b2cloud

27th September 2011

Obtaining Luminosity from an iOS Camera

Tutorial By 3 years ago

Unfortunately Apple does not give developers access to the ambient light sensor on the top of it’s iOS devices (used to measure brightness and adjust the screens brightness accordingly), and when I say access I mean direct access to it’s output rather the proximity state that can be inferred from it. This leaves us with using the camera as the only way to measure the ambient light in the operating environment.
Something you may be wondering is, how do we measure light? What values can we use to represent it? The answer here is a value called candela. To paraphrase Wikipedia, candela is the power of a light source in a particular direction (in this case it is the direction of the iPhone’s camera). However candela is not a value we can use here, because we are not just focusing on 1 point on the camera, we are focusing on as many as we can. Therefore we get into luminance, which is candela per square metre.
To calculate the luminance of a particular pixel, we simply take the red, green and blue values (since it is a camera it should not contain any alpha values) and use the following formula to convert them into a grayscale pixel:

double luminance = r*0.299 + g*0.587 + b*0.114;

The reason these values are weighted is because pure red, green and blue are actually darker/lighter than each other, with green being the darkest and blue being the lightest (hence the discrepancy between the blue and green weightings). What we have actually done there is calculate a grayscale value for our pixel, but here’s the trick, grayscale is luminance. The only difference between grayscale and a luminance value for a picture is that it is from a pixel instead. To calculate the luminance over an entire image, you simply take the mean of the luminance values for each pixel.
So in order to test this theory, 6 images will be used, 2 of night, 2 of twilight and 2 of day. The following code was used (using the UIImage pixel category I have previously made):

NSArray* dayArray = [NSArray arrayWithObjects:@"night",@"twilight",@"day",nil];
for(NSString* day in dayArray)
{
	for(int i=1;i<=2;i++)
	{
		UIImage* image = [UIImage imageNamed:[NSString stringWithFormat:@"%@%d.png",day,i]];
		unsigned char* pixels = [image rgbaPixels];
		double totalLuminance = 0.0;
		for(int p=0;p

Below are the test images and there luminosity values:

Luminosity: 0.688431

Luminosity: 0.465598

Luminosity: 0.493491

Luminosity: 0.381365

Luminosity: 0.159368

Luminosity: 0.100030

As you can see these pictures all display different luminosities, the darker ones have less light power and the brighter ones have more (if it helps, imagine how much power a solar cell would generate as a function of luminance values). Now simply plug this logic into the iOS's camera using the AVCaptureVideoDataOutput API and you can now measure the amount of ambient light. There is 1 problem, most iOS devices have cameras that auto adjust there brightness based on the amount of ambient brightness, this can tarnish results to make them slightly less accurate, however the adjustment is not enough to make the output from this kind of algorithm meaningless.

Links:
- Luminance of images
- RGB to Grayscale Conversion Calculator

  • http://ruadvertising.com.au houman

    that you for the solution
    Where is rgbaPixels coming from??

    cheers

    • http://ruadvertising.com.au houman

      Bad English, I meant to say. Thank you for the solution.
      Cheers

    • http://www.b2cloud.com.au Josh Guest

      Hi there, please see our Obtaining pixel data from a UIImage blog post

  • Sam

    Dear sir,
    I was trying to implement this code using AVCaptureVideoDataOutput API, But i have few problems that i need get solved out,
    1. Do we need to create an image from the frames to implement this?
    2. I created images and applied the algorithm, I set video setting of the image to 32BGRA, But it creates an error on the line
    totalLuminance += pixels[p]*0.299 + pixels[p+1]*0.587 + pixels[p+2]*0.114;
    with error -> CGBitmapContextCreate:unsupported parameter combination : 8integer bits/component;24 bits/pixel;3components color space;kCGImaheAlphaNone;1920 bytes/row;
    Can you please help me?

    • http://www.b2cloud.com.au will

      Can you paste more of your code? You don’t necessarily need images as long as you have the pixel values

  • stochastic oscillator

    About good camera your posting article with all latest useful information .Your posting images are proven of this .You want to give us some suggestion and Some unconscious fault .Actually you want to walk camera’s all fields.
    stochastic oscillator

  • Mike

    Just wondering, it is possible to get constantly the ambient luminance? Like instead of using a photo taken by the camera (due to the sound that it makes) use video frames? I’ve seen some tutorials but none worked and the tutorials one this site were the only ones that worked with no problem! I got the luminance calculation working taking a photo but I need to calculate it constantly and the shutter sound is not “compatible” with what I want .. :( Thanks and congrats for the excellent material on this site!!!

  • Gal Blank

    the problem with all that is, if your phone’s in your pocket or camera is facing bottom, it will always get the night result

Recommended Posts

7 Tips From Apple's iOS Human Interface Guidelines

Post by 3 years ago

With Appleā€™s iOS Human Interface Guidelines now available on the iBook Store, here some important tips to keep in mind when designing an app.

Got an idea?

We help entrepreneurs, organizations and established brands from around
the country bring ideas to life. We would love to hear from you!