26th October 2011

Face Detection in iOS 5

Guides | Tutorial By 6 years ago

With iOS 5, Apple has introduced their CoreImage class to developers. This class offers many image processing functions, but there is one in particular that caught my attention, the CIDetector class. Currently this only does face detection, but it hints to further feature detections in the future.

With minimal code you can easily detect faces within a picture, including the locations of eyes and mouths, here is how:

Begin a ‘Single View’ application and find your app delegate class. At the top import the framework’s headers.

#import <CoreImage/CoreImage.h>

Also include the framework in your build phase.

In the application:didFinishLaunchingWithOptions: method just before the return YES;, add the following code.

UIImageView* imageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"steves.jpg"]];
[imageView setTransform:CGAffineTransformMakeScale(1, -1)];
[self.window setTransform:CGAffineTransformMakeScale(1, -1)];
[imageView setFrame:CGRectMake(0, 0, imageView.image.size.width, imageView.image.size.height)];
[self.window addSubview:imageView];
[imageView release];

This will add an image to your screen. You may notice I transform the UIImageView and then transform the window. The detection by default works upside down, so as a quick fix I am flipping the image and then flipping the window back to make everything look up right again. The image will now be at the bottom of the screen. If you have a picture of a face include it in the project, or use the one below, just make sure it is called ‘steves.jpg’.

The detector analyzes a new type of image called a CIImage (you may have previously seen a CGImage and a UIImage, well this is another). Create a CIImage version of your photo

CIImage* image = [CIImage imageWithCGImage:imageView.image.CGImage];

Now create a face detector.

CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];

When making a detector, first you specify the type of features that will be detected. At the moment there are only faces, which is distinguished with the CIDetectorTypeFace constant. Next are the options of the detector, as an NSDictionary. I have gone with the highest level of accuracy, which I recommend for still shots, but if you are using real time face detection then you should probably use the CIDetectorAccuracyLow constant instead.

Pull out the features of the face and loop through them

NSArray* features = [detector featuresInImage:image];

for(CIFaceFeature* feature in features)


For determining where the features are, I will add semi transparent UIViews to the window.

for(CIFaceFeature* feature in features)
	UIView* face = [[UIView alloc] initWithFrame:feature.bounds];
	[face setBackgroundColor:[[UIColor yellowColor] colorWithAlphaComponent:0.4]];
	[self.window addSubview:face];
	[face release];

		UIView* eye = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 15, 15)];
		[eye setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.2]];
		[eye setCenter:feature.leftEyePosition];
		[self.window addSubview:eye];
		[eye release];

		UIView* eye = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 15, 15)];
		[eye setBackgroundColor:[[UIColor redColor] colorWithAlphaComponent:0.2]];
		[eye setCenter:feature.rightEyePosition];
		[self.window addSubview:eye];
		[eye release];

		UIView* mouth = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 15, 15)];
		[mouth setBackgroundColor:[[UIColor greenColor] colorWithAlphaComponent:0.2]];
		[mouth setCenter:feature.mouthPosition];
		[self.window addSubview:mouth];
		[mouth release];

Run the app and the features will be highlighted on top of the photo. This can take a couple of seconds on a device, so threading is encouraged.

The detection works very well when the person is front on, but you can see it hasn’t detected the bottom left Steve.

Recommended Posts

Hey Siri, how do I start a conversation with someone with a disability?

Post by 6 years ago

We all love Siri’s little witty quips, but I recently read a heartwarming article over on Mashable that reflects the power of technology to change lives in ways that most of us would never appreciate. I won’t spoil

Got an idea?

We help entrepreneurs, organizations and established brands from around
the country bring ideas to life. We would love to hear from you!