We all love Siri’s little witty quips, but I recently read a heartwarming article over on Mashable that reflects the power of technology to change lives in ways that most of us would never appreciate. I won’t spoil
With iOS 5, Apple has introduced their CoreImage class to developers. This class offers many image processing functions, but there is one in particular that caught my attention, the CIDetector class. Currently this only does face detection, but the hints to further feature detections in the future.
With minimal code you can easily detect faces within a picture, including the locations of eyes and mouths, here is how:
So you’re coding away, everything is coloured nicely so you can distinguish between reserved words, datatypes and variables, but then the unthinkable happens; all your code turns black. Not to worry, you can live without the colours, but when you see the “symbol not found” message and Xcode is no longer autocompleting variables and functions for you, you start the panic. Your development time rolls to a halt and you can no longer quickly jump around from method to method. You realise Xcode has broken it’s intellisense index.