Over the last 4 months b2cloud has been busily building Google Glass Apps for the most forward thinking companies in Australia. Not to mention the neat little Apps we have developed for demonstration at events around Australia including Glass Tabs – Learning guitar through Google Glass.
With iOS 5, Apple has introduced their CoreImage class to developers. This class offers many image processing functions, but there is one in particular that caught my attention, the CIDetector class. Currently this only does face detection, but the hints to further feature detections in the future.
With minimal code you can easily detect faces within a picture, including the locations of eyes and mouths, here is how:
So you’re coding away, everything is coloured nicely so you can distinguish between reserved words, datatypes and variables, but then the unthinkable happens; all your code turns black. Not to worry, you can live without the colours, but when you see the “symbol not found” message and Xcode is no longer autocompleting variables and functions for you, you start the panic. Your development time rolls to a halt and you can no longer quickly jump around from method to method. You realise Xcode has broken it’s intellisense index.