We are thrilled to announce some groundbreaking work we have been doing with Telstra for Google Glass. Over the past six months we have developed two world-first Google Glass apps designed for the visually and
With iOS 5, Apple has introduced their CoreImage class to developers. This class offers many image processing functions, but there is one in particular that caught my attention, the CIDetector class. Currently this only does face detection, but the hints to further feature detections in the future.
With minimal code you can easily detect faces within a picture, including the locations of eyes and mouths, here is how:
So you’re coding away, everything is coloured nicely so you can distinguish between reserved words, datatypes and variables, but then the unthinkable happens; all your code turns black. Not to worry, you can live without the colours, but when you see the “symbol not found” message and Xcode is no longer autocompleting variables and functions for you, you start the panic. Your development time rolls to a halt and you can no longer quickly jump around from method to method. You realise Xcode has broken it’s intellisense index.