Though Google put down a clear marker by pioneering smart glasses, a number of startups and research institutes seem now to be obtaining better results in the field of gesture recognition and eye control.
Many people said that Google Glass was the most important mass market technology innovation of 2013. Its detractors labelled it a gimmicky gadget, but this product is still the benchmark for wearable devices and Google appears to have a clear advantage – unlike for instance the various contestants on the smart watch playing field. Nevertheless, the Mountain View giant seems to be pulling along in its wake a swarm of innovators in the gesture recognition and eye control field. While these developers are certainly not among the biggest players in the electronic industry, the startup companies and research institutes involved could well prove to be sharp pebbles in the seven-league boots of this particular giant. At all events, manipulating digital objects in augmented reality and eye-control navigation are certainly no longer Google’s exclusive preserve.
Non-geek glasses designed to improve gesture recognition
Creating “a device that is as elegant to look at as a pair of sunglasses”: this is a key ambition for Silicon Valley startup Meta in developing its version of smart glasses. The Meta Pro device is a micro-computer which enables the user to operate his/her smartphone and computer via holograms, and s/he can also sculpture ideas in augmented reality (AR) and then print them for real using a 3D printer. The prototype of these glasses looked more like a boxy headset and was decidedly geeky rather than elegant, but the second version is much sleeker and boasts fifteen times the display capacity of Google Glass. It is scheduled to be available to the general public in July this year, by which time Meta hopes to be able to do away entirely with the wire to a pocket computer which currently runs down the user’s back. Meanwhile over 500 AR apps developed for the Meta Pro glasses are already available.
However, while the ability to control your environment with simple gestures can be very useful for a range of tasks, a person using these glasses in public risks coming across as rather eccentric and may at the very least feel isolated. With this in mind, the Fraunhoferresearch institute in Germany has been working to develop 100% eye-controlled navigation. The Fraunhofer system provides access to a huge range of virtual information, and users can browse through virtual pages while still keeping in touch with the real world. The camera sensors embedded in the OLED screen register the direction of the wearer’s eye movements and an image processing program calculates the exact position of his/her pupils in real time. The developers have also incorporated an infrared light source into the frame of the glasses to ensure accurate positioning results even in low light. The glasses are targeted primarily at the medical sector, as they are a particularly useful tool for surgeons and other practitioners. However, the device also has the potential to make everyday life easier for physically disabled people. Accordingly, Germany’s Federal Ministry of Education and Research has launched a joint research project along these lines in which the Fraunhofer Institute is taking part. The key focus of the project is to find ways of helping people to perform physical tasks through ‘hands-free applications in augmented reality’, abbreviated to FAIR.