ARM, eyeSight gesture recognition for devices
October 16, 2013
eyeSight, producer of gesture control technology, has completed extensive work in partnership with ARM to optimise its gesture recognition solution for use on ARM’s Mali T600 Graphics Processing Units. Manufacturers using ARM Mali GPUs can now benefit from eyeSight’s advanced gesture control capabilities, optimised using GPU Compute for improved robustness, accuracy and energy efficiency. The improved efficiency of gesture computation through use of the GPU will also improve a variety of new use cases, such as face and emotion detection, long distance finger tracking, and even 3D motion recognition (such as finger pointing for selection).
The ARM Mali T600 GPU Compute optimised engine from eyeSight provides a perfect solution for enabling gesture in mobiles, tablets and TVs and a range of other devices. Products featuring ARM Mali GPUs with eyeSight’s gesture solution will allow users to control user interfaces (UIs) and content such as music and movies, to activate usability applications, play games, or to browse menus with easy yet powerful hand and fingertip-level gestures.
With eyeSight’s technology, Mali devices can recognise a rich language of gestures including directional gestures, (such as up, right, wave, etc.), hand signs (such as a ‘thumbs-up’), and tracking of hands and even fingertips, (for mouse-cursor-accuracy).
“ARM is excited to be working closely with eyesight to offer our mutual customers advanced gesture recognition technology,” commented Pete Hutton, executive vice president and general manager, Media Processing Division, ARM. “The optimisation of gesture middleware solutions using Mali GPU Compute in combination with ARM Cortex-A processors using NEON technology is an industry first. It enables impressive performance, accuracy, robustness and efficiency. Developers no longer need to worry about processing or ambient limitations as they create the gesture-enabled applications of the future.”