Close

Giriş

Close

Register

Close

Lost Password

Google’s Project Gameface Brings Facial Gesture Controls to Android

The innovative hands-free interface Project Gameface created by Google to offer accessibility through facial motions is now available on Android.

What started as a novel concept is continuing to evolve, as Google’s Project Gameface takes its experimental accessibility platform beyond computers to the world of mobile. After debuting last year focused on letting users navigate PCs through head movements and facial gestures detected by the webcam, Project Gameface is now making the jump to Android.

On mobile, Project Gameface will utilize the front-facing camera to recognize 52 distinct facial expressions, like opening one’s mouth, raising an eyebrow, or looking in a direction. Each gesture can be assigned to a common system action – going home, opening recent apps, notifications, and more. A “drag” control lets users point where they want to move simply by looking and gesturing.

Open Source Potential

Google's Project Gameface Brings Facial Gesture Controls to Android

In a demonstration of its collaborative spirit, Google is making the underlying code for Project Gameface fully open source. This allows developers to freely build upon the work and integrate gesture controls into their own applications, unlocking whole new interfaces that don’t require physically touching the device. All that’s needed is a front camera and the user’s expressive face.

By expanding the reach of Project Gameface to Android, Google extends the benefits of its innovative accessibility solution to a much broader range of devices. With no extra hardware required, just a software update brings powerful new intuitive controls to mobile. It marks an encouraging step for inclusive design, empowering more users through the gestures of everyday expression.

Share

Related Contents

0
0

    Leave a Reply

    Your email address will not be published.

    Thanks for submitting your comment!