Two technologies — one that tracks a person’s gaze and another that captures facial expressions via sonar-like tech — have been developed by a team of researchers from Cornell University.

These technologies are reportedly small enough to be placed on commercial smart-glasses as well as on virtual reality (VR) or augmented reality (AR) headsets, while reportedly consuming far less power than existing tools that use cameras.

Source: Cornell UniversitySource: Cornell University

The researchers added that both technologies use speakers and microphones arranged on an eyeglass frame where they bounce inaudible soundwaves off the face and subsequently capture reflected signals caused by face and eye movements.

One device, dubbed GazeTrak, is considered by the researchers to be the first eye-tracking system that relies on acoustic signals. Meanwhile, the second device, dubbed EyeEcho, is reportedly the first eyeglass-based device to repeatedly and accurately detect facial expressions and subsequently recreate them in real time through an avatar.

The developers of both GazeTrak and EyeEcho suggest that the devices can last for several hours on a smart-glasses battery and more than one day on a VR headset.

To create the GazeTrak technology, researchers outfitted a pair of glasses with one speaker and four microphones around the inside of each eye frame to both bounce and capture soundwaves from the eyeball and the area surrounding the eyes. Those sound signals were then fed into a customized deep-learning pipeline that uses artificial intelligence (AI) to determine the direction of the person's gaze continuously.

To develop the EyeEcho, the research team mounted one speaker and one microphone next to the glasses' hinges, but pointing downward so that it can capture skin movement as facial expressions change. Those reflected signals are also interpreted using AI, the team added.

The developers suggest that the technologies can enable users to have hands-free video calls and enhance a user’s VR experience, while GazeTrak could be paired with screen readers to read out portions of text for people with low vision.

Further, both technologies could also potentially be used to help diagnose or monitor neurodegenerative diseases, like Alzheimer's and Parkinsons, which often present with abnormal eye movements and less expressive faces.

The findings are detailed in the article, “GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass Frame,” which appears in the journal arXiv.

To contact the author of this article, email