Test person with EarFS prototype controlling the mobile phone using facial gestures. Image credit: Fraunhofer IGDTest person with EarFS prototype controlling the mobile phone using facial gestures. Image credit: Fraunhofer IGDAs anyone with a mobile phone will tell you, sometimes it is impossible to reach for your phone when your hands are otherwise occupied with carrying items or engaged in household chores. Researchers at the Fraunhofer Institute for Computer Graphics Research IGD in Rostock, Germany, have recently developed a potential solution to the problem.

Current mobile devices are typically operated using a touch screen. However, it is nearly impossible to operate when you are, for example, wearing gloves or doing dishes.

In response, the researchers developed EarFieldSensing (EarFS), a device that reads facial gestures by means of an ear plug. The ear plug sensor detects even the slightest gestures by changes in the shape of the ear canal brought about by movement.

"The challenge was that these currents and movements are sometimes very small and need to be intensified," explains Denys Matthies, scientist at the Fraunhofer IGD. "In addition, the sensors cannot be interfered with by other movements of the body, such as vibrations during walking or external interferences. To solve this problem, an additional reference electrode was applied to the earlobe which records the signals coming from outside."

In addition to the gestures that signal whether to accept or reject a call, the EarFS can also evaluate facial movements to determine the emotional states of a user. Consequently, the device could have applications in other activities beyond accepting or rejecting phone calls. The device could be used to alert fatigued drivers.