Emotional analysis via temporal-frequency domain disruptionsMarie Donlon | September 19, 2019
Researchers from China's Hefei University and Japanese universities developed an electronic system for recognizing a person's emotions based on their gestures and movements.
The system, dubbed EmoSense, reads expressions and gestures to reveal the person’s mood for the purpose of emotion recognition, according to the team.
To develop EmoSense, researchers determined that gestures and movements impact wireless signals through shadowing and multi-path effects when antennas are employed to determine behavior. The signals, according to researchers, usually create unique patterns in the temporal-frequency domain for different gestures.
This inspired the team to create a system for identifying such patterns and emotions based on a person’s physical movements and gestures. The system analyzes wireless channel responses through data mining and uses that information to detect emotion based on the physical gestures that align with observed patterns.
Once developed, the team measured EmoSense’s performance on roughly 3,360 cases, comparing it to the performance of sensor- and vision-based emotion recognition tools already in use. The team discovered that it performed in line with the other methods without the added expense of costly hardware.
The team envisions that the system will eventually have a variety of applications, including during the rehearsal of a play or comedy show to gauge the emotional response of the audience to the performances.
However, Yantong Wang, one of the researchers who carried out the study, notes that there are some limitations of EmoSense. Most notable among them is that the system is data driven, relying primarily on quantitative observations and without consideration of potentially complex psychological emotives.
"As psychology knowledge is also very important for understanding human emotion, it might be more reasonable to couple both data and psychology knowledge in order to attain more reliable and accurate emotion recognition," said Wang. "In addition, the physical expression of emotion is affected by many congenital and acquired factors, some of which are totally out of control. Therefore, it is important to clarify the potential scenarios before we actually deploy the system."
The system is detailed in the journal arXiv.