A team of researchers from Cornell University has created a robotic feeding system for safely feeding people with substantial mobility limitations — such as those with spinal cord injuries, cerebral palsy and multiple sclerosis, among others.

According to the team, the robotic feeding system relies on a combination of computer vision, machine learning and multimodal sensing to safely feed people who cannot lean forward and thus require food to be placed directly in their mouths.

Source: Cornell UniversitySource: Cornell University

Noting that some care recipients have limited mouth openings — in some cases measuring less than 2 cm — while others might experience involuntary muscle spasms even when the utensil is inside their mouth, the team developed a robot that features both real-time mouth tracking that adjusts to users' movements, and a response mechanism that allows the robot to identify the nature of physical interactions as they occur, and thus react appropriately.

The team suggested that this allows the system to differentiate among sudden spasms, intentional bites and user efforts to manipulate the utensil within their mouths.

During trials, the robotic system, which features a multi-jointed arm that holds a custom utensil at the end that can sense the forces being applied on it, successfully fed 13 individuals with a range of medical conditions

To enable the robotic system to precisely detect the mouth of the subject being fed and subsequently overcome obstructions presented by the utensil, the team employed the mouth tracking approach — trained on thousands of images featuring different participants' head poses and facial expressions — which combined data captured from two cameras located above and below the utensil.

Further, the physical interaction-aware response mechanism used both visual and force sensing to perceive how users interacted with the robot, the researchers added.

"We're empowering individuals to control a 20-pound robot with just their tongue," the team suggested.

An article detailing the findings, “Feel the Bite: Robot-Assisted Inside-Mouth Bite Transfer using Robust Mouth Perception and Physical Interaction-Aware Control,” was presented at the Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction.

To contact the author of this article, email mdonlon@globalspec.com