Prosthetic hand combines user and robotic controlMarie Donlon | September 12, 2019
Scientists from the Ecole Polytechnique Federale de Lausanne in Switzerland are improving hand prosthetics by combining individual finger control along with automation to improve manipulation and grasping.
The team first set out to decipher intended finger movements based on the muscular activity taking place on an amputated limb, thereby potentially enabling individual finger control of the prosthetic hand. That, coupled with robotics, enables the robotic hand to grasp, hold and strengthen that grasp.
"When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react," explained Aude Billard who leads EPFL's Learning Algorithms and Systems Laboratory. "The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping."
To devise a system for sharing control, an algorithm translates user intention into finger movement on the prosthetic hand. To train the algorithm, the amputee performs an assortment of hand movements and gestures. Sensors located on the amputee’s stump sense the muscular activity and the algorithm determines hand movements. Once the algorithm understands the intended finger movements, that data can be applied to controlling individual fingers on the prosthetic.
To automate, the scientists enhanced the algorithm so that robotic automation is activated once the user attempts to handle an object. This algorithm instructs the prosthesis to enclose the object in its fingers once it contacts the sensors on the prosthetic.
The algorithm is undergoing additional testing. However, future potential applications for the algorithm exist, particularly in bionic hand prostheses and brain-to-machine interfaces.
The research appears in the journal Nature Machine Intelligence.