Video: Artificial Vision for Artificial HandS. Himmelstein | May 04, 2017
Current prosthetic hands are controlled by myoelectric signals from the muscles in the stump, a process that requires patience, concentration and time to master. Researchers from Newcastle University, UK, used computer vision and neural network technology to improve the responsiveness of these prosthetics. A bionic hand with computer vision ‘sees’ objects and reacts automatically, enabling the user to grasp objects 10 times faster than any other limb on the market.
The bionic hand is fitted with an off-the-shelf camera that takes a picture of the object in front of it, assesses its shape and size, and triggers a series of movements in the hand. The researchers trained the system using neural networks, showing it numerous pictures of various objects from multiple angles and in different light conditions. The deep-learning system learned which grasp pattern to use for different objects according to their shape, but without measuring specific dimensions or explicitly identifying them. The prosthetic is capable of four different grasps: pinch, tripod, palm wrist neutral and palm wrist pronated.
A small number of amputees have already trialed the new technology and now the Newcastle University team are working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the ‘hands with eyes’ to patients at Newcastle’s Freeman Hospital.
The ultimate goal is to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain.