Prosthetic hand combines user and robotic control
Marie Donlon | September 12, 2019Scientists from the Ecole Polytechnique Federale de Lausanne in Switzerland are improving hand prosthetics by combining individual finger control along with automation to improve manipulation and grasping.
The team first set out to decipher intended finger movements based on the muscular activity taking place on an amputated limb, thereby potentially enabling individual finger control of the prosthetic hand. That, coupled with robotics, enables the robotic hand to grasp, hold and strengthen that grasp.
"When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react," explained Aude Billard who leads EPFL's Learning Algorithms and Systems Laboratory. "The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping."
To devise a system for sharing control, an algorithm translates user intention into finger movement on the prosthetic hand. To train the algorithm, the amputee performs an assortment of hand movements and gestures. Sensors located on the amputee’s stump sense the muscular activity and the algorithm determines hand movements. Once the algorithm understands the intended finger movements, that data can be applied to controlling individual fingers on the prosthetic.
To automate, the scientists enhanced the algorithm so that robotic automation is activated once the user attempts to handle an object. This algorithm instructs the prosthesis to enclose the object in its fingers once it contacts the sensors on the prosthetic.
The algorithm is undergoing additional testing. However, future potential applications for the algorithm exist, particularly in bionic hand prostheses and brain-to-machine interfaces.
The research appears in the journal Nature Machine Intelligence.
Around 25 years ago I designed and prototyped a flexible linear encoder with an intended use in a talking glove being designed by a Dr.James Kramer PhD for the hearing / speech impaired. At the time he was using strain gages with resolution to 16 places and still could not determine all letters in the finger spelling positions.
My encoder could be made to have a very fine digital resolution. I felt that a count of 0 to 3 for each joint would work. I took 21 encoders to read all positions of a completely dexterous hand.
The biggest problem was to get the lines to stay on the transparent material. I tried making a drawing and photographing it. I sent the rolls of black and white film off to be developed with instructions not to print. I still invision a 3rd shift worker putting the roll on a larger roll to auto develop and print everyones pictures for that day. It must have sounded like a machine gun going off when their automatic developer tried to print those black and white lines that it was made to see as seperation lines between the pictures. I received the unusable negatives and some scraps of the printing.
I did come up with an inexpensive and simple method to fix the lines.
I continued and designed a mechanical hand that used the encoders to be used by a machine where the person would type on a normal keyboard and the hand would finger spell the letters.
I found a college professor trying to do the same thing. He did not want it to work because he would lose his grant.
I did try a business developer to try and get started. He had me go with a group of other people to a first ever "Shark Tank" to present ideas to investors. They preferred the hubcap for airplanes to make the wheels start spinning to reduce the wear before landing. Next he took me to a prototyping shop that did everything except use a jewelers loop to find the part numbers of the photocells and leds. The numbers were not printed on the parts. They were free samples of new products and very nice to use.
I talked to many people on the internet at the time. A robotic scientist in one of the northern countries was interested and ask for some pictures. I sent him some, and 2 weeks later everyone in the northern hemisphere with the then new 3D printers was printing them out.
I did attempt to patent the encoders. It was about the time of the mail strike. The patent office wanted more information sent to another address. This went on several times until they had everything except the part numbers of the leds and photocells. They had complete drawings in primary colors. This was a time before color scanners. They said they could not patent the encoder and said they returned all my material. The drawings were big black xerox blobs. I had to go thru my congressman's office to get my originals returned.
The big company that made the leds and photocells sold their designs to a smaller company that decided to only produce the leds in the packaging. So now if I was going to make them I would need to make them smaller.
I still consider the encoder very useful in making robotic / prosthetic manipulator function as a human hand. changing operators would be as simple as grasping a known sized ball and laying the hand flat on a table to calibrate.