Researchers from the Lausanne Federal Polytechnic School have developed a prototype system that allows a person without a hand to securely hold objects using a bionic prosthesis and control each finger individually. To do this, they developed a learning algorithm that helps a person control the prosthesis. Details about the work of Swiss developers are published in Nature Machine Intelligence, and briefly described in a message from the Polytechnic School.
Some modern bionic prostheses work on the principle of reading myoelectric signals from the surface of the skin of the stump. A person gives mental orders to perform some action, which causes a muscle reaction in the cult and the emergence of electrical potentials that are recorded, interpreted and performed by a bionic prosthesis. The difficulty in managing such a prosthesis lies in a large number of “noises”, side electrical potentials recorded on the surface of the skin of the stump, not related to the work of muscles during movement.
The algorithm, developed by researchers from the Lausanne Federal Polytechnic School, was created using a multilayer perceptron, a neural network built on the passage of a signal through several neural layers. This neural network was trained on several volunteers: three people with amputated limbs and seven with whole limbs. In general, training the neural network took about 10 minutes for each of the participants.
The multilayer perceptron learned to recognize an attempt to flex each finger individually, simultaneously index and middle or ring and little fingers, fist compression, grip with a pinch and flexion of the thumb. Based on the learning outcomes, the generated algorithm controlled the Allegro Hand robotic brush, which consists of a thumb and three ordinary fingers. This brush was mounted on a Kuka IIWA 7 industrial manipulator. This entire design imitated a bionic prosthesis of the right hand.
During the experiments, problems were found with the third finger on the robot arm and it had to be completely turned off. As a result, she could only operate with her thumb and two ordinary fingers. With the help of a robot arm, volunteers were able to take plastic bottles and perform various actions with them, including pouring their contents into a glass. The developers included a modified algorithm for holding objects in the robot control algorithm – in case the bottle taken into the robot manual began to slip, the brush increased the grip force. The slip response was no more than 400 milliseconds.
Earlier, developers from the University of Utah announced the completion of the LUKE arm prosthesis, named after the main character of Star Wars, Luke Skywalker. They equipped the device with biofeedback. Now the owner of the prosthesis can, with the help of electrodes implanted in peripheral nerves and muscles, feel touch of objects, vibration and even pain. In addition, refinement allows the user to more accurately control the prosthesis.