Artificial intelligence makes gripping more intuitive


Dec 06, 2023 (Nanowerk News) Artificial hands can be operated via app or with sensors placed in the muscles of the forearm. New research at the Technical University of Munich (TUM) shows: a better understanding of muscle activity patterns in the forearm supports a more intuitive and natural control of artificial limbs. This requires a network of 128 sensors and artificial intelligence based techniques.

Key Takeaways

  • Advanced artificial hands can now be controlled more intuitively with AI and a network of 128 sensors, enhancing the natural movement for users.
  • The ‘synergy principle’ in neuroscience, utilized in these prostheses, allows for synchronized and adaptive finger movements, mimicking natural hand actions.
  • Machine learning algorithms are key in personalizing control adaptability, improving the learning process for individual patients.
  • Research reveals that the majority of users prefer and are more efficient with this intuitive control method, though some excel in less intuitive techniques.
  • Despite advancements, challenges remain in sensor alignment, signal noise filtering, and customizing the learning algorithm for each new user.
  • A hand prosthesis grips wan orange A hand prosthesis grips with the help of muscle signals. (Image: TUM)

    The Research

    Different types of grasps and bionic design: technological developments in recent decades have already led to advanced artificial hands. They can enable amputees who have lost a hand through accident or illness to regain some movements. Some of these modern prostheses allow independent finger movements and wrist rotation. These movements can be selected via a smartphone app or by using muscle signals from the forearm, typically detected by two sensors. For instance, the activation of wrist flexor muscles can be used to close the fingers together to grip a pen. If the wrist extensor muscles are contracted, the fingers re-open and the hand releases the pen. The same approach makes it possible to control different finger movements that are selected with the simultaneous activation of both flexor and extensor muscle groups. “These are movements that the patient has to learn during rehabilitation,” says Cristina Piazza, a professor of rehabilitation and assistive robotics at TUM. Now, Prof. Piazza’s research team has shown that artificial intelligence can enable patients to control advanced hand prostheses more intuitively by using the “synergy principle” and with the help of 128 sensors on the forearm.

    The synergy principle: the brain activates a pool of muscle cells

    What is the synergy principle? “It is known from neuroscientific studies that repetitive patterns are observed in experimental sessions, both in kinematics and muscle activation,” says Prof. Piazza. These patterns can be interpreted as the way in which the human brain copes with the complexity of the biological system. That means that the brain activates a pool of muscle cells, also in the forearm. The professor adds: “When we use our hands to grasp an object, for example a ball, we move our fingers in a synchronized way and adapt to the shape of the object when contact occurs.” The researchers are now using this principle to design and control artificial hands by creating new learning algorithms. This is necessary for intuitive movement: When controlling an artificial hand to grasp a pen, for example, multiple steps take place. First, the patient orients the artificial hand according to the grasping location, slowly moves the fingers together, and then grabs the pen. The goal is to make these movements more and more fluid, so that it is hardly noticeable that numerous separate movements make up an overall process. “With the help of machine learning, we can understand the variations among subjects and improve the control adaptability over time and the learning process,” concludes Patricia Capsi Morales, the senior scientist in Prof. Piazza’s team.

    Discovering patterns from 128 signal channels

    Experiments with the new approach already indicate that conventional control methods could soon be empowered by more advanced strategies. To study what is happening at the level of the central nervous system, the researchers are working with two films: one for the inside and one for the outside of the forearm. Each contains up to 64 sensors to detect muscle activation. The method also estimates which electrical signals the spinal motor neurons have transmitted. “The more sensors we use, the better we can record information from different muscle groups and find out which muscle activations are responsible for which hand movements,” explains Prof. Piazza. Depending on whether a person intends to make a fist, grip a pen or open a jam jar, “characteristic features of muscle signals” result, according to Dr. Capsi Morales – a prerequisite for intuitive movements.

    Wrist and hand movement: Eight out of ten people prefer the intuitive way

    Current research concentrates on the movement of the wrist and the whole hand. It shows that most people (eight out of ten) prefer the intuitive way of moving wrist and hand. This is also the more efficient way. But two of ten learn to handle the less intuitive way, becoming in the end even more precise. “Our goal is to investigate the learning effect and find the right solution for each patient,” Dr. Capsi Morales explains. “This is a step in the right direction,” says Prof. Piazza, who emphasizes that each system consists of individual mechanics and properties of the hand, special training with patients, interpretation and analysis, and machine learning.

    Current challenges of advanced control of artificial hands

    There are still some challenges to address: The learning algorithm, which is based on the information from the sensors, has to be retrained every time the film slips or is removed. In addition, the sensors must be prepared with a gel to guarantee the necessary conductivity to record the signals from the muscles precisely. “We use signal processing techniques to filter out the noise and get usable signals,” explains Dr. Capsi Morales. Every time a new patient wears the cuff with the many sensors over their forearm, the algorithm must first identify the activation patterns for each movement sequence to later detect the user’s intention and translate it into commands for the artificial hand. The research was presented at IEEE International Conference on Rehabilitation Robotics 2023(“Exploring Muscle Synergies for Performance Enhancement and Learning in Myoelectric Control Maps”).

    Leave a Reply

    Your email address will not be published. Required fields are marked *