NSU students are creating a prototype of a body signal reading system that will allow prosthetic hands to be used as if they were their own.

Translation. Region: Russian Federation –

Source: Novosibirsk State University –

An important disclaimer is at the bottom of this article.

A prototype system that will collect and combine signals obtained from the human body using electromyography (EMG) and electroencephalography (EEG) and, based on these signals, recognize what movement the person wants to perform is being developed by master's students. Faculty of Information Technology of Novosibirsk State University (FIT NSU) Alexander Sartakov and Pavel Bortnikov, under the scientific supervision of Ivan Brak, a leading analyst at the Inzhevika Scientific and Technical Design Laboratory and a candidate of biological sciences, are working on their development. Their development is recognized as making control of a prosthetic hand more natural, faster, more intuitive, and closer to that of one's own hand compared to existing high-tech bionic devices.

— Most modern commercial prosthetics are controlled quite simply: a user tenses one or two forearm muscles, and the prosthesis responds to these actions according to a predetermined pattern. Essentially, this isn't a full-fledged "movement, like a natural arm," but rather a switching of modes through muscle contraction and relaxation. However, the human body generates much more information about movement intention than is used in such systems. Currently, real-world devices utilize only a small portion of the rich signal generated by muscles, neural activity, and limb movements. They operate by reading electrical impulses (EMG signals) generated by tensing the antagonist muscles of the remaining arm. This is typically done using two channels: one sensor reads limb flexion, and the other reads extension. We want to increase the number of channels and explore options for capturing other data from the human body using EEG. We believe this is important, since the very idea of performing a particular movement originates in the human brain. In any case, the more channels involved in signal transmission, the more data will be received and the more opportunities for interpreting it will appear, said Alexander Sartakov.

The young scientists intend to utilize 6 to 18 channels transmitting signals from the prosthetic user's hand. This will take into account not only the tension of certain antagonist muscles but also its strength (intensity), which will affect the device's actions: for example, partial flexion or extension of the fingers will be possible, as desired by the user. Currently, due to the limited range of data received from a small number of sensors, most only allow full actions—flexion or extension—but when more data is available, the device will be able to interpret it more diversely and broadly, and prostheses will gain new capabilities—they, as "artificial limbs," will be able to assume intermediate states.

The use of sensors that read brain signals using EEG will also contribute to the improvement of bionic prostheses.

"We reviewed existing studies on the correlation between signals generated in the brain and limb movement, since the initial intention for movement originates in the central nervous system. Scientists have identified a direct correlation between the impulse and the movement of a specific body part. A complete picture of movement intention was obtained. Knowing this, we can calculate the impulse in the user's brain directed, for example, to move one finger of a prosthetic hand. We based our further research on this," explained Pavel Bortnikov.

Capturing signals using both methods and converting them into movement will expand the capabilities of the system being developed by young scientists. Using EMG, the signal from the arm muscles will be read after it has passed through the body from the brain, while EEG will allow the signal to be "read" directly from the brain instantly—even before it reaches the muscle in the limb. Commercial bionic hand prostheses rarely use EEG. Over the past ten years, few studies have been conducted to interpret the data obtained using this method, but in those cases, signals from the brain were captured using chips implanted in the user's head. The user could control the movement of a three-axis bionic prosthesis placed on a tabletop using thought alone.

"By supplementing the system with surface sensors installed on the same limb, the accuracy of signal interpretation will dramatically improve. Our goal was to create prosthetics that not only functioned like a real hand but were also comfortable. It was important that the entire sensor system be lightweight and user-friendly, with no wires entangling the user's body, as movement signals would be read from the user's head. This was a challenging task, as the sensor array would be bulky. We needed to make it lightweight and easy to use. Therefore, from an engineering perspective, we explored the possibility of wirelessly transmitting signals from the sensors to the prosthesis with minimal latency. Ultimately, we decided to create individual modules that would wirelessly transmit data to a computing module," said Alexander Sartakov.

The developers envision a device for recording EEG readings, consisting of a cap containing dry electrodes. Another set of sensors, for recording EMG signals directly from the limb muscles, will be attached to a tightly fitting elastic fabric that fits around the arm like a wide bracelet. These two components of the signal-reading system will collect information and transmit it to a computing module. It is assumed that the computing module in the current concept could be a device capable of processing neural networks at a sufficiently high speed. The creators of the device are considering the possibility of integrating this computing unit into a smartphone.

From there, the final control signal will be sent to the prosthesis itself. As with existing systems, the new development will operate using pre-defined algorithms, but it should now be more precise and functional due to the greater volume of information collected by the sensors.

The project is currently in the feasibility study phase. Young researchers have interpreted and analyzed open-source data from the internet and are now planning to collect as many signals as possible from a real person, then combine them in a specific way. Signal processing and noise removal sequences have been selected, and a basic RL neural network has been written for this purpose.

The creators of the new signal reading system face a major challenge: adapting it to urban environments. Laboratory conditions are ideal for signal reading because they avoid noise and interference. In open spaces, extraneous sounds and signals, including those from nearby areas, are added. Even noise from clothing can affect signal interpretation.

"It's impossible to statically separate a signal from extraneous noise and interference in order to interpret it. Therefore, to eliminate unwanted elements, we plan to use mathematical extraction. Rather than taking a pure signal, we transform the "contaminated" signal into a numerical array using specific methods and then feed it into a neural network. RL is a reinforcement learning model that can adjust weights during operation to better interpret the data in a specific environment. We've done this for both sensors reading signals from the upper limb muscles and for reading signals coming from the brain," explained Alexander Sartakov.

It will take two to three years to create a prototype system, but initially, the young researchers need to determine whether it is suitable for use in open areas, rather than in laboratory conditions. If the results are positive, collaboration is planned with the Russian cybermedical company Motorika, which specializes in the creation of general-purpose prosthetics. It was this company that initiated this project.

Material prepared by: Elena Panfilo, NSU press service

Please note: This information is raw content obtained directly from the source. It represents an accurate account of the source's assertions and does not necessarily reflect the position of MIL-OSI or its clients.