Most people probably underestimate how much our sense of touch helps us navigate the world around us. New research has made it crystal clear after a robotic arm with the ability to feel was able to halve the time it took for the user to complete tasks.
In recent years, rapid advances in both robotics and neural interfaces have brought the dream of bionic limbs (like the one sported by Luke Skywalker in the Star Wars movies) within touching distance. In 2019, researchers even unveiled a robotic prosthetic arm with a sense of touch that the user could control with their thoughts alone.
But so far, these devices have typically relied on connecting to nerves and muscles in the patient’s residual upper arm. That has meant the devices don’t work for those who have been paralyzed or whose injuries have caused too much damage to those tissues.
That may be about to change, though. For the first time, researchers have allowed a patient to control a robotic arm using a direct connection to their brain while simultaneously receiving sensory information from the device. And by closing the loop, the patient was able to complete tasks in half the time compared to controlling the arm without any feedback.
“The control is so intuitive that I’m basically just thinking about things as if I were moving my own arm,” patient Nathan Copeland, who has been working with researchers at the University of Pittsburgh for six years, told NPR.
The results, reported in Science, build on previous work from the same team that showed they could use implants in Copeland’s somatosensory cortex to trigger sensations localized to regions of his hand, despite him having lost feeling and control thanks to a spinal cord injury.
The 28-year-old had also previously controlled an external robotic arm using a neural interface wired up to his motor cortex, but in the latest experiment the researchers combined the two strands of research, with impressive results.
In a series of tasks designed to test dexterity, including moving objects of different shapes and sizes and pouring from one cup to another, Copeland was able to reduce the time he took to complete these tasks from a median of 20 seconds to just 10, and his performance was often equivalent to that of an able-bodied person.
The sensory information that Copeland receives from the arm is still fairly rudimentary. Sensors measure torque in the joints at the base of the robotic fingers, which is then translated into electrical signals and transmitted to the brain. He reported that the feedback didn’t feel natural, but more like pressure or a gentle tingling.
But that’s still a lot more information than cab be gleaned from simply watching the hand’s motions, which is all he had to go on before. And the approach required almost no training, unlike other popular approaches based on sensory substitution that stimulate a patch of skin or provide visual or audio cues that the patient has to learn to associate with tactile sensations.
“We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be,” Robert Gaunt, a co-author of the paper, said in a press release.
“When even limited and imperfect sensation is restored, the person’s performance improved in a pretty significant way.”
An external robotic arm is still a long way from a properly integrated prosthetic, and it will likely require significant work to squeeze all the required technology into a more portable package. But Bolu Ajiboye, a neural engineer from Case Western Reserve University, told Wired that providing realistic sensory signals directly to the brain, and in particular ones that are relayed in real time, is a significant advance.
In a related perspective in Science, Aldo Faisal of Imperial College London said that the integration of a sense of touch may not only boost the performance of prosthetics, but also give patients a greater sense of ownership over their replacement limbs.
The breakthrough, he added, also opens up a host of interesting lines of scientific inquiry, including whether similar approaches could help advance robotics or be used to augment human perception with non-biological sensors.
Image Credit: RAEng_Publications from Pixabay
* This article was originally published at Singularity Hub
0 Comments