Linking brains to machines has gone from science fiction to reality in the past two decades.
When patient T5 suffered a spinal cord injury that left him paralyzed, his dream of flying a drone seemed forever out of reach.
Now, thanks to a brain implant, he’s experienced the thrill in a simulation. By picturing finger movements in his mind, the 69-year-old flew a virtual drone in a video game, with the quadcopter dodging obstacles and whizzing through randomly appearing rings in real time.
T5 is part of the BrainGate2 Neural Interface System clinical trial, which launched in 2009 to help paralyzed people control computer cursors, robotic arms, and other devices by decoding electrical activity in their brains. It’s not just for gaming. Having the ability to move and click a cursor gets them back online. Googling, emailing, streaming shows, scrolling though social media posts—what able-bodied people spend hours on every day—are now again part of their lives.
But cursors can only do so much. Popular gaming consoles—PlayStation, Xbox, Nintendo Switch—require you to precisely move your fingers, especially thumbs, fast and in multiple directions.
Current brain implants often take a bird’s-eye-view of the entire hand. The new study, published in Nature Medicine, separated the fingers into three groups—thumb, pointer and middle finger, and ring finger and pinky. After training, T5 could move each finger group independently with unprecedented finesse. His brain implant also picked up intentions to stretch, curl, or move his thumb side to side, letting him pilot the drone as if using a video game controller.
Calling his gaming sessions “stick time,” T5 enthusiastically said that piloting the drone allowed him to mentally “rise up” from his bed or chair for the first time since his injury. Like other gamers, he asked the research team to record his best runs and share the videos with friends.
Brain-computer mind-melds are “expanding from functional to recreational applications,” wrote Nick Ramsey and Mariska Vansteensel at the University Medical Center Utrecht, who were not involved in the study.
Mind Control
Linking brains to machines has gone from science fiction to reality in the past two decades, and it’s been life-changing for people paralyzed from spinal cord injuries.
These injuries, either due to accident or degeneration, sever nerve highways between the brain and muscles. Scientists have long sought to restore these connections. Some have worked to regenerate broken nerve endings inside the body, with mixed results. Others are building artificial “bridges” over the gap. These implants, often placed in the spinal cord above the injury site, record signals from the brain, decode intention for movement, and stimulate muscles to contract or relax. Thanks to such systems, paralyzed people have been able to walk again—often with assistance—for long distances and minimal training.
Other efforts have done without muscles altogether, instead tapping directly into the brain’s electrical signals to hook the mind to a digital universe. Previous studies have found that watching or imagining movements—like, say, asking a patient to picture moving a cursor around a browser—generates similar brain patterns to physically performing the movements. Recording these “brain signatures” from individual people can then decode their intention to move.
Noland Arbaugh, the first person to receive a brain implant from Elon Musk’s Neuralink, is perhaps the most well-known success. Late last year, the young man livestreamed his life for three days, sharing his view while moving a cursor and playing a video game in bed.
Decoding individual finger movements, however, is a bigger challenge. Our hands are especially dexterous and flexible, making it easy to type, play musical instruments, grab a cup of coffee, or twiddle our thumbs. Each finger is controlled by intricate networks of brain activity working together under the hood to generate complex movements.
Fingers curl, wiggle, and stretch apart. Deciphering the brain patterns that allow them to individually and collectively work together has stymied researchers. “In humans, finger decoding has only been demonstrated in prediction in offline analyses or classification from recorded neural activity,” wrote the authors. Brain signal control hasn’t been used to control fingers in real-time. Even in monkeys, brain implants have only been able to separate fingers into two groups that move independently, limiting their paws’ overall flexibility.
A Virtual Flex
In 2016, T5 had two tiny implants inserted into the hand “knob” of his brain—one for each side that controls hand and finger movements. Each implant, the size of a baby aspirin, had 96 microelectrode channels that quietly captured his brain activity as he went through a series of training tasks. At the time of surgery, T5 could only twitch his hands and feet randomly.
The team first designed a hand avatar. It didn’t fully capture the dexterity of a human hand. The index and middle finger moved together as a group, as did the ring and pinkie. Meanwhile, the thumbs could stretch, curl, and move side to side.
For training, T5 watched the hand avatar move and imagined moving his fingers in sync. Using an artificial neural network that specializes in decoding signals across time, the team next built an AI to decipher T5’s brain activity and correlate each pattern with different types of finger movements. The “decoder” was then used to translate his intentions into actual movements of the hand avatar on the computer screen.
In an initial test that only allowed the thumb to extend and curl—what the researchers call “2D”—the participant was able to extend his finger groups onto a virtual target with over 98 percent accuracy. Each attempt took only a bit more than a second.
Adding side-to-side movement of the thumb had a similar success rate, but doubled the amount of time (though he got faster as he became familiar with the task). Overall, T5 could mind-control his virtual hand to reach around 76 targets a minute, far faster than previous attempts. The training “wasn’t tedious,” he said.
Each finger group movement was then mapped onto a virtual drone. Like moving joysticks and pressing buttons on a video game controller, the finger movements moved the quadcopter at will. The system kept the virtual hand in a relaxed, neutral pose unless T5 decided to move any of the finger groups.
In a day of testing, he flew the drone a dozen times across multiple obstacle courses. Each course required him to use one of the finger group movements to successfully navigate randomly appearing rings and other hurdles. One challenge, for example, had him fly figure eights across multiple rings without hitting them. The system was roughly six times better than prior systems.
Although his virtual fingers and their movements were shown on the computer screen while playing, the visuals weren’t necessary.
“When the drone is moving and the fingers are moving, it’s easier and faster to just look at the drone,” he said. Piloting it was intuitive, “like riding your bicycle on your way to work, [thinking] ‘what am I going to do at work today’, and you’re still shifting gears on your bike and moving right along.”
Adapting from simple training exercises to more complicated movements was also easy. “It’s like if you’re a clarinet player, and you pick up someone else’s clarinet. You know the difference instantly, and there is a little learning curve involved, but that’s based on you [having] an implied competency with your clarinet,” he said. To control the drone, you just have to “tickle it a direction,” he added.
The system is still far from commercial use, and it will have to be tested on more people. New brain implant hardware with more channels could further boost performance. But it’s a first step that opens up multiplayer online gaming—and potentially, better control of other computer programs and sophisticated robotic hands—to people with paralysis, enriching their social lives and overall wellbeing.
The post A Paralyzed Man Just Piloted a Virtual Drone With His Mind Alone appeared first on SingularityHub.
* This article was originally published at Singularity Hub
0 Comments