UC San Francisco researchers have developed an AI-powered brain-computer interface that enabled a paralyzed man to control a robotic arm with his thoughts for a record seven months, marking a breakthrough in adaptive neurotechnology.
With a brain-computer interface (BCI) linking the man’s thoughts to the robotic arm, he maintained control for a record seven months—far longer than previous attempts, which typically failed within one to two days. Thanks to a new AI model, man can successfully grasp, move, and drop objects simply by imagining himself performing the actions.
Modeling Human Movement
When humans repeat a motion, small changes occur in their brains. Over time, these minor alterations disrupt traditional computer programs, requiring frequent recalibration in brain-computer interfaces to maintain fine motor control. The AI integrated into the new BCI system overcomes this issue by continuously detecting and adjusting to these subtle changes. Identifying how these neural patterns evolved provided the key to programming the AI.
“This blending of learning between humans and AI is the next phase for these brain-computer interfaces,” said neurologist Karunesh Ganguly, MD, PhD, a professor of neurology and a member of the UCSF Weill Institute for Neurosciences. “It’s what we need to achieve sophisticated, lifelike function.”
Finding Movement in the Brain
Ganguly’s breakthrough stemmed from studying brain activity in animals. He examined how neural patterns corresponded to specific movements and how learning modified these patterns over time. He hypothesized that these gradual changes explained why earlier BCIs quickly lost track of human brain patterns.
To test this theory, Ganguly partnered with PhD neurology researcher Nikhilesh Natraj. They worked with a man who had lost the ability to move or speak after a stroke years earlier. The researchers implanted tiny sensors on the surface of his brain to monitor activity when he imagined movement.
They then asked the participant to visualize moving various body parts—his hands, feet, and head—while tracking how his neural signals shifted. Even though he was physically incapable of movement, merely imagining it produced control signals in his brain. After analyzing his brain activity, the researchers discovered that while the brain retained the same patterns for specific movements, their exact location shifted slightly over time.
Artificial Intelligence for Artificial Movement
Recognizing this shifting neural activity, Ganguly trained an AI model to adapt in real-time. He asked the subject to visualize simple hand movements while sensors fed data into the AI. Though the AI successfully controlled the robotic arm, the movements were initially uncoordinated.
Ganguly introduced a virtual robotic arm to refine the process, providing the participant with real-time visual feedback. This feedback loop helped the participant synchronize his thoughts with the AI, improving precision. After a series of practice sessions, these skills seamlessly transferred to controlling the real-world robotic arm.
Controlling the Robotic Limb
With the optimized BCI, the participant achieved several practical feats. He successfully picked up, turned, and repositioned blocks. He could open a cabinet, retrieve a cup, and place it under a water dispenser in more complex tasks. Remarkably, the system maintained stability for seven months before requiring even a brief 15-minute recalibration to account for neural drift.
Ganguly continues refining the AI models behind the advanced BCI to enable faster, smoother movements in real-world home environments. According to him, life-changing mobility and independence may soon be within reach for individuals with paralysis.
“I’m very confident that we’ve learned how to build the system now, and that we can make this work,” Ganguly said.
The paper “Sampling Across-day Drift within a Mesoscale Manifold Enables Long-term Neuroprosthetic Control” appeared on March 6, 2025 in Cell.
Ryan Whalen covers science and technology for The Debrief. He holds an MA in History and a Master of Library and Information Science with a certificate in Data Science. He can be contacted at ryan@thedebrief.org, and follow him on Twitter @mdntwvlf.
