brain implants
Image: JHU Applied Physics Lab/UrTechPro

Brain Implants Allow Quadriplegic Man To Feed Himself

AI, robotic limbs, and brain implants allow a paralysed man to eat a Twinkie.

In a recently announced breakthrough, researchers at the Johns Hopkins Applied Physics Laboratory have helped a quadriplegic man named Robert “Buz” Chmielewski operate a pair of AI-controlled mechanical arms to feed himself using brain implants and his thoughts.

It is a massive step for anyone in Mr. Chmielewski’s condition and an undeniable first in the fields of robotics, artificial intelligence, and brain-machine interface.

 

BACKGROUND: How Brain Implants Helped Buz

In a written reply to The Debrief, JHAPL’s Paulette Campbell explained that “while Buz is quadriplegic, he does retain the ability to move his arms, plus his hands a little bit. He also has a great deal of retained sensation in his hands and fingers.” 

She further clarified that while this is undoubtedly a big step for him personally, “Buz is engaged in this research out of a desire to help others who are more impaired than himself.”

While undoubtedly significant, this unprecedented success is just the latest leap in an over two-year journey for Buz, as researchers refer to their star patient, a man who has been in a wheelchair since his teens. 

His first milestone came nearly two years earlier, when a team at Johns Hopkins operating under a clinical trial spearheaded by the Defense Advanced Research Projects Agency (DARPA) implanted a series of six electrodes directly into both sides of his brain, specifically in the two regions that control movement and touch sensation. 

That successful surgery allowed them to connect his brain to a cutting edge “AI-controlled” arm and essentially give it commands like they were his own.

“Ultimately, because this is the world’s first bilateral implant (both sides), we want to be able to execute motions that require both arms and allow the user to perceive interactions with the environment as though they were coming from his own hands,” said Dr. Francesco Tenore at the time, APL’s project manager for this effort. “Our team will continue training with our participant to develop motor and sensory capabilities, as well as to explore the potential for control of other devices that could be used to expand a user’s personal or professional capabilities.”

A few months later, Buz moved the arms with his mind, leading the research group to get even more aggressive, exploring a “smart machines” interface that would not only allow the patient to experience sensory feedback while moving his robotic arms but take on a task as complex and intricate as slicing, scooping up and feeding oneself a meal. With this latest success, Buz and the team have brought that goal to reality.

 


ANALYSIS: Brain Implants, AI, and the Future of Medical Care

Explaining the technical details of this landmark event to The Debrief, Dr. Luke Osborn, a postdoctoral fellow at Johns Hopkins and an expert in prostheses using sensory augmentation, pointed out that “although Buz wasn’t receiving any tactile sensory feedback during this particular self-feeding task, we are able to create sensations of touch in his hands through electrical stimulation of the somatosensory region of his brain. Generally, these sensations are described as a slight pressure, as if someone were pressing, in different parts of the hand.”

As for how much of the actual work was done by Buz and how much was done by the AI-controlled arms, senior roboticist Dave Handelman told The Debrief, “The current setup is a form of shared control. The robot AI goes through a series of steps to perform the feeding task. Buz contributes to a subset of those steps in order to, ultimately, decide which food on the plate to eat, where to cut the food, and how big the cut piece should be.”

 

So, this isn’t just the patient telling the arms “feed me,” and the AI does the rest, but instead requires the operator to periodically direct and redirect the arms at various points along the way. This is where the brain implants help.

“Buz controls one or two variables of each robot hand by thinking of moving his hands, and those inputs are combined with the AI to produce total robot motion,” Handelman explained of the coordinated interplay between man and device. “The goal is to enable much of the task to be done automatically by the robot, but enable Buz to imprint his intent on it and customize the task to his liking.”

 

 

OUTLOOK: Buz’s future is our future

As explained to The Debrief by Dr. Mathew Fifer, the technical lead on the project:

“The goal of this system is to be extensible to individuals based on their abilities. Someday, brain signals may allow brain-computer interface users to do extremely dexterous movements with their hands and fingers, to where only smoothing out motions might be necessary. But we also want to be able to provide assistance to folks who can only transmit a simple click with their brain signals, where the intelligence in the prosthetic system can carry the load from there.”

It’s an impressive goal. Given the steps they’ve made along the way, a seemingly achievable one at that. As for Buz’s personal reaction to this unprecedented achievement? “It’s pretty cool,” he said of using the limbs to cut and feed himself a piece of sponge cake. “I wanted to be able to do more of it.”

Fortunately, all of Buz’s work alongside the Johns Hopkins team’s efforts may indeed lead to many individuals having these types of “bionics” in the future.