18 years after he was paralysed, brain implants are letting this man communicate again

Cutting-edge tech is giving a voice to a man doctors said had ‘zero chance’ of getting better

Dr Eddie Chang helps Pancho speak through an implant in his brain. Photograph: Mike Kai Chen/New York Times
Dr Eddie Chang helps Pancho speak through an implant in his brain. Photograph: Mike Kai Chen/New York Times

He has not been able to speak since 2003, when he was paralysed at age 20 by a severe stroke after a terrible car crash.

Now, in a scientific milestone, researchers have tapped into the speech areas of his brain – allowing him to produce comprehensible words and sentences simply by trying to say them. When the man, known by his nickname, Pancho, tries to speak, electrodes implanted in his brain transmit signals to a computer that displays them on the screen.

His first recognisable sentence, researchers say, was, “My family is outside.”

The achievement, published a few days ago in the New England Journal of Medicine, could eventually help many patients with conditions that steal their ability to talk. "This is farther than we've ever imagined we could go," says Melanie Fried-Oken, a professor of neurology and paediatrics at Oregon Health and Science University, who was not involved in the project.

READ MORE

Three years ago, when Pancho, who is now 38, agreed to work with neuroscience researchers, they were unsure if his brain had even retained the mechanisms for speech. "That part of his brain might have been dormant, and we just didn't know if it would ever really wake up in order for him to speak again," says Dr Edward Chang, chairman of neurological surgery at University of California, San Francisco, who led the research.

The team implanted a rectangular sheet of 128 electrodes, designed to detect signals from speech-related sensory and motor processes linked to the mouth, lips, jaw, tongue and larynx. In 50 sessions over 81 weeks, they connected the implant to a computer by a cable attached to a port in Pancho’s head, and asked him to try to say words from a list of 50 common ones he helped suggest, including “hungry,” “music” and “computer”.

As he did, electrodes transmitted signals through a form of artificial intelligence that tried to recognise the intended words. "Our system translates the brain activity that would have normally controlled his vocal tract directly into words and sentences," says David Moses, a postdoctoral engineer who developed the system with Sean Metzger and Jessie R Liu, graduate students. The three are lead authors of the study.

Pancho (who asked to be identified only by his nickname to protect his privacy) also tried to say the 50 words in 50 distinct sentences like “My nurse is right outside” and “Bring my glasses, please” and in response to questions like “How are you today?”

His answer, displayed on screen: “I am very good.”

In nearly half of the 9,000 times Pancho tried to say single words, the algorithm got it right. When he tried saying sentences written on the screen, it did even better.

Dr Eddie Chang prepares to connect Pancho’s brain implant to a computer, which uses a form of artificial intelligence to recognize the words he intends to say, in San Francisco on July 5th, 2021. Photograph: Mike Kai Chen/The New York Times
Dr Eddie Chang prepares to connect Pancho’s brain implant to a computer, which uses a form of artificial intelligence to recognize the words he intends to say, in San Francisco on July 5th, 2021. Photograph: Mike Kai Chen/The New York Times

By funnelling algorithm results through a kind of autocorrect language-prediction system, the computer correctly recognised individual words in the sentences nearly three-quarters of the time and perfectly decoded entire sentences more than half the time. “To prove that you can decipher speech from the electrical signals in the speech motor area of your brain is groundbreaking,” says Fried-Oken, whose own research involves trying to detect signals using electrodes in a cap placed on the head, not implanted.

After a recent session, Pancho, wearing a black fedora over a white knit hat to cover the port, smiled and tilted his head slightly with the limited movement he has. In bursts of gravelly sound, he demonstrated a sentence composed of words in the study: “No, I am not thirsty.”

In interviews over several weeks for this article, he communicated through email exchanges using a head-controlled mouse to painstakingly type key-by-key, the method he usually relies on.

The brain implant’s recognition of his spoken words is “a life-changing experience,” he says. “I just want to, I don’t know, get something good, because I always was told by doctors that I had zero chance to get better,” Pancho types during a video chat.

Later, he emails: “Not to be able to communicate with anyone, to have a normal conversation and express yourself in any way, it’s devastating, very hard to live with.”

During research sessions with the electrodes, he writes, “It’s very much like getting a second chance to talk again.”

Pancho was a healthy field worker in California’s vineyards until a car crash after a soccer game one summer Sunday, he says. After surgery for serious damage to his stomach, he was discharged from the hospital, walking, talking and thinking he was on the road to recovery.

But the next morning, he was “throwing up and unable to hold myself up,” he writes. Doctors say he experienced a brainstem stroke, apparently caused by a post-surgery blood clot.

A week later, he woke up from a coma in a small, dark room. “I tried to move, but I couldn’t lift a finger, and I tried to talk, but I couldn’t spit out a word,” he writes. “So, I started to cry, but as I couldn’t make any sound, all I made were some ugly gestures.”

It was terrifying. “I wished I didn’t ever come back from the coma I was in,” he writes.

For years, Pancho communicated by spelling out words on a computer using a pointer attached to a baseball cap, a method that allowed him to type about five words a minute

The new approach, called a speech neuroprosthesis, is part of a surge of innovation aimed at helping tens of thousands of people who lack the ability to talk, but whose brains contain neural pathways for speech, says Dr Leigh Hochberg, a neurologist with Massachusetts General Hospital, Brown University and the Department of Veterans Affairs, who was not involved in the study but co-wrote an editorial about it.

That could include people with brain injuries or conditions such as amyotrophic lateral sclerosis (also known as Lou Gehrig’s disease) or cerebral palsy, in which patients have insufficient muscle control to speak. “The urgency can’t be overstated,” says Hochberg, who directs a project called BrainGate that implants tinier electrodes to read signals from individual neurons; it recently decoded a paralysed patient’s attempted handwriting motions. “It’s now only a matter of years,” he says, “before there will be a clinically useful system that will allow for the restoration of communication.”

For years, Pancho communicated by spelling out words on a computer using a pointer attached to a baseball cap, an arduous method that allowed him to type about five correct words a minute. “I had to bend/lean my head forward, down, and poke a key letter one-by-one to write,” he emails.

Last year, the researchers gave him another device involving a head-controlled mouse, but it is still not nearly as fast as the brain electrodes in the research sessions.

Through the electrodes, Pancho communicated 15 to 18 words a minute. That was the maximum rate the study allowed because the computer waited between prompts. Chang says faster decoding is possible, although it’s unclear if it will approach the pace of typical conversational speech: about 150 words a minute. Speed is a key reason the project focuses on speaking, tapping directly into the brain’s word production system rather than hand movements involved in typing or writing. “It’s the most natural way for people to communicate,” he says.

Pancho’s buoyant personality has helped the researchers navigate challenges, but also occasionally makes speech recognition uneven. “I sometimes can’t control my emotions and laugh a lot and don’t do too good with the experiment,” he emails.

Chang recalls times when, after the algorithm successfully identified a sentence, “you could see him visibly shaking and it looked like he was kind of giggling”. When that happened or when, during the repetitive tasks, he’d yawn or get distracted, “it didn’t work very well because he wasn’t really focused on getting those words. So, we’ve got some things to work on because we obviously want it to work all the time”.

The algorithm sometimes confused words with similar phonetic sounds, identifying “going” as “bring,” “do” as “you,” and words beginning with “F” — “faith,” “family,” “feel” — as a V-word, “very.”

Longer sentences needed more help from the language-prediction system. Without it, “How do you like my music?” was decoded as “How do you like bad bring?” and “Hello how are you?” became “Hungry how am you?”

But in sessions that the pandemic interrupted for months, accuracy improved, Chang says, both because the algorithm learned from Pancho’s efforts and because “there’s definitely things that are changing in his brain,” helping it “light up and show us the signals that we needed to get these words out”.

On a video call, Pancho communicates using a painstaking method involving a head-controlled mouse that he directs to type out letters one-by-one in Sonoma County, California, on July 5th, 2021. Photograph: Mike Kai Chen/The New York Times
On a video call, Pancho communicates using a painstaking method involving a head-controlled mouse that he directs to type out letters one-by-one in Sonoma County, California, on July 5th, 2021. Photograph: Mike Kai Chen/The New York Times

Before his stroke, Pancho had attended school only up to sixth grade in his native Mexico. With remarkable determination, he has since earned a high school diploma, taken college classes, received a web developer certificate and begun studying French. "I think the car wreck got me to be a better person, and smarter too," he emails.

With his restricted wrist movement, Pancho can manoeuvre an electric wheelchair, pressing the joystick with a stuffed sock tied around his hand with rubber bands. At stores, he’ll hover near something until cashiers decipher what he wants, like a cup of coffee.

“They place it in my wheelchair, and I bring it back to my home so I can get help drinking it,” he says. “The people here at the facility find themselves surprised, they always asked me, ‘how did you buy that, and how did you tell them what you wanted!?’”

He also works with other researchers using the electrodes to help him manipulate a robotic arm. His twice-weekly speech sessions can be difficult and exhausting, but he is always “looking forward to wake up and get out of bed every day, and wait for my UCSF people to arrive.”

The speech study is the culmination of more than a decade of research, in which Chang’s team mapped brain activity for all vowel and consonant sounds and tapped into the brains of healthy people to produce computerised speech.

Researchers emphasise that the electrodes are not reading Pancho’s mind, but detecting brain signals corresponding to each word he tries to say. “He is thinking the word,” Fried-Oken says. “It’s not random thoughts that the computer is picking up.”

Chang said “in the future, we might be able to do what people are thinking,” which raises “some really important questions about the ethics of this kind of technology.” But this, he says, “is really just about restoring the individual’s voice.”

In newer tasks, Pancho mimes words silently and spells out less common words using the military alphabet: “delta” for “d,” “foxtrot” for “f.”

“He is truly a pioneer,” Moses says.

The team also wants to engineer implants with more sensitivity and make it wireless for complete implantation to avoid infection, says Chang.

As more patients participate, scientists might find individual brain variations, Fried-Oken says, adding that if patients are tired or ill, the intensity or timing of their brain signals might change.

“I just wanted to somehow be able to do something for myself, even a tiny bit,” Pancho says, “but now I know, I’m not doing it just for myself”. – New York Times