The NSF-funded ACTiVATEŽ program helps convert discoveries made through federally funded scientific research into innovations and marketable devices that benefit the economy and society. Kris Appel joined ACTiVATEŽ in 2006 and was quickly matched with a group of scientists and medical doctors at the University of Maryland Medical School who developed a device to help people with stroke-induced arm paralysis. Learn more in this Discovery.
Credit: Kris Appel, Encore Path, Inc.
Jose Contreras-Vidal, an associate professor of kinesiology at the University of Maryland, and his team have created a non-invasive, sensor-lined cap that forms a "brain computer interface" that one day could control computers, robotic prosthetic limbs, motorized wheelchairs, and even digital avatars. Learn more in this Discovery.
Credit: John Consoli, University of Maryland
Researchers have developed an experimental tongue-based system that may allow individuals with debilitating disabilities to control wheelchairs, computers and other devices with relative ease and no sophistication. Read more in this news release.
Credit: Georgia Tech Photo: Gary Meek
Computer scientist Tom Mitchell and cognitive neuroscientist Marcel Just, both of Carnegie Mellon University, are closer to knowing how specific thoughts activate our brains. Their findings demonstrate the power of computational modeling to improve our understanding of how the brain processes information and thoughts. Read more in this news release.
Credit: Courtesy of Science
The Office of Multidisciplinary Activities (SMA) of the Directorate for Social, Behavioral & Economic Sciences provides a focal point for programmatic activities that cut across NSF and SBE boundaries and is SBE's broadest mechanism for contributing to Administration and NSF priorities. SMA assists with seeding multidisciplinary activities for the future and plays a critical role in the development of infrastructure to support interdisciplinary activities.
A University of Minnesota research team has developed a unique brain-computer interface (BCI) that allows humans to use thoughts to control the flight of a virtual helicopter in real time. The experience takes place in three dimensions and uses electrical signals from the scalp to control the helicopter's movements.
In an early step toward letting severely paralyzed people speak with their thoughts, University of Utah researchers translated brain signals into words using two grids of 16 microelectrodes implanted beneath the skull but atop the brain.
October 17, 2011
Mind Reading Computer System May Help People With Locked-in Syndrome
Totally paralyzed people could communicate and control robots
Imagine living a life in which you are completely aware of the world around you but you're prevented from engaging in it because you are completely paralyzed. Even speaking is impossible. For an estimated 50,000 Americans, this is a harsh reality. It's called locked-in syndrome, a condition in which people with normal cognitive brain activity suffer severe paralysis, often from injuries or an illness such as Lou Gehrig's disease.
"Locked-in people are unable to move at all except possibly their eyes, and so they're left with no means of communication but they are fully conscious," says Boston University neuroscientist Frank Guenther.
Guenther works with the National Science Foundation's (NSF) Center of Excellence for Learning in Education, Science and Technology (CELEST), which is made up of eight private and public institutions, mostly in the Boston area. Its purpose is to synthesize the experimental modeling and technological approaches to research in order to understand how the brain learns as a whole system. In particular, Guenther's research is looking at how brain regions interact, with the hope of melding mind and machine, and ultimately making life much better for people with locked-in syndrome.
"People who have no other means of communication can start to control a computer that can produce words for them or they can manipulate what happens in a robot and allow them to interact with the world," Guenther says about his research.
His team demonstrated two experiments on the day Science Nation stopped by. In one experiment, run by assistant research professor Jonathan Brumberg, a volunteer shows how she uses a speech synthesizer to make vowel sounds just by thinking about moving a hand or foot. She never moves her body or says anything.
"We use an EEG cap to read the brain signals coming from her brain through her scalp," explains Brumberg, who tracks the brainwaves with a computer. "Depending on what body part she imagines moving, the cursor moves in different directions on the screen. Brumberg explains that he is able to, "translate those brain activities into audio signals that can be used to drive a voice synthesizer. We've mapped the "uw" sound to a left hand movement, the "aa" sound to right hand movement, and the "iy" sound to a foot movement."
As the subject sits perfectly still, the cursor starts to move freely across the screen. Each of those sounds is represented by three circles on a computer screen. The subject needs to get the cursor into the center of any of the three circles to get the synthesizer to make the right vowel sound.
We watch as the subject imagines moving her left hand to get the cursor to move right into the center of the "uw" circle, and we hear a synthetic "uw" droning from the synthesizer. Brumberg has experimented on locked-in patients, too, and the results have been startling.
"We started with helping a locked-in patient regain an ability to make certain vowel sounds and that was amazing. He hasn't been able to talk in years and the first time he made a movement with our formant synthesizer, he nearly, you know, jumped out of his chair with excitement," says Brumberg. "Although the patient has no actual voluntary movement, involuntary motor actions are often seen when the patient gets excited."
Guenther says this technology holds great promise not just for locked-in patients. "We hope these technologies would be applied to people that have other communication disorders that cause them to be unable to speak," he says. "This sort of thing would allow them to produce synthetic speech, which could be used to talk to the people around them and mention their needs."
In another experiment, graduate student Sean Lorenz takes a robot out for a spin using only brainwaves. The checkerboards on the sides of the screen flash at slightly different frequencies. To the naked eye, the differences are subtle. "But the neurons in his visual cortex start firing in synchrony with the checkerboard he's looking at and so we can pick up the frequency and from that, determine which choice he was trying to make, left, right, forward or backward, for example." explains Guenther.
For locked-in patients, Guenther adds, "If they're pointing their eyes at a visual screen, they can focus their attention on one of the different frequencies and they can manipulate what happens in a robot or in a computer."
According to Guenther it's just a matter of time before these technologies are commercially available. It's all part of a vision that pairs biology with technology to find a way out--for those who are locked-in.
Any opinions, findings, conclusions or recommendations presented in this material are only those of the presenter grantee/researcher, author, or agency employee; and do not necessarily reflect the views of the National Science Foundation.