University of Florida researchers Jack DiGiovanna, left, and Justin Sanchez
Devices known as brain-machine interfaces could someday be used routinely to help paralyzed patients and amputees control prosthetic limbs with just their thoughts. Now, University of Florida researchers have taken the concept a step further, devising a way for computerized devices not only to translate brain signals into movement but also to evolve with the brain as it learns.
Instead of simply interpreting brain signals and routing them to a robotic hand or leg, this type of brain-machine interface would adapt to a person's behavior over time and use the knowledge to help complete a task more efficiently, sort of like an assistant, say UF College of Medicine and College of Engineering researchers who developed a model system and tested it in rats.
Until now, brain-machine interfaces have been designed as one-way conversations between the brain and a computer, with the brain doing all the talking and the computer following commands. The system UF engineers created actually allows the computer to have a say in that conversation, too, according to findings published this month online in the Institute of Electrical and Electronics Engineers journal IEEE Transactions on Biomedical Engineering.
"In the grand scheme of brain-machine interfaces, this is a complete paradigm change," said Justin C. Sanchez, Ph.D., a UF assistant professor of pediatric neurology and the study's senior author. "This idea opens up all kinds of possibilities for how we interact with devices. It's not just about giving instructions but about those devices assisting us in a common goal. You know the goal, the computer knows the goal and you work together to solve the task."
Scientists at UF and other institutions have been studying and refining brain-machine interfaces for years, developing and testing numerous variations of the technology with the goal of creating implantable, computer-chip-sized devices capable of controlling limbs or treating diseases.
The devices are programmed with complex algorithms that interpret thoughts. But the algorithms, or code, used in current brain-machine interfaces don't adapt to change, Sanchez said.
"The status quo of brain-machine interfaces that are out there have static and fixed decoding algorithms, which assume a person thinks one way for all time," he said. "We learn throughout our lives and come into different scenarios, so you need to develop a paradigm that allows interaction and growth."
To create this type of brain-machine interface, Sanchez and his colleagues developed a system based on setting goals and giving rewards.
Fitted with tiny electrodes in their brains to capture signals for the computer to unravel, three rats were taught to move a robotic arm toward a target with just their thoughts. Each time they succeeded, the rats were rewarded with a drop of water.
The computer's goal, on the other hand, was to earn as many points as possible, Sanchez said. The closer a rat moved the arm to the target, the more points the computer received, giving it incentive to determine which brain signals lead to the most rewards, making the process more efficient for the rat. The researchers conducted several tests with the rats, requiring them to hit targets that were farther and farther away. Despite this increasing difficulty, the rats completed the tasks more efficiently over time and did so at a significantly higher rate than if they had just aimed correctly by chance, Sanchez said.
"We think this dialogue with a goal is how we can make these systems evolve over time," Sanchez said. "We want these devices to grow with the user. (Also) we want users to be able to experience new scenarios and be able to control the device."
Dawn Taylor, Ph.D., an assistant professor of biomedical engineering at Case Western Reserve University, said the results of the study add a new dimension to brain-machine interface research. That UF researchers were able to train rats to use the robotic arm and then obtain significant results from animals lacking the mental prowess of primates or humans is also impressive, she said.
"It's a clear demonstration of a methodology that will work in situations when other implementations would fall apart," Taylor said.
To develop and test this brain-machine interface system, Sanchez collaborated with engineering professors Jose Principe, Ph.D., and Jose Fortes, Ph.D., and engineering doctoral students Jack DiGiovanna and Babak Mahmoudi.
The researchers received funding for the study from the National Science Foundation, the Children's Miracle Network and the UF Alumni Association.
Tuesday, June 24, 2008
University of Florida Researchers develop neural implant that learns with the brain
Labels: Science and Technology
Posted by forhad at 3:21 PM