Engineers, neuroscientists, and computer scientists in the U.S. and U.K. designed an algorithm offering more sensitive and accurate control of a computer display cursor controlled by thoughts. The team led by Krishna Shenoy, Stanford University professor of neurobiology and engineering, published its findings online yesterday in the journal Nature Neuroscience (paid subscription required), and aims to include the discovery in a clinical trial of neural-implanted devices for people with severe disabilities.
The researchers developed the algorithm, named Recalibrated Feedback Intention–Trained Kalman Filter or ReFIT, controlling a silicon chip implanted in the brain that records the intentions of the person to move a limb, called action potentials. The system filters and records these action potentials carrying information about the direction and speed of the user’s intended movement from an array of sensors measuring neural activity.
With current algorithms, the system transfers these data to a computer, where they are then processed into instructions for a prosthetic device. Shenoy’s team devised a mathematical routine to analyze and direct visual feedback in real time, in this case to control the actions of a display cursor interacting with an onscreen target.
The algorithm aims to make adjustments at the same time as guiding a display cursor to a target, just as a hand and eye would work in tandem to move a mouse-cursor onto an icon on a computer desktop. Movements too far in one direction would be corrected, and the system would learn from these corrections so later movements would move more accurately. The algoritm also combines neural information about the position and velocity of the cursor into one set of instructions, where earlier algorithms processed the information separately.
The researchers tested the algorithm with two rhesus monkeys that directed a display cursor to a target dot on a computer screen, and held the cursor on the target for a half-second. The results showed measurable improvements in speed and accuracy from earlier algorithms, with a straighter path of the cursor from the starting point to the target, and reaching the target twice as quickly, up to 85 percent as fast as the monkey using his real arm.
When the test monkey first moved the cursor with his real arm, the ReFIT algorithm recorded the neural signals of that activity, then translated those signals into screen movements when the monkey thought of moving the cursor. The algorithm used the monkey’s brain activity to refine the instructions, increasing the algorithm’s accuracy.
The researchers say they took a different approach to the problem, tracking the activity of groups of neurons rather than isolating the actions of individual neurons. Stanford research associate Vikash Gilja, the article’s first author, explains the reason for the change was to get results that could be better translated into medical advances.
“The core engineering goal is to achieve highest possible performance and robustness for a potential clinical device,” says Gilja, “From an engineering perspective, the process of isolating single neurons is difficult, due to minute physical movements between the electrode and nearby neurons, making it error-prone.” Gilja adds that the algorithm is still working after four years, another result that he attributes to this different approach.
Shenoy says the results have potential applicability with clinical trials of implanted neural devices now underway. “These findings could lead to greatly improved prosthetic system performance and robustness in paralyzed people, which we are actively pursuing as part of the FDA Phase-I BrainGate2 clinical trial here at Stanford,” notes Shenoy. Science Business reported on the BrainGate project in May.
The following short video by Vikash Gilja shows a side-by-side comparison of the test monkeys using the earlier (Velocity Kalman) filter on the left and the new ReFIT filter on the right. The ReFIT algorithm enabled the monkey to hit 21 targets in 21 seconds, compared to 10 targets in 21 seconds with the earlier filter.
- Robotic Legs Developed with Human Walking Motion
- Two Paralyzed People Use Brain-Controlled Robotic Arms
- Robotic Hand Demonstrates Firm Grip and Gentle Touch
- More Efficient Algorithms Devised for Robotic Motions
* * *