A neuroscience/engineering team at University of Maryland in College Park has developed headgear like a swim cap with sensors that read brain signals and let the wearer control electronic devices. The Maryland team’s findings appear in the current issue of the Journal of Neurophysiology (paid subscription required).
The non-invasive cap, lined with sensors that connect to neural interface software (pictured right), was developed in the Neural Engineering and Smart Prosthetics Lab headed by kinesiology professor José ‘Pepe’ Contreras-Vidal. The sensors in the cap use electroencephalogram (EEG) technology to read the wearer’s brain waves and turn them into movement commands for computers or other electronic devices.
In this current paper, Contreras-Vidal and colleagues use EEG brain signals to reconstruct the complex 3-D movements of the ankle, knee and hip joints during human treadmill walking. The study has direct implications for development of brain-controlled prosthetic devices.
Contreras-Vidal notes that other technologies in development to harness brain signals for device control either require electrodes to be implanted directly in the brain, or, if noninvasive, require much more training to use than does his lab’s EEG-based, brain cap technology.
In a new, related project Contreras-Vidal’s lab is partnering with colleagues at Rice University at Houston, University of Michigan, Drexel University in Philadelphia to develop a prosthetic arm that amputees can control directly with their brains and that will allow them to feel what they touch. The National Science Foundation’s Human-Centered Computing program has awarded this consortium a $1.2 million grant to fund the project.
The four-university team plans to incorporate technology that feeds both tactile information from the prosthetic fingertips and grasping-force information from the prosthetic hand via a robotic exoskeleton and touch pads that vibrate, stretch, and squeeze the skin where the prosthesis attaches to the body.
* * *