Donate to Science & Enterprise

S&E on Mastodon

S&E on LinkedIn

S&E on Flipboard

Please share Science & Enterprise

Mobile Low-Power Gesture-Recognition System Developed

AllSee prototype

AllSee prototype attached to smartphone (University of Washington)

27 February 2014. Computer scientists and engineers at University of Washington in Seattle developed an inexpensive gesture recognition system for mobile devices that consumes minimal power, with potential applications in robotics and “Internet-of-things” computing. The team led by Shyam Gollakota, director of the university’s Networks and Wireless Lab, presents its work on 3 April at the USENIX Symposium on Networked Systems Design and Implementation in Seattle.

The system, known as AllSee, uses a small sensor with a receiver requiring minimal power. The sensor and receiver take advantage of ambient wireless signals in the environment, such as television signals, that are detected and harnessed. Hand gestures change the amplitude of the wireless signals, and when they occur near the sensor, the AllSee system captures and recognizes the changes in signal amplitude, which it transforms into commands for the host device.

AllSee sensors, say the researchers, use three to four times less power than current gesture-recognition systems, like those developed for the Samsung Galaxy S4 smartphone. Because of their power needs, however, current gesture-recognition systems can quickly drain a smartphone’s battery. Current systems must also be manually switched on and in direct line-of-sight of their sensors.

The AllSee system, on the other hand, consumes minimal power — tens of microwatts say the developers — and, as a result, can be left on. In addition, the AllSee system can sense changes in wireless signal amplitude even when the attached device is sitting in a pocket or purse. Thus the AllSee can sense hand gestures changing audio levels on the attached phone, for example, without reaching into the pocket or purse.

Gollakota and colleagues tested a prototype AllSee system, with eight different hand gestures such as pushing or pulling to zoom in and out, on smartphones and battery-free sensors. They found the prototype could identify the gestures more than 90 percent of the time, with the gestures occurring as far as two feet away.

The researchers report a response time of less than 80 milliseconds, about 1,000 times faster than a blinking eye. They also devised a procedure using a finger-flick motion as a starting sequence to discriminate between AllSee commands and normal hand gestures.

“This is the first gesture recognition system that can be implemented for less than a dollar and doesn’t require a battery,” says Gollakota in a university statement. The researchers say the technology can also be applied beyond mobile devices to robotic systems and with Internet-enabled household devices and monitors — i.e., Internet of things — where hand gestures can control their actions and movements without a large added power drain.

The following video demonstrates the AllSee prototype.

Read more:

*     *     *

 

Comments are closed.