Science & Enterprise subscription

Follow us on Twitter

  • A drug targeting immune system enzymes used as a skin cream is shown in a clinical trial to reverse effects of viti… https://t.co/igOhmze2sl
    about 38 mins ago
  • New post on Science and Enterprise: Topical Immune Drug Reverses Skin Disease https://t.co/WzGc3YOFql #Science #Business
    about 44 mins ago
  • Medical researchers and data scientists are using artificial intelligence, or A.I., to diagnose children’s intestin… https://t.co/urHvB81IUx
    about 21 hours ago
  • New post on Science and Enterprise: A.I. Harnessed to Diagnose Children’s Gut Diseases https://t.co/K5JZp5FW7t #Science #Business
    about 21 hours ago
  • University engineers and chemists are creating a more efficient material for separating carbon dioxide from emissio… https://t.co/7hT7j4A6kd
    about 1 day ago

Please share Science & Enterprise

System Creates Ad Hoc Touch-Based Interfaces on Surfaces

Wall surface with touch interface created by WorldKit

Wall surface with touch interface created by WorldKit (chrisharrison.net)

Computer scientists at Carnegie Mellon University in Pittsburgh developed a system that can project images to control computer devices on everyday surfaces almost at will. The team of doctoral candidates Robert Xiao and Chris Harrison, with professor Scott Hudson, will discuss their WorldKit system next week at the ACM SIGCHI Conference on Human Factors in Computing Systems meeting in Paris.

Xiao, Harrison, and Hudson developed WorldKit in Carnegie Mellon’s Human-Computer Interaction Institute that combines a ceiling-mounted camera with sensors and projector to record the configuration of the room, capture hand movements and gestures, and project images on the desired surfaces in the room. WorldKit enables individuals to create images projected on room surfaces, on the fly and as needed, containing elements that interact with computer systems.

For example, a person can create a TV remote control on the arm of a sofa, or post a calendar on the wall. In addition, people in the room can then interact with the calendar, expanding the detail of or modifying individual events.

The researchers are building on earlier work by Harrison — who joins the Carnegie Mellon faculty this summer — that harnesses the capabilities of the depth-sensing camera, like the one found on the Microsoft Xbox Kinect, to track the user’s fingers on remote flat surfaces. This capability to project an interactive touch screen allows users to control interactive applications by tapping or dragging their fingers on those projected surfaces.

WorldKit, however, does not require prior calibration, and can automatically adjust its sensing and image projection to the orientation of the chosen surface. Individuals can select from a menu and project control-images such as switches, message boards, and indicator lights. Xiao says the system takes advantage of new and improved hardware, noting “Depth sensors are getting better and projectors just keep getting smaller.”

The WorldKit developers aim to refine the system to custom-design interfaces with hand gestures, rather than selecting from a menu, where users can interact with the system in free space. In addition, higher resolution depth cameras could enable the system to sense detailed finger gestures, or voice sensors could respond to spoken commands.

“People have talked about creating smart environments, where sensors, displays and computers are interwoven,” says Harrison. “With WorldKit, we say forget touchscreens and go straight to projectors, which can make the room truly interactive.”

Read more:

*     *     *

Please share Science & Enterprise ...
error

Comments are closed.