Donate to Science & Enterprise

S&E on Mastodon

S&E on LinkedIn

S&E on Flipboard

Please share Science & Enterprise

Augmented Reality Applications Enhanced for Mobile Devices

Matthew Turk, left, and Tobias Hollerer (UC Santa Barbara)

Matthew Turk, left, and Tobias Hollerer (UC Santa Barbara)

Computer scientists at University of California in Santa Barbara are developing augmented reality applications for mobile devices that offer more stable, realistic, and current images than available today. The lab of computer science professors Matthew Turk and Tobias Höllerer (pictured right) recently received a $300,000 grant from the U.S. Office of Naval Research to create an optimal user interface for augmented reality based on user experiences, a project conducted with Virginia Tech colleague Doug Bowman.

Augmented reality is a representation of the real world enhanced by computer-generated sensory input such as sound, video, graphics, or GPS data. Unlike virtual reality that offers a complete environmental simulation, augmented reality adds data in context to the real-world representation, such as aiming a tablet’s camera at a baseball player facing a left-handed pitcher with runners on base in the late innings of a game, and displaying the player’s lifetime batting average in this situation.

Rapid advancements in mobile computing devices in recent years, both in smartphones and tablets, are expected to accelerate augmented reality (AR) technology. Turk notes, “The applications for mobile, real-time augmented reality can have a major impact on health, education, entertainment, and many other areas.”

Turk and Höllerer aim to improve the current state of augmented reality, making it more stable, realistic, and dynamically updated by users. Their lab, called the Four Eyes Lab — for the the “four I’s” of Imaging, Interaction, and Innovative Interfaces — is conducting research on combining mobile computer vision capture with crowdsourced user data to determine if the app object matches the object in reality. They call it “anywhere” augmented reality.

“Our research employs real-time computer vision for more stable presentation of 3D computer graphics that appear as if they are truly part of the physical world,” says Höllerer. “A tourist at an archaeological site,” Höllerer adds, “could explore the reconstruction of an ancient temple where it once stood.”

The system being developed by the Four Eyes lab closes the sensor loop to get more accurate overlays using crowdsourced data. “So the next time I’m standing in front of a restaurant using an AR app, and the façade has recently changed,” Höllerer explains, “I can update that information just by virtue of looking at it. The use of the app improves the experience of the next user.” Achieving that state of anywhere augmented reality requires a more sophisticated user interface, which is the goal of the Office of Naval Research grant.

Another recent award, this one from National Science Foundation, examines computer vision-based tracking and augmented reality to enhance remote collaboration in physical spaces. The three-year, $500,000 project allows users in different locations to see and assign data within a target scene, extending two-dimensional tracking to a real-time three dimensional scenario. A camera tracking and mapping system developed in the lab received the best paper prize last month at an augmented-reality industry conference.

“You can point to and annotate an object in a target environment through your screen,” says Turk, “and the annotation will ‘stick’ to the object even when the camera moves, and it will be visible to all users.” Their research has led to a prototype, which tests show allows users, instructed by a remote expert, to control a mock-up airplane cockpit using just a visual camera feed.

The following video tells more about and gives a demonstration of the augmented reality work at UC Santa Barbara.

Read more:

*     *     *

1 comment to Augmented Reality Applications Enhanced for Mobile Devices