Donate to Science & Enterprise

S&E on Mastodon

S&E on LinkedIn

S&E on Flipboard

Please share Science & Enterprise

Team Performance Explored in Multi-Sensory Work Sites

Inside Walgreens distribution center

Inside Walgreens regional distribution center, L – R: Walgreens manager Joe Wendover, Clemson engineering professor Sara Riggs, Walgreens staff member Daulten Stewart, and ClemsonLIFE student Dalton Cron (Craig Mahaffey, Clemson University)

29 March 2018. A research project is underway that examines the ways teams of workers adapt and perform in new industrial environments demanding attention to visual information as well as cues from sound and touch. The new study, conducted at Clemson University in South Carolina, is led by industrial engineering professor Sara Riggs, and funded by a 5-year $83,600 grant from National Science Foundation.

Riggs’s lab investigates cognitive ergonomics and systems engineering, particularly the interaction between workers and the increasing complex environment as new technologies are introduced into the workplace. Among the lab’s projects are studies of multimodal displays that present information to workers distributed across multiple sensory channels — e.g., visual, sound, and touch — and adaptive displays that enable adjustments in the mix or pace of information presented and processed by operators to meet individual preferences or needs of the job.

The new research explores multimodal and adaptive information displays in the workplace, but as experienced by teams rather than just individuals. In this project Riggs and colleagues are expected to develop statistical models linking individual and team situational awareness, with a work group using unmanned drones for search and rescue operations as a test case. The researchers plan to capture eye-gaze and team dynamics data indicating situational awareness and workloads in real time, and develop algorithms from these data that result in guidelines for getting the right information to the right team members at the right time.

In addition, the researchers plan to put together an initial set of multi-modal interfaces that combine  visual, audio, and tactile information channels. These interfaces will be evaluated against the earlier statistical models to find the optimal mix of presenting information needed by workers, while minimizing interference with the team’s visual workload.

An immediate benefit of this research is the design of information flows for workers with disabilities. People in the workforce with disabilities can take advantage of multimodal displays, as well as make adjustments in workplace settings with adaptive displays to fit their individual needs. And as such, Riggs is working with a nearby regional distribution center for the drugstore chain Walgreens, as well as the ClemsonLIFE program that helps people with intellectual disabilities adapt to the university experience.

Riggs explains in a university statement with an example of a flashing red light indicating a box is ready for packing. “We might find that if you’re color blind, that red may not be the best color for the blinking light because you might miss that,” she notes. “There may be other ways to indicate that, either adding a sound or maybe changing the color. We’ll be developing the predictive algorithms that would adapt the displays for the user instead of having one solution that fits all.”

ClemsonLIFE students train at the Walgreens distribution center for future employment, which plans to adopt the findings from Riggs’s study. Joe Wendover, field inclusion manager for Walgreens, says the results will help more than just people with disabilities. “You put it in place for someone with a disability,” he adds, “but it could really help everybody.”

More from Science & Enterprise:

*     *     *

Comments are closed.