8 July 2014. A psychology research group at Brown University in Providence is using virtual reality to detect and document patterns of autonomous individuals taking part in crowds. The system developed by Brown’s Virtual Environment Navigation lab was described last week by its director William Warren in a keynote address at a meeting of the International Society for Gait and Posture Research in Vancouver, Canada.
Warren and colleagues designed the wireless system to study people’s behavior when as pedestrians they encounter other people in close proximity. These everyday experiences, notes Warren, involves people coordinating their movements with each other, whether they know it or not.
“Crowds seem to behave in predictable ways,” says Warren in a university statement, “but the environment is not always built to accommodate people’s behavior.” He and colleagues in the Virtual Environment Navigation lab aim to develop a computer model of this human swarming behavior, which Warren says could be applied to urban planning, architecture, and planning for evacuations. It may also have use for visually impaired pedestrians on city streets.
The researchers adapted off-the-shelf virtual reality head sets made by Oculus VR in Irvine, California, and created a 168 square meter space for up to four people to interact in a controlled environment. Movement in the space is measured with ultrasound beacons and sensors to track location, as well as microphones and video cameras. Accelerometers, like those found in mobile phones, track fine movements, such as head positions and the direction subjects are looking.
The lab’s original equipment required hard-wire cables connecting the head sets, as well as heavy control systems that subjects had to wear in backpacks. Going wireless made it possible for subjects to act more naturally. The university filed for patents on the wireless system.
The lab uses the system to test different scenarios where autonomous individuals interact with each other in common crowd situations. At the Vancouver meeting, Warren described circumstances where people literally cross paths in a railway station like Grand Central in New York. Another scenario tested by the lab had two groups of people passing through each other in opposite directions.
“We’re finding that when people walk together they tend to match the speed and direction of their neighbors,” says Warren. “That intrinsically leads to the emergence of collective motion.” He adds that the lab is now mapping a “coupling field” with virtual reality, where individuals connect with neighbors at different distances and positions.
In the future, the team aims to make the system even more virtual, so subjects can take part in these group experiments without occupying the same physical space. This advance will require head sets that can track precise locations on their own, without the external sensors now used in the lab.
Warren and lab colleagues demonstrate the current system in the following video.
- Paralyzed Man Moves Hand with Neuro-Signal Implant System
- Lasers Embed Sensors in Smartphone Display Glass
- Mobile App/Sensor Designed to Help Parents Control Stress
- App Measures Health Status with Basic Smartphone Technology
- Tablet App Provides Feedback, Improves Drug Adherence
* * *
You must be logged in to post a comment.