Science & Enterprise subscription

Follow us on Twitter

  • An engineering lab devised techniques for better adapting the material graphene to handle the physical stresses imp… https://t.co/6PhEcaFVMd
    about 15 hours ago
  • New post on Science and Enterprise: Graphene Material Designed for Wearable Devices https://t.co/qHRRX5F8LR #Science #Business
    about 15 hours ago
  • Google Claims a Quantum Breakthrough That Could Change Computing https://t.co/HLDlHV7Svn
    about 18 hours ago
  • A system using an electric current to break up bacterial communities on orthopedic implants received a breakthrough… https://t.co/u0nrChcVwU
    about 1 day ago
  • New post on Science and Enterprise: FDA Tags Biofilm Disruption Device as Breakthrough https://t.co/aBcyNzOrxN #Science #Business
    about 1 day ago

Please share Science & Enterprise

Virtual Reality System Developed to Track Crowd Moves

Grand Central Terminal

Grand Central Terminal (Sracer357/Wikimedia Commons)

8 July 2014. A psychology research group at Brown University in Providence is using virtual reality to detect and document patterns of autonomous individuals taking part in crowds. The system developed by Brown’s Virtual Environment Navigation lab was described last week by its director William Warren in a keynote address at a meeting of the International Society for Gait and Posture Research in Vancouver, Canada.

Warren and colleagues designed the wireless system to study people’s behavior when as pedestrians they encounter other people in close proximity. These everyday experiences, notes Warren, involves people coordinating their movements with each other, whether they know it or not.

“Crowds seem to behave in predictable ways,” says Warren in a university statement, “but the environment is not always built to accommodate people’s behavior.” He and colleagues  in the Virtual Environment Navigation lab aim to develop a computer model of this human swarming behavior, which Warren says could be applied to urban planning, architecture, and planning for evacuations. It may also have use for visually impaired pedestrians on city streets.

The researchers adapted off-the-shelf virtual reality head sets made by Oculus VR in Irvine, California, and created a 168 square meter space for up to four people to interact in a controlled environment. Movement in the space is measured with ultrasound beacons and sensors to track location, as well as microphones and video cameras. Accelerometers, like those found in mobile phones, track fine movements, such as head positions and the direction subjects are looking.

The lab’s original equipment required hard-wire cables connecting the head sets, as well as heavy control systems that subjects had to wear in backpacks. Going wireless made it possible for subjects to act more naturally. The university filed for patents on the wireless system.

The lab uses the system to test different scenarios where autonomous individuals interact with each other in common crowd situations. At the Vancouver meeting, Warren described circumstances where people literally cross paths in a railway station like Grand Central in New York. Another scenario tested by the lab had two groups of people passing through each other in opposite directions.

“We’re finding that when people walk together they tend to match the speed and direction of their neighbors,” says Warren. “That intrinsically leads to the emergence of collective motion.” He adds that the lab is now mapping a “coupling field” with virtual reality, where individuals connect with neighbors at different distances and positions.

In the future, the team aims to make the system even more virtual, so subjects can take part in these group experiments without occupying the same physical space. This advance will require head sets that can track precise locations on their own, without the external sensors now used in the lab.

Warren and lab colleagues demonstrate the current system in the following video.

Read more:

*     *     *

Please share Science & Enterprise ...
error

Comments are closed.