ETH Zurich University uses WorldViz Projection VR System to study how urban design influences stress and navigation

client: ETH University, Switzerland

research field: Applied Cognitive Science

equipment used: Vizard VR Software Toolkit, collapsible three-wall motion-tracked VizMove Projection VR System, Arrington Research Eye Tracker, Wireless PPT Wand, 6 PPT Motion Tracking cameras, BIOPAC skin-conductance sensors.

Researchers in the Behavioral Sciences division of the Department of Humanities, Social and Political Studies at ETH, Zurich, led by Professor Christoph Hoelscher, use their three-wall 3D Projection VR system to research 1) how humans interact with urban environments and 2) the relationship between these environments, stress, and navigability. The VizMoveProjection VR system is fitted with integrated eye-tracking and biometric sensor technology and is powered by the Vizard Virtual Reality software toolkit. Vizard allows the scientists at ETH Zurich to connect these devices without hassle so that they can focus on experiment design and data collection. In this case, the ETH research team is able to cull data about how the design of cities specifically influences stress levels and disorientation during urban navigation without worrying about the experiment’s technological logistics.

WorldViz equipped us with a zero footprint projection system that fit my research team’s specific needs. The Vizard Virtual Reality software provided us with powerful tools to create interactive, immersive virtual environments and helped us connect them with our eye-trackers and biometric data capture devices right away.

– Prof. Hoelscher

RESEARCH PROJECT EXAMPLES

Wayfinding and Stress

To better inform urban and architectural design, researchers at ETH Zurich are currently investigating the interaction between environmental cues, stress, and the (successful) navigation of an environment. The researchers used Vizard to display a complex virtual urban environment through which subjects navigate: custom-written Python scripts were used to record and match subject data from eye-tracking and biometric systems as well as data about the subject’s virtual position/orientation. When and where subjects get lost is logged by recording when participants call on the help of a virtual arrow that points to the goal location, which they can invoke at anytime by pressing the trigger button on a common joystick. The researchers can then visualize where participants are stressed and in what direction they look. Measurable consistencies are found between the nature of the environment and subject disorientation/stress. Once the experiments were run using Vizard data was exported to Matlab for visualization and analysis, and the multi-sensory subject data was processed in real-time with custom written Python scripts appended to Vizard.

Response-Learning and Spatial-Learning

In a related set of experiments, ETH researchers are investigating response-learning and spatial-learning by immersing their subjects in the Projection VR System and having them complete basic navigation tasks. Participants are given two sets of tasks, designated as “response-learning” (navigating the virtual environment by following arrows) and “spatial-learning” (using an overhead map of the world to navigate). The researchers are currently coupling the participants’ performance data with MRI scans of their brains in order to investigate the relationship between anatomical structures and performances on the two tasks. The researchers expect to find relations between task performance and volume and shape of both the hippocampus and the caudate.

Stay Updated
Subscribe to our monthly Newsletter
CONTACT US 
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103