How To Enhance VR Research with SightLab VR Pro's Searchable Example Scripts

March 20, 2025

A Streamlined Approach to VR Experimentation

SightLab VR Pro for Vizard simplifies the creation of academic VR research experiments by offering both a user-friendly GUI and Python scripting options. With a growing library of over 80 example scripts and templates, researchers can efficiently set up and customize their VR experiments.

The SightLab Example Scripts Library provides a searchable collection of pre-configured experiment templates that integrate seamlessly with SightLab’s built-in features, including eye-tracking, physiological data collection, and real-time visualization.

Watch this video to see some of these example scripts in action:
https://www.youtube.com/watch?v=f-riylZvAb0 

Notable Experiment Templates

Visual Search Tasks

A suite of five tasks designed to analyze eye movements, decision-making, and search efficiency. Key features include:

  • Measuring fixations, saccades, dwell time, and pupil diameter.
  • Visualizing scan paths, heatmaps, and interactions.
  • No coding required for setup.

AI Intelligent Agents & Avatars

Integrate interactive AI-driven avatars that respond to user input, customizable through speech recognition and large language models like GPT-4, Gemini and Claude.

Driving Simulators

Set up immersive driving experiments using 3D models or 360° media while tracking gaze and physiological responses.

Memory & Reaction Time Tasks

SightLab includes templates for memory-based paradigms and reaction time experiments, allowing researchers to analyze cognitive performance under controlled VR conditions.

Augmented & Mixed Reality

Use passthrough AR to overlay digital objects onto the real world, track real-world object interactions, and measure eye movements in mixed reality setups.

Biofeedback & Physiological Measurement

Synchronize VR interactions with real-time physiological data from BIOPAC AcqKnowledge, allowing biometric responses to control elements in the scene.

External Application Data Recorder

This tool allows researchers to record and synchronize sensor data while running external VR applications, including SteamVR games, Oculus apps, and web-based VR experiences. It supports multiple headsets, including Vive Focus Vision, Meta Quest, Varjo, and OpenXR-based devices. Sessions can be replayed with synchronized gaze tracking, and an experimental AI mode analyzes view counts.

Eye Tracker Calibration & Tests

Includes spatial accuracy tests, smooth pursuit tasks, and vestibulo-ocular reflex (VOR) evaluations for validating eye-tracking data.

Gaze-Based Interactions & Virtual Screens

  • Trigger interactive events based on gaze detection.
  • Present controlled 2D stimuli with customizable screen sizes.

Phobia Exposure

Test fear responses in controlled VR environments, capturing gaze data and sending real-time signals to physiological recording systems.

And much more, see the library for the full list

Why Use SightLab’s Example Scripts?

  • Ready-to-use templates minimize setup time.
  • Seamless integration with gaze tracking, EEG, and biometric data.
  • Multi-user support for collaborative VR research.
  • No coding required for basic customization, with Python scripting available for advanced modifications.

For more details and to explore the full list of available experiments, visit the SightLab Example Scripts Library.

There are many more examples included, or you can easily create your own custom VR experiment from scratch with little or even no code with SightLab VR Pro. 

For more information or to inquire about getting a demo click here or contact sales@worldviz.com.

Stay Updated
Subscribe to our monthly Newsletter
CONTACT US 
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103