SightLab VR Pro for Vizard simplifies the creation of academic VR research experiments by offering both a user-friendly GUI and Python scripting options. With a growing library of over 80 example scripts and templates, researchers can efficiently set up and customize their VR experiments.
The SightLab Example Scripts Library provides a searchable collection of pre-configured experiment templates that integrate seamlessly with SightLab’s built-in features, including eye-tracking, physiological data collection, and real-time visualization.
Watch this video to see some of these example scripts in action:
https://www.youtube.com/watch?v=f-riylZvAb0
A suite of five tasks designed to analyze eye movements, decision-making, and search efficiency. Key features include:
Integrate interactive AI-driven avatars that respond to user input, customizable through speech recognition and large language models like GPT-4, Gemini and Claude.
Set up immersive driving experiments using 3D models or 360° media while tracking gaze and physiological responses.
SightLab includes templates for memory-based paradigms and reaction time experiments, allowing researchers to analyze cognitive performance under controlled VR conditions.
Use passthrough AR to overlay digital objects onto the real world, track real-world object interactions, and measure eye movements in mixed reality setups.
Synchronize VR interactions with real-time physiological data from BIOPAC AcqKnowledge, allowing biometric responses to control elements in the scene.
This tool allows researchers to record and synchronize sensor data while running external VR applications, including SteamVR games, Oculus apps, and web-based VR experiences. It supports multiple headsets, including Vive Focus Vision, Meta Quest, Varjo, and OpenXR-based devices. Sessions can be replayed with synchronized gaze tracking, and an experimental AI mode analyzes view counts.
Includes spatial accuracy tests, smooth pursuit tasks, and vestibulo-ocular reflex (VOR) evaluations for validating eye-tracking data.
Test fear responses in controlled VR environments, capturing gaze data and sending real-time signals to physiological recording systems.
And much more, see the library for the full list
For more details and to explore the full list of available experiments, visit the SightLab Example Scripts Library.
There are many more examples included, or you can easily create your own custom VR experiment from scratch with little or even no code with SightLab VR Pro.
For more information or to inquire about getting a demo click here or contact sales@worldviz.com.