Academic Research Experiment Templates and Examples in SightLab VR Pro for Vizard

July 2, 2024

SightLab VR Pro for Vizard streamlines academic VR research experiment creation offering intuitive GUI and Python coding. With over 100 examples and templates, this article highlights standout templates to enhance your VR research. Watch this VR experiment sample video first!

SightLab VR Pro is a versatile academic VR research experiment generator tool for Vizard, designed to streamline the creation of virtual reality (VR) experiments. It offers users the choice of an intuitive graphical user interface (GUI) or Python coding to set up their experiments with ease. With over 100 examples and templates now available, SightLab VR Pro provides a comprehensive library that can be leveraged to enhance and build upon your VR experiments. This article showcases some of the standout examples and templates, illustrating how you can utilize them to expand the functionality and capabilities of your VR research.

All of these academic VR research experiment templates will allow you to leverage all of the built in functionality of SightLab VR Pro which includes:

  • Measure advanced sensor data
  • Visualize with interactive replays
  • Save out raw data
  • Integrate into larger VR experiments
  • Multi User capabilities on most of the examples

See here for full list of experiment templates and examples.

Visual Search Templates

This suite includes five distinct Visual Search tasks. Each task can be run as-is or customized to suit your needs. Key features include:

  • Integration with SightLab VR Pro's core functionality
  • No requirement for coding when customizing
  • STIM file reading capability
  • Experiment data file creation
  • Advanced visualization optionssome text
    • Measure fixations, saccades (amplitude, velocity, peak velocity), dwell time on objects, time to find target, eye position, pupil diameter
    • Visualize scan paths, heatmaps, fixations spheres, interactions and much more

More information here.


Task 1: Customizable Visual Search


In this task, participants are instructed to locate a specified target object amidst a series of objects. Upon successfully finding the target, an auditory cue is triggered, and the object is highlighted. Post-task, you can access raw data files, interactive session replays, and, if using BIOPAC Acqknowledge, analyze physiological data. Analyze time to first fixation, saccades, time to find target and more. 

Task 2: Visual Search Single Object - Experimenting with Variable Manipulation

Offers a focused approach to demonstrating experimental control in a visual search task. This setup is particularly valuable for:

  • Setting up experimental conditions using a STIM file or GUI.
  • Triggering specific events, like ending a trial when the target is found.
  • Configuring constants through a configuration file.
  • Recording experimental data for analysis with tools like matplotlib and PANDAS.
  • Analyzing a condition and measurement comparison
  • Can be configured and created completely with no code or using a STIM file

Task 3: Visual Search Multiple Objects - Analyzing Choice and Confidence

In the 'Visual Search Multiple Objects' task, participants are engaged in an environment where they must discern a target object based on size differences. This task is designed to study participant choices and confidence levels under varying conditions, and provides rich data for analysis.

  • Challenge: Among a series of objects, one object differs in size (either larger or smaller).
  • Participant Interaction:some text
    • Object Selection: Use a highlighter tool to select the object they believe is different.
    • Confidence Rating: After selection, rate their confidence in their choice.
  • Variable Conditions: Both the number of objects presented and their sizes can be varied, offering a diverse range of experimental setups.
  • Analyze association between number of objects, size of objects, confidence level and user accuracy. 

Task 4: Visual Search Randomize - Timed and Standard Challenges

The 'Visual Search Randomize' task, initially set in an art gallery environment (modifiable as per your needs), offers a dynamic visual search experience. In each trial, participants are immersed in a virtual environment scattered with various objects. Their primary goal is to locate a specific target object. The task stands out for its randomization of object locations in each trial, introducing both consistency and variability in the search challenge.

Task 5 - Pigeon Hunt

Follow the sound of hooting pigeons and catch them before they fly away. Shows how to keep score of targets found in a game-like setting. 

Stimulus Presentation / Response

  • Present 3D models or 360 videos
  • Randomize
  • User Feedback
  • Measure implicit reactions with eye tracking, physio, EEG
  • Drag and Drop

More information here.

Driving Simulators / Examples

See how to integrate vehicles into your VR experiment or VR eye tracking experiment

  • Use either 3D models or 360 media integrated into 3D model vehicles
  • Examples include cars, cockpit, or modify with your 3D model assets
  • Measure and visualize all data

More information here.

AI Intelligent Agents and Avatars

Template for an interactive, intelligent AI agent that can be connected to various large language models like GPT-4 and Claude Opus. You can customize the agent's personality, use speech recognition, and leverage high-quality text-to-speech models. 

More information here.

Additionally, you can set up a point and click interactive application that allows you to get information and ask questions about any object in your scene. This is automatically using the objects of interest, which is just a simple checkbox. 

More information here.

Eye Tracker Tests

A collection of tests designed to assess the accuracy of your eye tracker. 

Includes:

  • Spatial Accuracy Test
  • Smooth Pursuit
  • VOR reflex
  • Save and visualize data as csv, charts, and more
  • And more

More information here.

Virtual Screen

  • Bring in a 2D screen of any size for controlled 2D stimulus exposure
  • Show videos, images etc.
  • PsychoPy integration coming soon for wide selection of experiments

More information here.

External Application Data Recorder

The External Application Data Recorder allows you to record, save and synchronize sensor data while running external VR applications from any source (SteamVR game, Oculus, Web application, etc.). This uses the Nvidia Shadow Play recording feature to also have the option of recording and saving a video that can then be viewed back and matched up with the data recording.

More information here.

Gaze Based Interactions

Elements in the scene react to your gaze.

More information here.

Mixed/Augmented Reality

  • Examples of using the passthrough camera to display various levels of mixed reality. Will work through OpenXR support of Mixed Reality (Meta Quest 3, Meta Quest Pro, Varjo XR-4) currently
  • Overlay objects in the real world, including 3D objects, virtual screens, multi-user avatars, data visualizations, biofeedback representations and more
  • Bring real world objects into a virtual scene
  • Measure eye tracking data on real world objects

More information here.

Virtual Mirror

This demo allows you to view your reflection in a virtual mirror in VR. With features such as avatar swapping, facial expression tracking, body tracking, and customizable environments. Facial Expression data can also be saved to a file and analyzed afterwards.

More information here.

Hand Grabbing / Grab Events / Physics Based Interactions

Grab and interact with objects using physics. Connect to hand tracking from Meta headsets, Manus VR haptic data gloves and more. Save flexion data from fingers, measure distance between fingers. Also examples for collecting grab and release events, as well as controlling elements in the scene using hand gestures. 

More information here for hand tracking with physics example.

More information here for grab events example.

Phobia Exposure / Walk the Plank

Challenge Fear of Heights in the Walk the Plank demo. Capture fixations and heatmaps and send signals to BIOPAC Acqknowledge. Can also be connected with foot tracking as well as run in multi-user. 

More information here.

Charts / Graphs / Condition Comparisons

  • Generate charts and graphs based on data captured in an experiment
  • Can include things such as:some text
    • Object view count
    • Walk paths
    • Comparison between a manipulated variable and a measured reaction
    • Note that heatmaps, walk paths and other visualizations are also included directly in the session replay as well. 

More information here.

BioFeedBack / Physiological Measurement Connections

See how physiological data streaming in from BIOPAC Acqknowledge can control an object in the scene. In this case a ball will change in size and color based on the physiological signal.

More information here.

Additional Experiment Paradigms 

Distance Perception, Memory Tasks, Choice Comparison Templates, Optic Flow, Smooth Pursuit and more. Modify using no code or expand upon with built in modules. 

There are many more examples included, or you can easily create your own custom VR experiment from scratch with little or even no code with SightLab VR Pro. For more information or to inquire about getting a demo click here or contact sales@worldviz.com.

Stay Updated
Subscribe to our monthly Newsletter
CONTACT US 
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103