Skip to content

vrsys/Synchrony-Perception--Code

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rhythmic Interaction influences Synchrony Perception in VR - Unity VR Eye Tracking Study

Overview

This is the Unity code of the paper Rhythmic Interaction influences Synchrony Perception in VR which presents participants with rhythmic audiovisual stimuli, which can be temporally offset to simulate the result of different network latencies. The results from a user study showed, that rhythmic interaction can make audio-leading audiovisual offsets appear more synchronous. Subjective synchrony judgments (ratings) and behavioral metrics (hand movements, eye movements and pupil dilation) are recorded and stored for later analysis. For this the VIVE Full Face Tracker is required which provides the pupil data. The anonymized study data of the paper can be found in the corresponding data repository.

To ensure accurate control over the audiovisual synchronization the timing of the hardware (HTC VIVE Elite XR) needs to be taken into account. For this the time between the presentation of audio and visual stimuli through the HMD can be measured (e.g. using SyncOne2) and the measured timing can be entered in the application to offset the stimuli presentation accordingly to ensure that stimuli with a offset of n ms are actually presented n ms apart from each other by the hardware.

The base structure of the project makes use of the open source VRSYS multi-user VR framework but does not use its multi-user capabilities. To record the behavioral data of participants the open source Analysis and Recording Plugin from the open source Immersive Study Analyzer is used in addition to a CSV recorder.

Requirements

Setup

  • Open with Unity (run as administrator)
  • Set platform to Android
  • Click VIVE/Wave Installer/Install or Update latest Version and update out-of-date packages
  • Connect HTC VIVE Elite XR via USB
  • Build and run project on HTC VIVE

Structure

  • The study code and scene is located under Assets/VR_Study
  • Input actions to advance in the study are specified in Assets/VR_Study/Ressources/InputActions/StudyActions.inputactions
  • The relevant scripts can be found under Assets/VR_Study/Scripts

Study Flow

  • After specifying the participant ID in the UI, the audiovisual synchrony calibration using SyncOne2 can be started
  • Once the calibration started, the screen flashes while playing back a short sound
  • This process can be stopped using the specified input action (left hand primary button)
  • The measured audiovisual offset of the HMD can then be entered using the UI
  • Afterwards the experiment can be started and instructions are displayed
  • The first trials are test trials only, after which the study trials begin
  • Judgments and behavioral data gets recorded to the local device and has to be retrieved from there after the study
  • For data retrieval, the HMD needs to be connected to the PC.
    Files can be found under: This PC\VIVE XR Series\Internal shared storage\Android\data\com.vrsys.groove\files

BibTeX Citation

If you use the code in a scientific publication, we would appreciate using the following citations:

@article{Lammert2025,
    doi       = {},
    url       = {},
    year      = {2025},
    booktitle = {2025 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)}, 
    author    = {Lammert, Anton Benjamin, and Klass, Lina and  and Simon, Laura and Hornecker, Eva and Ehlers, Jan, and Froehlich, Bernd},
    title     = {Rhythmic Interaction influences Synchrony Perception in VR},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published