Files
charade_experiment/README.md

7.2 KiB

VR Charades Research Project

Structure

  • app.apk - VR Charades application
  • experiment-scripts/ - Control scripts and web interface
  • data-analysis/ - Analysis notebook and face/hand tracking data
  • videos/ - Recorded experiment footage
  • vrcharades-unity/ - Unity source code
  • VR_Charades_Paper.pdf - Research paper
  • Einverstaendniserklaerungen.pdf - Consent forms

Installing the VR Application

Sideloading via SideQuest

Before running experiments, you need to install app.apk on your Quest Pro headsets using SideQuest:

Prerequisites

  1. Enable Developer Mode on your Quest Pro:

    • Create a Meta Developer account at https://developer.oculus.com/
    • Install the Meta Quest mobile app on your phone
    • Add your Quest Pro to your account
    • Go to Settings → Developer Mode and enable it
  2. Download and Install SideQuest:

Sideloading Steps

  1. Connect Quest Pro to computer via USB-C cable
  2. Put on the headset and allow USB debugging when prompted
  3. Open SideQuest on your computer
  4. Verify connection - SideQuest should show your Quest Pro as connected (green dot)
  5. Install the APK:
    • Click the "Install APK from folder" button in SideQuest
    • Navigate to and select app.apk
    • Wait for installation to complete
  6. Launch the app:
    • In VR: Go to App Library → Unknown Sources → VR Charades
    • Or use SideQuest's "Currently Installed Apps" section to launch it

Repeat for All Headsets

  • Install the APK on all Quest Pro headsets that will participate in the experiment
  • Ensure all headsets are connected to the same Wi-Fi network

Running Experiments

Prerequisites

  1. Create and activate a virtual environment:
# Windows
cd experiment-scripts
python -m venv venv
venv\Scripts\activate

# Linux/Mac
cd experiment-scripts
python3 -m venv venv
source venv/bin/activate
  1. Install dependencies:
pip install -r requirements.txt

Experiment Scripts Overview

  • app.py: Main server - web interface, experiment control, and automatic tracking recorder
  • server.py: UDP relay for communication between VR clients (auto-started by app.py)
  • index.html: Web UI for configuring and running experiments
  • static/: Frontend assets (CSS, JavaScript, player display)
  • data/: Word lists (English and German)

Setup Instructions

1. Network Setup

  • Connect all VR headsets to the same network as your computer
  • Note the IP addresses of both VR headsets
  • Note your computer's IP address (the server IP)

To find your server IP:

# Linux/Mac
hostname -I

# Windows
ipconfig

Note: The UDP relay automatically detects and forwards data between connected VR headsets - no manual IP configuration needed in server.py!

2. Start the Server

cd experiment-scripts
fastapi dev app.py

This single command automatically starts:

Navigate to http://localhost:8000 to access the control interface.

Web Interface Usage

All experiment configuration and control is done through the web interface at http://localhost:8000.

1. Configure VR Headsets

In the VR Headset Configuration section:

  • Enter Server IP Address: Your computer's IP address
  • Enter Player 1 IP Address: First VR headset IP
  • Enter Player 2 IP Address: Second VR headset IP
  • Select Experiment Mode (see Experiment Conditions below)
  • Click "Send Configuration to Headsets"

Wait for confirmation that the configuration was sent successfully.

2. Configure Experiment Session

In the Session Configuration section:

  • Group ID: Identifier for this session (used in CSV filenames)
  • Time per Word: Duration in seconds for each word (e.g., 30)
  • Total Duration: Total experiment time in minutes (0 = unlimited)

In the Network Configuration section:

  • Active Player: Select which player will be performing (Player 1 or Player 2)

3. Prepare Word List

In the Word List section:

  • Copy words from data/word-list.txt or enter your own (one word per line)
  • Click "Shuffle Words" to randomize order
  • Click "Start Experiment" when ready

4. Run the Experiment

When you click "Start Experiment":

  • The system automatically sends words to the active player
  • Tracking data recording starts automatically
  • Words advance based on the timer
  • Check the checkbox next to each word if guessed correctly
  • Click "Stop Experiment" to end early

5. Export Data

After the experiment:

  • "Save Results (CSV)": Downloads word results

    • Format: {group_id}_results_{timestamp}.csv
    • Contains: word, correct/incorrect, time remaining
  • "Download Tracking Data (CSV)": Downloads tracking data

    • Format: {group_id}_tracking_{timestamp}.csv
    • Contains: camera and controller positions/rotations at 60Hz
    • Includes: timestamps, current word, condition, elapsed time

Experiment Conditions

The experiment supports six different conditions that control which body parts are tracked and how:

Condition Description Settings
Dynamic Face Real-time face tracking with expressions and eye rotation 1;1;1;0
Dynamic Hands Real-time hand tracking with finger gestures 0;0;0;1
Dynamic Hands+Face Full tracking: face, expressions, eyes, and hands 1;1;1;1
Static Face Head position tracking only (no expressions) 1;0;0;0
Static Hands Controller tracking (no finger tracking) 0;0;0;1
Static Hands+Face Head position + controller tracking 1;0;0;1

Mode format: <show_head>;<show_facial_expression>;<show_eye_rotation>;<show_hands>

Notes:

  • Dynamic modes: Use natural face/hand tracking via Quest Pro sensors
  • Static modes: Participants must use controllers for hand input
  • Select the condition in the web interface before starting the experiment

Tracking Data

The system automatically records tracking data from the active player (the one performing charades) at approximately 60Hz:

Recorded data:

  • Center eye camera position (x, y, z) and rotation (w, x, y, z)
  • Left hand controller position and rotation
  • Right hand controller position and rotation
  • Current word being performed
  • Timestamps and elapsed time

Data recording:

  • Starts automatically when experiment starts
  • Stops automatically when experiment stops
  • Exports as CSV with group ID and timestamp in filename

Unity Project Details

Unity Version

  • Unity Editor: 6000.0.49f1 (2024.1.19f1)

Key Dependencies

  • Meta XR SDK Core: 76.0.1
  • Meta XR SDK Interaction: 76.0.1
  • Meta XR SDK Movement: 76.0.1 (local file dependency)
  • Unity XR Interaction Toolkit: 3.1.1
  • Unity XR Hands: 1.5.0
  • Unity XR Management: 4.5.1
  • Unity OpenXR: 1.14.1
  • Universal Render Pipeline: 17.0.4
  • Unity Input System: 1.14.0
  • Unity Timeline: 1.8.7

Build Requirements

  • Unity 6000.0.49f1 or compatible
  • Meta XR SDK 76.0.1
  • Android Build Support module
  • OpenXR support enabled
  • Quest Pro development setup