274 lines
11 KiB
Markdown
274 lines
11 KiB
Markdown
# VR Charades Research Project
|
|
|
|
## Structure
|
|
|
|
- `app.apk` - VR Charades application
|
|
- `experiment-scripts/` - Control scripts and web interface
|
|
- `data-analysis/` - Analysis notebook and face/hand tracking data
|
|
- `videos/` - Recorded experiment footage
|
|
- `vrcharades-unity/` - Unity source code
|
|
- `VR_Charades_Paper.pdf` - Research paper
|
|
- `Einverstaendniserklaerungen.pdf` - Consent forms
|
|
|
|
## Installing the VR Application
|
|
|
|
### Sideloading via SideQuest
|
|
|
|
Before running experiments, you need to install `app.apk` on your Quest Pro headsets using SideQuest:
|
|
|
|
#### Prerequisites
|
|
1. **Enable Developer Mode** on your Quest Pro:
|
|
- Create a Meta Developer account at https://developer.oculus.com/
|
|
- Install the Meta Quest mobile app on your phone
|
|
- Add your Quest Pro to your account
|
|
- Go to Settings → Developer Mode and enable it
|
|
|
|
2. **Download and Install SideQuest**:
|
|
- Download SideQuest from https://sidequestvr.com/
|
|
- Install SideQuest on your computer (Windows/Mac/Linux)
|
|
|
|
#### Sideloading Steps
|
|
1. **Connect Quest Pro to computer** via USB-C cable
|
|
2. **Put on the headset** and allow USB debugging when prompted
|
|
3. **Open SideQuest** on your computer
|
|
4. **Verify connection** - SideQuest should show your Quest Pro as connected (green dot)
|
|
5. **Install the APK**:
|
|
- Click the "Install APK from folder" button in SideQuest
|
|
- Navigate to and select `app.apk`
|
|
- Wait for installation to complete
|
|
6. **Launch the app**:
|
|
- In VR: Go to App Library → Unknown Sources → VR Charades
|
|
- Or use SideQuest's "Currently Installed Apps" section to launch it
|
|
|
|
#### Repeat for All Headsets
|
|
- Install the APK on all Quest Pro headsets that will participate in the experiment
|
|
- Ensure all headsets are connected to the same Wi-Fi network
|
|
|
|
## Running Experiments
|
|
|
|
### Prerequisites
|
|
|
|
1. Create and activate a virtual environment:
|
|
```bash
|
|
# Windows
|
|
cd experiment-scripts
|
|
python -m venv venv
|
|
venv\Scripts\activate
|
|
|
|
# Linux/Mac
|
|
cd experiment-scripts
|
|
python3 -m venv venv
|
|
source venv/bin/activate
|
|
```
|
|
|
|
2. Install dependencies:
|
|
```bash
|
|
pip install -r requirements.txt
|
|
```
|
|
|
|
### Experiment Scripts Overview
|
|
|
|
- **app.py**: Main server - web interface, experiment control, and automatic tracking recorder
|
|
- **server.py**: UDP relay for communication between VR clients (auto-started by app.py)
|
|
- **index.html**: Web UI for configuring and running experiments
|
|
- **static/**: Frontend assets (CSS, JavaScript, player display)
|
|
- **data/**: Word lists (English and German)
|
|
|
|
### Setup Instructions
|
|
|
|
#### 1. Network Setup
|
|
- Connect all VR headsets to the same network as your computer
|
|
- Note the IP addresses of both VR headsets
|
|
- Note your computer's IP address (the server IP)
|
|
|
|
To find your server IP:
|
|
```bash
|
|
# Linux/Mac
|
|
hostname -I
|
|
|
|
# Windows
|
|
ipconfig
|
|
```
|
|
|
|
**Note**: The UDP relay automatically detects and forwards data between connected VR headsets - no manual IP configuration needed in server.py!
|
|
|
|
#### 2. Start the Server
|
|
|
|
```bash
|
|
cd experiment-scripts
|
|
fastapi dev app.py
|
|
```
|
|
|
|
This single command automatically starts:
|
|
- Web interface on http://localhost:8000
|
|
- UDP relay server (server.py)
|
|
- Tracking data recorder
|
|
|
|
Navigate to `http://localhost:8000` to access the control interface.
|
|
|
|
### Web Interface Usage
|
|
|
|
All experiment configuration and control is done through the web interface at `http://localhost:8000`.
|
|
|
|
#### 1. Configure VR Headsets
|
|
|
|
In the **VR Headset Configuration** section:
|
|
- Enter **Server IP Address**: Your computer's IP address
|
|
- Enter **Player 1 IP Address**: First VR headset IP
|
|
- Enter **Player 2 IP Address**: Second VR headset IP
|
|
- Select **Experiment Mode** (see Experiment Conditions below)
|
|
- Click **"Send Configuration to Headsets"**
|
|
|
|
Wait for confirmation that the configuration was sent successfully.
|
|
|
|
#### 2. Configure Experiment Session
|
|
|
|
In the **Session Configuration** section:
|
|
- **Group ID**: Identifier for this session (used in CSV filenames)
|
|
- **Time per Word**: Duration in seconds for each word (e.g., 30)
|
|
- **Total Duration**: Total experiment time in minutes (0 = unlimited)
|
|
|
|
In the **Network Configuration** section:
|
|
- **Active Player**: Select which player will be performing (Player 1 or Player 2)
|
|
|
|
#### 3. Prepare Word List
|
|
|
|
In the **Word List** section:
|
|
- Copy words from `data/word-list.txt` or enter your own (one word per line)
|
|
- Click **"Shuffle Words"** to randomize order
|
|
- Click **"Start Experiment"** when ready
|
|
|
|
#### 4. Run the Experiment
|
|
|
|
When you click **"Start Experiment"**:
|
|
- The system automatically sends words to the active player
|
|
- Tracking data recording starts automatically
|
|
- Words advance based on the timer
|
|
- Check the checkbox next to each word if guessed correctly
|
|
- Click **"Stop Experiment"** to end early
|
|
|
|
#### 5. Export Data
|
|
|
|
After the experiment:
|
|
- **"Save Results (CSV)"**: Downloads word results
|
|
- Format: `{group_id}_results_{timestamp}.csv`
|
|
- Contains: word, correct/incorrect, time remaining
|
|
|
|
- **"Download Tracking Data (CSV)"**: Downloads tracking data
|
|
- Format: `{group_id}_tracking_{timestamp}.csv`
|
|
- Contains: camera and controller positions/rotations at 60Hz
|
|
- Includes: timestamps, current word, condition, elapsed time
|
|
|
|
### Experiment Conditions
|
|
|
|
The experiment supports six different conditions that control which body parts are tracked and how:
|
|
|
|
| Condition | Description | Settings |
|
|
|-----------|-------------|----------|
|
|
| **Dynamic Face** | Real-time face tracking with expressions and eye rotation | 1;1;1;0 |
|
|
| **Dynamic Hands** | Real-time hand tracking with finger gestures | 0;0;0;1 |
|
|
| **Dynamic Hands+Face** | Full tracking: face, expressions, eyes, and hands | 1;1;1;1 |
|
|
| **Static Face** | Head position tracking only (no expressions) | 1;0;0;0 |
|
|
| **Static Hands** | Controller tracking (no finger tracking) | 0;0;0;1 |
|
|
| **Static Hands+Face** | Head position + controller tracking | 1;0;0;1 |
|
|
|
|
**Mode format**: `<show_head>;<show_facial_expression>;<show_eye_rotation>;<show_hands>`
|
|
|
|
**Notes**:
|
|
- **Dynamic modes**: Use natural face/hand tracking via Quest Pro sensors
|
|
- **Static modes**: Participants must use controllers for hand input
|
|
- Select the condition in the web interface before starting the experiment
|
|
|
|
### Tracking Data
|
|
|
|
The system automatically records tracking data from the **active player** (the one performing charades) at approximately 60Hz:
|
|
|
|
**Recorded data**:
|
|
- Center eye camera position (x, y, z) and rotation (w, x, y, z)
|
|
- Left hand controller position and rotation
|
|
- Right hand controller position and rotation
|
|
- Current word being performed
|
|
- Timestamps and elapsed time
|
|
|
|
**Data recording**:
|
|
- Starts automatically when experiment starts
|
|
- Stops automatically when experiment stops
|
|
- Exports as CSV with group ID and timestamp in filename
|
|
|
|
|
|
### Tracking Data CSV Structure
|
|
|
|
The exported tracking data CSV contains the following columns:
|
|
|
|
| Column Name | Description | Example Value |
|
|
|------------------------- |--------------------------------------------------|-----------------------|
|
|
| timestamp | Unix timestamp (seconds since epoch) | 1718123456.1234 |
|
|
| elapsed_time | Seconds since experiment start | 12.3456 |
|
|
| player_id | "player1" or "player2" | player1 |
|
|
| role | "mimicker" or "guesser" | mimicker |
|
|
| group_id | Experiment group identifier | g1 |
|
|
| condition | Experiment mode string | 1;1;1;1 |
|
|
| current_word | Word being performed | Applaudieren |
|
|
| word_time_remaining | Seconds left for current word | 18.1234 |
|
|
| center_eye_pos_x | Center eye camera position X | 0.1234 |
|
|
| center_eye_pos_y | Center eye camera position Y | 1.2345 |
|
|
| center_eye_pos_z | Center eye camera position Z | -0.5678 |
|
|
| center_eye_rot_w | Center eye camera rotation W (quaternion) | 0.9876 |
|
|
| center_eye_rot_x | Center eye camera rotation X (quaternion) | 0.0123 |
|
|
| center_eye_rot_y | Center eye camera rotation Y (quaternion) | 0.0456 |
|
|
| center_eye_rot_z | Center eye camera rotation Z (quaternion) | -0.0789 |
|
|
| left_hand_pos_x | Left hand position X | 0.2345 |
|
|
| left_hand_pos_y | Left hand position Y | 1.3456 |
|
|
| left_hand_pos_z | Left hand position Z | -0.6789 |
|
|
| left_hand_rot_w | Left hand rotation W (quaternion) | 0.8765 |
|
|
| left_hand_rot_x | Left hand rotation X (quaternion) | 0.0234 |
|
|
| left_hand_rot_y | Left hand rotation Y (quaternion) | 0.0567 |
|
|
| left_hand_rot_z | Left hand rotation Z (quaternion) | -0.0890 |
|
|
| right_hand_pos_x | Right hand position X | 0.3456 |
|
|
| right_hand_pos_y | Right hand position Y | 1.4567 |
|
|
| right_hand_pos_z | Right hand position Z | -0.7890 |
|
|
| right_hand_rot_w | Right hand rotation W (quaternion) | 0.7654 |
|
|
| right_hand_rot_x | Right hand rotation X (quaternion) | 0.0345 |
|
|
| right_hand_rot_y | Right hand rotation Y (quaternion) | 0.0678 |
|
|
| right_hand_rot_z | Right hand rotation Z (quaternion) | -0.0901 |
|
|
|
|
**All values are separated by semicolons (`;`).**
|
|
|
|
|
|
|
|
|
|
## Unity Project Details
|
|
|
|
### Unity Version
|
|
- **Unity Editor**: 6000.0.49f1 (2024.1.19f1)
|
|
|
|
### Key Dependencies
|
|
- **Meta XR SDK Core**: 76.0.1
|
|
- **Meta XR SDK Interaction**: 76.0.1
|
|
- **Meta XR SDK Movement**: 76.0.1
|
|
- **Unity XR Interaction Toolkit**: 3.1.1
|
|
- **Unity XR Hands**: 1.5.0
|
|
- **Unity XR Management**: 4.5.1
|
|
- **Unity OpenXR**: 1.14.1
|
|
- **Universal Render Pipeline**: 17.0.4
|
|
- **Unity Input System**: 1.14.0
|
|
- **Unity Timeline**: 1.8.7
|
|
|
|
### Installing Meta XR Movement SDK
|
|
|
|
To install the Meta XR Movement SDK v76.0.1 in Unity:
|
|
|
|
1. Open Unity Package Manager (`Window` > `Package Manager`)
|
|
2. Click the **+** button in the top-left corner
|
|
3. Select **"Add package from git URL..."**
|
|
4. Enter: `https://github.com/oculus-samples/Unity-Movement.git#v76.0.1`
|
|
5. Click **Add**
|
|
|
|
For more information, see the [Unity-Movement repository](https://github.com/oculus-samples/Unity-Movement).
|
|
|
|
### Build Requirements
|
|
- Unity 6000.2.1f1 or compatible
|
|
- Meta XR SDK 76.0.1
|
|
- Android Build Support module
|
|
- OpenXR support enabled
|
|
- Quest Pro development setup |