Skip to content

Official code for the NeurIPS 2025 paper "egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks."

Notifications You must be signed in to change notification settings

eth-siplab/egoEMOTION

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-world Tasks (NeurIPS 2025)

Matthias Jammot*, Björn Braun*, Paul Streli, Rafael Wampfler, Christian Holz
* Equal contribution

Sensing, Interaction & Perception Lab, Department of Computer Science, ETH Zürich, Switzerland

👓🎭 egoEMOTION

egoEMOTION is the first dataset that couples egocentric visual and physiological signals with dense self-reports of emotion and personality across controlled and real-world scenarios. Participants completed emotion-elicitation tasks and naturalistic activities while self-reporting their affective state using the Circumplex Model and Mikels’ Wheel as well as their personality via the Big Five model.

Overview

🎥 egoEMOTION dataset

The egoEMOTION dataset includes over 50 hours of recordings from 43 participants, captured using Meta’s Project Aria glasses. Each session provides synchronized eye-tracking video, head-mounted photoplethysmography, inertial motion data, and physiological baselines for reference.

To download the dataset, please visit the following link: egoEMOTION Dataset.

You have to sign a Data Transfer and Use Agreement (DTUA) form to agree to our terms of use. Please note that only members of an institution (e.g., a PI or professor) can sign this DTUA. After you have signed the DTUA, you will receive a download link via email. The dataset is around 380GB in size. The dataset is only for non-commercial, academic research purposes.

Sensors

🔧 Setup

To create the environment that we used for our paper, simply run:

conda env create -f environment.yml

📁 Code structure

Everything is running using the main.py file and the config files in the configs folder. The code is structured to allow easy experimentation with different model architectures, feature extraction methods, and input modalities. You can modify:

  1. Signal processing-based approaches: different feature selection and feature scaling methods, and different classifiers.
  2. Deep learning-based approaches: 2 different architectures (one classical CNN and one transformer-based architecture).
  3. The prediction target: continuous affect (arousal, valence, dominance), discrete emotions, or personality.
  4. Choose the input modalities: ECG, EDA, RR, Pupils, IMU from the head, pixel intensity, Fisherfaces, micro-expressions, gaze, and nose PPG. You can flexibly combine these modalities.

⚡ Training and inference

To run the main.py file, simply run the following command. The first input is the file containing all the information of the dataset (egoEMOTION) and the second input is the desired experiment configuration file.

python -m main ./configs/config_egoemotion.yaml ./configs/ml/egoemotion_personality_RF.yaml

Important: Make sure to adjust all folder paths in the config files to your local setup.

📊 Results for egoEMOTION

Benchmark Model Wearable Devices Egocentric Glasses All
Continuous Affect Classical 0.70 ± 0.14 0.74 ± 0.13 0.75 ± 0.13
DCNN [1] 0.63 ± 0.05 0.68 ± 0.05 0.68 ± 0.07
WER [2] 0.49 ± 0.21 0.65 ± 0.11 0.60 ± 0.16
Discrete Emotions Classical 0.24 ± 0.08 0.46 ± 0.18 0.46 ± 0.17
DCNN [1] 0.12 ± 0.01 0.23 ± 0.03 0.22 ± 0.02
WER [2] 0.13 ± 0.02 0.22 ± 0.03 0.21 ± 0.04
Personality Traits Classical 0.50 ± 0.48 0.57 ± 0.49 0.59 ± 0.49
DCNN [1] 0.43 ± 0.26 0.42 ± 0.20 0.41 ± 0.25
WER [2] 0.38 ± 0.28 0.47 ± 0.24 0.44 ± 0.28

Table: Performance comparison between classical and deep learning approaches on the egoEMOTION dataset.

📜 Citation

If you find our paper, code or dataset useful for your research, please cite our work.

@article{jammot2025egoemotion,
  title={egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks},
  author={Jammot, Matthias and Braun, Bj{\"o}rn and Streli, Paul and Wampfler, Rafael and Holz, Christian},
  journal={arXiv preprint arXiv:2510.22129},
  year={2025}
}

👓💓 egoPPG

Make sure to also check out our work egoPPG, a novel vision task for egocentric vision systems to recover a person’s cardiac activity only from the eye tracking videos of egocentric systems. We demonstrate egoPPG’s downstream benefit for a key task on EgoExo4D, an existing egocentric dataset for which we find PulseFormer’s estimates of HR to improve proficiency estimation by 14%.

About

Official code for the NeurIPS 2025 paper "egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks."

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages