Skip to content

hyunaseo/EmoShortcuts

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

EmoShortcuts

Introduction

Authors: HyunA Seo, Youngki Lee, Rajesh Balan, Thivya Kandappu

Publication: ACM UIST, Sept 2025

Paper: EmoShortcuts: Emotionally Expressive Body Augmentation for Social Mixed Reality Avatar

Research project description

We present EmoShortcuts, a novel social Mixed Reality (MR) framework that enhances emotional expression by dynamically augmenting avatar body gestures to reflect users’ emotional states. While social MR enables immersive remote interactions through avatars, conveying emotions remains challenging due to limitations in head-mounted display (HMD) tracking (e.g., missing lower-body movements, such as stomping or defensive postures), and users’ tendency to deprioritize nonverbal expressions during multitasking. EmoShortcuts addresses these challenges by introducing an augmentation framework that generates expressive body gestures even when users’ physical movements are restricted. We identified key challenges in emotional expression and explore user preferences for AI-assisted gesture augmentation. Based on these insights, we designed an interface that enables adaptive gesture augmentation, allowing for both preset and real-time user control.

Hardware Requirements

  • Meta Quest Pro for facial expression tracking

Installation and runtime requirements

  • Unity Editor: tested on 2022.3.55f1 (2022 LTS).
  • Required SDKs (tested versions):
    • Meta XR All-in-One SDK v65 (platform + core XR features).
    • Meta XR Movement SDK v65 (body tracking/IK).
    • Meta XR Avatar SDK v24 (Avatars 2.0 pipeline).
    • Unity XR Interaction Toolkit (for Unity 2022 LTS), TextMeshPro.

Emotion recognition server guide

  • Output format: send JSON strings over WebSocket, e.g.
    • {"emotion":"happy","confidence":0.87,"timestamp":1700000000}
    • Fields: emotion (string label), confidence (float 0–1), timestamp (Unix epoch seconds).
  • Send rate: 5–20 Hz depending on model latency and desired responsiveness.

WebSocket server quick start

  1. Install Python deps (use a venv): pip install websockets uvicorn fastapi.
  2. Run your server script: python server.py (example) → ensure it exposes ws://localhost:8000/ws.
  3. In Unity, connect to that WebSocket URL and parse the incoming JSON to drive avatar gesture mappings.

Citation

If you find our work helpful, please consider citing:

@inproceedings{seo2025emoshortcuts,
  title={EmoShortcuts: Emotionally Expressive Body Augmentation for Social Mixed Reality Avatars},
  author={Seo, HyunA and Lee, Youngki and Balan, Rajesh and Kandappu, Thivya},
  booktitle={Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology},
  pages={1--16},
  year={2025}
}

About

Augment virtual avatar body gestures for emotionally expressive social Mixed Reality (UIST 2025)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages