Authors: HyunA Seo, Youngki Lee, Rajesh Balan, Thivya Kandappu
Publication: ACM UIST, Sept 2025
Paper: EmoShortcuts: Emotionally Expressive Body Augmentation for Social Mixed Reality Avatar
We present EmoShortcuts, a novel social Mixed Reality (MR) framework that enhances emotional expression by dynamically augmenting avatar body gestures to reflect users’ emotional states. While social MR enables immersive remote interactions through avatars, conveying emotions remains challenging due to limitations in head-mounted display (HMD) tracking (e.g., missing lower-body movements, such as stomping or defensive postures), and users’ tendency to deprioritize nonverbal expressions during multitasking. EmoShortcuts addresses these challenges by introducing an augmentation framework that generates expressive body gestures even when users’ physical movements are restricted. We identified key challenges in emotional expression and explore user preferences for AI-assisted gesture augmentation. Based on these insights, we designed an interface that enables adaptive gesture augmentation, allowing for both preset and real-time user control.
- Meta Quest Pro for facial expression tracking
- Unity Editor: tested on
2022.3.55f1(2022 LTS). - Required SDKs (tested versions):
- Meta XR All-in-One SDK
v65(platform + core XR features). - Meta XR Movement SDK
v65(body tracking/IK). - Meta XR Avatar SDK
v24(Avatars 2.0 pipeline). - Unity XR Interaction Toolkit (for Unity 2022 LTS), TextMeshPro.
- Meta XR All-in-One SDK
- Output format: send JSON strings over WebSocket, e.g.
{"emotion":"happy","confidence":0.87,"timestamp":1700000000}- Fields:
emotion(string label),confidence(float 0–1),timestamp(Unix epoch seconds).
- Send rate: 5–20 Hz depending on model latency and desired responsiveness.
- Install Python deps (use a venv):
pip install websockets uvicorn fastapi. - Run your server script:
python server.py(example) → ensure it exposesws://localhost:8000/ws. - In Unity, connect to that WebSocket URL and parse the incoming JSON to drive avatar gesture mappings.
If you find our work helpful, please consider citing:
@inproceedings{seo2025emoshortcuts,
title={EmoShortcuts: Emotionally Expressive Body Augmentation for Social Mixed Reality Avatars},
author={Seo, HyunA and Lee, Youngki and Balan, Rajesh and Kandappu, Thivya},
booktitle={Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology},
pages={1--16},
year={2025}
}