Real-time webcam-based face tracking for MetaHuman characters in Unreal Engine 5 using MediaPipe4U plugin.
This project provides a complete solution for driving MetaHuman facial animations via webcam-based face tracking through MediaPipe4U and LiveLink. The implementation uses a code-first approach with C++ classes that automatically initialize and configure everything minimizing manual editor setup.
- Automatic Setup: C++ classes spawn automatically via custom GameMode
- Webcam Face Tracking: Real-time 52 ARKit blendshape capture via MediaPipe4U
- LiveLink Integration: Seamless connection to MetaHuman face rigs
- Python Automation: Scripts for programmatic blueprint and actor setup
- One-Time Configuration: Only MetaHuman LiveLink connection requires manual setup
- Engine: Unreal Engine 5.4+
- Language: C++ (Standard 17)
- Plugin: MediaPipe4U (MediaPipeHolistic, MediaPipe4ULiveLink)
- Camera: Webcam input (device index 0)
- Output: LiveLink ARKit blendshapes (52 shapes)
- Automation: Python Editor Scripting, PowerShell
routed-collar/
.gitignore # Git ignore patterns (Python, UE5)
README.md # This file
.md/ # Local-only documentation
AGENTS.md # AI agent rules
CONTEXT.md # Project context
GUIDE.md # Human workflow guide
LOG.md # Development log
TASKS.md # Task management
build_and_run.ps1 # Build automation script
monitor_logs.ps1 # Log monitoring utility
project/
create_mediapipe_setup.py # Python automation for blueprints
run_mediapipe_setup.ps1 # PowerShell wrapper for automation
AUTOMATION_README.md # Automation documentation
build_instructions.md # Build guide
implementation_plan.md # Technical implementation details
SETUP_INSTRUCTIONS.md # Manual setup guide
walkthrough.md # Complete walkthrough
task.md # Current tasks
tracking/ # Unreal Engine 5 project
tracking.uproject # Project file
Config/ # Engine configuration
Content/ # Assets (maps, materials, etc.)
Source/ # C++ source code
tracking/
Public/
MediaPipe4UFaceTrackerActor.h
AutoSpawnGameMode.h
Private/
MediaPipe4UFaceTrackerActor.cpp
AutoSpawnGameMode.cpp
src/ # Empty (reserved for future)
tests/ # Empty (reserved for future)
- Unreal Engine 5.4 or later
- MediaPipe4U plugin (with valid license)
- MetaHuman character in your project
- Webcam (works best with decent lighting)
- Clone the repository:
git clone <repository-url> cd routed-collar
2. **Enable MediaPipe4U plugins:**
* Open `project/tracking/tracking.uproject` in Unreal Editor
* Edit > Plugins > Search "MediaPipe"
* Enable: `MediaPipe4U`, `MediaPipe4ULiveLink`, `MediaPipeHolistic`
* Restart editor
### Setup Options
#### Option A: Automated Setup (Recommended)
Run the PowerShell automation script:
```powershell
cd project
.\run_mediapipe_setup.ps1
This creates:
BP_MediaPipeHolisticblueprint with tracking componentBP_FaceLinkblueprint for LiveLink connection- Actors automatically placed in current level
- Regenerate project files:
- Right-click
project/tracking/tracking.uproject - Select "Generate Visual Studio project files"
- Build the project:
- Open
project/tracking/tracking.slnin Visual Studio - Build > Build Solution (Ctrl+Shift+B)
- Configure MetaHuman:
- Open level in UE Editor
- Select your MetaHuman
- Details > Face component > LiveLink Subject Name =
M4UFace - Save level
- Open UE Editor with the project
- Window > Developer Tools > Output Log > Python
- Run:
exec(open(r'C:\path\to\project\create_mediapipe_setup.py').read())- Open your level in Unreal Editor
- Press Play (Alt+P)
- Check Output Log for initialization messages:
LogTemp: MediaPipe4U Face Tracking - Initializing
LogTemp: MediaPipe4U: Camera Device = 0
LogTemp: MediaPipe4U: LiveLink Subject Name = M4UFace
LogMediaPipe4U: Webcam device 0 opened successfully
LogLiveLink: LiveLink source 'M4UFace' connected
- Your webcam light should turn on
- MetaHuman face should mirror your expressions
All settings are configurable in C++:
Change webcam device (MediaPipe4UFaceTrackerActor.h):
int32 CameraDeviceIndex = 0; // Change to 1, 2, etc.
Change LiveLink subject name:
FString LiveLinkSubjectName = TEXT("M4UFace");
Disable auto-start:
bool bAutoStart = false;
System Flow
|
[PIE/Game Start] v
|
[AutoSpawnGameMode] --------> Spawns MediaPipe4UFaceTrackerActor
|
[UM4UFaceLandmarkerComponent] ------> MediaPipe4U plugin
|
[Webcam Capture] -----------> Face detection (52 ARKit blendshapes)
|
[LiveLink Source "M4UFace"] --------> Real-time streaming
|
[MetaHuman Face Component] ---------> Receives via LiveLink
|
[Character Animation] ------> Facial expression mirroring
| Document | Description |
|---|---|
| project/AUTOMATION_README.md | Python automation details |
| project/SETUP_INSTRUCTIONS.md | Manual setup steps |
| project/build_instructions.md | Build and troubleshooting |
| project/implementation_plan.md | Technical implementation |
| project/walkthrough.md | Complete walkthrough |
- Check Windows Settings > Privacy > Camera > Allow apps access
- Verify no other app is using the camera
- Try different
CameraDeviceIndex(0, 1, 2)
- Verify license is valid
- Check all three MediaPipe4U plugins are enabled
- Clean/rebuild project: Build > Clean Solution, then rebuild
- Verify LiveLink Subject Name matches exactly:
M4UFace - Check LiveLink window (Window > LiveLink) for active source
- Ensure MetaHuman Face component has LiveLink integration enabled
- Regenerate Visual Studio project files
- Clean Intermediate/Binaries folders
- Delete
.vsfolder and reopen solution
Per project constraints (max 200 lines per file):
- All source files under limit
- Refactoring suggested if exceeded
See MediaPipe4U plugin documentation for licensing terms.
See .md/TASKS.md for active development tasks and .md/LOG.md for development history.
- Read
.md/AGENTS.mdfor AI agent workflow rules - Follow atomic commit pattern: one logical change per commit
- Use conventional commits:
feat:,fix:,docs:,chore: - Update
.md/LOG.mdafter session completion
Status: Implementation complete, ready for testing and production use.