Skip to content

Real-time MetaHuman face tracking for Unreal Engine 5 using MediaPipe4U. Webcam-based 52 ARKit blendshape capture via LiveLink for facial animation.

Notifications You must be signed in to change notification settings

chocolatepcode/unreal-motion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MediaPipe4U MetaHuman Face Tracking

Real-time webcam-based face tracking for MetaHuman characters in Unreal Engine 5 using MediaPipe4U plugin.

Overview

This project provides a complete solution for driving MetaHuman facial animations via webcam-based face tracking through MediaPipe4U and LiveLink. The implementation uses a code-first approach with C++ classes that automatically initialize and configure everything minimizing manual editor setup.

Key Features

  • Automatic Setup: C++ classes spawn automatically via custom GameMode
  • Webcam Face Tracking: Real-time 52 ARKit blendshape capture via MediaPipe4U
  • LiveLink Integration: Seamless connection to MetaHuman face rigs
  • Python Automation: Scripts for programmatic blueprint and actor setup
  • One-Time Configuration: Only MetaHuman LiveLink connection requires manual setup

Tech Stack

  • Engine: Unreal Engine 5.4+
  • Language: C++ (Standard 17)
  • Plugin: MediaPipe4U (MediaPipeHolistic, MediaPipe4ULiveLink)
  • Camera: Webcam input (device index 0)
  • Output: LiveLink ARKit blendshapes (52 shapes)
  • Automation: Python Editor Scripting, PowerShell

Project Structure


routed-collar/
.gitignore                    # Git ignore patterns (Python, UE5)
README.md                     # This file
.md/                          # Local-only documentation
AGENTS.md                 # AI agent rules
CONTEXT.md                # Project context
GUIDE.md                  # Human workflow guide
LOG.md                    # Development log
TASKS.md                  # Task management
build_and_run.ps1             # Build automation script
monitor_logs.ps1              # Log monitoring utility
project/
create_mediapipe_setup.py     # Python automation for blueprints
run_mediapipe_setup.ps1       # PowerShell wrapper for automation
AUTOMATION_README.md          # Automation documentation
build_instructions.md         # Build guide
implementation_plan.md        # Technical implementation details
SETUP_INSTRUCTIONS.md         # Manual setup guide
walkthrough.md                # Complete walkthrough
task.md                       # Current tasks
tracking/                     # Unreal Engine 5 project
tracking.uproject         # Project file
Config/                   # Engine configuration
Content/                  # Assets (maps, materials, etc.)
Source/                   # C++ source code
tracking/
Public/
MediaPipe4UFaceTrackerActor.h
AutoSpawnGameMode.h
Private/
MediaPipe4UFaceTrackerActor.cpp
AutoSpawnGameMode.cpp
src/                          # Empty (reserved for future)
tests/                        # Empty (reserved for future)

Quick Start

Prerequisites

  • Unreal Engine 5.4 or later
  • MediaPipe4U plugin (with valid license)
  • MetaHuman character in your project
  • Webcam (works best with decent lighting)

Installation

  1. Clone the repository:
    git clone <repository-url>
    cd routed-collar
    

2. **Enable MediaPipe4U plugins:**
* Open `project/tracking/tracking.uproject` in Unreal Editor
* Edit > Plugins > Search "MediaPipe"
* Enable: `MediaPipe4U`, `MediaPipe4ULiveLink`, `MediaPipeHolistic`
* Restart editor



### Setup Options

#### Option A: Automated Setup (Recommended)

Run the PowerShell automation script:

```powershell
cd project
.\run_mediapipe_setup.ps1

This creates:

  • BP_MediaPipeHolistic blueprint with tracking component
  • BP_FaceLink blueprint for LiveLink connection
  • Actors automatically placed in current level

Option B: Manual C++ Setup

  1. Regenerate project files:
  • Right-click project/tracking/tracking.uproject
  • Select "Generate Visual Studio project files"
  1. Build the project:
  • Open project/tracking/tracking.sln in Visual Studio
  • Build > Build Solution (Ctrl+Shift+B)
  1. Configure MetaHuman:
  • Open level in UE Editor
  • Select your MetaHuman
  • Details > Face component > LiveLink Subject Name = M4UFace
  • Save level

Option C: Python Script in Editor

  1. Open UE Editor with the project
  2. Window > Developer Tools > Output Log > Python
  3. Run:
exec(open(r'C:\path\to\project\create_mediapipe_setup.py').read())

Usage

Testing in Editor

  1. Open your level in Unreal Editor
  2. Press Play (Alt+P)
  3. Check Output Log for initialization messages:
LogTemp: MediaPipe4U Face Tracking - Initializing
LogTemp: MediaPipe4U: Camera Device = 0
LogTemp: MediaPipe4U: LiveLink Subject Name = M4UFace
LogMediaPipe4U: Webcam device 0 opened successfully
LogLiveLink: LiveLink source 'M4UFace' connected

  1. Your webcam light should turn on
  2. MetaHuman face should mirror your expressions

Customization

All settings are configurable in C++:

Change webcam device (MediaPipe4UFaceTrackerActor.h):

int32 CameraDeviceIndex = 0;  // Change to 1, 2, etc.

Change LiveLink subject name:

FString LiveLinkSubjectName = TEXT("M4UFace");

Disable auto-start:

bool bAutoStart = false;

How It Works

                     System Flow
                          |
   [PIE/Game Start]       v
          |
   [AutoSpawnGameMode] --------> Spawns MediaPipe4UFaceTrackerActor
          |
   [UM4UFaceLandmarkerComponent] ------> MediaPipe4U plugin
          |
   [Webcam Capture] -----------> Face detection (52 ARKit blendshapes)
          |
   [LiveLink Source "M4UFace"] --------> Real-time streaming
          |
   [MetaHuman Face Component] ---------> Receives via LiveLink
          |
   [Character Animation] ------> Facial expression mirroring

Documentation

Document Description
project/AUTOMATION_README.md Python automation details
project/SETUP_INSTRUCTIONS.md Manual setup steps
project/build_instructions.md Build and troubleshooting
project/implementation_plan.md Technical implementation
project/walkthrough.md Complete walkthrough

Troubleshooting

Webcam not detected

  • Check Windows Settings > Privacy > Camera > Allow apps access
  • Verify no other app is using the camera
  • Try different CameraDeviceIndex (0, 1, 2)

MediaPipe4U plugin errors

  • Verify license is valid
  • Check all three MediaPipe4U plugins are enabled
  • Clean/rebuild project: Build > Clean Solution, then rebuild

MetaHuman not responding

  • Verify LiveLink Subject Name matches exactly: M4UFace
  • Check LiveLink window (Window > LiveLink) for active source
  • Ensure MetaHuman Face component has LiveLink integration enabled

Build errors

  • Regenerate Visual Studio project files
  • Clean Intermediate/Binaries folders
  • Delete .vs folder and reopen solution

File Length Compliance

Per project constraints (max 200 lines per file):

  • All source files under limit
  • Refactoring suggested if exceeded

License

See MediaPipe4U plugin documentation for licensing terms.

Development

Task Management

See .md/TASKS.md for active development tasks and .md/LOG.md for development history.

Contributing

  1. Read .md/AGENTS.md for AI agent workflow rules
  2. Follow atomic commit pattern: one logical change per commit
  3. Use conventional commits: feat:, fix:, docs:, chore:
  4. Update .md/LOG.md after session completion

Status: Implementation complete, ready for testing and production use.

About

Real-time MetaHuman face tracking for Unreal Engine 5 using MediaPipe4U. Webcam-based 52 ARKit blendshape capture via LiveLink for facial animation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published