On-device emotion inference from biosignals (HR/RR) for Dart, Python, Kotlin, and Swift applications
Synheart Emotion is a comprehensive SDK ecosystem for inferring momentary emotions from biosignals (heart rate and RR intervals) directly on device, ensuring privacy and real-time performance.
- π± Multi-Platform: Dart/Flutter, Python, Kotlin, Swift
- π Real-Time Inference: Live emotion detection from heart rate and RR intervals
- π§ On-Device Processing: All computations happen locally for privacy
- π Unified API: Consistent API across all platforms
- π Privacy-First: No raw biometric data leaves your device
- β‘ High Performance: < 5ms inference latency on mid-range devices
- π Research-Based: Models trained on WESAD dataset with 78% accuracy
- π§ͺ Thread-Safe: Concurrent data ingestion supported on all platforms
All SDKs provide identical functionality with platform-idiomatic APIs. Each SDK is maintained in its own repository:
dependencies:
synheart_emotion: ^0.2.1π Repository: synheart-emotion-dart
pip install synheart-emotionπ Repository: synheart-emotion-python
dependencies {
implementation("ai.synheart:emotion:0.1.0")
}π Repository: synheart-emotion-kotlin
Swift Package Manager:
dependencies: [
.package(url: "https://github.com/synheart-ai/synheart-emotion-swift.git", from: "0.1.0")
]CocoaPods:
pod 'SynheartEmotion', '~> 0.1.0'π Repository: synheart-emotion-swift
This repository serves as the source of truth for shared resources across all SDK implementations:
synheart-emotion/ # Source of truth repository
βββ models/ # ML model definitions and assets
β βββ wesad_emotion_v1_0.json # Model configuration
β βββ *.onnx # Pre-trained model weights
β
βββ docs/ # Technical documentation
β βββ RFC-E1.1.md # Complete technical specification
β βββ MODEL_CARD.md # Model details and performance
β
βββ tools/ # Development tools
β βββ synthetic-data-generator/ # Generate test biosignal data
β βββ wesad-reference-models/ # Research artifacts (14 ML models)
β
βββ examples/ # Cross-platform example applications
βββ scripts/ # Build and deployment scripts
βββ CONTRIBUTING.md # Contribution guidelines for all SDKs
Platform-specific SDK repositories (maintained separately):
- synheart-emotion-dart - Dart/Flutter SDK
- synheart-emotion-python - Python SDK
- synheart-emotion-kotlin - Kotlin SDK
- synheart-emotion-swift - Swift SDK
from datetime import datetime
from synheart_emotion import EmotionEngine, EmotionConfig
# Initialize engine
config = EmotionConfig()
engine = EmotionEngine.from_pretrained(config)
# Push biosignal data
engine.push(
hr=72.0,
rr_intervals_ms=[850.0, 820.0, 830.0, 845.0, 825.0],
timestamp=datetime.now()
)
# Get inference results
results = engine.consume_ready()
for result in results:
print(f"Emotion: {result.emotion} ({result.confidence:.1%})")import 'package:synheart_emotion/synheart_emotion.dart';
// Initialize the emotion engine
final engine = EmotionEngine.fromPretrained(
const EmotionConfig(
window: Duration(seconds: 60),
step: Duration(seconds: 5),
),
);
// Push biometric data
engine.push(
hr: 72.0,
rrIntervalsMs: [850.0, 820.0, 830.0, 845.0, 825.0],
timestamp: DateTime.now(),
);
// Get results
final results = engine.consumeReady();
for (final result in results) {
print('Emotion: ${result.emotion} (${result.confidence})');
}import ai.synheart.emotion.*
val config = EmotionConfig()
val engine = EmotionEngine.fromPretrained(config)
engine.push(
hr = 72.0,
rrIntervalsMs = listOf(850.0, 820.0, 830.0, 845.0, 825.0),
timestamp = Date()
)
val results = engine.consumeReady()
results.forEach { result ->
println("Emotion: ${result.emotion} (${result.confidence})")
}import SynheartEmotion
let config = EmotionConfig()
let engine = try! EmotionEngine.fromPretrained(config: config)
engine.push(
hr: 72.0,
rrIntervalsMs: [850.0, 820.0, 830.0, 845.0, 825.0],
timestamp: Date()
)
let results = engine.consumeReady()
results.forEach { result in
print("Emotion: \(result.emotion) (\(result.confidence))")
}The library currently supports three emotion categories:
- π Amused: Positive, engaged emotional state
- π Calm: Relaxed, peaceful emotional state
- π° Stressed: Anxious, tense emotional state
Generate realistic biosignal data for testing all SDKs:
cd tools/synthetic-data-generator
# Generate test data
python cli.py --emotion Calm --duration 60 --output ./data
# Generate session with transitions
python cli.py --session Calm Stressed Amused --transitions --output ./dataExports to: CSV, JSON, Python, Kotlin, Swift
π Data Generator Documentation
Research artifacts with 14 pre-trained ML models from WESAD dataset:
- XGBoost, RandomForest, ExtraTrees, KNN, LDA, SVM, etc.
- For research and model comparison only
- Not for production use (use SDKs instead)
π Research Models Documentation
All SDKs implement the same architecture:
Wearable / Sensor
ββ(HR bpm, RR ms)βββΊ Your App
β
βΌ
Synheart Emotion SDK
[Ring Buffer] β [Feature Extraction] β [Normalization]
β
[Model]
β
EmotionResult
Components:
- Ring Buffer: Holds last 60s of HR/RR data (configurable)
- Feature Extractor: Computes HR mean, SDNN, RMSSD
- Scaler: Standardizes features using training ΞΌ/Ο
- Model: Linear SVM (One-vs-Rest) with softmax
- Emitter: Throttles outputs (default: every 5s)
All SDKs expose identical functionality:
| Feature | Python | Kotlin | Swift | Dart |
|---|---|---|---|---|
| EmotionConfig | β | β | β | β |
| EmotionEngine | β | β | β | β |
| EmotionResult | β | β | β | β |
| EmotionError | β | β | β | β |
| Feature Extraction | β | β | β | β |
| Linear SVM Model | β | β | β | β |
| Thread-Safe | β | β | β | β |
| Sliding Window | β | β | β | β |
- β 16/16 tests passing (100%)
- β All examples working
- β CLI demo functional
- β All modules compile successfully
- β 6 Kotlin source files
- β API parity verified
- β Gradle build and tests passing
- β Swift build successful
- β 6 Swift source files
- β Multi-platform support (iOS, macOS, watchOS, tvOS)
- β Swift Package Manager integration
Model Type: Linear SVM (One-vs-Rest)
Task: Momentary emotion recognition from HR/RR
Input Features: [hr_mean, sdnn, rmssd] over a 60s rolling window
Performance:
- Accuracy: ~78%
- Macro-F1: ~72%
- Latency: < 5ms on modern mid-range devices
The model is trained on WESAD-derived 3-class subset with artifact rejection and normalization.
π Model Card | RFC E1.1
- On-Device Processing: All emotion inference happens locally
- No Data Retention: Raw biometric data is not retained after processing
- No Network Calls: No data is sent to external servers
- Privacy-First Design: No built-in storage - you control what gets persisted
- Not a Medical Device: This library is for wellness and research purposes only
- Dart SDK - Dart/Flutter implementation
- Python SDK - Python implementation
- Kotlin SDK - Kotlin implementation
- Swift SDK - Swift implementation
- Synthetic Data Generator - Test data generation
- WESAD Reference Models - Research artifacts
- RFC E1.1 - Complete technical specification
- Model Card - Model details and performance
- Contributing Guide - How to contribute (covers all SDKs)
- Changelog - Version history for all SDKs
- Dart SDK: Flutter >= 3.10.0, Dart >= 3.0.0
- Python SDK: Python >= 3.8
- Kotlin SDK: Kotlin 1.8+, Android API 21+ (if targeting Android)
- Swift SDK: Swift 5.9+, iOS 13+ / macOS 11+ (if targeting Apple platforms)
For SDK-specific tests, see the individual SDK repositories:
Generate test data for all SDKs:
cd tools/synthetic-data-generator
python cli.py --emotion Calm --duration 60 --output ./test_data# Python example
from synheart_emotion import EmotionEngine, EmotionConfig
from your_sensor import get_biosignal_stream
engine = EmotionEngine.from_pretrained(EmotionConfig())
for data_point in get_biosignal_stream():
engine.push(
hr=data_point.heart_rate,
rr_intervals_ms=data_point.rr_intervals,
timestamp=data_point.timestamp
)
results = engine.consume_ready()
if results:
print(f"Current emotion: {results[0].emotion}")See Swift SDK Examples for HealthKit integration.
Target Performance (mid-range device):
- Latency: < 5ms per inference
- Model Size: < 100 KB
- CPU Usage: < 2% during active streaming
- Memory: < 3 MB (engine + buffers)
- Accuracy: 78% on WESAD dataset (3-class emotion recognition)
We welcome contributions! Please see our Contributing Guidelines for details on:
- Code style and conventions
- Testing requirements
- Pull request process
- Development setup
This project is licensed under the MIT License - see the LICENSE file for details.
- Synheart AI: synheart.ai
- Documentation: Full Documentation
- Issues: GitHub Issues
- Discussions: GitHub Discussions
If you use this SDK in your research:
@software{synheart_emotion,
title = {Synheart Emotion: Multi-platform SDK for on-device emotion inference from biosignals},
author = {Synheart AI Team},
year = {2025},
version = {0.1.0},
url = {https://github.com/synheart-ai/synheart-emotion}
}WESAD Dataset:
@article{schmidt2018introducing,
title={Introducing WESAD, a multimodal dataset for wearable stress and affect detection},
author={Schmidt, Philip and Reiss, Attila and Duerichen, Robert and Marberger, Claus and Van Laerhoven, Kristof},
journal={ICMI 2018},
year={2018}
}- Israel Goytom - Initial work, RFC Design & Architecture
- Synheart AI Team - Development & Research
Made with β€οΈ by the Synheart AI Team
Technology with a heartbeat.