On-device emotion inference from biosignals (HR/RR) for Flutter applications
- π± Cross-Platform: Works on iOS and Android
- π Real-Time Inference: Live emotion detection from heart rate and RR intervals
- π§ On-Device Processing: All computations happen locally for privacy
- π Unified Output: Consistent emotion labels with confidence scores
- π Privacy-First: No raw biometric data leaves your device
- β‘ High Performance: < 5ms inference latency on mid-range devices
Add synheart_emotion to your pubspec.yaml:
dependencies:
synheart_emotion: ^0.2.2Then run:
flutter pub getimport 'package:synheart_emotion/synheart_emotion.dart';
void main() async {
// Initialize the emotion engine
final engine = EmotionEngine.fromPretrained(
const EmotionConfig(
window: Duration(seconds: 60),
step: Duration(seconds: 5),
),
);
// Push biometric data
engine.push(
hr: 72.0,
rrIntervalsMs: [823, 810, 798, 815, 820],
timestamp: DateTime.now().toUtc(),
);
// Get emotion results (synchronous - no await needed)
final results = engine.consumeReady();
for (final result in results) {
print('Emotion: ${result.emotion} (${(result.confidence * 100).toStringAsFixed(1)}%)');
}
}// Stream emotion results
final emotionStream = EmotionStream.emotionStream(
engine,
tickStream, // Your biometric data stream
);
await for (final result in emotionStream) {
print('Current emotion: ${result.emotion}');
print('Probabilities: ${result.probabilities}');
}synheart_emotion works independently but integrates seamlessly with synheart-wear for real wearable data.
First, add both to your pubspec.yaml:
dependencies:
synheart_wear: ^0.1.0 # For wearable data
synheart_emotion: ^0.2.2 # For emotion inferenceThen integrate in your app:
import 'package:synheart_wear/synheart_wear.dart';
import 'package:synheart_emotion/synheart_emotion.dart';
// Initialize both SDKs
final wear = SynheartWear();
final emotionEngine = EmotionEngine.fromPretrained(
const EmotionConfig(window: Duration(seconds: 60)),
);
await wear.initialize();
// Stream wearable data to emotion engine
wear.streamHR(interval: Duration(seconds: 1)).listen((metrics) {
emotionEngine.push(
hr: metrics.getMetric(MetricType.hr),
rrIntervalsMs: metrics.getMetric(MetricType.rrIntervals),
timestamp: DateTime.now().toUtc(),
);
// Get emotion results (synchronous - no await needed)
final emotions = emotionEngine.consumeReady();
for (final emotion in emotions) {
// Use emotion data in your app
updateUI(emotion);
}
});See examples/lib/integration_example.dart for complete integration examples.
The library currently supports three emotion categories:
- π Amused: Positive, engaged emotional state
- π Calm: Relaxed, peaceful emotional state
- π° Stressed: Anxious, tense emotional state
The main class for emotion inference:
class EmotionEngine {
// Create engine with pretrained model
factory EmotionEngine.fromPretrained(
EmotionConfig config, {
LinearSvmModel? model,
void Function(String level, String message, {Map<String, Object?>? context})? onLog,
});
// Push new biometric data
void push({
required double hr,
required List<double> rrIntervalsMs,
required DateTime timestamp,
Map<String, double>? motion,
});
// Get ready emotion results
Future<List<EmotionResult>> consumeReady();
// Get buffer statistics
Map<String, dynamic> getBufferStats();
// Clear all buffered data
void clear();
}Configuration for the emotion engine:
class EmotionConfig {
final String modelId; // Model identifier
final Duration window; // Rolling window size (default: 60s)
final Duration step; // Emission cadence (default: 5s)
final int minRrCount; // Min RR intervals needed (default: 30)
final bool returnAllProbas; // Return all probabilities (default: true)
final double? hrBaseline; // Optional HR personalization
final Map<String,double>? priors; // Optional label priors
}Result of emotion inference:
class EmotionResult {
final DateTime timestamp; // When inference was performed
final String emotion; // Predicted emotion (top-1)
final double confidence; // Confidence score (0.0-1.0)
final Map<String, double> probabilities; // All label probabilities
final Map<String, double> features; // Extracted features
final Map<String, dynamic> model; // Model metadata
}- On-Device Processing: All emotion inference happens locally
- No Data Retention: Raw biometric data is not retained after processing
- No Network Calls: No data is sent to external servers
- Privacy-First Design: No built-in storage - you control what gets persisted
- Real Trained Models: Uses WESAD-trained models with 78% accuracy
Check out the complete examples in the synheart-emotion repository:
# Clone the main repository for examples
git clone https://github.com/synheart-ai/synheart-emotion.git
cd synheart-emotion/examples
flutter pub get
flutter runThe example demonstrates:
- Real-time emotion detection
- Probability visualization
- Buffer management
- Logging system
Run the test suite:
flutter testRun benchmarks:
flutter test test/benchmarks_test.dartTests cover:
- Feature extraction accuracy
- Model inference performance
- Edge case handling
- Memory usage patterns
Target Performance (mid-range phone):
- Latency: < 5ms per inference
- Model Size: < 100 KB
- CPU Usage: < 2% during active streaming
- Memory: < 3 MB (engine + buffers)
- Accuracy: 78% on WESAD dataset (3-class emotion recognition)
Benchmarks:
- HR mean calculation: < 1ms
- SDNN/RMSSD calculation: < 2ms
- Model inference: < 1ms
- Full pipeline: < 5ms
Biometric Data (HR, RR)
β
βΌ
βββββββββββββββββββββββ
β EmotionEngine β
β [RingBuffer] β
β [FeatureExtractor] β
β [Model Inference] β
βββββββββββββββββββββββ
β
βΌ
EmotionResult
β
βΌ
Your App
Perfect integration with the Synheart Wear SDK for real wearable data:
// Stream from Apple Watch, Fitbit, etc.
final wearStream = synheartWear.streamHR();
final emotionStream = EmotionStream.emotionStream(engine, wearStream);Feed emotion results into the SWIP impact measurement system:
for (final emotion in emotionResults) {
swipCore.ingestEmotion(emotion);
}This project is licensed under the MIT License - see the LICENSE file for details.
We welcome contributions! See our Contributing Guidelines for details.
- Main Repository: synheart-emotion (Source of Truth)
- Documentation: RFC E1.1
- Model Card: Model Card
- Examples: Examples
- Models: Pre-trained Models
- Tools: Development Tools
- Synheart Wear: synheart-wear
- Synheart AI: synheart.ai
- Issues: GitHub Issues
- Synheart AI Team - Initial work, RFC Design & Architecture
Made with β€οΈ by the Synheart AI Team
Technology with a heartbeat.