A tiny end-to-end demo: a webcam app that infers facial emotion in real time using a CNN exported to ONNX.
app/- Windows desktop demo that loads the ONNX model and shows live emotion traces and an assistant hint.cnn-model/- Jupyter notebook, trained Keras model, ONNX export, and train/validate/test images.screen-shots/- All images referenced by this README (UI, training curves, confusion matrix, etc.).
The app mocks a two-sided video call. The "Sales Guy" panel shows a static photo and an assistant message ("Steady as she goes", "Tell a joke", etc.).
The "Client" panel uses your webcam to infer Happy, Sad, Neutral, Surprised and plots the probabilities over time.
ONNX runtime: the app loads the exported
.onnxmodel for fast inference.
emotions captured in real time
- Notebook:
Model6.ipynb - Saved models:
final_best_model.keras(Keras)final_best_model.onnx(used by the app)
- Data:
images/{train,valid,test}/<class>/...
- Architecture: compact CNN tuned for 4 classes (Happy, Sad, Neutral, Surprised).
- Pipeline: image loading, augmentation, class balancing, training/validation split.
- Export: best Keras checkpoint -> ONNX for runtime use in
app/.