Facial Emotion Recognition on fer2013 dataset using TensorFlow! (Accuracy ~ 65%)
-
Download Data Set:
fer2013.bin(63M) andtest_batch.bin(7.9M) from https://goo.gl/ffmy2hImage Properties:
Size of an image- 48 x 48 pixels (2304 bytes),Size of a label- number in (0..6) (1 byte) (0=Angry, 1=Fear, 2=Happy, 3=Sad, 4=Disgust, 5=Surprise, 6=Neutral).Data Set Format:
1st byteis the label number and thenext 2304 bytesare the image pixels. -
Create a data directory in your system:
/tmp/fer2013_data/ -
Put the training data set (28,709 images) in:
/tmp/fer2013_data/fer2013-batches-bin/fer2013.bin -
Put the testing data set (3,589 images) in:
/tmp/fer2013_data/fer2013-batches-bin/test_batch.bin
- Install
TensorFlow: https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#pip-installation - Run
python fer2013_train.py - Run
python fer2013_eval.pyon fer2013.bin data (Training Precision) - Run
python fer2013_eval.pyon test_batch.bin data (Evaluation Precision)
-
From https://goo.gl/ffmy2h download the checkpoint files
checkpoint,model.ckpt-6000,model.ckpt-6000.metalocated in65acc-checkpointdir. -
Copy these files into
/tmp/fer2013_train/ -
Inside the demo folder of the emotion-recognition project, run
./demo.sh IMG#. Provide anIMG#, which is the row number in theprivate-test-150.csv, where each row corresponds to an image. There are 150 such rows/images. -
Executing
./demo.shoutputs thelabelpredicted by the trained model. This can be cross checked with the first value in the row of the csv file. -
To actually view the image and visually cross check the emotion, run
uint8-to-image.pyscript onprivate-test-150.csv. This generates 150 .png image files with appropriate IMG# in the image file name.
- Run
tensorboard --logdir "/tmp" - Go to
http://0.0.0.0:6006/ - This displays
events,images,graphsandhistogramsfor the train and eval runs on the model.
- Code references and examples from https://www.tensorflow.org
- Data Set from https://www.kaggle.com/c/challenges-in-representation-learning-facial-expression-recognition-challenge/data