(pronounced as AN-i-moh-eye; derived from Animation-Emotion-Eye)
The project is an innovative approach to integrating emotional intelligence into robotics. It features a dynamic robot eye animation displayed on a 0.96" OLED screen that changes based on the sentiment of text input provided by the user. The project is powered by the CAP10 Pratham, a made-in-India microcontroller board.
- Real-Time Sentiment Analysis: The system detects the sentiment from the user's speech and converts it into text for analysis.
- Emotional Intelligence: Enhances user interaction by allowing the robot to visually express emotions.
- Compact Design: Utilizes a 0.96" OLED screen for a minimalistic yet effective display.
- Cost-Effective: Built using the CAP10 Pratham board, making it an affordable solution for integrating emotional responses into robotics.
- System/Edge Computing Device: Any System capable of running sentiment analysis program.
- CAP10 Pratham: A programmable microcontroller board to control the OLED screen and handle the request from master system.
- 0.96" OLED Display: A compact screen to display the animated robot eye.
- Power Supply: Ensure a stable power source for consistent operation.
Python: For sentiment analysis using LSTM (Long Short-Term Memory) models. Arduino IDE: To program the CAP10 Pratham board and control the OLED display.
- Please checkout requirements.txt for python libraries.
- ESP32 board manager for Arduino IDE for programming CAP10 Pratham.
- Adafruit_SSD1306 & Adafruit_GFX by Adafruit; for handling the OLED display & graphics functions.
- Train Model (Optional): Users can train the model using the dataset available in the data directory or with their own data in the same format. Upon completion, they will receive the trained model for making predictions.
- Sentiment Analysis: The user provides a speech input which will transcribed by the WhisperX model to generate text script, then the sentiment of the text is analyzed using an LSTM model trained in the previous step.
- Animation Update: The CAP10 Pratham board receives the sentiment result by an API call and updates the OLED display with the corresponding eye animation (e.g., happy, sad, surprised).
- Software Setup
- Ensure Git LFS is installed if it hasn't been set up previously
git lfs install
- Clone this repository:
git clone https://github.com/AsutoshPati/An-Emo-Eye.git
- Install the required Python packages:
pip install -r requirements.txt
- Upload the Arduino code available in Display_Emotion directory to CAP10 Pratham board by selecting
ESP32 Dev Module
as board using the Arduino IDE.
- Ensure Git LFS is installed if it hasn't been set up previously
- Hardware Setup
- Connect the 0.96" OLED display to the CAP10 Pratham board. as mentioned in the circuit diagram
- Ensure all components are securely connected and powered.
- Connect the 0.96" OLED display to the CAP10 Pratham board. as mentioned in the circuit diagram
- Usage
- (Optional) To Train the model run the training code Sentiment Analysis.ipynb in Jupyter Notebook.
- If the Arduino code is successfully uploaded to CAP10 Pratham and the device is properly powered, a Wi-Fi Access Point with the SSID
CAP10 Eye
will become visible. To proceed, connect to this SSID from the master system (where the sentiment analysis script will be executed) using the password12345678
. - Run the sentiment analysis Prediction and Integration.ipynb script in the Jupyter Notebook.
- Provide text input to the system and watch the OLED display as the robot eye changes based on the sentiment of the input.
- Interactive Robotics: Enhance user interaction by allowing robots to express emotions visually.
- Educational Projects: Ideal for teaching concepts related to emotional intelligence in robotics.
- Prototype Development: Can be used in developing more complex emotionally aware robotic systems.
Special thanks to my friend Santosh for significant contributions to this project.
This project is licensed under the MIT License - see the LICENSE file for details.
Don't forget to star the repository and connect with me if you find it helpful. Feel free to fork this repository and take your innovations to the next level.