- The project overview
- Data Preprocessing
- Modeling
- Evaluation
Recurrent Neural Networks can Memorize/remember previous inputs in-memory When a huge set of Sequential data is given to it.
These loops make recurrent neural networks seem kind of mysterious. However, if you think a bit more, it turns out that they aren’t all that different than a normal neural network. A recurrent neural network can be thought of as multiple copies of the same network, each passing a message to a successor.
Different types of Recurrent Neural Networks.
- Image Classification
- Sequence output (e.g. image captioning takes an image and outputs a sentence of words).
- Sequence input (e.g. sentiment analysis where a given sentence is classified as expressing a positive or negative sentiment).
- Sequence input and sequence output (e.g. Machine Translation: an RNN reads a sentence in English and then outputs a sentence in French).
- Synced sequence input and output (e.g. video classification where we wish to label each frame of the video)
This project is licensed under the MIT License - see the LICENSE.md file for details
by : Shahab Rahnama