Skip to content

DebasishMaji/notebook_projects

Repository files navigation

Notebook Projects

Deep learning projects to understand the neural network concept, implement neural network and improve the performance of our result.

  1. Initialization

    In this notebook project we will:

    • Understand that different initialization methods and their impact on our model performance

    • Implement zero initialization and and see it fails to "break symmetry",

    • Recognize that random initialization "breaks symmetry" and yields more efficient models,

    • Understand that we could use both random initialization and scaling to get even better training performance on our model.

  2. Regularization

    In this notebook project we will:

    • Understand that different regularization methods that could help our model.

    • Implement dropout and see it work on data.

    • Recognize that a model without regularization gives us a better accuracy on the training set but nor necessarily on the test set.

    • Understand that we could use both dropout and regularization on our model.

  3. Optimization

    In this notebook project we will:

    • Understand the intuition between Adam and RMS prop

    • Recognize the importance of mini-batch gradient descent

    • Learn the effects of momentum on the overall performance of our model

  4. Gradient Checking

    In this notebook project we will:

    • Implement gradient checking from scratch.

    • Understand how to use the difference formula to check our backpropagation implementation.

    • Recognize that our backpropagation algorithm should give us similar results as the ones we got by computing the difference formula.

    • Learn how to identify which parameter's gradient was computed incorrectly.

  5. Tensorflow

    • In this notebook we will learn all the basics of Tensorflow.
    • We will implement useful functions and draw the parallel with what we did using Numpy.
    • We will understand what Tensors and operations are, as well as how to execute them in a computation graph.