Skip to content

Latest commit

 

History

History
223 lines (203 loc) · 9.44 KB

readings.md

File metadata and controls

223 lines (203 loc) · 9.44 KB
layout
default

Lecture 1 -- Deep Learning Challenge. Is There Theory?

Readings

  1. Deep Deep Trouble
  2. Why 2016 is The Global Tipping Point...
  3. Are AI and ML Killing Analyticals...
  4. The Dark Secret at The Heart of AI
  5. AI Robots Learning Racism...
  6. FaceApp Forced to Pull ‘Racist' Filters...
  7. Losing a Whole Generation of Young Men to Video Games

Lecture 2 -- Overview of Deep Learning From a Practical Point of View

Readings

  1. Emergence of simple cell
  2. ImageNet Classification with Deep Convolutional Neural Networks (Alexnet)
  3. Very Deep Convolutional Networks for Large-Scale Image Recognition (VGG)
  4. Going Deeper with Convolutions (GoogLeNet)
  5. Deep Residual Learning for Image Recognition (ResNet)
  6. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
  7. Visualizing and Understanding Convolutional Neural Networks
Blogs
  1. An Intuitive Guide to Deep Network Architectures
  2. Neural Network Architectures
Videos
  1. Deep Visualization Toolbox

Lecture 3

Readings

  1. A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction
  2. Energy Propagation in Deep Convolutional Neural Networks
  3. Discrete Deep Feature Extraction: A Theory and New Architectures
  4. Topology Reduction in Deep Convolutional Feature Extraction Networks

Lecture 4

Readings

  1. A Probabilistic Framework for Deep Learning
  2. Semi-Supervised Learning with the Deep Rendering Mixture Model
  3. A Probabilistic Theory of Deep Learning

Lecture 5

Readings

  1. Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality: A Review
  2. Learning Functions: When is Deep Better Than Shallow

Lecture 6

Readings

  1. Convolutional Patch Representations for Image Retrieval: an Unsupervised Approach
  2. Convolutional Kernel Networks
  3. Kernel Descriptors for Visual Recognition
  4. End-to-End Kernel Learning with Supervised Convolutional Kernel Networks
  5. Learning with Kernels
  6. Kernel Based Methods for Hypothesis Testing

Lecture 7

Readings

  1. Geometry of Neural Network Loss Surfaces via Random Matrix Theory
  2. Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
  3. Nonlinear random matrix theory for deep learning

Lecture 8

Readings

  1. Deep Learning without Poor Local Minima
  2. Topology and Geometry of Half-Rectified Network Optimization
  3. Convexified Convolutional Neural Networks
  4. Implicit Regularization in Matrix Factorization

Lecture 9

Readings

  1. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position
  2. Perception as an inference problem
  3. A Neurobiological Model of Visual Attention and Invariant Pattern Recognition Based on Dynamic Routing of Information

Lecture 10

Readings

  1. Working Locally Thinking Globally: Theoretical Guarantees for Convolutional Sparse Coding
  2. Convolutional Neural Networks Analyzed via Convolutional Sparse Coding
  3. Multi-Layer Convolutional Sparse Modeling: Pursuit and Dictionary Learning
  4. Convolutional Dictionary Learning via Local Processing

To be discussed and extra

back