Skip to content

PyML-studio/mlstudio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ML Studio

Link to YouTube channel: Machine Learning Studio

Deep Learning Series playlist

Video Description
Video 1: Top 10 Activaiton Functions Giving a review of activation functions
Video 2: A Review of Top NN Optimizers A review of top 16 optimization algorithms for training neural networks

Attention mechanism & Transformers playlist

Video Description
Video 1: Marix Multiplication Concept Explained Linear algebra concepts (pre-requisite to attention mechanism)
Video 2: Self-Attention Using Scaled Dot-Product Approach Understanding the self-attention mechanism, and intro to scaled dot-product attention
Video 3: A Dive Into Multihead Attention, Self-Attention and Cross-Attention Multihead Attention
Video 4: Transformer Architecture Transformer
Video 5: PostLN, PreLN and ResiDual Transformers LayerNorm in Transformer
Video 6: Variants of Multi-head Attention: MQA and GQA MQA and GQA
Video 7: Efficient Self-Attention Reducing Complexity
Video 8: Implementing Linear-Complexity Attention Implementing Linear-Attention in PyTorch
Video 9: Introducing a new seriws on Vision Transformers Introduction and outline of Vision Transformers series
Video 10: Self Attention in Image Domain Self-Attention in Image domain: Non-Local Module
Video 11: Relative Self-Attention Explained Mechanics of Relative Self-Attention
Video 12: Evolution of Self-Attention in Vision Attention-Augmented Conv. (AANet), Stand-Alone Self-Attention (SASA), and Stand-Alone Axial Attention (SAAA)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published