This repository contains my personal notes on Machine Learning, collected from my studies at Ca' Foscari University (Data Science program) and from independent learning. The notes are organized into different topics and stored in Markdown notebook files. They are a work in progress and will be updated as I continue studying and exploring new concepts.
Note: These notes are intended for personal learning and reference. They may contain errors or incomplete sections.
- Overview of supervised and unsupervised methods
- Key algorithms, use cases, and examples
- Supervised and Unsupervised Learning
- Core concepts of neural networks, including architecture, activation functions, and optimization
- Model evaluation, overfitting, underfitting, and regularization techniques
- Fundamentals of Neural Networks
- NLP tasks: text classification, sentiment analysis, NER, POS tagging, translation, summarization, question answering, and text generation
- Core concepts: tokenization, embeddings, language modeling, attention, seq2seq models, transformers
- Large Language Models (LLMs), generative models, and applications
- Tools and frameworks for NLP and LLMs
- NLP and LLM
- Types of generative models: GANs, VAEs, diffusion models, autoregressive models
- Neural networks and deep learning concepts for generative models
- Training, fine-tuning, and prompt engineering
- Large Language Models (LLMs) and multi-modal AI
- Retrieval-Augmented Generation (RAG), vector storage, and chunking
- Tools, frameworks, and platforms for GenAI
- Generative AI
To fully understand these notes, a solid grasp of the following mathematical and statistical concepts is recommended:
- Vectors, matrices, and operations (addition, multiplication)
- Dot product, cross product
- Eigenvalues and eigenvectors
- Matrix decomposition (SVD, LU)
- Derivatives and gradients
- Partial derivatives
- Chain rule
- Gradient descent and optimization basics
- Probability distributions (normal, binomial, Poisson)
- Expectation, variance, covariance
- Conditional probability and Bayes’ theorem
- Sampling, estimation, hypothesis testing
- Logarithms and exponentials
- Norms (L1, L2)
- Basic combinatorics (for probabilistic reasoning)
These concepts are not explained in the notes.
- Each Markdown file is self-contained and can be read independently.
- These notebooks are also available on my kaggle profile.
- These notes are personal and intended as a study guide for ML concepts.
- They are a work in progress and will be updated as I continue learning.