Skip to content

ECLIPSE-Lab/MathematicalFoundationsForAIML

Repository files navigation

MFML Lecture

Week MFML Lecture Focus MFML Exercise (Python, 90 min) Pedagogical Purpose
1 Learning vs data analysis; loss functions NumPy refresher: vectors, dot products, simple loss functions (MSE) Shift mindset from “data analysis” to “learning”
2 Linear algebra refresher; PCA/SVD (R) PCA refresher on known dataset; visualize variance directions Align notation & geometry, no novelty overload
3 Regression as loss minimization Linear regression from scratch via loss minimization Bridge known regression → learning viewpoint
4 Neural networks: neuron & activations Single-neuron model: forward pass + activation functions First NN contact, zero frameworks
5 Backpropagation & gradients Manual backprop for 1–2 layer network Demystify training mechanics early
6 Loss landscapes & optimization behavior Gradient descent experiments: learning rate, conditioning Understand why training fails or succeeds
7 Generalization, bias–variance Overfitting demo: polynomial vs NN models Make generalization tangible
8 Probabilistic view of learning Noise injection; likelihood vs MSE comparison Connect probability to physical data
9 Representation learning Feature learning vs hand-crafted features (simple NN) Prepare Materials Genomics concepts
10 Latent spaces & autoencoders Autoencoder with framework (PyTorch/Keras) First latent-space construction
11 Unsupervised objectives revisited Clustering vs autoencoder embeddings Reframe known clustering methods
12 Uncertainty in predictions Predictive uncertainty via ensembles / dropout Teach model trust, not accuracy
13 Physics-informed learning Simple constrained NN (penalty-based) Bridge MFML → ML-PC PINNs
14 Explainability & limits Sensitivity analysis & failure case Scientific responsibility closure

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages