Skip to content

Latest commit

 

History

History
73 lines (57 loc) · 12.8 KB

README.md

File metadata and controls

73 lines (57 loc) · 12.8 KB

mlcourse.ai – Open Machine Learning Course

ODS stickers

🇷🇺 Russian version 🇷🇺

❗ Current session launched on October 1, 2018. Fill in this form to participate, you can still join ❗

Mirrors (:uk:-only): mlcourse.ai (main site), Kaggle Dataset (same notebooks as Kernels)

Outline

This is the list of published articles on medium.com 🇬🇧, habr.com 🇷🇺, and jqr.com 🇨🇳. Icons are clickable. Also, links to Kaggle Kernels (in English) are given. This way one can reproduce everything without installing a single package.

  1. Exploratory Data Analysis with Pandas 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  2. Visual Data Analysis with Python 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernels: part1, part2
  3. Classification, Decision Trees and k Nearest Neighbors 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  4. Linear Classification and Regression 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernels: part1, part2, part3, part4, part5
  5. Bagging and Random Forest 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernels: part1, part2, part3
  6. Feature Engineering and Feature Selection 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  7. Unsupervised Learning: Principal Component Analysis and Clustering 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  8. Vowpal Wabbit: Learning with Gigabytes of Data 🇬🇧 🇷🇺 🇨🇳, Kaggle Kernel
  9. Time Series Analysis with Python, part 1 🇬🇧 🇷🇺 🇨🇳. Predicting future with Facebook Prophet, part 2 🇬🇧, 🇨🇳 Kaggle Kernels: part1, part2
  10. Gradient Boosting 🇬🇧 🇷🇺, 🇨🇳, Kaggle Kernel

Lectures

Videolectures are uploaded to this YouTube playlist.

Introduction, video, slides

  1. Exploratory data analysis with Pandas, video
  2. Visualization, main plots for EDA, video
  3. Decision trees: theory and practical part
  4. Logistic regression: theoretical foundations, practical part (baselines in the "Alice" competition)
  5. Emsembles and Random Forest – part 1. Classification metrics – part 2. Example of a business task, predicting a customer payment – part 3
  6. Linear regression and regularization - theory, LASSO & Ridge, LTV prediction - practice
  7. Unsupervised learning - Principal Component Analysis and Clustering

Assignments

  1. Exploratory Data Analysis of Olympic games with Pandas. nbviewer. Deadline: October 14, 21:59 UTC+2
  2. Exploratory Data Analysis of US flights. nbviewer. Deadline: October 21, 21:59 UTC+2
  3. Decision trees. nbviewer. Deadline: October 28, 21:59 UTC+2. Optional: implementing a decision tree algorithm, nvbiewer (no webforms and credits, the same deadline)
  4. Logisitic regression. nbviewer. Deadline: November 4, 21:59 UTC+2
  5. Random Forest and Logistic Regression in credit scoring and movie reviews classification. nbviewer. Deadline: November 11, 21:59 UTC+2
  6. Beating baselines in "How good is your Medium article?". nbviewer. Deadline: November 18, 21:59 UTC+2
  7. PCA and clustering. nbviewer. Deadline: November 25, 21:59 UTC+2
  8. StackOverflow questions tagging with logistic regression. nbviewer. Deadline: December 2, 21:59 UTC+2

The following are demo versions. Just for practice, they don't have an impact on rating.

  1. Exploratory data analysis with Pandas, nbviewer, Kaggle Kernel
  2. Analyzing cardiovascular disease data, nbviewer, Kaggle Kernel
  3. Decision trees with a toy task and the UCI Adult dataset, nbviewer, Kaggle Kernel
  4. Linear Regression as an optimization problem, nbviewer, Kaggle Kernel
  5. Logistic Regression and Random Forest in the credit scoring problem, nbviewer, Kaggle Kernel
  6. Exploring OLS, Lasso and Random Forest in a regression task, nbviewer, Kaggle Kernel
  7. Unsupervised learning, nbviewer, Kaggle Kernel
  8. Implementing online regressor, nbviewer, Kaggle Kernel
  9. Time series analysis, nbviewer, Kaggle Kernel
  10. Gradient boosting and flight delays, nbviewer, Kaggle Kernel

Kaggle competitions

  1. Catch Me If You Can: Intruder Detection through Webpage Session Tracking. Kaggle Inclass
  2. How good is your Medium article? Kaggle Inclass

Rating

Throughout the course we are maintaining a student rating. It takes into account credits scored in assignments and Kaggle competitions. Top students (according to the final rating) will be listed on a special Wiki page.

Community

Discussions between students are held in the #mlcourse_ai channel of the OpenDataScience Slack team. Fill in this form to get an invitation. The form will also ask you some personal questions, don't hesitate 👋

More info

Go to mlcourse.ai

The course is free but you can support organizers by making a pledge on Patreon