Notebooks for Kaggle competition
-
Updated
Jan 25, 2025 - Jupyter Notebook
Notebooks for Kaggle competition
Leveraging Machine Learning 🚀 to predict Math Scores 📈 based on key features, Enables data-driven insights for better academic support and improvement strategies! 🚀
Accident damage prediction using catboost regressor
This repository contains a project I completed for an NTU course titled CB4247 Statistics & Computational Inference to Big Data. In this project, I applied regression and machine learning techniques to predict house prices in India.
Detecting brain age based on MRI scans data.
Linking Writing Processes to Writing Quality
Dynamically adjust cost of the rides in response to changing factors
Math Score Predictor
A light-weight Kaggle challenge to predict crabs' age
Estimating abalone rings (age) based on their physical characteristics, such as gender, length, height, diameter, weight, etc.
Predicting house prices using advanced regression algorithms
This project aims to predict flight arrival delays using various machine learning algorithms. It involves EDA, feature engineering, and model tuning with XGBoost, LightGBM, CatBoost, SVM, Lasso, Ridge, Decision Tree, and Random Forest Regressors. The goal is to identify the best model for accurate predictions.
Predicción del precio de venta de las viviendas en venta y de las viviendas en alquiler de Barcelona.
House Price Prediction
Development of a project for the thesis “AI in the reduction of cognitive biases (Anchoring Bias) in Brazilian consumption", with the support of Python and Machine Learning.
Predicting house prices using advanced regression techniques (LightGBM, XGBoost, CatBoost, stacking) on Kaggle’s Ames dataset.
Code for kaggle single cell competition (got bronze medal)
Add a description, image, and links to the catboost-regressor topic page so that developers can more easily learn about it.
To associate your repository with the catboost-regressor topic, visit your repo's landing page and select "manage topics."