Skip to content

A regression model optimization project using GridSearchCV with 3-fold Cross-Validation, evaluated by MSE, and tracked via MLflow to log experiments, parameters, and metrics.

Notifications You must be signed in to change notification settings

hrry121/Hyperparameter_Tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ” Hyperparameter Tuning with Cross-Validation & MLflow Tracking

This project demonstrates hyperparameter tuning on a regression model using Scikit-Learn's GridSearchCV, evaluated with Mean Squared Error (MSE), and tracked using MLflow for experiment management.


πŸ“Œ Project Highlights

  • Model Optimization via Grid Search over a defined hyperparameter space
  • Cross-Validation with cv=3 to ensure model generalization
  • Metric Used: Mean Squared Error (MSE) for evaluation
  • Experiment Tracking with MLflow (parameters, metrics, artifacts, and models)
  • Best Model Selection based on the lowest average MSE across folds

πŸ› οΈ Tech Stack

  • Python
  • Scikit-Learn
  • MLflow
  • Pandas

πŸ“‚ Project Structure

β”œβ”€β”€ mlruns/              # MLflow logs (auto-generated)
β”œβ”€β”€ house_prediction.py  # Script to run tuning + tracking
β”œβ”€β”€ requirements.txt     # Required packages
└── README.md            # You're here!



About

A regression model optimization project using GridSearchCV with 3-fold Cross-Validation, evaluated by MSE, and tracked via MLflow to log experiments, parameters, and metrics.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages