End to End Kidney Disease Classification model using MLFlow, DVC, DagsHub, AWS, Github Actions, Docker
- Update config.yaml
- Update secrets.yaml [Optional]
- Update params.yaml
- Update the entity
- Update the configuration manager in src config
- Update the components
- Update the pipeline
- Update the main.py
- Update the dvc.yaml
- app.py
Clone the repository
https://github.com/Jkanishkha0305/Kidney_Disease_Classification
conda create -n kidney python=3.8 -y
conda activate kidney
pip install -r requirements.txt
Dataset : Kaggel [CT KIDNEY DATASET: Normal-Cyst-Tumor and Stone]
Dataset Link : https://www.kaggle.com/datasets/nazmul0087/ct-kidney-dataset-normal-cyst-tumor-and-stone
Size : 1.66 GB
Files : 12.4k Files
Keras API link : https://keras.io/api/applications/
Model Chosen : VGG16
Model link : https://keras.io/api/applications/vgg/#vgg16-function
Image size : 244x244x3
# Finally run the following command
python main.py
# Finally run the following command
python app.py
Now,
open up you local host and port
- mlflow ui
import dagshub
dagshub.init(repo_owner='Jkanishkha0305', repo_name='Kidney_Disease_Classification', mlflow=True)
import mlflow
with mlflow.start_run():
mlflow.log_param('parameter name', 'value')
mlflow.log_metric('metric name', 1)
MLFLOW_TRACKING_URI=https://dagshub.com/Jkanishkha0305/Kidney_Disease_Classification.mlflow
MLFLOW_TRACKING_USERNAME=entbappy \
MLFLOW_TRACKING_PASSWORD=6bd16d7f5ee713eba8329fb353c637dc8de93b55 \
python script.py
Run this to export as env variables:
export MLFLOW_TRACKING_URI=https://dagshub.com/Jkanishkha0305/Kidney_Disease_Classification.mlflow
export MLFLOW_TRACKING_USERNAME=Jkanishkha0305
export MLFLOW_TRACKING_PASSWORD=6bd16d7f5ee713eba8329fb353c637dc8de93b55
- dvc init
- dvc repro
- dvc dag
MLflow
- Its Production Grade
- Trace all of your expriements
- Logging & taging your model
DVC
- Its very lite weight for POC only
- lite weight expriements tracker
- It can perform Orchestration (Creating Pipelines)
#with specific access
1. EC2 access : It is virtual machine
2. ECR: Elastic Container registry to save your docker image in aws
#Description: About the deployment
1. Build docker image of the source code
2. Push your docker image to ECR
3. Launch Your EC2
4. Pull Your image from ECR in EC2
5. Lauch your docker image in EC2
#Policy:
1. AmazonEC2ContainerRegistryFullAccess
2. AmazonEC2FullAccess
- Save the URI: 729852511714.dkr.ecr.us-east-1.amazonaws.com/kidney
#optinal
sudo apt-get update -y
sudo apt-get upgrade
#required
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker ubuntu
newgrp docker
settings>actions>runner>new self hosted runner> choose os> then run command one by one
settings>secrets and variables>action>add new repository secret
(from the access key .csv that was downloaded)
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION = us-east-1
AWS_ECR_LOGIN_URI = 729852511714.dkr.ecr.us-east-1.amazonaws.com
ECR_REPOSITORY_NAME = kidney