With Increasing AI workloads and IOT data streams, current centralized AI training platforms face scalability and cost issues. Centralized AI and Data cause massive costs and less scalable. High load on central servers, network congestion are the main issues.
We introduce an AI training and analytics platform which utilises edge resources for running AI and data workloads. Our platform provides a local Streamlit application for users to collaborate and train AI models locally with zero coding while utilising their resources. This does not involve moving the dataset through the network, which lowers the load on central servers and the network. The platform also provides storage for IOT data and IoT-related AI predictions on edge servers. Pre-trained Public Models are also moved from central servers to edge servers for faster delivery and lower network congestion.
User Model Training is done as Federated Learning though Flower framework.
- Users can train and use AI models through a user-friendly UI without any programming knowledge.
- Multiple users can train models collaboratively while keeping sensitive data locally, which ensures data privacy and reduces network traffic.
- Users can train a variety of custom AI models related to classification, regression, forecasting, anomaly detection, etc.
- The system is capable of processing live sensor/IoT data at the edge servers to collect data for model training, inference, and providing live insights.
- Collection of pre-trained models is cached and delivered through edge servers to users with minimal latency.
- Users can view real-time analytics and predictions related to sensor/IoT data in a centralized dashboard.
username: admin
pword: fyp12345
username: navod
pword: fyp12345
organization: fyp
token: 3wvWUxmtdBM03hm9YgTEa91s6ofQ73G4gQ54uNR0Ek59zpJNMGOagj1UR1GKw3D1f5Elw-zS78rEwY7akZGmOw==
url:http://localhost:5001/
cd edge_server/airflow_env
docker build . --tag extended_airflow:latest
in project root
docker compose up airflow-init
in project root
docker-compose up -d
run temp python files in mlflow folder
cd central_server
uvicorn api:app --host 0.0.0.0 --port 8000 --reload
cd central_server/event_ingester
python events_ingest.py
cd edge_server
uvicorn api:app --host 0.0.0.0 --port 8001 --reload
cd sensors/common_cameras
python camera.py
cd sensors/power_plants
python plants.py
cd sensors/traffic_cameras
python camera.py
cd sensors/weather_sensors
python weather_stream.py
cd edge_server/power_gen_pred
python realtime_pred.py
cd edge_server/weather_pred
python realtime_pred.py
cd edge_server/traffic_pred
python traffic_pred.py
cd edge_server/crowd_pred
python crowd_pred.py
cd desktop_app
streamlit run app.py
