Stars
Standardized Serverless ML Inference Platform on Kubernetes
This project showcases an LLMOps pipeline that fine-tunes a small-size LLM model to prepare for the outage of the service LLM.
Purpose-built OS for Kubernetes, fully managed by Kubernetes.
Minimal and free Kubernetes distribution with Terraform
Tools to process books in a cloud based pipeline system
A small form factor OpenShift/Kubernetes optimized for edge computing