Skip to content

ninerealmlabs/mlflow-server

Repository files navigation

MLflow-server

This repo provides a MLflow container intended for robust self-hosted deployment, where the MLflow container serves as the tracking server, with (optional but recommended) external Artifact and Backend Stores. This readme details the recommended configuration.

⚠️ Deprecation Notice: No longer updating DockerHub repository ⚠️

Due to March 2023 removal of Docker's free Teams organization & history of price changes, images will no longer be pushed to DockerHub. Please use ghcr.io/ninerealmlabs/mlflow-server:<tag>

Quickstart

The quickstart directory provides a sample docker compose deployment, demonstrating MLflow Server, MLflow AI Gateway, and using S3 and Postgres as Artifact and Backend Stores respectively.

Experiment Tracking Server

The MLflow Tracking Server hosts the MLflow UI and proxies connections to the Artifact and Backend Stores.

Artifact Store

The Artifact Store is the storage location for the artifacts created by each run (model weights, images (.jpeg, .png), and model and data files). MLflow supports a variety of object storage solutions (S3, blob, network-accessible storage, etc.)

Backend Store

The Backend Store persists MLflow entities and metadata (runs, parameters, metrics, tags, notes, etc.) to a relational database.

Configuration

Configuration is done by setting environmental variables. Generally, environmental variables are equivalent to mlflow-server cli commands with "MLFLOW" prefix, all-caps, and underscores (i.e., --serve-artifacts would become MLFLOW_SERVE_ARTIFACTS).

Options for storage are specified in the tracking server documentation

Network security middleware

Review the network hardening guide and configure the container with explicit settings. The compose example accepts overrides via .env:

MLFLOW_SERVER_ALLOWED_HOSTS=mlflow.example.com,localhost:5555 MLFLOW_SERVER_CORS_ALLOWED_ORIGINS=https://app.example.com
MLFLOW_SERVER_X_FRAME_OPTIONS=DENY
...

Adjust these to match the domains that should reach the tracking server. Set MLFLOW_SERVER_X_FRAME_OPTIONS=NONE only when the UI must be embedded cross-origin, and avoid overriding MLFLOW_SERVER_ALLOWED_HOSTS with * outside of local development.

Single Sign-on

Single Sign-on (SSO) is provided by the mlflow-oidc-auth plugin.

Database migrations

When upgrading MLflow, it is possible that a database migration will be required. mlflow provides a command for this; run it from the tracking server container.

mlflow db upgrade "$MLFLOW_BACKEND_STORE_URI"

Cleaning up deleted experiments

Gateway Server

This container can also be used to deploy the MLflow AI Gateway. The AI Gateway provides a central interface for deploying and managing multiple LLM providers, and supports the prompt engineering UI.

Follow the instructions in the AI Gateway Configuration documentation to create a config.yaml file with the specifications for the AI API services that will be routed through the AI Gateway.

About

No description or website provided.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 2

  •  
  •