Skip to content

Commit

Permalink
Add initial documentation (#224)
Browse files Browse the repository at this point in the history
  • Loading branch information
Adrian Gonzalez-Martin authored Jul 9, 2021
1 parent 9becdd1 commit eddad0d
Show file tree
Hide file tree
Showing 99 changed files with 489 additions and 112 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -42,3 +42,6 @@ _bundle
# Conda-packed environment
old-sklearn.tar.gz
mlruns

# Sphinx documentation
docs/_build/
20 changes: 20 additions & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py

# Optionally build your docs in additional formats such as PDF
formats:
- pdf

# Optionally set the version of Python and requirements required to build your docs
python:
version: 3.7
install:
- requirements: docs/requirements.txt
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ lint: generate
mypy $$_runtime; \
done
mypy ./benchmarking
mypy ./examples
mypy ./docs/examples
# Check if something has changed after generation
git \
--no-pager diff \
Expand Down
61 changes: 29 additions & 32 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# MLServer

An open source inference server to serve your machine learning models.

> :warning: **This is a Work in Progress**.
An open source inference server for your machine learning models.

## Overview

Expand Down Expand Up @@ -32,55 +30,54 @@ pip install mlserver-sklearn
```

For further information on how to use MLServer, you can check any of the
[available examples](#Examples).
[available examples](#examples).

## Inference Runtimes

Inference runtimes allow you to define how your model should be used within
MLServer.
You can think of them as the **backend glue** between MLServer and your machine
learning framework of choice.
You can read more about [inference runtimes in their documentation
page](./docs/runtimes/index.md).

Out of the box, MLServer comes with a set of pre-packaged runtimes which let
you interact with a subset of common ML frameworks.
you interact with a subset of common frameworks.
This allows you to start serving models saved in these frameworks straight
away.

To avoid bringing in dependencies for frameworks that you don't need to use,
these runtimes are implemented as independent optional packages.
This mechanism also allows you to rollout your [own custom runtimes]( very easily.

To pick which runtime you want to use for your model, you just need to make
sure that the right package is installed, and then point to the correct runtime
class in your `model-settings.json` file.

The included runtimes are:
Out of the box, MLServer provides support for:

| Framework | Package Name | Implementation Class | Example | Source Code |
| ------------ | ------------------- | --------------------------------- | ---------------------------------------------------- | ---------------------------------------------------------------- |
| Scikit-Learn | `mlserver-sklearn` | `mlserver_sklearn.SKLearnModel` | [Scikit-Learn example](./examples/sklearn/README.md) | [`./runtimes/sklearn`](./runtimes/sklearn) |
| XGBoost | `mlserver-xgboost` | `mlserver_xgboost.XGBoostModel` | [XGBoost example](./examples/xgboost/README.md) | [`./runtimes/xgboost`](./runtimes/xgboost) |
| Spark MLlib | `mlserver-mllib` | `mlserver_mllib.MLlibModel` | Coming Soon | [`./runtimes/mllib`](./runtimes/mllib) |
| LightGBM | `mlserver-lightgbm` | `mlserver_lightgbm.LightGBMModel` | [LightGBM example](./examples/lightgbm/README.md) | [`./runtimes/lightgbm`](./runtimes/lightgbm) |
| Tempo | `tempo` | `tempo.mlserver.InferenceRuntime` | [Tempo example](./examples/tempo/README.md) | [`github.com/SeldonIO/tempo`](https://github.com/SeldonIO/tempo) |
| MLflow | `mlserver-mlflow` | `mlserver_mlflow.MLflowRuntime` | [MLflow example](./examples/mlflow/README.md) | [`./runtimes/mlflow`](./runtimes/mlflow) |
| Framework | Supported | Documentation |
| ------------ | --------- | ---------------------------------------------------------------- |
| Scikit-Learn | :+1: | [MLServer SKLearn](./runtimes/sklearn) |
| XGBoost | :+1: | [MLServer XGBoost](./runtimes/xgboost) |
| Spark MLlib | :+1: | [MLServer MLlib](./runtimes/mllib) |
| LightGBM | :+1: | [MLServer LightGBM](./runtimes/lightgbm) |
| Tempo | :+1: | [`github.com/SeldonIO/tempo`](https://github.com/SeldonIO/tempo) |
| MLflow | :+1: | [MLServer MLflow](./runtimes/mlflow) |

## Examples

On the list below, you can find a few examples on how you can leverage
`mlserver` to start serving your machine learning models.
To see MLServer in action, check out [our full list of
examples](./docs/examples/index.md).
You can find below a few selected examples showcasing how you can leverage
MLServer to start serving your machine learning models.

- [Serving a `scikit-learn` model](./examples/sklearn/README.md)
- [Serving a `xgboost` model](./examples/xgboost/README.md)
- [Serving a `lightgbm` model](./examples/lightgbm/README.md)
- [Serving a `tempo` pipeline](./examples/tempo/README.md)
- [Serving a custom model](./examples/custom/README.md)
- [Multi-Model Serving with multiple frameworks](./examples/mms/README.md)
- [Loading / unloading models from a model repository](./examples/model-repository/README.md)
- [Serving a `scikit-learn` model](./docs/examples/sklearn/README.md)
- [Serving a `xgboost` model](./docs/examples/xgboost/README.md)
- [Serving a `lightgbm` model](./docs/examples/lightgbm/README.md)
- [Serving a `tempo` pipeline](./docs/examples/tempo/README.md)
- [Serving a custom model](./docs/examples/custom/README.md)
- [Multi-Model Serving with multiple frameworks](./docs/examples/mms/README.md)
- [Loading / unloading models from a model repository](./docs/examples/model-repository/README.md)

## Developer Guide

### Versioning

Both the main `mlserver` package and the [inference runtimes
packages](./runtimes) try to follow the same versioning schema.
packages](./docs/runtimes/index.md) try to follow the same versioning schema.
To bump the version across all of them, you can use the
[`./hack/update-version.sh`](./hack/update-version.sh) script.
For example:
Expand Down
23 changes: 23 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile install-dev

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

install-dev:
pip install -r ./requirements.txt
20 changes: 20 additions & 0 deletions docs/_static/css/custom.css
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@

/* Hide first elem of nav bar */
.md-tabs__list > li:first-child {
display: none;
}

dt {
display: table;
margin: 6px 0;
margin-top: 6px;
font-size: 90%;
line-height: normal;
background: #e7f2fa;
color: #2980B9;
border-top: solid 3px #6ab0de;
padding: 6px;
position: relative;
}


1 change: 1 addition & 0 deletions docs/assets/architecture.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
180 changes: 180 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,180 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html

# -- Path setup --------------------------------------------------------------

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))


# -- Project information -----------------------------------------------------
import sphinx_material

project = "MLServer"
copyright = "2021, Seldon Technologies"
html_title = "MLServer"
author = "Seldon Technologies"

# The full version, including alpha/beta/rc tags
release = "0.4.0.dev1"


# -- General configuration ---------------------------------------------------

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
# "sphinx.ext.autodoc",
# Creates .nojekyll config
# "sphinx.ext.githubpages",
# Converts markdown to rst
"myst_parser",
# "sphinx.ext.napoleon",
# automatically generate API docs
# see https://github.com/rtfd/readthedocs.org/issues/1139
# "sphinxcontrib.apidoc",
]

# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]

# apidoc settings
apidoc_module_dir = "../mlserver"
apidoc_output_dir = "api"
apidoc_excluded_paths = ["**/*test*"]
apidoc_module_first = True
apidoc_separate_modules = True
apidoc_extra_args = ["-d 6"]

# mock imports
autodoc_mock_imports = [
"pandas",
"sklearn",
"skimage",
"requests",
"cv2",
"bs4",
"keras",
"seaborn",
"PIL",
"tensorflow",
"spacy",
"numpy",
"tensorflow_probability",
"scipy",
"matplotlib",
"creme",
"cloudpickle",
"fbprophet",
"dask",
"transformers",
]


# Napoleon settings
napoleon_google_docstring = True
napoleon_numpy_docstring = True
napoleon_include_init_with_doc = True
napoleon_include_private_with_doc = False
napoleon_include_special_with_doc = True
napoleon_use_admonition_for_examples = False
napoleon_use_admonition_for_notes = False
napoleon_use_admonition_for_references = False
napoleon_use_ivar = False
napoleon_use_param = True
napoleon_use_rtype = False


# -- Options for HTML output -------------------------------------------------

# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
# Chosen Themes:
# * https://github.com/bashtage/sphinx-material/
# * https://github.com/myyasuda/sphinx_materialdesign_theme
html_theme = "sphinx_material"

if html_theme == "sphinx_material":
html_theme_options = {
"google_analytics_account": "",
"base_url": "https://mlserver.readthedocs.io",
"color_primary": "teal",
"color_accent": "light-blue",
"repo_url": "https://github.com/SeldonIO/MLServer/",
"repo_name": "MLServer",
"globaltoc_depth": 2,
"globaltoc_collapse": False,
"globaltoc_includehidden": True,
"repo_type": "github",
"nav_links": [
{
"href": "https://docs.seldon.io",
"internal": False,
"title": "🚀 Our Other Projects & Products:",
},
{
"href": "https://docs.seldon.io/projects/seldon-core/en/latest/",
"internal": False,
"title": "Seldon Core",
},
{
"href": "https://docs.seldon.io/projects/alibi/en/stable/",
"internal": False,
"title": "Alibi Explain",
},
{
"href": "https://docs.seldon.io/projects/alibi-detect/en/stable/",
"internal": False,
"title": "Alibi Detect",
},
{
"href": "https://tempo.readthedocs.io/en/latest/",
"internal": False,
"title": "Tempo SDK",
},
{
"href": "https://deploy.seldon.io/",
"internal": False,
"title": "Seldon Deploy (Enterprise)",
},
{
"href": (
"https://github.com/SeldonIO/seldon-deploy-sdk#seldon-deploy-sdk"
),
"internal": False,
"title": "Seldon Deploy SDK (Enterprise)",
},
],
}

extensions.append("sphinx_material")
html_theme_path = sphinx_material.html_theme_path()
html_context = sphinx_material.get_html_context()

html_sidebars = {
"**": ["logo-text.html", "globaltoc.html", "localtoc.html", "searchbox.html"]
}

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ["_static"]

html_css_files = [
"css/custom.css",
]
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,10 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Custom environments in MLServer\n",
"# Custom Conda environments in MLServer\n",
"\n",
"It's not unusual that model runtimes require extra dependencies that are not direct dependencies of MLServer.\n",
"This is the case when we want to use [custom runtimes](../custom), but also when our model artifacts are the output of older versions of a toolkit (e.g. models trained with an older version of SKLearn).\n",
"This is the case when we want to use [custom runtimes](../custom/README), but also when our model artifacts are the output of older versions of a toolkit (e.g. models trained with an older version of SKLearn).\n",
"\n",
"In these cases, since these dependencies (or dependency versions) are not known in advance by MLServer, they **won't be included in the default `seldonio/mlserver` Docker image**.\n",
"To cover these cases, the **`seldonio/mlserver` Docker image allows you to load custom environments** before starting the server itself.\n",
Expand Down Expand Up @@ -76,7 +76,7 @@
"We can now train and save a Scikit-Learn model using the older version of our environment.\n",
"This model will be serialised as `model.joblib`.\n",
"\n",
"You can find more details of this process in the [Scikit-Learn example](../sklearn)."
"You can find more details of this process in the [Scikit-Learn example](../sklearn/README)."
]
},
{
Expand Down Expand Up @@ -257,7 +257,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.0"
"version": "3.7.8"
}
},
"nbformat": 4,
Expand Down
Loading

0 comments on commit eddad0d

Please sign in to comment.