Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Add better docs for using mlflow models serve #1495

Merged
merged 2 commits into from
Jun 26, 2019

Conversation

aarondav
Copy link
Contributor

What changes are proposed in this pull request?

Was playing around with mlflow models serve, and it was surprisingly hard to figure out how to make a request against the spawned server -- had to dig through the mlflow code. This PR improves the docs to provide examples both in the CLI and the model deployment documentation.

Release Notes

Is this a user-facing change?

  • No. You can skip the rest of this section.
  • Yes. Give a description of this change to be included in the release notes for MLflow users.

What component(s) does this PR affect?

  • Docs
  • Models

How should the PR be classified in the release notes? Choose one:

  • rn/breaking-change - The PR will be mentioned in the "Breaking Changes" section
  • rn/none - No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" section
  • rn/feature - A new user-facing feature worth mentioning in the release notes
  • rn/bug-fix - A user-facing bug fix worth mentioning in the release notes
  • rn/documentation - A user-facing documentation change worth mentioning in the release notes

@aarondav
Copy link
Contributor Author

cc @stbof

@@ -35,6 +35,19 @@ def serve(model_uri, port, host, workers, no_conda=False, install_mlflow=False):
Serve a model saved with MLflow by launching a webserver on the specified host and port. For
information about the input data formats accepted by the webserver, see the following
documentation: https://www.mlflow.org/docs/latest/models.html#model-deployment.

Requests may be made to ``POST /invocations`` in Pandas split- or record-oriented formats.

Copy link

@ghost ghost Jun 24, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Prefer active to passive voice, can instead of may, and the correct spelling of pandas according to https://pandas.pydata.org/. Thus:

You can make requests to POST /invocations in pandas split- or record-oriented formats.

@@ -490,7 +490,7 @@ be used to safely deploy the model to various environments such as Kubernetes.
You deploy MLflow model locally or generate a Docker image using the CLI interface to the
:py:mod:`mlflow.models` module.

The REST API server accepts the following data formats as inputs:
The REST API server accepts the following data formats as inputs as a POST to the ``/invocations`` path:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The REST API server accepts the following data formats as POST input to the /invocations path:

@aarondav aarondav merged commit 4270802 into mlflow:master Jun 26, 2019
@andrewmchen andrewmchen added the rn/none List under Small Changes in Changelogs. label Jul 16, 2019
avflor pushed a commit to avflor/mlflow that referenced this pull request Aug 22, 2020
* [Docs] Add better docs for using `mlflow models serve`

* Address comments
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
rn/none List under Small Changes in Changelogs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants