Model Packaging and Deployment with MLflow Models

Note

This section describes MLflow features that are in Private Preview. To request access to the preview, contact your Databricks sales representative. If you are not participating in the preview, see the MLflow open-source documentation for information on how to run standalone MLflow.

An MLflow Model is a standard format for packaging machine learning models that can be used in a variety of downstream tools—for example, batch inference on Apache Spark and real-time serving through a REST API. The format defines a convention that lets you save a model in different flavors that can be understood by different model serving and inference platforms.

This topic provides examples of using a model for inference and deploying models to various deployment environments.

Quick start model inference

This notebook is part 2b of a Quick Start guide based on the MLflow tutorial. As in part 1, Quick start training, this notebook uses ElasticNet models trained on the diabetes dataset in scikit-learn. This part of the tutorial shows how to:

  • Select a model to deploy using the MLflow tracking UI
  • Load the trained model as a scikit-learn model
  • Export the model as a PySpark UDF
  • Apply the UDF to add a prediction column to a DataFrame

Quick start model deployment

This notebook is part 2b of a Quick Start guide based on the MLflow tutorial. As in part 1, Quick start training, this notebook uses ElasticNet models trained on the diabetes dataset in scikit-learn. This part of the tutorial shows how to:

  • Select a model to deploy using the MLflow tracking UI
  • Deploy the model to SageMaker using the MLflow API
  • Query the deployed model using the sagemaker-runtime API
  • Repeat the deployment and query process for another model
  • Delete the deployment using the MLflow API

For information on how to configure AWS authentication so that you can deploy MLflow models in AWS SageMaker from Databricks, see Set up AWS Authentication for SageMaker Deployment.

MLeap model deployment

The following notebook is part 2 of an example that trains a PySpark model and logs the model with MLeap flavor. The first part walked through the process of training a PySpark model and saving it in MLeap format. The notebook below shows how to deploy the saved MLeap model to SageMaker.