MLflow end-to-end example¶
In this example we are going to build a model using mlflow
, pack and deploy locally using tempo
(in docker and local kubernetes cluster).
We are are going to use follow the MNIST pytorch example from mlflow
, check this link for more information.
In this example we will:
Prerequisites¶
This notebooks needs to be run in the tempo-examples
conda environment defined below. Create from project root folder:
conda env create --name tempo-examples --file conda/tempo-examples.yaml
Train model¶
We train MNIST model below:
Install prerequisites¶
!pip install mlflow 'torchvision>=0.9.1' torch==1.9.0 pytorch-lightning==1.4.0
!rm -fr /tmp/mlflow
%cd /tmp
!git clone https://github.com/mlflow/mlflow.git
Train model using mlflow
¶
%cd mlflow/examples/pytorch/MNIST
!mlflow run . --no-conda
!tree -L 1 mlruns/0
Choose test image¶
from torchvision import datasets
mnist_test = datasets.MNIST('/tmp/data', train=False, download=True)
# change the index below to get a different image for testing
mnist_test = list(mnist_test)[0]
img, category = mnist_test
display(img)
print(category)
Tranform test image to numpy¶
import numpy as np
img_np = np.asarray(img).reshape((1, 28*28)).astype(np.float32)
Save model environment¶
import glob
import os
files = glob.glob("mlruns/0/*/")
files.sort(key=os.path.getmtime)
ARTIFACTS_FOLDER = os.path.join(
os.getcwd(),
files[-1],
"artifacts",
"model"
)
assert os.path.exists(ARTIFACTS_FOLDER)
print(ARTIFACTS_FOLDER)
Define tempo
model¶
from tempo.serve.metadata import ModelFramework
from tempo.serve.model import Model
mlflow_tag = "mlflow"
pytorch_mnist_model = Model(
name="test-pytorch-mnist",
platform=ModelFramework.MLFlow,
local_folder=ARTIFACTS_FOLDER,
# if we deploy to kube, this defines where the model artifacts are stored
uri="s3://tempo/basic/mnist",
description="A pytorch MNIST model",
)
Save model (environment) using tempo
¶
Tempo hides many details required to save the model environment for mlserver
:
Add required runtime dependencies
Create a conda pack
environment.tar.gz
from tempo.serve.loader import save
save(pytorch_mnist_model)
Deploy to Docker¶
from tempo import deploy_local
local_deployed_model = deploy_local(pytorch_mnist_model)
local_prediction = local_deployed_model.predict(img_np)
print(np.nonzero(local_prediction.flatten() == 0))
local_deployed_model.undeploy()
Deploy to Kubernetes¶
Prerequisites¶
Create a Kind Kubernetes cluster with Minio and Seldon Core installed using Ansible as described here.
%cd -0
!kubectl apply -f k8s/rbac -n production
Upload artifacts to minio¶
from tempo.examples.minio import create_minio_rclone
import os
create_minio_rclone(os.getcwd()+"/rclone.conf")
from tempo.serve.loader import upload
upload(pytorch_mnist_model)
Deploy to kind
¶
from tempo.serve.metadata import SeldonCoreOptions
runtime_options = SeldonCoreOptions(**{
"remote_options": {
"namespace": "production",
"authSecretName": "minio-secret"
}
})
from tempo import deploy_remote
remote_deployed_model = deploy_remote(pytorch_mnist_model, options=runtime_options)
remote_prediction = remote_deployed_model.predict(img_np)
print(np.nonzero(remote_prediction.flatten() == 0))
remote_deployed_model.undeploy()