Easily deploy an MLflow tracking server with 1 command.
MinIO S3 is used as the artifact store and MySQL server is used as the backend store.
-
Clone (download) this repository
git clone https://github.com/sachua/mlflow-docker-compose.git
-
cd
into themlflow-docker-compose
directory -
Build and run the containers with
docker-compose
docker-compose up -d --build
-
Access MLflow UI with https://localhost:5000
-
Access MinIO UI with https://localhost:9000
The MLflow tracking server is composed of 4 docker containers:
- MLflow server
- MinIO object storage server
- MySQL database server
-
Install conda
-
Install MLflow with extra dependencies, including scikit-learn
pip install mlflow[extras]
-
Set environmental variables
export MLFLOW_TRACKING_URI=https://localhost:5000 export MLFLOW_S3_ENDPOINT_URL=https://localhost:9000
-
Set MinIO credentials
cat <<EOF > ~/.aws/credentials [default] aws_access_key_id=minio aws_secret_access_key=minio123 EOF
-
Train a sample MLflow model
mlflow run https://github.com/mlflow/mlflow-example.git -P alpha=0.42
-
Note: To fix ModuleNotFoundError: No module named 'boto3'
#Switch to the conda env conda env list conda activate mlflow-3eee9bd7a0713cf80a17bc0a4d659bc9c549efac #replace with your own generated mlflow-environment pip install boto3
-
-
Serve the model (replace with your model's actual path)
mlflow models serve -m S3:https://mlflow/0/98bdf6ec158145908af39f86156c347f/artifacts/model -p 1234
-
You can check the input with this command
curl -X POST -H "Content-Type:application/json; format=pandas-split" --data '{"columns":["alcohol", "chlorides", "citric acid", "density", "fixed acidity", "free sulfur dioxide", "pH", "residual sugar", "sulphates", "total sulfur dioxide", "volatile acidity"],"data":[[12.8, 0.029, 0.48, 0.98, 6.2, 29, 3.33, 1.2, 0.39, 75, 0.66]]}' https://127.0.0.1:1234/invocations
This note is based on changes to the
.env
anddocker-compose.yml
files. The changes to the Minio Access Key must first be made in the Minio Console.
-
Make Minio Access Keys on Minio, then save access key id and secret access key.
-
Import environment on notebook
%env MLFLOW_TRACKING_URI=http://localhost:5000
%env MLFLOW_S3_ENDPOINT_URL=http://localhost:9000
%env AWS_ACCESS_KEY_ID=2vSrPs21nZYaUQvovgRL
%env AWS_SECRET_ACCESS_KEY=yCqF29KU1qbykEnsceWMDRNvPelgGAVBmyD6PeU5
- Check it again and setup experiment name
import os
import mlflow
assert "MLFLOW_TRACKING_URI" in os.environ
assert "MLFLOW_S3_ENDPOINT_URL" in os.environ
assert "AWS_ACCESS_KEY_ID" in os.environ
assert "AWS_SECRET_ACCESS_KEY" in os.environ
# you can also use this method for set tracking uri, instead using environment
mlflow.set_tracking_uri("https://localhost:5000/")
mlflow.set_experiment("nyc-taxi")
- Use with statement in trainer code
with mlflow.start_run():
# your trainer code
-
Make Minio Access Keys on Minio, then save access key id and secret access key.
-
Put environment in
.env
file
MLFLOW_TRACKING_URI=https://localhost:5000
MLFLOW_S3_ENDPOINT_URL=https://localhost:9000
AWS_ACCESS_KEY_ID=2vSrPs21nZYaUQvovgRL
AWS_SECRET_ACCESS_KEY=yCqF29KU1qbykEnsceWMDRNvPelgGAVBmyD6PeU5
- Import
.env
file in python code and setup experiment name
from dotenv import load_dotenv
load_dotenv()
# you can also use this method for set tracking uri, instead using environment
mlflow.set_tracking_uri("https://localhost:5000/")
mlflow.set_experiment("nyc-taxi")
- Use with statement in trainer code
with mlflow.start_run():
# your trainer code
- 2023-12-01 Add Adminer and Grafana service; Remove Prefect agent service (outdated)
- 2023-11-28 Migrate to Postgres DB and add Prefetch server and agent service