Home Artificial Intelligence MLflow on Cloud

MLflow on Cloud

1
MLflow on Cloud

MLflow on AWS

MLflow-AWS architecture

The above architecture diagram shows what are all of the AWS services we required and the way do they interact in an effort to host mlflow on AWS. We’re going to launch the mlflow using AWS Fargate which is serverless.

First, we’d like to create a docker file, that packages mlflow and it’s dependencies. We define the next dependencies in requirements.txt

mlflow on AWS requirements

The docker file for mlflow installation and running the server

mlflow Dockerfile

The docker file has bucket and other database connection details as variables, these values will likely be configured as environment variables in Fargate task definition

Once now we have the dockerfile, run the below command to construct the docker image

docker construct -t mlflowtracker .

Next thing we’d like is to create container registry repository in AWS. Seek for container registry in search bar of AWS console and click on ‘Create Repository’ button. Within the creation screen, select private option and supply the repository name

Create AWS ECR Repository

Now, We’d like to push the docker image that we created locally on to the container registry in AWS.

# Login 
aws ecr get-login-password --region region | docker login --username AWS --password-stdin aws_account_id.dkr.ecr.region.amazonaws.com

# Tag local docker image
docker tag aws_account_id.dkr.ecr.region.amazonaws.com/repository_name:tag

# Push image
docker push aws_account_id.dkr.ecr.region.amazonaws.com/repository_name:tag

We’ve got to put in AWS CLI to run the above commands

Next we’d like MySql for mlflow to store meta information, for which we’re going to create an AWS RDS MySql instance

For steps to create MySql in AWS RDS,

We’d like to create an S3 bucket for mlflow to store model artifacts. Go to aws console, seek for s3, provide the bucket name and be certain that you block all the general public access and click on ‘Create Bucket”

Create S3 bucket

Next we’d like to create AWS ECS cluster by choosing fargate as compute. Go to aws console and seek for elastic container service and click on ‘Create Cluster’. In create cluster page, provide the cluster name, select vpc and subnet. By default the fargate option can be chosen.

Create ECS cluster

In ECS, a task is granular level entity which runs the docker image, it is comparable to a pod in kubernetes. So, we’d like to create task definition which goes to make use of the docker image that now we have uploaded to ECR. Here we define the environment variable values. To create task definition, go to and supply the duty name, image uri that we uploaded to ECR, container port and the environment variables

ECS task definition

Throughout the creation of task definition select ecsTaskExecution role which is created by AWS mechanically, be certain that you add s3 access policy in order that it may read and write to s3 bucket

Once now we have the duty definition created we’d like to create a service which uses the above created task definition. While creating the service we will select the min and max variety of tasks we’d like (I actually have chosen just one task), so it may autoscale. We’ve got an option of making a load balancer during this step.

To create ECS service, go to and supply service name, select the duty definition and it’s revision that we created earlier, enter min and max variety of tasks, select vpc under networking section, select application load balancer in load balancer section and supply the name, listner port, goal group name

ECS Service for task orchestration
ECS Service Load Balancer

Once we create the service, it spins up the required variety of tasks, we will use the load balancer url to launch the mlflow dashboard.

The load balancer uses security group which defines the inbound and outbound network rules, it’s worthwhile to modify/add rules, so that you could access the mlflow UI.

Now that we hosted mlflow on AWS, lets run a machine learning experiment from our local by pointing to the mlflow on cloud

After running the experiment, we will see all of the parameters, metrics and model in mlflow that we logged throughout the experiment

mlflow with experimental results — AWS

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here