Deployments with MLOPs Getting Started Guide

Showing results for 
Search instead for 
Did you mean: 

Deployments with MLOPs Getting Started Guide

(Updated March 2021)

Deployment is a critical component for gaining real value from a model, yet unfortunately it’s also where a lot of models get stuck. DataRobot offers a comprehensive set of solutions around model deployment as well as model monitoring, management, and governance through MLOps. These solutions work well with the different personas involved in MLOps and are agnostic to your current platform and choice of model-development environment.

With MLOPs you can deploy models using various solutions:

  • Easily deploy and interact with a DataRobot-built model using the REST API.
  • Deploy a model built with tools like Python or R using DataRobot Custom Inference Models.
  • Export a DataRobot-built model as a container.
  • Export a DataRobot-built model as scoring code.
  • Deploy a remote agent for an external model.


And you can monitor, manage, and govern all of these models in one place:

  • Monitor any of these models for service health, data drift, and accuracy.
  • Set up custom Notifications that tell you when your deployment needs your attention.
  • Manage and Replace your models easily while keeping a documented record of every change that occurs.
  • Establish Governance roles and processes for each deployment.


To get the big picture of MLOps, see these resources which show the entire product from end-to-end:

Deploying Models

Before deploying the model, you make the model package in the model registry

Then you have the model you want, you just need to deploy it. You can do this all through one system, regardless of how you created the model.DataRobot gives you three options for deployment: using REST APIs, using Custom Inference Models, or using Portable Prediction Servers (PPSs) with MLOps agents.

Deploy a DataRobot-built model with the REST API:

Real-time deployments
Batch deployments

Deploy a model built with tools like Python or R using the DataRobot Custom Inference Models solution:

Export a DataRobot-built model as a container using a Portable Prediction Server (PPS) and then monitor it with MLOPs Agents on cloud-provider Kubernetes services:

Exporting Models

Using DataRobot for model development, it is possible to export your model as scoring code. DataRobot offers two approaches for this: Scoring Code and DataRobot Prime.

Scoring Code
DataRobot Prime

Deploy a remote agent for monitoring external models (either custom models or DataRobot exported scoring code):

Monitoring Models

When you deploy models using MLOps, you can then track them all from one place. MLOps is a one-stop command center for all your models.

Overview of Monitoring models
Specific changes to monitor

Managing and Governing your Deployment

MLOps simplifies replacing an existing model. And, governance features ensure you can easily track and manage deployments within your enterprise.


When you want to deploy models or just use model predictions in another application,DataRobot provides integrations with many other technologies.

AWS Athena
Snowflake Integrations
Azure DB/SQL Integrations
Tableau Integrations


While this article provided an overview of some of the deployment/MLOps resources, there are also a number of MLOps-related webinars listed in our index of learning sessions.

Scripts and Code

You can find code examples with R and Python on our Community Github. Below are some of the more relevant repos for MLOPs processes.

New in recent releases

You can find loads of enhancements and new functionality for MLOps in recent releases.

  • DataRobot Release 7.1: Improved Batch Predictions for Custom Models, Governance: Feature Lists, The Management Agent, Scoring Code in Snowflake, Prediction Explanations in Scoring Code, Prediction Job Definitions + Scheduler, Deployment Reports, MLOps Agent: Kafka, Baseline Revisions for External Models, Portable Batch Predictions, Batch Prediction Parquet Support, Reset Deployments Statistics
  • DataRobot Release 7.0: Remote Model Challengers, PPS (Portable Prediction Servers) for Custom Models, Pulling and Testing Models from Bitbucket, Multiclass Model Monitoring
    Custom Model Testing Enhancements
  • DataRobot Release 6.3: Portable Prediction Servers, Improved Historical Prediction Data Upload, Scoring Code & Agent Integration, Group Sharing of Deployments

Have an FAQ?

Have a question about MLOps? Check FAQs: Deploying Models (MLOps) to find answers to some frequently asked questions.

Help us!

We're trying to keep this list up-to-date with all the helpful DataRobot MLOps content. If you find something that isn't on this list, please Comment here and let us know!

Labels (3)
Jumper Wires


Thank you for your good explanation-Deployments with MLOPs Getting Started Guide.Hoping for the best brief explanation for the next topics




Version history
Last update:
‎06-17-2021 08:23 AM
Updated by: