Labels

This notebook is intended for a GraphQL developer who wants to integrate with DataRobot.

In this is example implementation, a GraphQL server is connecting to the DataRobot OpenAPI specification using GraphQL Mesh, the currently maintained option.

Read more...

This accelerator outlines how to create, deploy, and monitor a custom inference model with DataRobot's Python client. You can use the Custom Model Workshop to upload a model artifact to create, test, and deploy custom inference models to DataRobot’s centralized deployment hub.

Read more...

This accelerator is developed for use with Databricks to help users leverage the power of DataRobot for time-series modeling within their Databricks ecosystem.

Read more...

Level up your end-to-end ML lifecycle on the Data Cloud  by integrating DataRobot and Snowflake.

Read more...

Source data from S3 or Athena, build and evaluate ML models using DataRobot, send predictions back to S3.

Read more...

How data stored in Azure can be used to train a collection of models.

Read more...

Import data, build and evaluate models, and deploy a model into production to make new predictions with Snowflake.

Read more...

Build and refine ML models within DataRobot and deploy them to run within AWS SageMaker.

Read more...

Integrate directly into your GCP environment to accelerate your use of machine learning across all of the GCP services.

Read more...

Pair the power of DataRobot with the Spark-backed notebook environment provided by Databricks.

Read more...

Sample solution for monitoring AWS Sagemaker models with DataRobot MLOps.

Read more...

Use DataRobot and the Python API to build a workflow with SAP as the remote data source.

Read more...

Embed scoring code in a microservice and prepare as Docker container.

Read more...

About AI Accelerators
Discover code-first, modular building blocks for efficient model development and deployment that provide a template for kick-starting a project with DataRobot.

Check out GitHub to learn how to get started.