Native integration DataRobot and Snowflake Snowpark

cancel
Showing results for 
Search instead for 
Did you mean: 

 

If you or your team have tried to develop and productionize machine learning models with Snowflake using Python and Snowpark but looking to level up your end-to-end ML lifecycle on the Data Cloud, this AI Accelerator is for you.

 

Depending on your role within the organization, these may vary from technical personnel who desire a hosted notebook, improved developer experience, and monitoring capabilities for models within Snowflake to business personnel seeking guidance and insights on translating these models into actionable business insights, such as next steps for customers, sales, marketing, and more.

 

About this Accelerator

DataRobot addresses these exact needs with this accelerator. The notebook showcases the native integration between DataRobot and Snowflake's data cloud, leveraging DataRobot notebooks and Snowflake Snowpark (with Python and Java).
In addition, it is compatible with the Snowflake data science stack and DataRobot 9.0 to give you advantages in terms of speed, accuracy, security, and cost-effectiveness.
 

What you will learn  

  1. Load data to Snowflake from an S3 file
  2. Acquire a training dataset from a Snowflake table using Snowpark for Python
  3. Feature engineering: analyze data and create new features using Snowpark
  4. Build a new DataRobot project
  5. Analyze and evaluate model performance and explainability using DataRobot AutoML
  6. Deploy the recommended model to Snowflake using DataRobot MLOps
  7. Score the model via Snowpark for Java
  8. Monitor the model with MLOps

 

Additional Resources

Watch the demo here

Version history
Last update:
‎09-14-2023 11:38 AM
Updated by: