Batch Scoring Jobs for AWS S3

 

DataRobot can execute prediction jobs on a recurring schedule and write the predictions to an AWS S3 bucket table. When configuring the batch job, users can select which of the features from the prediction data set to include and choose to include up to 10 prediction explanations. This video such how to set up a batch scoring job where the data source and the prediction destination is AWS S3. Users can also combine Snowflake batch scoring with other data warehouses such as Azure, BigQuery or Snowflake.

 

Learn More:

Batch prediction methods 

Amazon S3 

End-to-end ML workflow with AWS 

Version history
Last update:
‎08-10-2023 12:56 PM
Updated by:
Contributors