Batch Scoring Jobs for BigQuery

 

DataRobot can execute prediction jobs on a recurring schedule and write the predictions to a BigQuery table. The write process can be set to update or overwrite and can even create a new table for you if it does not already exist in BigQuery. When configuring the batch job, users can select which of the features from the prediction data set to include and choose to include up to 10 prediction explanations. This video such how to set up a batch scoring job where the data source and the prediction destination is BigQuery. Users can also combine BigQuery batch scoring with other data warehouses such as Azure Synapse, Snowflake or AWS S3.

 

Learn More:

Prediction intake options 

Make batch predictions with Google Cloud storage 

Version history
Last update:
‎08-10-2023 12:55 PM
Updated by:
Contributors