DataRobot can execute prediction jobs on a recurring schedule and write the predictions to a Snowflake table. The write process can be set to update or overwrite and can even create a new table for you if it does not already exist in Snowflake. When configuring the batch job, users can select which of the features from the prediction data set to include and choose to include up to 10 prediction explanations. This video such how to set up a batch scoring job where the data source and the prediction destination is Snowflake. Users can also combine Snowflake batch scoring with other data warehouses such as Azure Synapse, BigQuery or AWS S3.