DataRobot can execute prediction jobs on a recurring schedule and write the predictions to an Azure SQL table. The write process can be set to update or overwrite and can even create a new table for you if it does not already exist in Azure SQL. When configuring the batch job, users can select which of the features from the prediction data set to include and choose to include up to 10 prediction explanations. This video such how to set up a batch scoring job where the data source and the prediction destination is Azure SQL. Users can also combine Azure SQL batch scoring with other data warehouses such as Snowflake, BigQuery or AWS S3.