I am planning to develop an end to end solution for automated time series forecast(mostly using deep learning models).Could you please let me know if its possible to integrate azure synapse with datarobot for analytics(ETL) instead of azure databricks? And do I need a separate storage like datalake?
Also it would be helpful if you can tell me the options to get this solution in production available with datarobot.
DataRobot has quite powerful automated time series model building capabilities - whether predicting for a single series, or very many (each store, each sku for example) with the ability to discover cross series relationships as well as calculate and evaluate many different lags/aggregations of time series data.
Models can be deployed in various ways; including being hosted on the DataRobot platform for batch and real time use via API, as well as options to export models to run on external infrastructure. These exported models can also be monitored from DataRobot.
For scoring data from an Azure Synapse Data Warehouse, my default recommendation would be simply building and deploying the model on DataRobot, and leveraging our Batch Prediction API. This API accepts parameters for source/destinations of data to be scored - local csv file, cloud object storage, and jdbc are all supported, and can be mixed and matched. For example, source data could be from jdbc (via Synapse) or Azure Blob, and written to Synapse via jdbc. Larger loads could be done in bulk leveraging the Synapse COPY statement from Blob storage.