I have my snowflake integration with Datarobot running properly. In particular, I have one job running which is making predictions on some test accounts which are being fed to Datarobot from snowflake.
My question is; is it possible to define one more prediction job in a different project, without deleting the existing job. So far what I have been doing is; whenever I need to define a new job I delete the existing job. But I would like to explore more, and see if we can run two jobs running, but one in each different project.
So you have a deployment using the Datarobot prediction environment and you create a prediction job inside the deployment to score data from a snowflake connection?
Is my understanding correct?
You can create multiple Job Definitions inside the Deployment if there are subtle variations you wish to incorporate.
if you have a different project and a different deployment then you could also create a Job Definition there to score the snowflake data.
Do not hesitate to reach out if you have additional questions.
Appreciate your reply. So I decided to go for the second approach. Define one job each in different projects, and they are scheduled to run at hourly basis.
My question is, are they supposed to run one after another since both the jobs are retrieving data from snowflake. I can see that when job 1 is running, then job2 says Next#1 in queue.
Is this expected?kindly advise.