AWS (Lambda, S3, EC2), Apache Spark with DataRobot Trial

AWS (Lambda, S3, EC2), Apache Spark with DataRobot Trial

Good afternoon, tell me, can I connect DataRobot to a working chain with Amazon and Apache(Spark ) in the trial version? Are there any limits on the amount of processed information? I want to try setup theme like I read from there:https://community.datarobot.com/t5/general-knowledge-base/scoring-snowflake-data-via-datarobot-model....

This is real with trial version of DataRobot?

0 Kudos
3 Replies

At the moment I have configured only DataRobot with Tableu, and as result here table updates by itself, that was pretty easy. I wait until guys from front-end part end they tasks and after then will connect Apache with DataRobot, I find this articles, and will setup all with this, think it will help you too. At first you should understand is it really right for you use Spark with DataRobot:

https://www.datarobot.com/blog/spark-not-enough/

If so, here some related articles:

https://www.datarobot.com/blog/dataversity-spark/

https://www.cdata.com/kb/tech/datarobot-jdbc-apache-spark.rst

Hope this help.

0 Kudos

Hi,

I have the model integrations grayed out in my trial version of datarobot . I will like to integrate snowflake. 

Can you help?

amusatm_0-1595964074181.png

 

0 Kudos

Hey there - thanks for your question!

Our trial has some safeguards to prevent runaway usage. But these safeguards are set exceptionally high to allow experimentation and exploration. So you likely won't run into a volume limitation, but if you do, please reach out - we're happy to help.

On the other hand, we do have some feature limitations in the Trial, but nothing that should limit your ability to evaluate DataRobot as an AI Solution. Specifically, you won't have access to scoring code within our Trial, but you can connect a data source (like Snowflake) and pull data for modeling. 

I hope this helps - but let me know if you have specific questions and I'll be sure to get you on your way.  Thanks!