cancel
Showing results for 
Search instead for 
Did you mean: 

Do custom inference models capabilities differ from vanilla

Do custom inference models capabilities differ from vanilla

Hi, I have been looking at custom model deployments and wanted to know if their capabilities in MLOps differ from standard deployments using DataRobot's inbuilt models. Specifically can custom models use continuous AI features such as automatic retraining?

 

12 Replies

Hi Ira, I am from former Algorithmia. We got acquired about 2 Month ago by DataRobot. The custom inference capability brought by Algorithmia differ quite a bit from the vanilla offering.

ModelCatalog, Versioning, Pipelining, Governance, Languages, Dependencies, Compute, Orchestration, Hardware, Monitoring are features provided across models built in R, Python, Ruby, Rust...DataRobot, H2O, Dataiku, AWS SM...etc using autoscaling capabilities running Kubernetes on Dockers, deployed on premise or in the Cloud.

Thank you @MLOPS DFW for the answer, I didn't realise how new the system was to DataRobot. 

0 Kudos

Thanks

0 Kudos