Hi. I'm using Datarobot PPS image. And for the documentation I referred, once the image started I can just run the curl command and get the prediction results from the model I choose and I able to do that.
But is it possible to put the curl command inside a Dockerfile instead? I try to do it but I keep on getting 'invalid option' error. Got an idea?
Previously I'm testing on Deploying and Monitoring DataRobot Model on Amazon EKS using pps image (https://community.datarobot.com/t5/knowledge-base/deploying-and-monitoring-datarobot-models-on-aws/t...).
But now I want to test using Amazon ECS using fargate launch type instead. When I run the task definition, I can’t figure out how to pass in the curl command. I try to put in inside a dockerfile while rebuild a new image from my pps image, and pass in the curl inside CMD command but I keep on getting error. So, any idea I can do the deployment on Amazon ECS?
For now testing I didnt use larger batch of predictions.
curl -X POST http://localhost:8080/predictions -H "Content-Type: text/csv" --data-binary @path/to/scoring.csv
when I put inside Dockerfile as CMD, it will return `/bin/sh is not a valid option`
Its a very simple Dockerfile. The one I use for ECS do not contain CMD part. I just put it in the container overrides>command override section when I run new task.
FROM datarobot-portable-prediction-api:7.1.2r0 COPY sample_data/* /sample_data/ #ENTRYPOINT ["sh","-c","curl -X POST http://localhost:8080/predictions -H 'Accept: text/csv' -H 'Content-Type: text/csv' --data-binary @/sample_data/test_prediction_data.csv"] CMD ["/bin/sh","-c","curl -X POST http://localhost:8080/predictions -H 'Accept: text/csv' -H 'Content-Type: text/csv' --data-binary @/sample_data/test_prediction_data.csv"]
Or maybe, is there any hands-on tutorial to configure the model deployment using AWS ECS? Video or something could be useful for me.