DataRobot can integrate directly into your GCP environment, helping to accelerate your use of machine learning across all of the GCP services.
In this notebook accelerator, you can use Google Collaboratory or another notebook environment to source data from BigQuery, build and evaluate an ML model using DataRobot, and deploy predictions from that model back into BigQuery and GCP.
1. Preparing data and ensuring connectivity: In the first section of the notebook, you will load a sample dataset to be used for modeling into BigQuery. Once complete, you will connect your BigQuery data with DataRobot.
2. Building and evaluating a model: Using the DataRobot Python API, you will have DataRobot build close to 50 different machine learning models while also evaluating how those models perform on this dataset.
3. Scoring and hosting: In the final section, the entire dataset will be scored on the new model with prediction data written back to BigQuery for use in your GCP applications.
Feel free to contact me here on the DataRobot Community if you have any questions and for follow-up content on how to use DataRobot on GCP.