Loss cost modeling is a very common and important use case for insurance companies.
Depending on the situation, one may choose a two-stage modeling approach, i.e., frequency model (Binomial, Poisson) and severity model (Gamma); or one model approach, i.e., pure premium (a.k.a. loss cost) model. The 3rd approach which is not as common as the other two is also a two-stage modeling approach, i.e., frequency model + loss cost model. In this approach, a frequency model is built first; then the predictions from the frequency model is used as an input to the loss cost model, which has the advantage of capturing the potential interaction between frequency and severity models.
DataRobot enforces best practices by automating model setup and incorporating various preprocessing and modeling approaches. Pricing actuaries can test various advanced algorithms with minimal setup, including XGBoost, GBM, ENET, and GAM. Within the same project, they can try all the three different approaches mentioned above.
Check out here to find more about loss cost modeling using DataRobot.