You can sort your leaderboard of generated models by various metrics (LogLoss, AUC, RMSE, etc.). The F1 score for a binary classifier is available under a specific model; expand one on your leaderboard and choose Evaluate -> ROC Curve and you will see it along with many other metrics. You can see it in the top left of this image.
The score itself is not simply an attribute of the model however; it is a result of where a threshold is applied to label a prediction for the positive or negative class. Thus you will see it change as you apply different threshold values.