The impact of disks filling up is, at best, down time. At worst? You're spending many hours resurrecting your data with your DataRobot support gurus. DataRobot’s data layer has metadata databases and file storage. No one likes full disks, especially databases! Trust me, you want to avoid the Oops:
Go ahead right now and check your data node to make sure your disk has space and is < 85% full.
Is your disk space okay? Moving forward, make sure that you continue to monitor it. Is your disk space in trouble? Now is a great time to clean things up.
Monitor like your disk space depends on it
Monitoring basic server stats is Sys Admin 101, but it may surprise you to learn that you still need to do this in the cloud. Although Amazon AWS will make sure your S3 bucket has space, your EBS volumes are your problem. You can use CloudWatch to alert you! (If you're using another cloud environment, look for the equivalent monitoring tool. Also, if you’re a VM customer, check with your IT folks who will already have a monitoring tool.)
Jettison like your disk space depends on it
Deleting unwanted projects, including all those old diabetes 10K demo projects, will help but remember to permanently delete them as an admin. Have a look at the documentation for deleting projects.
What if you’re a Managed AI Cloud service customer? DataRobot does all this for you. Your job then is to just keep those models coming!
I have so many more administration tips queued up and will be sharing them here. Make sure to share your favorite tips too!