I started a new job a few months ago that has 4 environments. In each of these 4 environments there's around 40+ databases.
Problem is, the environments are out of date and the databases in some instances are HUGE. I want to restore these environments with different data retention policies but the sheer size and complexity of the environment scares me. Example, QA will restore every month while DEV will restore every 2 months...
Ideally, I would love to have a VMWare snapshot of our prod environments applied every month or so over these QA/DEV environments for simplicity purposes but that doesn't fly with our sysadmins.
I'm looking into data partitioning for the extremely large datasets that would take away some of the pain of restoring data. What my problem is, I used to have this done at my last job but I was managing far less data and the restores were happening nightly in a test environment with full backups.
I'm looking for a little guidance on how I can complete this with the least amount of pain with some best practices.