How are you handling the db in automation suites?
I’m running into issues where the test DB is, by necessity, a rather weighty 900mb, so a simple drop and restore from known backup is hugely time consuming.
“If you automate a mess, you get an automated mess.” -Rod Michael
In my current role at Automattic I primarily work on end-to-end automated tests for WordPress.com. These tests run against live data (Production) no matter where our UI client (Calypso) is running (for example on localhost), so we just make sure our config points to the data that we need (test sites) and create other test data within the e2e scenarios.
In previous organisations I have used a scaled down backup of production that had specific test data ‘seeded’ into it. Our DBAs had a bunch of scripts that would take a backup and cleanse/remove a whole heap of data (for example, archived products, orders) so that resulted in a small manageable backup that we could quickly restore into an environment. I found this to be a good approach as it gave us realistic data but it wasn’t time consuming restoring this when necessary, eg. before a CI test run.
I also shared some other data creation techniques in a previous answer.