Written by Luther Rochester on 05:51 reading time

As we’ve moved increasing portions of our infrastructure to AWS, we’ve found more use cases for stashing things in s3 that we used to use traditional backup software to store. We keep our SSIS code in version control (currently TFS, grumble grumble), but we aren’t storing a copy of each database object definiton there. Recently we undertook setting up a job that would script out each object’s DDL and store it in an s3 bucket. We’re already doing this for our Postgres servers using pg_dump; we found a great little Python app called mssql-scripter that works similarly to script out the objects for us. This runs in a batch script, along with an aws-cli command to upload the resulting files. The batch script is then called from a SQL Agent job and run on a schedule. The AWS bucket has a lifecycle policy to handle retention for us.

You’ll need a few tools

The server that will run the code will need a few things installed on it. It’s easiest if you use the same server that SQL Agent...

Read more