Scripting and Backing Up SQL Server Object DDL to s3

05:51 reading time

As we’ve moved increasing portions of our infrastructure to AWS, we’ve found more use cases for stashing things in s3 that we used to use traditional backup software to store. We keep our SSIS code in version control (currently TFS, grumble grumble), but we aren’t storing a copy of each database object definiton there. Recently we undertook setting up a job that would script out each object’s DDL and store it in an s3 bucket. We’re already doing this for our Postgres servers using pg_dump; we found a great little Python app called mssql-scripter that works similarly to script out the objects for us. This runs in a batch script, along with an aws-cli command to upload the resulting files. The batch script is then called from a SQL Agent job and run on a schedule. The AWS bucket has a lifecycle policy to handle retention for us.

You’ll need a few tools

The server that will run the code will need a few things installed on it. It’s easiest if you use the same server that SQL Agent runs on:
- Python
- mssql-scripter
- AWS CLI for windows

First, download the Windows Python installer and install it on the machine that will run the code.

Next, open an administrative CMD prompt and type:

> pip install mssql-scripter

Finally, install the AWS CLI. We used the MSI installer.

Set up s3 bucket to receive all the DDLzz

Assuming you already have an AWS account, you can log into the web GUI to create an S3 bucket and set up its permissions/retention manually. We’re using Terraform here, so we created it with some terraform config. We created a user who only has access to read/write to this bucket. We also set a lifecycle policy to expire files after 90 days, which will give us about 3 months’ worth of DDL files and automagically rotate out old files.

You’ll need to generate an access key for the user that will upload to s3. In the AWS GUI, browse to IAM, find your user, and click the “Security credentials” tab. From here, click “Create access key” and download the .csv file that is created. You’ll reference the keys in this file in the next step.

Configure your user on the Windows server

On the machine that will run the script, log on to Windows as the user it will run under. We used the SQL Agent service account, as that’s the default account that operating system commands run under in SQL Agent. Once you’re logged on, open a CMD prompt and type:

> aws configure

You will be prompted for Four items: Access key ID, Secret access key, Default region name, and Default output format. For these, enter:

AWS Access Key ID [None]: LSCOHRSHW9TKSQCOSHRB (the shorter blob from your key file)
AWS Secret Access Key [None]: SPMG8fFCrJ6HORNqcRrPyfaBMnGg29P8eLGsL46l (the longer blob from your key file)
Default region name [None]: us-east-2 (or your default region)
Default output format [None]: json

Grant proper SQL Server permissions

You’ll need to map the user on your SQL Server instance(s) that will be scripted.

Additionally, the user will need VIEW DEFINITION permission on the database it will be scripting:

USE yourdatabase
GRANT VIEW DEFINITION to [SOMEDOMAIN\sqlagentserviceaccount];

Important! Before you test, you will need to restart the SQL Agent service so it can pick up the new PATH for its user. If you don’t do this, the account won’t be able to use the AWS CLI.

Create your directory and script

Create a directory that your script will live in, and also will house the temporary DDL script files while the script is running. Technically you could skip all the AWS steps and just store all your files locally, but that wouldn’t be as fun. We created our directory at C:\mssql_schema_backup. The SQL Agent service user will require write access to this folder.

Finally, you can create a .bat script that will be called by SQL Agent and run the jobs. Ours looks like this:

REM create directory with today's date as its name
mkdir "c:\mssql_schema_backup\sqlservername\%date:~-4,4%\%date:~-10,2%\%date:~7,2%"

REM use mssql-scripter to write the object DDL to new directory
call mssql-scripter -S sqlservername -d dw --file-per-object --file-path "c:\mssql_schema_backup\sqlservername\%date:~-4,4%\%date:~-10,2%\%date:~7,2%"

REM set unicode for persnickety aws cli
chcp 437

REM copy files up to bucket
call aws s3 cp "c:\mssql_schema_backup\sqlservername\%date:~-4,4%\%date:~-10,2%\%date:~7,2%" s3://mssql-ddl-backup/%date:~-4,4%/%date:~-10,2%/%date:~7,2%/sqlservername --recursive

REM delete today's files and party time
rd /s /q "c:\mssql_schema_backup\sqlservername\%date:~-4,4%"

If your script ran successfully, you should see your DDL objects were created in s3:

Once your confirm your script runs successfully, create a SQL Agent job that will run the .bat file on a schedule. Make sure the step runs as the SQL Server Agent Service account.

This script has been running on our QA and production SQL Servers and has been reliable and successful. If we need to investigate any issues we feel may be related to SQL schema, user, view, or stored procedure changes, it will be good to know we can go back and review previous versions. Additionally, if we wanted to spin up another environment for testing, these scripts would assist in quickly creating database objects.


Luther Rochester
Sr. Database Administrator