Select Page

Amazon S3 Storage for SQL Server Databases

Author: Manguirish Sardesai | | July 27, 2017

Amazon Simple Storage Service (S3) provides developers and IT Teams with secure, durable and highly scalable cloud storage. Amazon S3 is easy to use object storage with a simple web service interface that can be used to store and retrieve any amount of data from anywhere on the web.

Gone are the days of traditional block and file storage. Object based storage of Amazon S3 manages data as objects that contain both data and meta data, manipulated by an API. It is designed for 11 nines of object durability and four nines of availability per year.

Common use cases for Amazon S3 storage in the DBA world includes below:

  1. Backup and Archive for on-premises or cloud data
  2. Disaster Recovery
  3. Big Data Analytics

Amazon’s S3 service provides a cost effective and reliable way of storing backups, with plenty of options to simplify management of these. This blog post shows you how to effectively handle backups using Amazon S3.

First step Set up New Amazon S3 Bucket to store backups using Amazon Console.

In my earlier article “How to Set Up an AWS Account,you will find detailed instructions for how to create an Amazon AWS account. If you don’t already have an account it just takes a few minutes to set one up. Account set up can be accomplished using the available free tier..

Once you have your Amazon AWS account you must create a user and key pair. These are used by AWS PowerShell Toolkit to access and manipulate objects in AWS. Starting at the AWS console:

  1. Click on the Identity and Access Management (IAM) icon which brings you to the IAM dashboard.
  2. Create new users and then generate access key for each user.
  3. You will be prompted to download credentials using “download credentials” button. Download and save this file in safe location for future use.
  4. Next, create a S3 Bucket. A bucket is just a container for files stored in Amazon S3. Creation of a bucket is very simple, and you must follow all DNS naming rules. Refer to this link for instructions to create a unique bucket. This Bucket will be used to store Backups so make sure to set the retention policy to 30-45 days, depending on your needs.The retention policy can be set to any span, but keep in mind that the length of time will affect the cost of storage.
  5. Create a new life cycle rule for the whole bucket, that permanently deletes the file after 7 days, for example, to prevent incurring a large S3 bill by storing old backups. Alter the time range to suite your needs, or consider pushing the backups to Glacier for cost-effective long term storage. In the future, you may choose from a variety of life cycle management and storage classes based on your business needs.

Second Step Configure your Maintenance Plan using SQL Native Backups.  We will describe how to reference the S3 bucket destination as a windows drive using Powershell Script.

Below we’ve provided a simple PS script which creates backups with a timestamp, copies them to S3, and then deletes them from the local machine. This script ensures variables are updated to reference the server, database, S3 bucket name, and back up location. The script assumes Windows Authentication to connect to the database. If you are running the script on an EC2 instance, and have applied the security policy to the IAM role used by the instance the AccessKey and SecretKey parameters can be removed from the Write-S3Object command.

$server = '.'
$database = 'AWSDB'
$s3Bucket = 's3-dvbackups'
$backupPath = 'C:\dvBackup\'
$region = 'us-east-1'
# accessKey and secretKey can be removed if running on an EC2 instance and using IAM roles for security
$accessKey = 'USER-ACCESS-KEY'
$secretKey = 'USER-SECRET-KEY'
$timestamp = get-date -format yyyyMMddHHmmss
$fileName = "$database-$timestamp.bak"
$filePath = Join-Path $backupPath $fileName
Backup-SqlDatabase -ServerInstance $server -Database $database -BackupFile $filePath
# AccessKey and SecretKey can be removed if running on an EC2 instance and using IAM roles for security
Write-S3Object -BucketName $s3Bucket -File $filePath -Key $fileName -Region $region -AccessKey $accessKey -SecretKey $secretKey
Remove-Item $backupPath$database*.bak

Once the PS script is set up, windows scheduler can run per your specifications. As the script runs, check the S3 bucket to ensure the backups are being saved correctly, as there is no notification available if the backups fail.

Now that you’ve seen how S3 facilitates and provides storage for backups that can be used for analytics and disaster recovery situations, you can see how Amazon S3 is a game changer for storage in hybrid IT environments and applications.

If you would like more information on Amazon S3 Storage and how it fits into your environment, please contact Datavail today. Datavail is a specialized IT services company focused on Data Management with solutions in BI/DW, analytics, database administration, custom application development, and enterprise applications. We provide both professional and managed services delivered via our global delivery model, focused on Microsoft, Oracle and other leading technologies.

For additional resources please download white paper: AWS for SQL Server DBAs.

Datavail Script: Terms & Conditions

By using this software script (“Script”), you are agreeing to the following terms and condition, as a legally enforceable contract, with Datavail Corporation (“Datavail”). If you do not agree with these terms, do not download or otherwise use the Script. You (which includes any entity whom you represent or for whom you use the Script) and Datavail agree as follows:

1. CONSIDERATION. As you are aware, you did not pay a fee to Datavail for the license to the Script. Consequently, your consideration for use of the Script is your agreement to these terms, including the various waivers, releases and limitations of your rights and Datavail’s liabilities, as setforth herein.

2. LICENSE. Subject to the terms herein, the Script is provided to you as a non-exclusive, revocable license to use internally and not to transfer, sub-license, copy, or create derivative works from the Script, not to use the Script in a service bureau and not to disclose the Script to any third parties. No title or other ownership of the Script (or intellectual property rights therein) is assigned to you.

3. USE AT YOUR OWN RISK; DISCLAIMER OF WARRANTIES. You agree that your use of the Script and any impacts on your software, databases, systems, networks or other property or services are solely and exclusively at your own risk. Datavail does not make any warranties, and hereby expressly disclaims any and all warranties, implied or express, including without limitation, the following: (1) performance of or results from the Script, (2) compatibility with any other software or hardware, (3) non-infringement or violation of third party’s intellectual property or other property rights, (4) fitness for a particular purpose, or (5) merchantability.

4. LIMITATION ON LIABILITY; RELEASE. DATAVAIL SHALL HAVE NO, AND YOU WAIVE ANY, LIABILITY OR DAMAGES UNDER THIS AGREEMENT.

You hereby release Datavail from any claims, causes of action, losses, damages, costs and expenses resulting from your downloading or other use of the Script.

5. AGREEMENT. These terms and conditions constitute your complete and exclusive legal agreement between you and Datavail.

How to Solve the Oracle Error ORA-12154: TNS:could not resolve the connect identifier specified

The “ORA-12154: TNS Oracle error message is very common for database administrators. Learn how to diagnose & resolve this common issue here today.

Vijay Muthu | February 4, 2021

Data Types: The Importance of Choosing the Correct Data Type

Most DBAs have struggled with the pros and cons of choosing one data type over another. This blog post discusses different situations.

Craig Mullins | October 11, 2017

How to Recover a Table from an Oracle 12c RMAN Backup

Our database experts explain how to recover and restore a table from an Oracle 12c RMAN Backup with this step-by-step blog. Read more.

Megan Elphingstone | February 2, 2017

Subscribe to Our Blog

Never miss a post! Stay up to date with the latest database, application and analytics tips and news. Delivered in a handy bi-weekly update straight to your inbox. You can unsubscribe at any time.

Work with Us

Let’s have a conversation about what you need to succeed and how we can help get you there.

CONTACT US

Work for Us

Where do you want to take your career? Explore exciting opportunities to join our team.

EXPLORE JOBS