As you may have heard, I’ve recently taken a new job post-election. I want to reiterate that one of the biggest reasons I was attracted to the position was because of our focus on building solutions in the cloud using Amazon Web Services (AWS). You can dismiss me as having drunk the kool-aid, but I do believe that the cloud is where technology is headed in the long run. And not even that long of a run. I’ve enjoyed learning and working with AWS’ services, and this is the first of what I hope will be many posts on AWS.
With that in mind, today’s T-SQL Tuesday challenge from Wayne Sheffield is to blog about PowerShell as it relates to SQL Server. I don’t have any super-whiz-bang scripts yet, but I did want to share some basic resources so you can begin to explore and utilize PowerShell and AWS in your own environment.
In order to begin to use PowerShell with AWS, you’ll need to download and install the AWS SDK for .Net from Amazon. Just click on the large yellow button on the right, then install it. Easy peasy. It works with PowerShell 2.0, but I suggest upgrading to 3.0 if you can.
Once you’ve finished your setup, you’ll want to open PowerShell in Administrator mode by right-clicking on it, and selecting Run as Administrator. In the console, we’ll need to test (and possibly change) your Execution Policy, which will allow you to run PowerShell scripts including your $profile!
# Check the Execution Policy Get-ExecutionPolicy # If it returns Restricted, you'll need to change it Set-ExecutionPolicy -ExecutionPolicy RemoteSigned
After changing your Execution Policy, close and reopen PowerShell so we can create a PowerShell $profile and add some commands to it. You won’t need to Run as as Administrator to complete the rest of the blog post. Here’s how you can create a new $profile.
# Test to see if your $profile exists Test-Path $profile # If it returns false, run... New-Item -Path $profile -Type File -Force # If it returns true or you created a $profile... notepad $profile
Your new PowerShell $profile will open in Notepad. As I mentioned before, we’re going to add some commands to load the AWSPowerShell module and some default settings so they’ll automatically load each time you open a PowerShell console.
# Load the AWSPowerShell module Import-Module AWSPowerShell # Set a default region; mine is us-east-1 Set-DefaultAWSRegion us-east-1
Let’s run down what these do. The first command will import the AWSPowerShell module. This will allow you to utilize the AWS cmdlets, including Set-DefaultAWSRegion. The Set-DefaultAWSRegion cmdlet will store your default AWS region. I’ve chosen us-east-1, since that’s what’s connected to my AWS account, but yours may be different.
So why add these to the $profile? It’s a time saver. Importing the AWSPowerShell module will give you immediate access to the AWS cmdlets without having to remember to import it manually every time you use PowerShell. As for the region, many AWS cmdlets will require you to give them some amount of regional scope. By setting a default region, PowerShell will apply it automatically within a session to all cmdlets (if you don’t specify a different one).
In order to use AWS cmdlets, you’ll also need your API credentials. The credentials are made up of two strings: the Access Key ID and the Secret Key. You can find them inside of your AWS Console online by hovering over the drop-down menu (your account name), and choosing Security Credentials. You’ll be taken to a new page where you can generate a new API pair or copy-paste your current keys. Needless to say, keep these babies safe.
Once you’ve found (or generated) a new set of credentials, we can store them for ease of use. In the PowerShell console (and not your $profile), you’ll want to use the Set-AWSCredentials cmdlet in order to do this.
Set-AWSCredentials -AccessKey <your_access_key_id> -SecretKey <your_secret_key> -StoreAs firstname.lastname@example.org
You’ll pass in your specific Access Key ID and Secret Key after the corresponding parameters. As for the -StoreAs parameter, this is the alias you can use to call them when issuing a command. In my example, I’m using my email address, but you can use whatever makes sense for your situation. Additionally, you can use the Set-AWSCredentials cmdlet multiple times for each AWS account that you access. For example, I have credentials stored for my personal, work, and customer accounts.
One final note: the credentials are stored outside of your current PowerShell session. This means that you don’t need to store them in your $profile or hard-code them into any scripts that you write. Once you have run the Set-AWSCredentials cmdlet, you’ll be able to pass them to other cmdlets through the -StoredCredentials parameter, as we’ll see below. You can review (and destroy) any stored credentials with the following cmdlets.
# List all available stored credentials Get-AWSCredentials -ListStoredCredentials # Remove a stored credential Clear-AWSCredentials -StoredCredentials email@example.com
What about SQL Server?
I didn’t forget about SQL Server! We know that it’s a good practice to keep backups on a different server than production: if that production server goes down, those backups might be hosed as well. But backing up over the network can be tricky or take a long time, so it’s common to perform the backup to a local drive, then move it once finished. In our example, we’ll begin by backing up AdventureWorks inside of SSMS, then we’ll open up PowerShell, find the backup, and move it to Amazon’s Simple Storage Service (S3).
Opening up SSMS, you can take a regular backup of AdventureWorks. I’ve included some sample T-SQL below. I’m using SQL Server 2012, but you shouldn’t have to modify the code save for changing the name of the database.
BACKUP DATABASE AdventureWorks2012 TO DISK = 'C:BackupsAdventureWorks_FULL.bak' WITH CHECKSUM;
Now that we’ve got our backup, you can open up PowerShell. Your $profile should import the AWSPowerShell module and your defaults automatically. Begin by changing to your backup directory.
Set-Location -Path C:Backups
Once there, you can find the latest backup by using the following cmdlet. I’m storing the info in a variable that I’ve named $file.
$file = Get-ChildItem | Sort LastWriteTime -Descending | Select -First 1
We can push the file to S3 using AWS’ Write-S3Object cmdlet. You’ll need to supply the appropriate S3 bucket name. You can see where we are passing the $file variable to both the -File and -Key parameters. The -File is the actual file we’re uploading, while the -Key is how it will be named in S3. Finally, we end the cmdlet by passing our default credentials to the -StoredCredentials parameter.
Write-S3Object -BucketName Matts_Bucket -File $file -Key $file -StoredCredentials firstname.lastname@example.org
The great thing about the Write-S3Object cmdlet is that it allows for easy upload to S3. While S3 can store any object up to 5TB (yes, that’s terabytes), the normal upload process would require you to break down objects into 5GB chunks. Using this cmdlet circumvents that limit, and allows you to upload more seamlessly. That isn’t to say that you might not find better throughput by breaking down a large object and attempting to upload pieces concurrently. But if you are looking for a no-nonsense upload command, Write-S3Object is hard to beat.
Finally, once the upload is complete, you can use PowerShell to generate a list of all the objects in your S3 bucket.
Get-S3Object -BucketName Matts_Bucket -StoredCredentials email@example.com
Typically you wouldn’t want to do this by hand: you would want to automate this process. And PowerShell is wonderful for automation. Personally, I’ve integrated this technique into my backup jobs, which rely on Ola Hallengren’s Maintenance Solution because it’s so awesome. Inside of my Agent backup job, I simply include another step (as a CmdExec job type) to run the PowerShell script that will move the backup from the production server to S3. Again, you can schedule this kind of file maintenance to its own proper time slot; this just happens to work for me right now in my current situation.
And there you have it! Using PowerShell and AWS to better secure your backups off site. From S3, you could download your backups to a new SQL Server for restore and disaster recovery purposes. As a final note, these AWS cmdlets are new and the online documentation isn’t always the best, so feel free to reach out for help and I’ll do the best I can.
Yikes! An engineer at Amazon reached out to me to let me know that I did a terrible job at security practices in the first cut of this blog post. This is a small note to let you know that I’ve already redrafted the post to take into account the new information I learned based on his recommendations. Also know this: Amazon is watching!
- Blog Maintenance
- I’m Presenting at SQL Saturday #187 in Richmond!