By: Tim Smith | Updated: 2018-05-21 | Comments (5) | Related: > Backup
Problem
We store our daily backups in a backup folder and keep these for four days before removing them. After a few recent disasters in our area, we've attached two shares that we are going to use to store backups for 60 days and we'd like to store these as zip files before we move them to these shares. We'd like to automate this task daily after our backup job where we get all of the recent backups, zip them, and copy the zip files to the two shares. How can we do this with PowerShell?
Solution
In this tip, we’ll look at automating a zipping protocol with PowerShell where we zip a database backup, copy the zipped backup to two folder locations (or shares), and remove the original. We’ll add a remove function in case we want to eventually remove zipped backups on the two folder locations.
Validate SQL Server Database and Backup Database
Our first step will ensure that we have a valid backup file, so we’ll run an integrity check before backing up the database. Once the integrity check passes, we’ll backup the database and verify that the file is valid. In some environments, we will not be able to run an integrity check before backing up the database due to limitations. In these environments, one alternative practice is to backup the database, restore the backup file on a different server, run an integrity check and alert if there’s a failure, then drop the database. Integrity checks should be run as often as we are unable to lose any data – the more important our data are, the more often we need to validate the integrity of our data. I’ve seen situations where integrity checks were skipped and the company discovered months of files with corrupt database backups.
---- Verify integrity DBCC CHECKDB ---- Set date dynamic string DECLARE @savestring VARCHAR(250) = 'I:\Backups\Data2017_' + CONVERT(VARCHAR(12),GETDATE(),112) +'.BAK' ---- Backup BACKUP DATABASE Data2017 TO DISK = @savestring ---- Verify backup file RESTORE VERIFYONLY FROM DISK = @savestring
The backup format we use here only serves for demonstration purposes, as we may want to include options with our backup, such as encryption, compression, etc. We also want a regular schedule where we restore backups, to validate times and dates of when we checked their validity while retaining this information. In the least as a best practice, every backup database should also verify the file is good right after the file has been backed up, which we see in the final step of the above T-SQL.
Zip SQL Server Database Backup Files
In the above T-SQL, we save the date and time information with the format of four years, then 2 months, then 2 days. First, we want to mirror this format with the zip file. PowerShell will follow this format using the $date.ToString() method and entering the "yyyyMMdd" format in the parentheses. We’ll use this to keep the zip files formatted in a similar naming convention to the backup file or files. We can see an example of this by calling the below script in a PowerShell ISE window or PowerShell command line:
(Get-Date).ToString("yyyyMMdd")
In the first script, the design assumes that we have backups from multiple days in the same folder or location. Since we have multiple backup files in our path and we want the recent wants to copy on to file shares, we will create a file store to copy recent backup files to in order to zip. The second script will skip this step and assume that all our backups are stored by a folder date eliminating the need to copy recent backup files to a holding folder before compressing.
The zip library in .NET that we’ll be using will be the System.IO.Compression.FileSystem class, which has the method CreateFromDirectory requiring two folder paths – one folder path that will be zipped and the other path the actual zip file where it will be saved. Because we are copying all recent backups to the file store, we do not want to keep these after they’ve been zipped and copied to shares, so this folder will be removed with all contents after the zip and copy.
### Script 1 Function ZipCopy-BAKs { Param( [Parameter(Mandatory=$true)][string]$filepath , [Parameter(Mandatory=$true)][string]$store , [Parameter(Mandatory=$true)][string]$share1 , [Parameter(Mandatory=$false)][string]$share2 ) Process { $zip = (Get-Date).ToString("yyyyMMdd") + ".zip" if (Test-Path $store) { Remove-Item $store -Recurse -Force } New-Item -ItemType Directory -Path $store $allbakfiles = Get-ChildItem $filepath -Filter *.Bak | Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-12) } foreach ($bakfile in $allbakfiles) { Copy-Item -Path $bakfile.FullName -Destination $store } Add-Type -AssemblyName System.IO.Compression.FileSystem [System.IO.Compression.ZipFile]::CreateFromDirectory($store,"$share1$zip") if ($share2) { [System.IO.Compression.ZipFile]::CreateFromDirectory($store,"$share2$zip") } Remove-Item $store -Recurse -Force } } ZipCopy-BAKs -filepath "I:\backups\" -store "I:\backups\store\" -share1 "H:\backups\bakshare\" -share2 "G:\backups\bakshare\"
### Script 2 Function ZipCopy-BAKs { Param( [Parameter(Mandatory=$true)][string]$filepath , [Parameter(Mandatory=$true)][string]$share1 , [Parameter(Mandatory=$false)][string]$share2 ) Process { $zip = (Get-Date).ToString("yyyyMMdd") + ".zip" Add-Type -AssemblyName System.IO.Compression.FileSystem [System.IO.Compression.ZipFile]::CreateFromDirectory($filepath,"$share1$zip") if ($share2) { [System.IO.Compression.ZipFile]::CreateFromDirectory($filepath,"$share2$zip") } } } ZipCopy-BAKs -filepath "I:\backups\" -share1 " H:\backups\bakshare\" -share2 " G:\backups\bakshare\"
In the first script, we filter the creation date of the files to ensure that we only get the recent backup files. If this zip and copy step follows a backup job (in a SQL Server Job Agent as an example - see the image below this), this would ensure that we only get the backups from today. If you need to extend the time, or shorten it, we can add or subtract the hours based on what's compatible with our environment:
$allbakfiles = Get-ChildItem $filepath -Filter *.Bak | Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-12) }
For an example, if we wanted to get all files in the last six hours, we would just change this line to:
$allbakfiles = Get-ChildItem $filepath -Filter *.Bak | Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-6) }
As long as this step follows the backup step, however we choose to run this step (such as Task Scheduler, Job Agent, etc), the backups will complete before this step runs. If we do not run it following our backup step, such as running our backups in SQL Server Job Agent, but running our Zip and Copy in Task Scheduler, we will want to make sure the time window covers a possible backup that runs longer than expected, such as using a time window of up to 23 hours.
Next Steps
- Organize your backup locations by folder with everything, specific name of the database, by specific time, or by a combination of these. Either of the above scripts will function in most environments, provided that the database or databases are within a folder.
- While this tip covers ways in which we can have multiple copies of compressed backups, we want to still follow strong backup best practices in T-SQL by always checking integrity on our database and validating our backup file after the backup. Our environment may require other options specified during the backup process such as encrypting backups, compressing them with T-SQL, adding checksums, etc.
- PowerShell scripts can be executed within SQL Server Agents or as Windows Scheduled Tasks.
About the author
This author pledges the content of this article is based on professional experience and not AI generated.
View all my tips
Article Last Updated: 2018-05-21