By: Aaron Nelson | Updated: 2019-04-25 | Comments (1) | Related: > Containers
Problem
Docker can be an easy way to setup SQL Server databases for a demo or to test out new features in the next version SQL Server. However, when building these containers, you may want to persist the data files of your database outside of the container itself, so that you can quickly attach the database to a different instance with minimal effort. Alternatively, you may just want to be able to access .bak files without having to copy the file inside of the container and take up room there.
Solution
Building Docker Containers with External Storage. This guide is intended for Data Professionals who want to use their Windows 10 machine to try out "Official images for Microsoft SQL Server on Linux for Docker Engine". However, many of these steps work exactly the same if you're running MacOS. Please do note that if you're following along on a Mac, you will need to adjust the location of the external storage.
In the previous article, we walked through building a container which stores the database files inside of the container.
This article will focus on storing the database files outside of the container, inside a directory on the host machine which is running the docker image.
Prerequisites
- Download Docker Desktop for Windows: https://hub.docker.com/editions/community/docker-ce-desktop-windows
- Install the latest SqlServer PowerShell module from the PowerShell Gallery: (run the code below in your favorite PowerShell editor)
Install-Module SqlServer
- If you didn't step through the first article, download the AdventureWorks2016 database: https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorks2016.bak
Again, you can use this PowerShell code to download the AdventureWorks2016.bak file:
$BakURL = "https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorks2016.bak"; $BakFile = "$($Home)/Downloads/AdventureWorks2016.bak"; Invoke-WebRequest -Uri $BakURL -OutFile $BakFile;
If you're on Windows, make sure you see this icon in your system tray:
Optional – Just to make your experience match my screenshots
- Download Azure Data Studio (ADS): aka.ms/GetAzureDataStudio
- Add the PowerShell extension from the Marketplace in Azure Data Studio
- If you're on a Mac, you will need PSCore 6.2+ if you want to use the Invoke-Sqlcmd cmdlet: aka.ms/pscore6
- Download & install the Docker .VSIX extension into your ADS. Click on "Download Extension" on this page: https://marketplace.visualstudio.com/items?itemName=PeterJausovec.vscode-docker
Again, using the Docker .VSIX extension is not mandatory, you could even use a different one. I use it to give me a visual cue if I forget and leave my Docker container running; or to just give me a quick list of which containers are running.
Building a SQL-on-Linux Docker Container Using External Storage
The first article with Internal Storage was pretty easy, wasn't it? But I bet you might want the option to keep your databases files separate, and be able to dispose of your container and build a new one on a whim, without having to backup all your databases, or go inside the container and copy all of them out. In this article, we'll create a Host Mount directory so that you never even have to go inside of your container if you don't want to! This Host Mount directory is essentially like you're mapping a drive from inside the container to a folder on the hard drive of your host machine. Although, those probably aren't the terms Linux people would use, so I bet if any of them are reading this article, they probably want to strangle me right now.
This article involves a bit more code than the first article. For that reason, you may want to copy the code out of this zip file which has all the steps, instead of copying each step individually.
Steps
Step 0 - Decide which directory you want to store these database files in, and make sure the directory exists. In the subsequent steps you will likely want to swap out the directory I use (C:\SQLData\Docker\SQLDev02) for something different.
Step 1- Create a second container named testcontainer02, this time with some extra code to create that Host Mount directory.
docker run -d -p 10002:1433 -vC:\SQLData\Docker\SQLDev02:/sqlserver -e ACCEPT_EULA=Y -e SA_PASSWORD=Test1234 --name testcontainer02 microsoft/mssql-server-linux
At this point, you will probably receive a pop-up dialog from Docker Desktop confirming that you want to create this mapping, and asking you for a set of credentials which can access that folder. If so, go ahead and enter your credentials. However, depending on your setup you may just receive an error message that says "Error response from daemon: Drive has not been shared." If that happens just right-click on the docker icon in your system tray > choose Settings > click on Shared Devices > check the box to share whichever drive you are using > click Apply to be prompted to enter your credentials.
Once you're done with that have a look at Docker Explorer in ADS again because you've already got your second container up & running!
Reminder: Since I'm planning to run multiple containers at the same time, I'm using C:\SQLData\Docker\SQLDev02 as my path. Your path doesn't have to be quite this elaborate.
Step 2 - Now, let's load that container up with some databases. Before we can do that, make sure you have copied your Adventureworks2016.bak file into C:\SQLData\Docker\SQLDev02, or whatever directory you're using. The code below may help you, just modify it for whichever directory you're using.
Copy-Item -Path "$($Home)\Downloads\AdventureWorks2016.bak" -Destination C:\SQLData\Docker\SQLDev02
Quick troubleshooting note: Besides using the Docker Explorer extension in Azure Data Studio to check the status of your container, you can also use the docker ps command to check to see how many containers you have running. If that command doesn't return any results, you can run docker ps -a to see if your container started, but then exited with an error.
docker ps -a
If you're the untrusting type, like me, run the code below just to make sure that your second container is up & running and that there's no database just yet.
Get-SqlDatabase -ServerInstance 'localhost,10002' -Credential (Get-Credential sa)
Step 3 - Thanks to that Host Mount we can skip straight to restoring our database!
#Requires -Modules SqlServer $RelocateData = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('AdventureWorks2016_Data', '/sqlserver/AdventureWorks2016_Data.mdf') $RelocateLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('AdventureWorks2016_Log', '/sqlserver/AdventureWorks2016_Log.ldf'); Restore-SqlDatabase -ServerInstance 'localhost,10002' -Credential (Get-Credential sa) -BackupFile '/sqlserver/AdventureWorks2016.bak' -Database 'AdventureWorks2016' -RelocateFile @($RelocateData,$RelocateLog)
Now run that command again to check and make sure your database was in fact restored.
Get-SqlDatabase -ServerInstance 'localhost,10002' -Credential (Get-Credential sa)
You're done creating your containers. Next, go to Data Explorer (Ctrl + G in Azure Data Studio) and add a New Connection to testcontainer02. This allows you to directly query the container and use intellisense, instead of having to use the PowerShell Terminal and manually typing your queries, from memory.
After you have connected, you should be able to see your SQL-on-Linux container in an Object Explorer like experience, and be able to see the AdventureWorks2016 database you restored.
Of course, just restoring a single database isn't much fun. Next, we'll download four of the AdventureWorksDW .bak files from GitHub. Then we'll restore them to your SQL-on-Linux container instance. Additionally, we will leverage the new -AutoRelocateFile parameter for the Restore-SqlDatabase cmdlet. Using the -AutoRelocateFile parameter means that we don't have to go to all the effort of changing the path for every single Data & Log file of our database. As long as the Data & Log files of the database we are restoring do not yet exist in the Default Data & Log directories, we won't have to specify any additional information; the cmdlet relocates the files for us.
This also allows us to cover one more scenario: Where we want the database files to exist within the container, but we don't want to take up room inside the container with the .bak files. We'll still use that Host Mount to present the .bak file to the SQL Server instance.
Quick tip before you run this command: Copy the sa password into your clipboard so you can paste it in when prompted by the Get-Credential window.
Step 4 - Run the code below from the script pane in Azure Data Studio, by highlighting the code, then hit the F8 key. (After you highlight the code, you can also right-click and choose Run Selection.)
<# 4) This portion allows you to grab all four AdventureWorksDW sample databases from GitHub, then downloads and the .bak file. After the .bak files are downloaded you restore the instance specified. #> $releases = Invoke-RestMethod https://api.github.com/repos/microsoft/sql-server-samples/releases $BaksToDownload = ($releases | where {$_.name -eq 'AdventureWorks sample databases' -or $_.name -eq 'Wide World Importers sample database v1.0'}).assets | WHERE { $_.name -like 'AdventureWorksDW201*bak' -and $_.name -notlike '*EXT*' } | SELECT name, browser_download_url, size, updated_at FOREACH( $BakInfo in $BaksToDownload ) { "$($BakInfo.name)"; Invoke-WebRequest -Uri $BakInfo.browser_download_url -OutFile "C:\SQLData\Docker\SQLDev02\$($BakInfo.name)" Restore-SqlDatabase -ServerInstance 'localhost,10002' -Credential (Get-Credential sa) -BackupFile "/sqlserver/$($BakInfo.name)" -Database ($BakInfo.name -replace '.bak') -AutoRelocateFile }
After the databases have finished restoring, refresh the Databases node of TestContainer02 in Data Explorer to see them all. From here you can right-click on any of the databases and start a new query.
Of course, you may be curious how much space we're taking up with all these databases we've just restored. Not that much, actually. Run the following command to see how much space you have used up so far:
docker ps -s
In summary, we now have a way to spin up multiple SQL-on-Linux Docker containers, with multiple databases, with our choice of storage option, all in a matter of minutes. And once we've completed the .BAK downloads, we can spin up even more containers and blow them away as much as we want, in a few mere seconds.
Next Steps
- If you haven't done so already, check out the first article in this series on using only internal storage for the Docker Container.
- Download this zip file with all of the code in this article.
- Docker Commands with examples for SQL Server DBAs.
About the author
This author pledges the content of this article is based on professional experience and not AI generated.
View all my tips
Article Last Updated: 2019-04-25