10 Backups

It’s hard to overstate just how important backups are to data professionals. Not only can backups help businesses recover from data loss, they can even stop a business from going under in the event of a disaster.

Data is always at risk of loss, whether through accidental deletion, intentional sabotage, natural disasters, malware, or other threats. That is why well-developed backup and backup-testing plans are arguably one of the most important aspects of business continuity.

Backups can also help us evolve and accelerate our day-to-day operations. They can be used to provide a way for developers to easily reset databases to a known state and can be integrated into DevOps practices as part of automated testing pipelines. They also help us when we perform side-by-side migrations or need to rebuild an entire server.

dbatools can help simplify managing all three types of SQL Server backups, defined in table 10.1.

Table 10.1 Backup types

Backup type

Description

Example schedule

Full

Backs up the entire database.

Weekly

Differential

Backs up database changes since the last full backup was performed.

Daily

Log

Backs up the transaction log; in addition to enabling point-in-time restores, transaction log backups keep your log files reasonable in size.

At least hourly

In this chapter, you’ll learn how to easily create, manage, and test backups across your entire SQL Server estate using dbatools backup-related commands, which work with all versions of SQL Server from SQL Server 2000 all the way through Azure. You can then use these backups for automated disaster recovery, which we’ll cover in chapter 14.

You’ll learn how to create backups on-premises, in Azure, and in Docker. We’ll also cover how to easily test your backups using Test-DbaLastBackup.

10.1 Creating backups

We consider creating and testing backups as two of the most important tasks that a DBA can perform. When we inherit a SQL Server environment, backups are the first thing we look for, create, and schedule if they are missing or not meeting the business requirements or expectations.

Scheduling and managing backups dbatools does not provide an all-encompassing backup scheduling and management package. For that, we recommend Ola Hallengren’s Maintenance Solution, which can be found at ola.hallengren.com. You can install Ola’s scripts using Install-DbaMaintenanceSolution.

We try to make using dbatools (and, thus, PowerShell) as simple as possible by default, while also providing the flexibility to expand the command usage to suit your needs. This allows you to ease into using dbatools as you grow your PowerShell skill set. To create backups for every database on an instance, for example, you need to know only the SQL Server instance name, as demonstrated in the next listing.

Listing 10.1 Backing up an entire instance

PS> Backup-DbaDatabase -SqlInstance sql01

The code in listing 10.1 creates full backups of all databases on sql01 and saves them to the SQL Server instance’s default backup directory. Because a backup file name was not specified, the filename for each database will be automatically generated using the database name and the current date and time.

If you’d prefer to add an explicit path to control where the backups are stored, you can use the -Path parameter, as seen in listing 10.2. This allows you to back up files to centralized storage or a larger local disk, as long as the SQL Server service account has write permissions to the specified path. We prefer to back up to centralized storage (a fileshare that stores all the other backups), then back up that centralized storage to an off-site location. This helps make your SQL Server estate more resilient to data loss and, specifically, ransomware attacks.

Listing 10.2 Backing up an entire instance to a specific path

# Note: the SQL Server service account must have write permissions
# to \nassqlbackups
PS> Backup-DbaDatabase -SqlInstance sql01 -Path \nassqlbackups
 
SqlInstance Database       Type TotalSize DeviceType Start
----------- --------       ---- --------- ---------- -----
SQL01       AdventureWorks Full 183.09 MB Disk       2020-10-25 14:32:47...
SQL01       master         Full 4.39 MB   Disk       2020-10-25 14:33:30...
SQL01       model          Full 2.58 MB   Disk       2020-10-25 14:33:32...
SQL01       msdb           Full 16.08 MB  Disk       2020-10-25 14:33:34...

By default, limited output is displayed by the backup command. Similar to SharePoint lists, which have a default view and do not show all of the information available, PowerShell can also show limited information by default. This helps keep command output tidy and unintimidating. The information is still available and can be used for further actions once the command completes. If you’d like to see all of the information, you can pipe the results of the command to Select-Object *.

Consider the code in our first “Try it now.” Here, we back up a single database using the -Database parameter and compress it using the -CompressBackup parameter. Then we expand the results to show all fields using Select-Object *.

Try it now 10.1

Create a single, compressed database backup using the -CompressBackup parameter, then use Select-Object * to see all available output:

Backup-DbaDatabase -SqlInstance sql01 -Database pubs -CompressBackup |
Select-Object *
 
BackupComplete        : True
BackupFile            : pubs_202112121657.bak
BackupFilesCount      : 1
BackupFolder          : S:ackup
BackupPath            : S:ackuppubs_202112121657.bak
DatabaseName          : pubs
Notes                 :
Script                : BACKUP DATABASE [pubs] TO  DISK =
                        N'S:ackuppubs_202112121657.bak' WITH NOFORMAT,
                        NOINIT, NOSKIP, REWIND, NOUNLOAD,  STATS = 1
Verified              : False
ComputerName          : workstation
InstanceName          : MSSQLSERVER
SqlInstance           : WORKSTATION
AvailabilityGroupName :
Database              : pubs
UserName              : adsqldba
Start                 : 12/12/2021 4:57:00 PM
End                   : 12/12/2021 4:57:00 PM
Duration              : 00:00:00
Path                  : {S:ackuppubs_202112121657.bak}
TotalSize             : 3.77 MB
CompressedBackupSize  : 3.77 MB
CompressionRatio      : 1
Type                  : Full
BackupSetId           : 3313
DeviceType            : Disk
Software              : Microsoft SQL Server
FullName              : {S:ackuppubs_202112121657.bak}
FileList              : {@{FileType=D; LogicalName=pubs;
                        PhysicalName=D:mssqlpubs.mdf},
                        @{FileType=L; LogicalName=pubs_log;
                        PhysicalName=D:mssqlpubs_log.ldf}}
Position              : 1
FirstLsn              : 42000000031200001
DatabaseBackupLsn     : 42000000019200001
CheckpointLsn         : 42000000031200001
LastLsn               : 42000000033600001
SoftwareVersionMajor  : 15
IsCopyOnly            : False
LastRecoveryForkGUID  : 1bb67654-499c-496a-aeb0-b0de6ad75291
RecoveryModel         : FULL
KeyAlgorithm          :
EncryptorThumbprint   :
EncryptorType         :

Now you have a better idea of the output that you can work with and further process if desired. For instance, you can create a script that will attempt another backup if BackupComplete is false, or you can calculate the total space used by all backups.

Working with SQL Server Express limitations SQL Server Express does not include SQL Server Agent. This means that a different solution is required to schedule backups of important databases. Consider using Task Scheduler with dbatools. We’ve created scheduled backups of SQL Express databases using this method and have found it to be a suitable solution.

When setting up automatic seeding with availability groups or testing your backup performance, you may find yourself needing to back up a database to the NUL device. Backing up to NUL is supported in dbatools and can be accomplished using the command shown next. Note that the BackupFileName is NUL.

Listing 10.3 Backing up to the NUL device

PS> Backup-DbaDatabase -SqlInstance sql01 -Database pubs -BackupFileName NUL

Alternatively, if you do not want to immediately back up the database, but you do want the T-SQL code to perform backups at a later time, you can use the -OutputScriptOnly parameter. You can also use Get-DbaDatabase to find databases that match certain criteria and create backup scripts only for those databases. This can be seen in the following code listing, which creates backup scripts for databases with names that include the word “factory,” such as PastaFactory. Note that the match is case-insensitive.

Listing 10.4 Outputting backup scripts for databases including the word “factory”

PS> $bigdbs = Get-DbaDatabase -SqlInstance sql01 | Where Name -match factory
PS> $bigdbs | Backup-DbaDatabase -Path \nassqlbackups -OutputScriptOnly

Remember, you can also pipe these results to clip, Set-Clipboard, or Out-File to use or store elsewhere for a later date. We also support encrypted backups, and we’ll cover that in chapter 24.

10.1.1 Azure

dbatools provides extensive support for backing up databases directly to Azure Storage using native SQL Server functionality. Azure is an ideal option for off-site storage, and off-site storage of backups is a proven solution for recovering from data loss caused by ransomware attacks.

You can use two methods for backing up directly to Azure: access keys and shared access signatures. Although we recommend using shared access signatures because it’s more flexible and Microsoft recommends it, we do our best to support a wide range of scenarios, and dbatools developer Stuart Moore has made it possible for dbatools to support both methods. Each of these is managed by creating a corresponding SQL Server credential, then using that credential to back up your database to Azure blobs.

Shared access signatures

Shared access signatures (SAS) are time-restricted tokens that allow you to securely back up databases to Azure storage resources using native SQL Server functionality. You can read more about shared access signatures at sqlps.io/sas.

One of the most painful parts of shared access signatures is figuring out the exact signature required for your SQL Server credential. Microsoft doesn’t make this easy because most documentation adds an extra character (“?”) that makes the SAS invalid within SQL Server. For our memory and yours, we’ll use this chapter to remind ourselves to remove the leading question mark from any SAS provided. See the next listing for an example of an SAS.

Listing 10.5 Example shared access signature

sv=2019-10-10&st=2020-10-20T20%3A00%3A50Z&se=2030-10
 -21T20%3A00%3A00Z&sr=c&sp=racwxl&sig=REsaQ4RYVx%2F5jEckVLTtAIWNoyu7T%2Fy%2
 BhstNHAmykW0%3D

When creating a SQL Server credential, the -Identity must be the phrase SHARED ACCESS SIGNATURE, though it is not case-sensitive, so shared access signature is also valid. In addition, the -Name must be the URL to the container in your storage account, for example, https://STORAGEACCOUNTNAME.blob.core.windows.net/ CONTAINERNAME.

To perform a backup to Azure, acquire a SAS using the Az.Storage PowerShell module, Azure Storage Explorer, or the Azure Portal, then execute the code in listing 10.5. Chrissy prefers getting SASs from the Azure Storage Explorer GUI when testing, whereas Rob prefers using New-AzStorageBlobSASToken within the Az.Storage PowerShell module in production and within DevOps pipelines.

When executing the code in the following code listing, paste in the SAS when prompted by Get-Credential. This allows you to securely paste and use the password. Note that the username in the Get-Credential prompt won’t be used. What we really care about is the password. This “trick” works for passing SecureStrings to all PowerShell commands, and not just those within dbatools.

Listing 10.6 Creating a SAS credential and backing up to Azure

PS> $splatCredential = @{
    SqlInstance = "sql01"
    Name = "https://acmecorp.blob.core.windows.net/backups"
    Identity = "SHARED ACCESS SIGNATURE" 
    SecurePassword = (Get-Credential).Password
}
PS> New-DbaCredential @splatCredential
 
PS> $splatBackup = @{
    SqlInstance = "sql01" 
    AzureBaseUrl = "https://acmecorp.blob.core.windows.net/backups/"
    Database = "mydb"
    BackupFileName = "mydb.bak" 
    WithFormat = $true
}
PS> Backup-DbaDatabase @splatBackup

The example in listing 10.6 will result in a backup being created at acmecorp .blob.core.windows.net/backups/mydb.bak. You do not need to specify a credential for Backup-DbaDatabase because SQL Server associates the -AzureBaseUrl with the -Name of the credential when the -Identity is SHARED ACCESS SIGNATURE.

Access keys

Within dbatools, access keys are considered legacy credentials because SAS is the recommended method. Because SQL Server 2012 and 2014 do not support SAS, dbatools added support for them as well. If your SQL Server instances require you to use access keys, the next code snippet shows an approximate format of an access key.

Listing 10.7 Example access key

XYZ8dB1R4c/L7VVkqK5KloLWhBTA0EBoA6kNwPYbyCf2LtoinlQpbmt14N1lmwdgP9eyHcFgsNMHl
JZQuYBTOg==

Backing up to Azure using access keys is very similar to the method used to back up using SASs. To perform a backup to Azure using access keys, you can acquire the key using Azure Storage Explorer, the Azure Portal, or Get-AzStorageAccountKey within the Az.Storage PowerShell module, then execute the code in listing 10.8.

Like with SASs, you’ll need to create a SQL Server credential and then use that credential when performing the backup. Unlike with SASs, you can -Name the SQL Server credential whatever you wish, though you do need to follow a specific format for the -Identity. Specifically, you must ensure that the identity matches the storage account name, which is the first segment of your Azure Blob Storage’s base URL, as shown in table 10.2.

Table 10.2 Example credential identities

Example Azure storage URL

Corresponding credential identity

acmecorp.blob.core.windows.net

acmecorp

dbatools.blob.core.windows.net

dbatools

pubsinc.blob.core.windows.net

pubsinc

Once you have your access key, use this naming guideline to create a SQL Server credential, as demonstrated in the following code listing. After the credential is created, you must supply the credential name to the -AzureCredential.

Listing 10.8 Creating an access key and backing up to Azure

PS> $splatCredential = @{
    SqlInstance = "sql01"
    Name = "AzureAccessKey"
    Identity = "acmecorp" 
    SecurePassword = (Get-Credential).Password
}
PS> New-DbaCredential @splatCredential
 
PS> $splatBackup = @{
    SqlInstance = "sql01" 
    AzureCredential = "AzureAccessKey"
    AzureBaseUrl = "https://acmecorp.blob.core.windows.net/backups/"
    Database = "mydb"
}
PS> Backup-DbaDatabase @splatBackup

This will create a backup within the acmecorp.blob.core.windows.net/backups/ blob storage.

10.1.2 Docker

Backups can easily be taken within SQL Server on Linux using dbatools. SQL Server on Linux is often accessed via a Docker container, which commonly publishes the container’s SQL Server port to a local port other than 1433. dbatools supports connecting to specific ports using both colons and commas, as shown in the next code sample.

Listing 10.9 Backing up databases using an alternative port

PS> Backup-DbaDatabase -SqlInstance localhost:14433 -SqlCredential sqladmin
 -BackupDirectory /tmp

dbatools automatically transforms localhost:14433 to a format that SQL Server understands: localhost,14433. This means you do not have to use quotes around the Sql instance name. Passing an instance name with a comma in it without quotes would result in PowerShell interpreting the instance as an array (two instances) and creating an error.

When creating a backup within Docker, remember that the path is relative to the SQL Server, not your own workstation. So, in the example in listing 10.9, the backup will be created within the container in the /tmp directory.

If you have shared a Windows folder with Docker, you can also back up to this shared folder by using the mount point within the container. Say you have shared the Windows folder S:ackups and mounted it as /shared/backups. You’d use the following code to back up to S:ackups using the /shared/backups path within Docker.

Listing 10.10 Backing up to a shared folder

PS> Backup-DbaDatabase -SqlInstance localhost:14433 -SqlCredential sqladmin
 -BackupDirectory /shared/backups

This is a behavior native to Docker, but we wanted to highlight it in case you were curious as to the support for this scenario.

10.2 Reading backup files

We also offer commands that help you manage backup files, such as Read-DbaBackupHeader.

Try it now 10.2

Find all dbatools commands with backup in the name:

Get-Command -Module dbatools -Name *backup*.

If you find an old backup and want to know details, you can get information like the original server name, the create date, the database version, the file list, and more. Interestingly enough, access to a SQL Server is required to read a backup header. When we first created the command, we attempted to parse it using the filesystem, but ultimately, using built-in SQL Server functionality was the safest and fastest method.

With that said, the next listing shows how you can read a header by reading the file from SQL Server, using a file path that is accessible by the SQL Server service account.

Listing 10.11 Reading detailed information from a backup

PS> Read-DbaBackupHeader -SqlInstance sql01 -Path
 \nassqlackupsmydb.bak

We use this command both within other PowerShell commands and ad hoc, right at the command line, to discover details about the backup to see if it’s the one we’re hoping to use.

Another command we use a lot when working with backups is Get-DbaBackupInformation, shown in the next code snippet. This command scans a set of backup files and builds a set (think full, differential, and logs) that can be piped to Restore-DbaBackup. This method is helpful when you have a directory full of backup files and want dbatools to figure out what will get you to the latest version of your database that is available within the backups.

Listing 10.12 Creating a backup set from a bunch of backups

PS> Get-DbaBackupInformation -SqlInstance sql01 -Path
 \nassqlackupssql01

10.3 Backup history

Each time you perform a backup, information about the backup is stored in a few tables within msdb. Over the years, we’ve found ourselves digging into these tables when we needed detailed information for diagnostics.

We use this backup information in a variety of ways within dbatools. Primarily, we use it to speed up restores, because it’s a lot faster to query a SQL table than it is to read a file header. After realizing how useful this functionality was for our tools, we took the super-long T-SQL query we were using and made it accessible using the dbatools command, Get-DbaDbBackupHistory. The command in the following code listing gets all available backup history for the pubs database.

Listing 10.13 Getting database backup history

PS> Get-DbaDbBackupHistory -SqlInstance sql01 -Database pubs

You can even use the -Last parameter to easily gather the last full, diff, and log backup chains for all databases on your server, as seen here.

Listing 10.14 Getting the last backups of all databases

PS> Get-DbaDbBackupHistory -SqlInstance sql01 -Last

In the next chapter, about database restores, we’ll see how to use this command to help recover from database corruption. How cool is that? There’s a whole lot more to this command, too. If you’d like more information, Stuart Moore wrote in depth about our database backup command at sqlps.io/history.

10.4 Pruning old backup files

Another command available in dbatools is Find-DbaBackup, which helps find backups in a directory that are older than X days old. This command is useful when you need to clean up old backups. Just find the backups, then pipe them to Remove-Item, as shown next.

Listing 10.15 Finding and removing old backup files

PS> Find-DbaBackup -Path \nassqlackups -BackupFileExtension bak
 -RetentionPeriod 90d | Remove-Item -Verbose

If you’d like to find old transaction log backups, simply change -BackupFileExtension to trn, as shown here.

Listing 10.16 Finding and removing old transaction log backup files

PS> Find-DbaBackup -Path \nassqlackups -BackupFileExtension trn
 -RetentionPeriod 90d | Remove-Item -Verbose

For more information about backing up and restoring, check out Stuart Moore’s 31-day series at sqlps.io/backup. Stuart wrote most of our backup and restore commands.

10.5 Testing your backups

For data professionals, valid backups of our data is perhaps the single most important thing to have. Perhaps the two most critical tasks for a DBA is to back up their databases and to regularly test those backups. dbatools simplifies testing backups down to a single command, which we hope will encourage everyone to regularly test their backups.

This section will focus on a single command, Test-DbaLastBackup, which ensures SQL Server native backups are restorable, and the restored data is sound. This is great not only for verifying that your backups work; it also makes it easy to prove to auditors that backups are tested regularly.

dbatools supports only native SQL Server backups If you use third-party products to back up your databases, Test-DbaLastBackup likely won’t work as expected. Although we are open to pull requests from external vendors, our core team has decided to support only native SQL Server backups, because it is the only backup format we use and recommend. Please visit sqlps.io/limitations for more information.

When we created Test-DbaLastBackup, we took a list of best practices and codified them into a routine, which follows:

  • Logs in to a SQL Server and gathers a list of all databases

  • Gets a list of the most recent full, diff, and log backups for each of the databases

  • Restores the full chain of backups to the destination server (local or remote) with a new name and new filenames (to prevent conflicts with the original database)

  • Performs a DBCC CHECKDB

  • Drops the test database, and reports success or failure

You can perform your test restores against the same machine, but we recommend you offload that task and its accompanying resource usage to another server. This will keep the load of your production SQL Servers down.

Dedicated SQL Server for disaster recovery If you are using SQL Server 2019 and Software Assurance, check with your licensing representative, because you’re likely entitled to a matching SQL Server edition license for disaster recovery. We’ve even written a blog post that can help guide you when building a dedicated backup server. For more information, visit dbatools.io/ dedicated-server.

Let’s say that you want to test your backups on sql01 and sql02, but you want to offload the testing to sqltest, which has two large, dedicated drives for SQL files: R: for data and L: for logs. You also want to store the results in an efficient table for processing within Power BI. This is all possible with the following three commands:

  1. Invoke-DbaQuery and some T-SQL to create the table.

  2. Test-DbaLastBackup to test the restores.

  3. Write-DbaDataTable to store the results in SQL Server.

To accomplish this task, run the commands shown here.

Listing 10.17 Creating a table, testing the backups, and writing the results

PS> $query = "CREATE TABLE dbo.lastbackuptests (
    SourceServer nvarchar(255),
    TestServer nvarchar(255),
    [Database] nvarchar(128),
    FileExists bit,
    Size bigint,
    RestoreResult nvarchar(4000),
    DbccResult nvarchar(4000),
    RestoreStart datetime,
    RestoreEnd datetime,
    RestoreElapsed nvarchar(128),
    DbccStart datetime,
    DbccEnd datetime,
    DbccElapsed nvarchar(128),
    BackupDates nvarchar(4000),
    BackupFiles nvarchar(4000))"
PS> Invoke-DbaQuery -SqlInstance sqltest -Database dbatools -Query $query
 
PS> $splatTestBackups = @{
    SqlInstance = "sql01", "sql02" 
    Destination = "sqltest"
    DataDirectory = "R:" 
    LogDirectory = "L:"
}
PS> Test-DbaLastBackup @splatTestBackups | Write-DbaDataTable -SqlInstance
 sqltest -Table dbatools.dbo.lastbackuptests

Now you can schedule the last command to run on a regular basis and then process your results using Power BI or a similar data visualization tool.

In this chapter, you’ve learned how to back up your SQL Server databases to a variety of destinations, and you’ve also learned how to test your backups. Now let’s see all that we can do with restores.

10.6 Hands-on lab

  • Back up all databases on your SQL Server to a nondefault path.

  • Back up a database, and pipe the results directly to Read-DbaBackupHeader.

  • Find transaction log (trn) backups older than 21 days using Find-DbaBackup.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.218.91.239