Chapter 35. The poor man’s SQL Server log shipping

Edwin Sarmiento

Let’s imagine that you’ve got a new line of business application up and running smoothly. You’re extending your disaster recovery plan to include this new application. For database, you’re thinking along the lines of log shipping until you read the documentation provided by the vendor: The installation program installs SQL Server 2005 Express Edition.

Log shipping is a high availability solution of SQL Server, in which the SQL Server Agent automates the process of generating transaction log backups of the source database, copying them over to a remote SQL Server instance, and restoring them in either no-recovery or read-only, standby mode. It has gained popularity as a high availability solution because of the option to use different hardware for a standby server and the licensing options available for setting it up. This feature is available in the Enterprise Edition for SQL Server 2000, whereas even the Standard and Workgroup Editions of SQL Server 2005 and 2008 have it. But for those who are using editions other than those specified, there are no other options except to do it outside of the “supported” scenarios. Many third-party applications use editions such as Express and the old Microsoft SQL Server Desktop Engine (MSDE) including those from Microsoft, such as Office SharePoint Server 2007, with its default installation option. This chapter will discuss how to implement log shipping even when Microsoft says it cannot be done.

It is important to understand what log shipping does so that we can come up with a process that can implement a similar mechanism. Process is more important than the technology itself. Log shipping consists of three steps:

1.  

Back up the transaction log within a specified interval, say every 15 minutes.

2.  

Copy the log backup from the primary server to the standby server.

3.  

Restore the copied log backup while leaving the database in either norecovery or read-only, standby mode.

Because we’ll be dealing with SQL Server 2005 and 2008 Express Edition (or even SQL Server 2000 editions that do not support log shipping), defining a primary and a standby server could mean working with nonserver operating systems such as Windows 2000 Professional, Windows XP, or even Windows Vista or Windows 7. References to these terms throughout the chapter should not be confused with using Windows server operating systems.

Understanding these steps will help you create an automated job that involves all of these processes. First, you need to create a full backup of the database that you will be configuring for log shipping, and restore the backup on the standby SQL Server. Make sure that the restore options for the database should either be read-only and standby or with no recovery (the T-SQL script for this is defined in listing 5, which shows restoreLOG.vbs). This ensures that we can restore additional transaction logs later in the process. After we restore the database backup on the standby server, we are now ready to configure log shipping. The following items are needed:

  • ROBOCOPY (Robust File Copy Utility)— This is a command-line file copy tool available in the Windows Resource Kit tools. This copies files from a source folder to a destination folder and resumes interrupted copies. It also preserves Windows NT File System (NTFS) permissions and access control lists (ACLs).
  • Shared folder— You should share the folder, which will contain the log backup files, and make sure that you have at least read-only access.
  • Domain— This will have the primary and standby servers as member servers.
  • Domain account— This has dbo permissions on the database, which you will be configuring for log shipping. We’ll use this account to copy and restore the log backups from the primary server to the standby server.

Now, we’re ready to configure any database for log shipping. We’ll start with the SQL Server instance that will act as the primary server. To be more structured, we’ll create folders on both the primary and standby servers that will contain the log backups, the script files, the script execution results, and the UNDO files. The scripts are dependent on the file and folder structure you create, so make sure you update your scripts accordingly.

  • E:LogShipFolder— Location of the log backups, segregated by database name (for example, E:LogShipFolderNorthwind for the Northwind database). This folder should be on both the primary and the standby servers.
  • E:LogShipFolderUNDO— Location of UNDO files that will be specified as part of the RESTORE LOG command when the choice is to restore the log backup and leave the database in the read-only, standby mode. This folder should be created in the standby server. Each UNDO file is identified by the fact that it has the database name as its prefix.
  • E:LogShipFolderscripts— Location of all the T-SQL scripts, VBScripts, batch files, and EXE files that will be used for this process. This folder should be on both the primary and the standby servers.
  • E:LogShipFolderscriptsSQL— Location of the dynamically generated T-SQL script files, which will contain the RESTORE LOG command. This SQL script file will be run by a command-line call to sqlcmd.exe (or osql.exe for SQL Server 2000 or MSDE), the command-line utility for SQL Server. This folder should be created in the standby server and will also store the query execution results file, which will be generated using the sqlcmd.exe command-line utility. The results file is for records and troubleshooting.

Creating the T-SQL script

First, you need to create a T-SQL script that generates transaction log backups. Let’s say, we’ll back up the Northwind database in our SQL Server 2005 Express instance, as shown in listing 1. We’ll name the file backupLog.sql.

Listing 1. T-SQL script (backupLog.sql) generates transaction log backups
DECLARE @strTimeStamp NVARCHAR(12) --variable for timestamp value
DECLARE @SQL VARCHAR(1000) --variable for dynamic SQL
SET @strTimeStamp=convert(char(8), getdate(), 112)
SET @strTimeStamp=@strTimeStamp + REPLACE(convert(char(8), getdate(),
108),':','')

SELECT @SQL = 'BACKUP LOG [Northwind] TO DISK =
''E:LogShipFolderNorthwindNorthwind_tlog_' + @strTimeStamp +'.trn''
WITH INIT'
EXEC (@SQL)

When you run this backup script, make sure that you have no other log backups running, as this would break the log sequence that is necessary to restore the transaction logs.

Creating a cleanup script

We’ll need to create a cleanup script that will delete the log backups older than a specified number of days or whatever retention period you have set as a corporate policy. The script will also log the delete activities in a file for reference. To accomplish this task, a VBScript will be used, as shown in listing 2.

Listing 2. VBScript (deleteTRN.vbs) cleans up the transaction log backups
On Error Resume Next
Dim fso, folder, files, sFolder, sFolderTarget, intDaysOld
Set fso = CreateObject("Scripting.FileSystemObject")

'The number of days would be the value passed to the script as a parameter
intDaysOld= Wscript.Arguments.Item(0)
strDatabaseName= Wscript.Arguments.Item(1)

'Location of the database backup files
sFolder = "E:LogShipFolder" & strDatabaseName & ""

Set folder = fso.GetFolder(sFolder)
Set files = folder.Files
'Variable used for writing to textfile-generate report on database backups
deleted
Const ForAppending = 8

'Create a folder named "scripts" for ease of file management &
'a file inside it named "LOG.txt" for delete activity logging
Set objFile = fso.OpenTextFile(sFolder & "scriptsLOG.txt", ForAppending)

objFile.Write "==================================" & VBCRLF & VBCRLF
objFile.Write "DATABASE BACKUP FILE REPORT " & VBCRLF
objFile.Write "DATE: " & FormatDateTime(Now(),1) & "" & VBCRLF
objFile.Write "TIME: " & FormatDateTime(Now(),3) & "" & VBCRLF & VBCRLF
objFile.Write "=======================================" & VBCRLF

'iterate thru each of the files in the database backup folder
For Each itemFiles In files
'retrieve complete path of file for the DeleteFile method and to extract
'file extension using the GetExtensionName method
a=sFolder & itemFiles.Name

'retrieve file extension
b = fso.GetExtensionName(a)
'check if the file extension is TRN
If uCase(b)="TRN" Then
'check if the database backups are older than intDaysOld days
If DateDiff("d",itemFiles.DateCreated,Now()) >= intDaysOld Then
'Delete any old BACKUP files to cleanup folder
fso.DeleteFile a
objFile.WriteLine "BACKUP FILE DELETED: " & a
End If
End If
Next

objFile.WriteLine "======================================="&VBCRLF & VBCRLF

objFile.Close

Set objFile = Nothing
Set fso = Nothing
Set folder = Nothing
Set files = Nothing

Creating a batch file

We need to create the batch file that will call both the T-SQL script and the VBScript file. The contents of the batch file will be a simple call to sqlcmd.exe and a call to the VBScript file using either wscript.exe or calling the file, as shown in listing 3. Save the file as E:LogShipFolderscriptsdatabaseBackup.cmd and save it in the scripts subfolder.

Listing 3. Batch file (databaseBackupLog.cmd) calls the backup and cleanup scripts
REM Run TSQL Script to backup database transaction logs
sqlcmd -S<PRIMARYINSTANCENAME> -E -i"E:LogShipFolderscriptsackupLog.sql"

REM Run database backup cleanup script, passing the number of days
REM old and the database name
E:LogShipFolderscriptsdeleteTRN.vbs 2 Northwind

We’ll now create a Scheduled Task to run this batch file every 15 minutes or so, depending on the frequency of your transaction log backups. This is the alternative to SQL Server Agent, as it is not available on the Express Editions. Scheduled Tasks can be found in the Control Panel > Scheduled Tasks or under Start > All Programs > Accessories > System Tools > Scheduled Tasks (this is the same as the Task Scheduler if you’re running on Windows Vista).

Because we are using Windows authentication to run the T-SQL script, we should use a Windows account that is a member of the db_backupoperator role of all the databases, as outlined in the list and figures that follow:

1.  

Launch Scheduled Tasks.

2.  

Click on Add Scheduled Task.

3.  

Browse to the E:LogShipFolderscripts folder and select databaseBackupLog.cmd (see figure 1).

Figure 1. Selecting a program to schedule

4.  

Select the frequency and time for running the backups (see figures 2 and 3).

Figure 2. Naming a task

Figure 3. Entering a start time and day for the task

5.  

Enter a Windows account that has at least db_backupoperator role privileges for all of the databases (see figure 4).

Figure 4. Entering the user’s name and password

After creating the scheduled task, open the properties of the job you just created. On the Schedule tab, click the Advanced button. On the Advanced Schedule Options window (figure 5), click the Repeat Task check box. Enter the values as defined by your log backup frequency, in this case 15 minutes, and specify a duration of 24 hours.

Figure 5. Using Advanced Schedule Options

Now that we have created the automated job to generate the log backups, we need to share the folder that will contain the log backups. We’ll share the E:LogShipFolder in this scenario in case you plan to create log shipping jobs for more than one database on this instance. This should be accessible from the standby server, either via the IP address or the DNS name.

On the standby server, copy the robocopy.exe file into the E:LogShipFolderscripts folder. Then create a batch file that will call the robocopy.exe utility, passing the corresponding parameters. This batch file will copy the log backups from the primary server to the standby server and should be run in the standby server. Let’s call the batch file logShipRobocopy.cmd and save it in the same folder. We add another line on the batch file to call a VBScript file that will be responsible for restoring the log backups on the standby server based on time stamps. The batch file will contain the commands shown in listing 4.

Listing 4. Batch file (logShipRobocopy.cmd) calls the replication and restore scripts
REM ROBOCOPY job to copy the transaction log backups from the primary
REM server to the standby server
ROBOCOPY \primary-serverLogShipFolder E:LogShipfolder /COPY:DATSO /MIR

REM Call the VBScript file that will run a RESTORE LOG command
REM The VBScript file accepts 2 parameters—the folder containing the
REM transaction log backups and the database name
E:LogShipFolderscripts estoreLOG.vbs E:LogShipFolder Northwind

The /COPY option, in the ROBOCOPY command, with the corresponding flags copies the files together with the file information as specified by the following switches:

  • D—File data
  • A—File attributes
  • T—File timestamps
  • O—File ownership information
  • U—File auditing information
  • S—File security (NTFS ACLs)

Both source and destination volumes must be formatted using NTFS to allow copying of Security, Ownership, and Auditing information. If this switch is omitted, the default is /COPY:DAT, or /COPY:DATS if /SEC is specified. The /MIR (short for mirror) switch is used to maintain an exact mirror of the source directory tree. This ensures that whatever the users did on their machines is replicated on the server.

The restoreLOG.vbs file will contain the commands shown in listing 5.

Listing 5. Batch file (restoreLOG.vbs) calls the backup and cleanup scripts
'This script does a custom log shipping job using ROBOCOPY and VBScript
'with FileSystemObject querying the specified folder for files created
'within the past 15 minutes or less and generates a TSQL RESTORE LOG
'command which is executed after the ROBOCOPY script
'Syntax: restoreLOG.vbs folder databaseName

On Error Resume Next

Dim sFolder, sFolderTarget, strParentFolder, strDatabaseName
Dim fso, folder, files, strFileNameResults, objShell

Set fso = CreateObject("Scripting.FileSystemObject")
Set objFSO = CreateObject("Scripting.FileSystemObject")

strParentFolder=Wscript.Arguments.Item(0)
strDatabaseName=Wscript.Arguments.Item(1)

sFolder = strParentFolder & strDatabaseName

Set folder = fso.GetFolder(sFolder)
Set files = folder.Files

SET objShell = CreateObject("Wscript.Shell")

For each itemFiles In files
a=sFolder & "" & itemFiles.Name

'retrieve file extension
b = fso.GetExtensionName(a)

'check if the file extension is TRN
If uCase(b)="TRN" Then

'check for DateCreated attribute of file and compare with current
'date/time
If (DateDiff("N", itemFiles.DateCreated, Now) <=15) Then
'Create the file to contain the script
If (objFSO.FileExists("E:LogShipFolderscriptsSQL" & strDatabaseName & ".sql")) Then
objFSO.DeleteFile ("E:LogShipFolderscriptsSQL" & strDatabaseName & ".sql")
End If

Set objMyFile = objFSO.CreateTextFile("E:LogShipFolderscriptsSQL" & strDatabaseName & ".sql", True)

str1="RESTORE LOG " & strDatabaseName
str2="FROM DISK='" & a & "'"
str3="WITH STANDBY='E:LogShipFolderUNDOUNDO_" & strDatabaseName & "_ARCHIVE.DAT',"
str4="DBO_ONLY"

objMyFile.WriteLine (str1)
objMyFile.WriteLine (str2)
objMyFile.WriteLine (str3)
objMyFile.WriteLine (str4)

objMyFile.Close
Set objFSO = Nothing
Set objMyFile = Nothing

'Run an OSQL command that uses a RESTORE LOG WITH MOVE, STANDBY
objShell.Run("sqlcmd -S<STANDBYINSTANCENAME> -E -iE:LogShipFolderscriptsSQL" & strDatabaseName & ".sql -oE:LogShipFolderscriptsSQL" & strDatabaseName & "-results.txt")

'Call subroutine to read the results file for RESTORE errors
strFileNameResults="E:LogShipFolderscriptsSQL" & strDatabaseName & "-results.txt"
Call CheckforErrors(strFileNameResults, strParentFolder, strDatabaseName)
End If
End If
Next

objFile.Close
SET objFile = NOTHING
SET fso = NOTHING
SET folder = NOTHING
SET files = NOTHING
SET objShell = NOTHING
SET objFSO = NOTHING
SET objMyFile = NOTHING

'SUBROUTINE - used to read the results of the SQLCMD query to check for
'errors
Sub CheckforErrors(fileNameResults,parentFolder,databaseName)

strFileNameResults=fileNameResults
strParentFolder=parentFolder
strDatabaseName=databaseName

Const ForReading = 1

SET objShell = CreateObject("Wscript.Shell")
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFile = objFSO.OpenTextFile(strFileNameResults, ForReading)
strContents = objFile.ReadAll

'The value=0 means that it cannot find the phrase RESTORE LOG successfully
'processed and thus means an error
If Instr(strContents,"RESTORE LOG successfully processed")=0 then
objShell.Run(strParentFolder & "scriptssendEmailSMTP.vbs " & strDatabaseName & " RESTORE ")
End If

objFile.Close

Set objFSO=NOTHING
Set objFile=NOTHING
Set objShell=NOTHING

End Sub

 

The script accepts two parameters—the folder containing the copied log backups and the database name. It iterates through the contents of the folder and retrieves the file extension of all the files—in this case, TRN. It then checks for the DateCreated attribute of the file and compares it to the system date and time. If the value for the DateCreated attribute for the file differs in less than or equal to 15 minutes from the system date and time, it creates a SQL file containing the RESTORE LOG command for the database you’ve passed as a parameter to the script. Notice how the script is dynamically generated, passing the filename of the retrieved log backup. After the SQL file has been generated, a call to the sqlcmd.exe utility is made, passing the filename of the SQL file generated as an input and generating an output file, which we’ll use to send email alerts should the RESTORE LOG command fail. Because the script will be executed using the Windows Task Scheduler, we need something to send us email alerts for notification, as SQLMail and Database Mail are not available in SQL Server Express. The script calls another VBScript file, sendEmailSMTP.vbs, which accepts two parameters—the database name and the job type, which in this case is a RESTORE job. Listing 6 shows the script for the sendEmailSMTP.vbs file. Make sure to change the SMTP server address and the recipient email address.

Listing 6. VBScript file (sendEmailSMTP.vbs) sends email notifications
'Accept input parameters
Dim databaseName
Dim jobType

'first parameter
databaseName= Wscript.Arguments.Item(0)
'second parameter
jobType= Wscript.Arguments.Item(1)

Set objMessage = CreateObject("CDO.Message")
objMessage.Subject = "Message Alert from STANDBY SQL Server: " & databaseName
& " " & jobType & " job failed"
objMessage.From = "[email protected]"
objMessage.To = "[email protected]"
objMessage.TextBody = "The " & databaseName & " " & jobType & " job running in the STANDBY SQL Server instance failed on " & Now() & vbCrLF & vbCrLf & "Please look at this problem ASAP " & vbCrlf & vbCrlf & vbCrlf & vbCrlf & "- The SQL Server Administrator/DBA -"

'This section provides the configuration information for the remote SMTP
'server.Normally you will only change the server name or IP.
objMessage.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2

'Name or IP of Remote SMTP Server
objMessage.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "smtp.domain.local"

'Server port number(typically 25)
objMessage.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25

objMessage.Configuration.Fields.Update

objMessage.Send
Set objMessage = Nothing

After all the required scripts have been created, we’ll use the same process we used in the primary server by creating a Scheduled Task that will call the logShipRobocopy.cmd batch file. This is the tricky part. You need to make sure that the batch file will execute after the log backup in the primary server is generated but before the new one starts. We must monitor this after implementation as the log backup time in the primary server may increase due to increased transactions, which means increased file size resulting in increased file transfer time.

To illustrate, if we enabled log backups on the primary server to run every 15 minutes starting from 12:00 a.m., the log backups will be generated in 15-minute sequences: 12:00 a.m., 12:15 a.m., 12:30 a.m., and so on. On the standby server, the scheduled task to call the logShipRobocopy.cmd batch file should be scheduled to run every 15 minutes but after 12:00 a.m. To be safe, it should run in sequences that are two-thirds of the time sequences of the primary server, say 12:10 a.m. This would provide ample time for a medium-sized database with an average-to-high number of transactions to generate the log backups. Using a trial-and-error process to find the appropriate timing, you could begin by setting 5 minutes for copying the log backups from the primary to the standby and to restoring them.

To reiterate, when the log backup job runs on the primary server at 12:00 a.m., your Scheduled Task on the standby server should be scheduled to execute at 12:10 a.m. When you create the Scheduled Task, make sure that the domain account that you use has the appropriate rights to copy from the shared folder and dbo or dbcreator privileges to restore the log backups. You also need to take into account password changes on the account, as changes will cause the jobs on the primary and standby servers to fail. Note that all the scripts used in the entire process are interdependent. Changing the frequency of the transaction log backups would mean changing the VBScript file that reads the DateCreated attribute of the log backup file to reflect that interval. It also means changing the frequency of the Scheduled Task created on the standby server. Overall, understanding the entire process will give you the flexibility to modify parameters used in all the scripts.

Improving the log shipping process

You can improve the process of log shipping in many ways such as by using Windows PowerShell instead of VBScript. Windows PowerShell is an extensible command-line shell and associated scripting language that runs on top of .NET Framework 2.0, and it makes writing scripts much easier. A few lines of VBScript can be translated in a single line of code in PowerShell.

For example, consider the deleteTRN.vbs file that is responsible for cleaning up the transaction log backup files. Listing 7 shows the script translated into PowerShell. This does not log the delete activity in a text file, but it does the job as expected.

Listing 7. PowerShell translation of deleteTRN.vbs
Get-ChildItem E:LogShipFolderNorthwind | Where {$_.CreationTime -lt (get-date).AddDays(-2) -and ($_.Extension -eq ".trn")} | Remove-Item

Notice how a couple of lines of VBScript codes can translate to a single line in PowerShell. It’s a great alternative to administering servers and workstations with scripts.

Summary

The process in this chapter gives you an opportunity to implement a custom log shipping solution on SQL Server editions that do not support it—even older versions. You can improve the process in many ways including using Windows PowerShell instead of VBScript. Custom log shipping gives you an opportunity to implement a high availability solution for your SQL Server Express databases even though they may be running on a Windows XP Professional machine. Have fun!

About the author

Edwin Sarmiento works as a Senior SQL Server DBA/Systems Engineer for The Pythian Group in Ottawa, Canada. He is passionate about technology but has interests in music, professional and organizational development, leadership, and management, when not working with databases. He lives up to his primary mission statement, “To help people grow and develop their full potential as God has planned for them.”

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.104.196