4.3 Extracting DDL Using SMO
A misplaced index or a poorly defined
table definition are two examples of how changing the DDL of a database
can have heavy performance implications. With this in mind, I created
the following script, which can be scheduled to run once a day and will
extract all the DDL objects from the database and store them on disk
against the day that they were extracted. If poor performance is
identified, it’s easy to compare the files between days to identify
whether any DDL changes caused the performance of the database to
degrade.
#Helper function to script the DDL Object to disk
function Write-DDLOutput ($filename, $object)
{
New-Item $filename -type file -force | Out-Null
#Specify the filename
$ScriptingOptions.FileName = $filename
#Assign the scripting options to the Scripter
$Scripter.Options = $ScriptingOptions
#Script the index
$Scripter.Script($object)
}
#Load the SMO assembly
[reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null
#Create all the global vars we need
$Server = New-Object ("Microsoft.SqlServer.Management.Smo.Server")
$Scripter = New-Object ("Microsoft.SqlServer.Management.Smo.Scripter")
$ScriptingOptions = New-Object
("Microsoft.SqlServer.Management.SMO.ScriptingOptions")
$Scripter.Server = $Server
#Specifies the root folder that we’ll store the Scripts into This will probably
become a param in future
$RootBackupFolder = "C:\SqlBackups\DDL"
#Get the day of the week so that we can create a folder for each day
$Today = [System.DateTime]::Today.DayOfWeek
#Store today’s backup folder
$DDLBackupFolder = Join-Path -Path $RootBackupFolder -ChildPath $Today
#Check if today’s folder exists
if ([System.IO.Directory]::Exists($DDLBackupFolder))
{
#If it does delete it’s contents
Remove-Item (Join-Path -Path $DDLBackupFolder -ChildPath *) -Recurse
}
else
{
#Otherwise create it
[System.IO.Directory]::CreateDirectory($DDLBackupFolder) | Out-Null
}
#Setup the scripting options
$ScriptingOptions.AppendToFile = $true
$ScriptingOptions.FileName = $filename
$ScriptingOptions.ToFileOnly = $true
$ScriptingOptions.ScriptData = $false
#Loop through all the databases to script them out
foreach ($database in ($Server.databases | where {$_.IsSystemObject -eq $false -and
$_.IsDatabaseSnapshot -eq $false}))
{
$databaseBackupFolder = Join-Path -Path $DDLBackupFolder -ChildPath
$Database.Name
#This will be the database create script
Write-DDLOutput (Join-Path -Path ($databaseBackupFolder) -ChildPath
($Database.Name + ".sql")) $database
$ProgrammabilityBackupFolder = Join-Path -Path
$databaseBackupFolder -ChildPath "Programmability"
$DefaultsBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Defaults"
foreach ($default in $database.Defaults)
{
#Generate a filename for the default
Write-DDLOutput (Join-Path -Path
($DefaultsBackupFolder) -ChildPath
($default.Schema + "." + $default.Name + ".sql"))
$default
}
#Create a folders to store the functions in
$FunctionsBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Functions"
$ScalarFunctionsBackupFolder = Join-Path -Path
$FunctionsBackupFolder -ChildPath "Scalar-valued Functions"
$TableValuedFunctionsBackupFolder = Join-Path -Path
$FunctionsBackupFolder -ChildPath "Table-valued Functions"
foreach ($function in $database.UserDefinedFunctions | where
{$_.IsSystemObject -eq $false})
{
#script the functions into folders depending upon type. We’re
only interested in scalar and table
switch ($function.FunctionType)
{
scalar
{
#Generate a filename for the scalar function
$filename = Join-Path -Path
($ScalarFunctionsBackupFolder) -ChildPath
($function.Schema + "." + $function.Name + ".sql")
}
table
{
#Generate a filename for the table value function
$filename = Join-Path -Path
($TableValuedFunctionsBackupFolder) -ChildPath
($function.Schema + "." + $function.Name + ".sql")
}
default { continue }
}
#Script the function
Write-DDLOutput $filename $function
}
$RulesBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Rules"
foreach ($rule in $database.Rules)
{
#Script the rule
Write-DDLOutput (Join-Path -Path
($RulesBackupFolder) -ChildPath
($rule.Schema + "." + $rule.Name + ".sql")) $rule
}
#Create a folder to store the Sprocs in
$StoredProceduresBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Stored Procedures"
#Loop through the sprocs to script them out
foreach ($storedProcedure in $database.StoredProcedures | where
{$_.IsSystemObject -eq $false})
{
#script the sproc
Write-DDLOutput ($filename = Join-Path -Path
($StoredProceduresBackupFolder) -ChildPath
($storedProcedure.Schema + "." +
$storedProcedure.Name + ".sql"))
$storedProcedure
}
#Create a folder to store the table scripts
$TablesBackupFolder = Join-Path -Path $databaseBackupFolder -ChildPath
"Tables"
$TableIndexesBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Indexes"
$TableKeysBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Keys"
$TableConstraintsBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Constraints"
$TableTriggersBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Triggers"
#Loop through the tables to script them out
foreach ($table in $database.Tables | where
{$_.IsSystemObject -eq $false})
{
#Script the Table
Write-DDLOutput (Join-Path -Path
($TablesBackupFolder) -ChildPath
($table.Schema + "." + $table.Name + ".sql")) $table
foreach($Constraint in $table.Checks)
{
#Script the Constraint
Write-DDLOutput (Join-Path -Path
($TableConstraintsBackupFolder) -ChildPath
($table.Schema + "." + $table.Name + "." +
$Constraint.Name + ".sql")) $Constraint
}
foreach ($index in $table.Indexes)
{
#Generate a filename for the table
switch($index.IndexKeyType)
{
DriPrimaryKey
{
$filename = Join-Path -Path
($TableKeysBackupFolder) -ChildPath
($table.Schema + "." +
$table.Name + "." +
$index.Name + ".sql")
}
default
{
$filename = Join-Path -Path
($TableIndexesBackupFolder) -ChildPath
($table.Schema + "." +
$table.Name + "." +
$index.Name + ".sql")
}
}
#Script the index
Write-DDLOutput $filename $index
}
foreach ($trigger in $table.Triggers)
{
#Script the trigger
Write-DDLOutput (Join-Path -Path
($TableTriggersBackupFolder) -ChildPath
($table.Schema + "." + $table.Name + "." +
$trigger.Name + ".sql")) $trigger
}
}
#Create a folder to store the view scripts
$ViewsBackupFolder = Join-Path -Path $databaseBackupFolder -ChildPath
"Views"
$ViewKeysBackupFolder = Join-Path -Path $ViewsBackupFolder -ChildPath
"Keys"
$ViewIndexesBackupFolder = Join-Path -Path
$ViewsBackupFolder -ChildPath "Indexes"
$ViewTriggersBackupFolder = Join-Path -Path
$ViewsBackupFolder -ChildPath "Triggers"
#Loop through the views to script them out
foreach ($view in $database.Views | where
{$_.IsSystemObject -eq $false})
{
#Script the view
Write-DDLOutput (Join-Path -Path
($ViewsBackupFolder) -ChildPath
($view.Schema + "." + $view.Name + ".sql")) $view
foreach ($index in $view.Indexes)
{
#Generate a filename for the table
switch($index.IndexKeyType)
{
DriPrimaryKey
{
$filename = Join-Path -Path
($ViewKeysBackupFolder) -ChildPath
($view.Schema + "." +
$view.Name + "." + $index.Name + ".sql")
}
default
{
$filename = Join-Path -Path
($ViewIndexesBackupFolder) -ChildPath
($view.Schema + "." + $view.Name + "." +
$index.Name + ".sql")
}
}
Write-DDLOutput $filename $index
}
foreach ($trigger in $view.Triggers)
{
#Script the trigger
Write-DDLOutput (Join-Path -Path
($ViewTriggersBackupFolder) -ChildPath
($view.Schema + "." + $view.Name + "." +
$trigger.Name + ".sql")) $trigger
}
}
}
4.4 Scheduling Script Execution
There are two methods for scheduling
script execution. The first is to use Windows Task Scheduler, which is
useful if you don’t have SQL Server installed on the server from which
you wish to execute the PowerShell script. You can simply add a new
task to the Scheduler and execute PowerShell.exe, passing the script
you want to execute as a parameter.
For servers on which you have SQL Server 2008 R2
or later installed, you also have the option to execute PowerShell as a
SQL Server Agent job. This is easily achieved by creating a new step
for a job and selecting PowerShell from the Type drop-down box. You can
then enter the PowerShell script into the Command text box.
Unfortunately using PowerShell jobs in SQL Server
2008 R2 was not very useful because it would call into PowerShell
version 1.0 so many scripts and modules would not work properly. If you
want to execute scripts written for PowerShell 2.0 from SQL Server 2008
R2 you are best off using the execute PowerShell.exe method described
earlier. Fortunately this is resolved in SQL Server 2012 because it
will load PowerShell 2.0
The advantage of using the SQL Server
job agent is that you may already have jobs running, and this approach
enables you to manage all your jobs in one place. You can also use the
logging functionality already built into the SQL Job engine to monitor
the execution of the PowerShell script.