"Well look at this. Appears we got here just in the nick of time. What does that make us?"
"Big damn heroes, Sir"
"Ain't. We. Just" - Mal and Zoe, Firefly - 'Safe'
When I went to SQL Saturday #94 here in Salt Lake City back in September, Chris Shaw (blog|twitter) did a presentation on a Utility Database - a database you as the DBA create to track certain things (disk free space, high use sprocs, a parse of the daily error log etc).
With the gracious permission of Mr. Shaw, I took what he started and modified it for my particular needs, and this posting is about what I've done and how I'm using it.
Chris presented the same topic at the PASS Summit 2011 in Seattle in October. After the Summit, Buck Woody (blog|twitter) tweeted a #sqlchallenge to blog about something you learned at the PASS Summit.
So, I'm cheating a little, since what I'm writing about, while presented at the Summit, I actually learned about a few weeks before.
The Utility Database is a DB you create. In it, there are a few stored procedures to collect some data daily and email it to you every day to alert you to certain things that may be happening in your system. It also keeps a historical log of these alerts. To quote from his slide deck on what this is: "A user defined, created database. Keeps information on your server about your server. Keeps information you want as long as you want it. Presents information the way you want to see it. This is your database. Acts as a repository for you the DBA".
I had been sending myself some emails along these lines every day. I hadn't thought about recording the data for historical purposes or reporting from it in order to predict disk usage trends, etc.
The Original scripts associated with this presentation did the following:
- Created two tables, the Daily Check List table (for today's issues) and a historical table to keep anything not from today.
- Gathered information on the local drives to record free space on each local drive on the server
- Captured the size of each datafile in each database
- Read the Error Log and parsed out certain messages that we don't want to see the records the rest
- Queried Report Server Log to get information on number of reports executed
- Queries the index physical stats DMO to report index fragmentation
- Gather statistics about execution of stored procedures and the reads and writes and elapsed times on each
- Calls a CLR Procedure to gather processor utilization information
- Emails a summary to the DBA
He also went on to set up a basic report in Report Server to email a pretty formatted report to himself each morning with all the information from the Daily Check List.
This did some of what I wanted, didn't record all of what I wanted and was performing extra steps that I just didn't need.
I'd never be able to get a CLR Routine DLL past change management (code unseen from an unknown third party), so that was dropped. Index Maintenance is a scheduled task each month already, so I really didn't need that. The application I'm primarily responsible for doesn't use Report Server and has an extremely limited (single digit, if that) number of stored procedure calls. That also went.
The file information displayed also wasn't showing everything I wanted and I needed the output a little different. So I set out to make my version of this Utility Database.
First, a disclaimer. I'm relatively new to SQL Server and the work I've been doing in it has certainly not been in T-SQL - I'm a DBA, not a developer. I'm not necessarily doing things the most elegant way, but for once daily or once a month reporting, what I have below is working for me.
I've got all of this in one large script for my implementation, but I'll break it down and annotate it for the purposes of this article.
First step: Create the database.
-- This script defines the objects needed to track daily reporting -- against all databases installed in this instance of SQL Server -- -- Ideas in this script have been culled from a presentation by -- Chris Shaw at SQL Saturday 94 in Salt Lake City (@SQLShaw on twitter, -- http://chrisshaw.wordpress.com) and from a series of articles -- on SQLServerCentral.com called SQL Overview. -- The last article in the series is linked here and it -- refers back to all of the others. -- http://www.sqlservercentral.com/articles/Monitoring/69650/ -- -- The SQL Overview series gave lots of good information, but didn't keep -- any of it for trending analysis. Chris Shaw's presentation gave ideas -- on how to keep that in line. -- -- First, create the database CREATE DATABASE [UtilityDB] ON PRIMARY ( NAME = N'UtilityDB', FILENAME = N'G:\MSSQL\Data\UtilityDB.mdf' , SIZE = 102400KB , MAXSIZE = UNLIMITED, FILEGROWTH = 102400KB ) LOG ON ( NAME = N'UtilityDB_log', FILENAME = N'G:\MSSQL\Data\UtilityDB_log.ldf' , SIZE = 51200KB , MAXSIZE = 2048GB , FILEGROWTH = 51200KB ) GO
Next step was to create the tables that I'd be using for this database. I decided to create clustered indexes on them purely for performance on queries down the road. Over time, if never archived, these tables may grow to a significant size and having a clustered index would hopefully assist in performance since most of the queries are date based.
First couple of tables are the Daily Check List and the Historical Check List tables and the tables for storing disk free space and file growth.
-- Create the tables used in this utility USE [UtilityDB]; -- This Table contains the daily information emailed to the DBA. -- It is cleared out first thing each morning and moved to the -- cumulative history table CREATE TABLE [dbo].[DayCheckList]( [InstanceName] [varchar](50) NOT NULL, [EventStatus] [varchar](50) NOT NULL, [EventDate] [datetime] NULL, [EVENT] [varchar](2000) NOT NULL ) ON [PRIMARY] GO CREATE CLUSTERED INDEX IX_DayCheckList ON DayCheckList(EventDate) GO -- This is the cumulative history table. It contains the complete -- history of events captured by this utility in the range of from -- yesterday back until when data capture started. CREATE TABLE [dbo].[HistoryCheckList]( [InstanceName] [varchar](50) NOT NULL, [EventStatus] [varchar](50) NOT NULL, [EventDate] [datetime] NULL, [EVENT] [varchar](2000) NOT NULL ) ON [PRIMARY] GO CREATE CLUSTERED INDEX IX_HistoryCheckList ON HistoryCheckList(EventDate) GO -- This table holds a history of the daily size of your datafiles. CREATE TABLE [dbo].[FileSizeGrowth]( [InstanceName] [varchar](50) NOT NULL, [DBName] [varchar] (50) NOT NULL, [FileName] [varchar] (50) NOT NULL, [PhysicalName] [varchar](200) NULL, [SizeInMB] [int] NULL, [SizeInGB] [int] NULL, [ReadingDate] [datetime] ) ON [PRIMARY] GO CREATE CLUSTERED INDEX IX_FileSizeGrowth ON FileSizeGrowth(ReadingDate) GO -- Not a huge fan of default values myself, but we can place this -- here just in case. ALTER TABLE [dbo].[FileSizeGrowth] ADD DEFAULT (getdate()) FOR [ReadingDate] GO -- This is a work table used when populating the historical FileFreeSpace -- table. It is truncated each day after calculating what is needed CREATE TABLE [dbo].[FreeSpaceWork]( [InstanceName] [varchar] (50), [DBName] [varchar] (50), [FileName] [varchar] (50), [PhysicalName] [varchar](200) NULL, [TotalSizeInMB] [int] NULL, [FreeSizeInMB] [int] NULL ) ON [PRIMARY] GO -- This table keeps the daily tracking of the percentage of free space -- in the data files. This data will be used in the monthly reporting -- for size trending. CREATE TABLE [dbo].[FileFreeSpace]( [InstanceName] [varchar] (50), [DBName] [varchar] (50), [FileName] [varchar] (50), [PhysicalName] [varchar](200) NULL, [TotalSizeInMB] [int] NULL, [FreeSizeInMB] [int] NULL, [PctFreeSpace] [int], [ReadingDate] [datetime] ) ON [PRIMARY] GO CREATE CLUSTERED INDEX IX_FileFreeSpace ON FileFreeSpace(ReadingDate) GO -- Again, not a huge fan of default values myself, but we can place this -- here just in case. ALTER TABLE [dbo].[FileFreeSpace] ADD DEFAULT (getdate()) FOR [ReadingDate] GO
One of the things I added to my version of this was an indication in the daily email that a database hadn't been backed up in a certain number of days. For most user databases, one would expect that you'd want a backup at least every 24 hours - but maybe you're happy only backing up your system databases once a week. The database control table below lists, for each database you are going to report on, how far in the past (in days) the most recent backup can be before we get upset and alert about it.
Also, there may be databases on the server that you don't want reported. Maybe they are test or development or training databases and you don't particular care about them. The ExcludeFromReporting table just contains the database name of any database you never want to see reported. Then, I went an populated the seed data into these tables.
-- This table defines the maximum numbers of days than can elapse after -- a database is backed up before it's reported as missing a backup. -- For example, system databases may be backed up weekly, but others -- are daily CREATE TABLE [dbo].[DatabaseBackupControl]( [DBName] [char](50) NOT NULL, [MaxDaysBetweenBackups] [int] NOT NULL ) ON [PRIMARY] -- We want the data in this table to be unique, so put a unique key here. CREATE UNIQUE CLUSTERED INDEX IX_DatabaseBackupControl ON DatabaseBackupControl (DBName) GO -- This table defines any databases we don't want included in reporting, -- such as test or development databases. CREATE TABLE [dbo].[ExcludeFromReporting]( [DBName] [char](50) NULL ) ON [PRIMARY] -- We want the data in this table to be unique, so put a unique key here. CREATE UNIQUE CLUSTERED INDEX IX_ExcludeFromReporting ON ExcludeFromReporting (DBName) -- Populate initial seed information into control tables. -- Control settings for checking for missing backups INSERT INTO DatabaseBackupControl VALUES ('myprod', 1) INSERT INTO DatabaseBackupControl VALUES ('master', 7) INSERT INTO DatabaseBackupControl VALUES ('model', 7) INSERT INTO DatabaseBackupControl VALUES ('msdb', 7) INSERT INTO DatabaseBackupControl VALUES ('monitoring', 7) INSERT INTO DatabaseBackupControl VALUES ('ReportServer', 1) -- Control settings for databases to exclude from checking INSERT INTO ExcludeFromReporting VALUES ('mydev'); INSERT INTO ExcludeFromReporting VALUES ('mytst'); INSERT INTO ExcludeFromReporting VALUES ('mytrn'); INSERT INTO ExcludeFromReporting VALUES ('tempdb'); INSERT INTO ExcludeFromReporting VALUES ('LiteSpeedLocal'); INSERT INTO ExcludeFromReporting VALUES ('ReportServerTempDB'); INSERT INTO ExcludeFromReporting VALUES ('RSExecutionLog');
Moving along! The next thing I created was a view to make reporting a little easier. I also wanted my report to show me each day any SQL Server Agent jobs that had failed in the previous day.
-- In order to simplify some of the reporting, views may be -- created that are snapshots of system catalog tables or -- views from msdb or master. -- This view summarizes any SQL Server Agent job that failed yesterday. -- The view is set up so that it can just be inserted into the daily -- notification table when the daily procedures runs. CREATE VIEW [dbo].[YesterdayFailedJobs] as SELECT @@ServerName as ServerName, 'Job Failed' as Event_Status, getdate() as Event_Date, cast(run_date as char(8)) + '/' + cast(run_time as char(6)) + ' Job: '+name + ' failed with ' + message as EventMsg FROM msdb..sysjobs a JOIN (SELECT * from msdb..sysjobhistory where sql_severity <> 0) b ON a.job_id = b.job_id WHERE b.run_date = cast(convert(char(8), getdate()-1, 112) as int)
OK that defines all the tables and views. Next are the individual stored procedures that gather the various data aspects. First couple gather free space on local drives and the size of the data files.
-- This stored procedure gathers information about the free space on -- the drives attached to the server. CREATE PROCEDURE [dbo].[GetDriveSpaceFree] AS DECLARE @driveSpace TABLE (drive CHAR(2), MBFree int) INSERT INTO @driveSpace EXEC sp_executesql N'xp_fixeddrives' INSERT INTO UtilityDB..DayCheckList SELECT @@ServerName, 'Drive Space', GETDATE(), 'Free space on ' + drive + ' is ' + CONVERT (VARCHAR(20), MBFree/1024) + ' Gigs' FROM @driveSpace GO -- This stored procedure gathers information about the growth of your -- data files. CREATE Procedure [dbo].[GetFileSizeGrowth] AS insert into FileSizeGrowth Select @@ServerName, db_name(database_id), name, physical_name, SizeInMB = (size * 8 / 1024), SizeInGB = (size * 8 / 1024 / 1024), GETDATE() from sys.master_files GO CREATE Procedure [dbo].[GetFileFreeSpace] AS declare @RETURN_VALUE int declare @command1 nvarchar(2000) declare @yesterday datetime declare @today datetime set @command1 = 'use ? Select @@ServerName, ''?'', name, physical_name, TotalSizeInMB = size/128, FreeSizeInMB = size/128.0 - CAST(FILEPROPERTY(name, ''SpaceUsed'') AS int)/128.0 from ?.sys.database_files' Insert Into FreeSpaceWork (InstanceName, DBName, FileName, PhysicalName, TotalSizeInMB, FreeSizeInMB) exec @RETURN_VALUE = sp_MSforeachdb @command1 = @command1; Insert Into FileFreeSpace (InstanceName, DBName, FileName, PhysicalName, TotalSizeInMB, FreeSizeInMB, PctFreeSpace, ReadingDate) select InstanceName, DBName, FileName, PhysicalName, TotalSizeInMB, FreeSizeInMB, FreeSizeInMB * 100 / CASE WHEN TotalSizeInMB = 0 THEN 1 ELSE TotalSizeinMB END, GETDATE() from FreeSpaceWork where DBName not in (select DBName from ExcludeFromReporting); select @yesterday = dateadd(day, datediff(day, 0, getdate()-1), 0) select @today = dateadd(day, datediff(day, 0, getdate()), 0) insert into DayCheckList select InstanceName, 'Free Space ('+DBName+')', ReadingDate, rtrim(ltrim(cast(PctFreeSpace as Char(3)))) + '% Freespace in ' + PhysicalName from FileFreeSpace where ReadingDate between @yesterday and @today; truncate table FreeSpaceWork; GO
The next stored procedure reads the SQL Server error log and filters out things we don't want to see before recording the rest in the log. I'm also recording a startup for any database started in the last 7 days
-- This stored procedure reads the SQL Server Error Log and parses -- out information we don't want to see and reports the rest. Create Procedure GetFilteredErrorLog AS DECLARE @Errorlog TABLE (LogDate datetime, ProcessorInfo VARCHAR (100), ErrorMSG VARCHAR(2000)) INSERT INTO @Errorlog EXEC sp_executesql N'xp_readerrorlog' Delete FROM @Errorlog WHERE ErrorMSG LIKE '%Log was backed up%' OR ErrorMSG LIKE '%Setting database option COMPATIBILITY_LEVEL%' OR ErrorMSG LIKE '%were backed up%' OR ErrorMSG LIKE '%DBCC TRACEON%' OR ErrorMSG LIKE '%without errors%' OR ErrorMSG LIKE '%\ERRORLOG%' OR ErrorMSG LIKE '%Attempting to cycle errorlog%' OR ErrorMSG LIKE '%Errorlog has been reinitialized.%' OR ErrorMSG LIKE '%found 0 errors and repaired 0 errors.%' OR ErrorMSG LIKE '%without errors%' OR ErrorMSG LIKE '%This is an informational message%' OR ErrorMSG LIKE '%WARNING:%Failed to reserve contiguous memory%' OR ErrorMSG LIKE '%The error log has been reinitialized%' OR ErrorMSG LIKE '%Setting database option ANSI_WARNINGS%'; INSERT INTO UtilityDB..daychecklist SELECT @@ServerName, 'Error Log', Logdate, SUBSTRING(ErrorMSG, 1, 2000) FROM @Errorlog WHERE LogDate > DATEADD(dd, -1, GETDATE()) INSERT INTO UtilityDB..daychecklist SELECT @@ServerName, 'DB Restart',Logdate,ErrorMSG FROM @Errorlog WHERE ErrorMSG LIKE '%Starting up database%' AND LogDate > DATEADD(dd, -7, GETDATE()) GO
Next: Let's check for failed SQL Server Agent jobs and backups outside of our tolerances
CREATE PROCEDURE [dbo].[CheckFailedJobs] AS declare @Count as int select @Count=count(*) from YesterdayFailedJobs if @Count > 0 Begin INSERT INTO daychecklist SELECT * FROM YesterdayFailedJobs END GO -- This stored procedure checks recent backups against the control tables. -- This query is a UNION of three selects, each checking different things. -- If a database is found that has a backup and is not defined on the -- control table, it is reported (first select) -- If a database is found where the last backup is outside the recovery -- window, it is reported (second select) -- If a database is found that is not defined in either backup control -- table, it is reported (third select) CREATE PROCEDURE CheckRecentBackups AS insert into DayCheckList select @@ServerName, 'Missing Backup', getdate(), a.database_name + ' last backed up ' + convert(char(20), max(backup_finish_date), 100) + ' and control record is not defined' from (select * from msdb.dbo.backupset where type <> 'L') a JOIN sys.databases b ON a.database_name = b.name LEFT OUTER JOIN DatabaseBackupControl c ON a.database_name = c.DBName where database_name not in (select DBName from ExcludeFromReporting) and MaxDaysBetweenBackups is NULL group by database_name, MaxDaysBetweenBackups UNION select @@ServerName, 'Missing Backup', getdate(), a.database_name+ ' last backed up '+ convert(char(20), max(backup_finish_date), 100) + ' limit is ' + cast(MaxDaysBetweenBackups as char(2)) + ' day(s)' from (select * from msdb.dbo.backupset where type <> 'L') a JOIN sys.databases b ON a.database_name = b.name LEFT OUTER JOIN DatabaseBackupControl c ON a.database_name = c.DBName where database_name not in (select DBName from ExcludeFromReporting) group by database_name, MaxDaysBetweenBackups having getdate() > dateadd(day, MaxDaysBetweenBackups, max(backup_finish_date)) UNION select @@ServerName, 'Missing Backup', getdate(), name + ' not defined in either Control Table or Exclude table. ' + 'Please investigate.' from sys.databases where name not in (select DBName from ExcludeFromReporting) and name not in (select DBName from DatabaseBackupControl) GO
Almost there. This next procedure is the one that will query the daily error log, format it nicely into an HTML table and email it to you.
-- This stored procedure will take the contents of the Daily log table -- and email it formatted as an HTML table to interested parties. CREATE PROCEDURE [dbo].[SendDailyEmail] as DECLARE @Subject varchar (100) SET @Subject='SQL Server - Daily Event Log' declare @Count as int select @Count=count(*) from DayCheckList IF @Count > 0 Begin DECLARE @tableHTML NVARCHAR(MAX) ; SET @tableHTML = N'<table border="1">' + N'<tr><th>InstanceName</th>' + N'<th width="150">EventStatus</th>' + N'<th width="180">EventDate</th>' + N'<th>Event</th>' + N'</tr>' + CAST ( ( SELECT td=[InstanceName],'' ,td=[EventStatus],'' ,td=[EventDate],'' ,td=[Event],'' from DayCheckList order by InstanceName, EventStatus FOR XML PATH('tr'), TYPE ) AS NVARCHAR(MAX) ) + N'</table>' ; EXEC msdb.dbo.sp_send_dbmail @profile_name = 'SQL Server Agent Mail Profile', @recipients = 'me@mydomain.com', @subject = @Subject, @body = @tableHTML, @body_format = 'HTML' ; END GO
Almost done. This Stored Procedure will be run daily via a SQL Server Agent job to generate the email
-- Finally, this is the stored procedure that ties it all together. -- This procedures should be executed daily as a job CREATE PROCEDURE [dbo].[DayCheck] AS ------------------------Maintenance Work--------------------------------- INSERT INTO HistoryCheckList SELECT * FROM DayChecklist IF @@error = 0 BEGIN Delete UtilityDB..DayCheckList END ---------------------------Disk Space------------------------------------ EXECUTE GetDriveSpaceFree EXECUTE GetFileSizeGrowth EXECUTE GetFileFreeSpace --------------------------Read Error Log--------------------------------- EXECUTE GetFilteredErrorLog --------------------------Check Failed Jobs------------------------------ EXECUTE CheckFailedJobs --------------------------Check Recent Backups--------------------------- EXECUTE CheckRecentBackups --------------------------Send Daily Email------------------------------- EXECUTE SendDailyEmail GO
Finally, set up a SQL Server Agent job to run this last Stored Procedure (DayCheck) at some time that works for you.
So, collecting all this data is wonderful, but what about using some of the data file data to generate a report with some trending data for data growth etc. I created this query that I run on the first of each month to show me growth of files and growth of data within the data files.
Note the commented lines at the bottom of the query. There are all sorts of things you can do here. You can exclude the databases included in the ExcludeFromReporting table. You can ignore log files (assuming they have been named *.LDF). You could also only pick up those records where there was growth of some sort etc
Use UtilityDB go IF EXISTS (SELECT name FROM sys.objects WHERE name = 'MonthlyReport') DROP TABLE MonthlyReport; DECLARE @Today DATETIME, @BDATE DATETIME, @EDATE DATETIME, @MINDATE DATETIME, @MAXDATE DATETIME, @DAYS DECIMAL; -- Today's date, modified to be midnight. SELECT @Today = dateadd(dd, datediff(dd, 0, getdate())+0, 0) -- One Month back from today SELECT @Today = DATEADD(month, -1, @Today); -- First of last month SELECT @BDATE = DATEADD(dd,-(DAY(DATEADD(mm,1,@Today))-1),DATEADD(mm,0,@Today)) -- First of this month SELECT @EDATE = DATEADD(dd,-(DAY(DATEADD(mm,1,@Today))-1),DATEADD(mm,1,@Today)) -- Calculate days in the month SELECT @DAYS = DAY(DATEADD (m,1,DATEADD(d,1-DAY(GETDATE()),GETDATE()))-1) -- Calculate the first and last dates in the date range -- and adjust both to be midnight SELECT @MINDATE = min(ReadingDate) from FileFreeSpace where ReadingDate >= @BDATE and ReadingDate < @EDATE; SELECT @MAXDATE = max(ReadingDate) from FileFreeSpace where ReadingDate >= @BDATE and ReadingDate < @EDATE; SELECT @MINDATE = dateadd(dd, datediff(dd, 0, @MINDATE)+0, 0) SELECT @MAXDATE = dateadd(dd, datediff(dd, 0, @MAXDATE)+0, 0) select a.InstanceName, a.DBName, a.FileName, a.PhysicalName, a.TotalSizeInMB as StartSize, b.TotalSizeInMB as EndSize, (b.TotalSizeInMB - a.TotalSizeInMB) as Growth, (a.TotalSizeInMB - a.FreeSizeInMB) as StartUsed, (b.TotalSizeInMB - b.FreeSizeInMB) as EndUsed, (b.TotalSizeInMB - b.FreeSizeInMB) - (a.TotalSizeInMB - a.FreeSizeInMB) as DiffUsed, CAST((((b.TotalSizeInMB - b.FreeSizeInMB) - (a.TotalSizeInMB - a.FreeSizeInMB)) / @Days) as DECIMAL(9,2)) as AvgGrowthDay, (b.FreeSizeInMB * 100) / CASE b.TotalSizeInMB WHEN 0 THEN 1 ELSE b.TotalSizeInMB END as PctFreeEOM into MonthlyReport from (select * from FileFreeSpace where dateadd(dd,datediff(dd,0,ReadingDate)+0,0) = @MINDATE) a, (select * from FileFreeSpace where dateadd(dd,datediff(dd,0,ReadingDate)+0,0) = @MAXDATE) b where a.InstanceName = b.InstanceName and a.DBName = b.DBName and a.FileName = b.FileName and a.PhysicalName = b.PhysicalName and a.DBName not in (select DBName from ExcludeFromReporting) and upper(a.PhysicalName) not like '%.LDF' order by a.DBName, a.FileName GO -- and DBName not in (select DBName from ExcludeFromReporting) -- Having ((max(FreeSizeInMB) - min(FreeSizeInMB)) / @Days) -- and upper(a.PhysicalName) not like '%.LDF' DECLARE @Subject varchar (100) SET @Subject='SQL Server - Monthly File Growth Statistics' declare @Count as int select @Count=count(*) from MonthlyReport IF @Count > 0 Begin DECLARE @tableHTML NVARCHAR(MAX) ; SET @tableHTML = N'<table border="1">' + N'<tr><th>InstanceName</th>' + N'<th>DBName</th>' + N'<th>FileName</th>' + N'<th>PhysicalName</th>' + N'<th>StartSize</th>' + N'<th>EndSize</th>' + N'<th>Growth</th>' + N'<th>StartUsed</th>' + N'<th>EndUsed</th>' + N'<th>DiffUsed</th>' + N'<th>AvgGrowthDay</th>' + N'<th>PctFreeEOM</th>' + N'</tr>' + CAST ( ( SELECT td=[InstanceName],'' ,td=[DBName],'' ,td=[FileName],'' ,td=[PhysicalName],'' ,td=[StartSize],'' ,td=[EndSize],'' ,td=[Growth],'' ,td=[StartUsed],'' ,td=[EndUsed],'' ,td=[DiffUsed],'' ,td=[AvgGrowthDay],'' ,td=[PctFreeEOM],'' from MonthlyReport order by DBName, FileName FOR XML PATH('tr'), TYPE ) AS NVARCHAR(MAX) ) + N'</table>' ; EXEC msdb.dbo.sp_send_dbmail @profile_name = 'SQL Server Agent Mail Profile', @recipients = 'me@mydomain.com', @subject = @Subject, @body = @tableHTML, @body_format = 'HTML' ; END GO IF EXISTS (SELECT name FROM sys.objects WHERE name = 'MonthlyReport') DROP TABLE MonthlyReport;
So this is an example of the kind of things that I'm using this sort of data for. Hopefully someone else will find this useful too.
Any suggestions for improvement, please leave comments!