Tuesday, September 20, 2016

Sometimes I want to run a bunch of statements (like updating a bunch of stats or re-indexing a number of indexes) and I want to reduce the chance that these statements will interfere with other processes in the database. I created a script which loops over each statement to be executed and then checks to see if the database is currently in use (has any "Runnable" processes and if it does, waits until all "Runnable" processes have completed, then runs the statement and then starts over with the next statement.

I know that for many this won't be of value, because your systems are too busy for this to make sense and I also know that a new process could start while I'm running one of my statements, so this isn't fool-proof by any means.
-- ========================================================================
-- Author:  Eric Zierdt
-- Create date: 9/20/2016
-- Description: Checks database for running processes and Runs 
--    series of statements when free and emails when complete
-- ========================================================================


/*********************************************************************
 **  CODE TO CREATE THE TABLE TO STORE THE QUERIES YOU WANT TO RUN **
 *********************************************************************/
/*
IF EXISTS (SELECT * FROM tempdb.dbo.sysobjects WHERE ID = OBJECT_ID(N'tempdb..##TABLE'))
BEGIN
 DROP TABLE ##TABLE
END
CREATE TABLE ##TABLE (ID INT IDENTITY(1,1), Code VARCHAR(MAX),Complete BIT DEFAULT 0)
INSERT INTO ##TABLE
( Code )
VALUES
('UPDATE STATISTICS [Table] [Stat] WITH FULLSCAN')
*/
SET NOCOUNT ON

/*****************************************************************************************
 **  CREATING ##StartTime TABLE TO BE USED IN REMAINING TIME ESTIMATES IN OTHER QUERY. **
 **  SETTING TO GLOBAL TEMP TABLE SO ACCESSABLE IN ANOTHER WINDOW      **
 *****************************************************************************************/
IF EXISTS (SELECT * FROM tempdb.dbo.sysobjects WHERE ID = OBJECT_ID(N'tempdb..##StartTime'))
BEGIN
 DROP TABLE ##StartTime
END
CREATE TABLE ##StartTime (StartTime DATETIME)
INSERT INTO ##StartTime VALUES  ( GETDATE() )
PRINT 'Start Time: ' + CONVERT(VARCHAR(25), GETDATE(), 101) + ' ' + CONVERT(VARCHAR(25), GETDATE(), 108)
GO

/*****************************************************************************************
 **  CREATING ##StartTime TABLE TO BE USED IN REMAINING TIME ESTIMATES IN OTHER QUERY. **
 **  SETTING TO GLOBAL TEMP TABLE SO ACCESSABLE IN ANOTHER WINDOW      **
 *****************************************************************************************/
DECLARE @ID INT
  ,@SQL VARCHAR(MAX)
  ,@StartTime DATETIME = GETDATE()
  ,@EndTime DATETIME
WHILE 
(
 SELECT COUNT(1)
 FROM ##TABLE
 WHERE Complete = 0
) >= 1
BEGIN 
 WHILE 
 (
  /**************************************************************
   **  CHECKING TO SEE IF DATABASE HAS NO 'RUNNABLE' PROCESSES **
   **************************************************************/
  SELECT  COUNT(1)
  FROM    sys.sysprocesses S
  WHERE   status = 'runnable'
    AND DB_NAME(S.dbid) = DB_NAME()
    AND S.spid <> @@SPID
 ) > 0   
 BEGIN   
  WAITFOR DELAY '00:00:05';  ---CHANGE TO MEET YOUR NEEDS; THIS IS SAYING, RETRY EVERY 5 SECONDS TO SEE IF DATABASE IS FREE TO DO WORK
 END
 /*************************************************
  ** NO 'RUNNING' PROCESSES DETECTED, RUN QUERY **
  *************************************************/
 SELECT TOP 1 @ID=ID, @SQL=Code FROM ##TABLE WHERE Complete = 0
 EXEC(@SQL)
 UPDATE ##TABLE SET Complete = 1 WHERE ID = @ID
END

/*********************************************
 ** SET END TIME AND PRINT RUN TIME STATS **
 *********************************************/
SET @EndTime = GETDATE()
PRINT 'End Time: ' + CONVERT(VARCHAR(25), @EndTime, 101) + ' ' + CONVERT(VARCHAR(25), @EndTime, 108)
PRINT 'Run Time: ' + CONVERT(VARCHAR, DATEADD(ms,DATEDIFF(ms,@StartTime,@EndTime),0),108)

/*************
 ** CLEANUP **
 *************/
DROP TABLE ##TABLE
DROP TABLE ##StartTime

/*********************************************
 ** SEND EMAIL NOTIFICATION OF COMPLETION **
 *********************************************/
DECLARE @buffer_memory_used_MB FLOAT
  ,@Body VARCHAR(MAX)
  ,@Subject VARCHAR(255) = 'Auto Run Script Complete'
  ,@To VARCHAR(512) = ''
SET @Body = '

The script has completed processing queries.
Run Time: ' + CONVERT(VARCHAR, DATEADD(ms,DATEDIFF(ms,@StartTime,@EndTime),0),108) + ' ' EXEC msdb.dbo.sp_send_dbmail @recipients = @To , @subject = @Subject , @body = @Body , @body_format = 'HTML'

In another window I will often run this on occasion to get some stats and an idea of how long it will run for:
DECLARE @Completed FLOAT
  ,@CompletedStatements FLOAT
  ,@Total FLOAT
  ,@TotalStatements FLOAT
  ,@PercentComplete FLOAT 
  ,@StartDate DATETIME

SELECT @StartDate=StartTime FROM ##StartTime
DECLARE @RunTime BIGINT = DATEDIFF(SECOND,@StartDate,GETDATE())

SELECT @CompletedStatements = COUNT(1) FROM ##Table WHERE fixed = 1
SELECT @TotalStatements = COUNT(1) FROM ##Table
SELECT @Completed = SUM([RowCount]) FROM ##Table WHERE fixed = 1
SELECT @Total = SUM([RowCount]) FROM ##Table
SELECT @PercentComplete = 100*(@Completed/@Total)
DECLARE @SecondPerRecord FLOAT = CAST(@RunTime AS FLOAT)/@Completed
DECLARE @EstRemainingSeconds BIGINT = @SecondPerRecord*(@Total-@Completed)
SELECT @Total TotalRecords, @Completed CompletedRecords, @TotalStatements as TotalStatements, @CompletedStatements as CompletedStatements, @PercentComplete [% Complete], @RunTime RunTimeSeconds, @SecondPerRecord SecondPerRecord, @EstRemainingSeconds EstRemainingSeconds
  ,@EstRemainingSeconds/60.0 EstRemainingMins, @EstRemainingSeconds/3600.0 EstRemainingHours

Please comment below and let me know if you found value in this post.

Eric

Wednesday, March 9, 2016

Get Fast Rowcount

I came up with this little gem a while ago, I use it when I'm trying to get a rowcount on a large table, where it's so large that it takes a long time for a simple SELECT COUNT(1) FROM xxx to return data.

DECLARE @TableName sysname
SET @TableName = 'TableName'

SELECT SUM(row_count) AS [RowCount]
FROM sys.dm_db_partition_stats
WHERE object_id=OBJECT_ID(@TableName)   
AND (index_id=0 or index_id=1);

Thursday, March 3, 2016

Script - Email on Job Completion

Sometimes I have long running agent jobs (like say a backup job) and I don't want to do work on the server while the specific job is running; so I've come up with a quick and easy script using the sp_send_dbmail ability to monitor for the job completion and then email me. I've done similar things with some manual tasks that take a while (index reindex or update stats) so I am notified when the script completes.

-- ===============================================
-- Author:  Eric Zierdt
-- Create date: 3/3/2016
-- Description: Emails upon agent job completion
-- ===============================================
DECLARE @JOB_NAME SYSNAME = N'AdventureWorks Backup'; -- NAME OF AGENT JOB
 
WHILE EXISTS
(     
 SELECT  1
 FROM    msdb.dbo.sysjobs_view job
 JOIN msdb.dbo.sysjobactivity activity ON job.job_id = activity.job_id
 JOIN msdb.dbo.syssessions session ON session.session_id = activity.session_id
 JOIN ( 
    SELECT   MAX(agent_start_date) AS max_agent_start_date
    FROM     msdb.dbo.syssessions
   ) session_max ON session.agent_start_date = session_max.max_agent_start_date
 WHERE   activity.run_Requested_date IS NOT NULL
   AND activity.stop_execution_date IS NULL
   AND job.name = @JOB_NAME 
   --AND activity.start_execution_date > CAST(GETDATE() AS DATE)
) 
BEGIN      
     WAITFOR DELAY '00:00:45'; -- SET HOW OFTEN YOU WANT YOUR JOB TO CHECK
END 

DECLARE @buffer_memory_used_MB FLOAT
  ,@Body VARCHAR(MAX)
  ,@Subject VARCHAR(255) = 'Job Complete'
  ,@To VARCHAR(512) = 'user@email.com' -- INSERT YOUR EMAIL ADDRESS HERE
SET @Body = '

The monitored job has completed running.' EXEC msdb.dbo.sp_send_dbmail @recipients = @To , @subject = @Subject , @body = @Body , @body_format = 'HTML'

Tuesday, October 13, 2015

Statistic Repository

I have seen posts from many great SQL masters saying that the first thing they do when diagnosing a performance problem is to update the stats or review the stats; I too have seen the power of updated stats, so when an internal customer of mine was having some issues with stats, it became helpful for me to create a Stats Repository to track and monitor changes in statistics.

The idea is, record once a day the stats info on each statistic in a given database, this can be done across servers (Prod, Test, Dev, etc) if you want to compare to see why something runs better on one server and not on another.

The first step is to create a holding table, if you have a DBA database to store maintenance stuff, put it there, but for this post I'll use master:

------------------------------------------------------
-- SETUP STORAGE TABLE TO STORE DAILY STATS DETAILS --
------------------------------------------------------
USE [master]
GO

/****** Object:  Table [dbo].[StatsHistory]  ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[StatsHistory](
 [ServerName] [sysname] NULL,
 [DatabaseName] [sysname] NULL,
 [TableName] [sysname] NULL,
 [StatName] [sysname] NOT NULL,
 [Updated] [datetime] NULL,
 [TableRows] [int] NULL,
 [RowsSampled] [int] NULL,
 [PercentSampled] [float] NULL,
 [Steps] [int] NULL,
 [Density] [int] NULL,
 [AverageKeyLength] [int] NULL,
 [StringIndex] [varchar](3) NULL,
 [FilterExpression] [varchar](512) NULL,
 [UnfilteredRows] [int] NULL,
 [StatColumns] [varchar](500) NULL,
 [BatchRunID] [int] NULL,
 [CreateDate] [datetime] NULL
) ON [PRIMARY]
The next step is to create a function that will be used to concatenate the columns for the Stat into a list. Because of how sys.stats_columns works, you'll need to put this in every database you want to monitor

-----------------------------------------------------
-- NEEDS TO BE RUN FOR EVERY DB YOU ARE MONITORING --
--     THIS FUNCTION CREATES A LIST OF COLUMNS     --
--       INCLUDED IN THE PASSED IN STATISTIC       --
-----------------------------------------------------

USE []
GO

/****** Object:  UserDefinedFunction [dbo].[StatColumnIDToList]    Script Date: 12/19/2014 10:11:01 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

-- =============================================
-- Author:  Eric Zierdt
-- Create date: 12/9/2014
-- Description: Converts the ColumnIDs to a List
-- USE: SELECT dbo.StatColumnIDToList(357576312,2)
-- =============================================
CREATE FUNCTION [dbo].[StatColumnIDToList] 
(
 -- Add the parameters for the function here
 @ObjectID INT
 ,@StatsID INT
)
RETURNS VARCHAR(2000)
AS
BEGIN
 -- Declare the return variable here
 DECLARE @List VARCHAR(2000)

 SELECT @List = COALESCE(@List + ',', '') + Cast(SC.column_id As varchar(5))
 FROM sys.stats_columns SC 
 WHERE SC.object_id = @ObjectID AND SC.stats_id = @StatsID

 -- Return the result of the function
 RETURN @List

END
The next step is to create a job which you will schedule to run daily. This job should execute the following code to generate the stat info and insert it into the table we made in step 1:

-- =================================================
-- Author:  Eric Zierdt
-- Create date: 12/9/2014
-- Description: INSERTS STATS DATA INTO REPOSITORY
-- =================================================
DECLARE @ServerName VARCHAR(120)
  ,@DBName VARCHAR(120)
  ,@TableName VARCHAR(120)
  ,@StatName VARCHAR(120)
  ,@SQL VARCHAR(MAX)
  ,@BatchRunID INT
  ,@ColumnList VARCHAR(2000)
  
SELECT @BatchRunID = MAX(BatchRunID)+1
FROM master.dbo.StatsHistory
  

CREATE TABLE #StatsTable (
 ServerName SYSNAME NULL
 ,DatabaseName SYSNAME NULL
 ,TableName SYSNAME NULL
 ,StatName SYSNAME
 ,Updated DATETIME
 ,TableRows INT
 ,RowsSampled INT
 ,PercentSampled AS CAST(RowsSampled AS FLOAT)/CAST(TableRows AS FLOAT)*100 
 ,Steps INT
 ,Density INT
 ,AverageKeyLength INT
 ,StringIndex VARCHAR(3)
 ,FilterExpression VARCHAR(512)
 ,UnfilteredRows INT
 ,StatColumns VARCHAR(500)
 ,BatchRunID INT
 ,CreateDate DATETIME
)

CREATE NONCLUSTERED INDEX [IDX_StatTable_Name] ON #StatsTable
(
 [TableName] ASC
) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF
  , ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [Primary]

DECLARE ZCursor CURSOR LOCAL FAST_FORWARD FOR
(
    SELECT  @@SERVERNAME AS ServerName, DB_NAME() AS DBName, OBJECT_NAME(object_id) AS TableName 
           ,name AS StatName, dbo.StatColumnIDToList(object_id,stats_id)
    FROM    sys.stats
    WHERE   OBJECT_NAME(object_id) NOT LIKE ''sys%''
            AND OBJECT_NAME(object_id) NOT LIKE ''MS%''
            AND OBJECT_NAME(object_id) NOT LIKE ''queue%'' 
            AND OBJECT_NAME(object_id) NOT LIKE ''filestream%'' 
            --AND OBJECT_NAME(object_id) = ''MSdistribution_agents''
)
OPEN ZCursor 
FETCH NEXT FROM ZCursor INTO @ServerName, @DBName, @TableName,@StatName,@ColumnList
 WHILE @@FETCH_STATUS = 0
 BEGIN
  --PRINT @StatName
  SET @SQL = ''DBCC SHOW_STATISTICS ('' + @TableName+ '',"'' + @StatName +''") WITH STAT_HEADER,NO_INFOMSGS ;''
  --PRINT @SQL
  INSERT INTO #StatsTable
  (
   StatName
   ,Updated
   ,TableRows
   ,RowsSampled
   ,Steps
   ,Density
   ,AverageKeyLength
   ,StringIndex
   ,FilterExpression
   ,UnfilteredRows
  )
  EXEC(@SQL)

  UPDATE #StatsTable
  SET TableName = @TableName
   ,DatabaseName = @DBName
   ,ServerName = @ServerName
   ,BatchRunID = @BatchRunID
   ,StatColumns = @ColumnList
   ,CreateDate = GETDATE()
  WHERE TableName IS NULL

  FETCH NEXT FROM ZCursor INTO @ServerName, @DBName, @TableName,@StatName,@ColumnList
 END

CLOSE ZCursor
DEALLOCATE ZCursor

--SELECT * FROM #StatsTable WHERE TableRows <> RowsSampled ORDER BY TableName

INSERT INTO master.dbo.StatsHistory
(
 ServerName
 ,DatabaseName
 ,TableName
 ,StatName
 ,Updated
 ,TableRows
 ,RowsSampled
 ,PercentSampled
 ,Steps
 ,Density
 ,AverageKeyLength
 ,StringIndex
 ,FilterExpression
 ,UnfilteredRows
 ,StatColumns
 ,BatchRunID
 ,CreateDate 
)
SELECT ServerName
  ,DatabaseName
  ,TableName
  ,StatName
  ,Updated
  ,TableRows
  ,RowsSampled
  ,PercentSampled
  ,Steps
  ,Density
  ,AverageKeyLength
  ,StringIndex
  ,FilterExpression
  ,UnfilteredRows
  ,StatColumns
  ,BatchRunID
  ,CreateDate 
FROM #StatsTable

DROP TABLE #StatsTable
Now you can do some fun things with your repository table, like this:

-- =====================================================
-- Author:  Eric Zierdt
-- Create date: 12/9/2014
-- Description: SAMPLE CODE TO QUERY STATS REPOSITORY
-- =====================================================
DECLARE @BatchRunID INT
SELECT  @BatchRunID = MAX(BatchRunID) FROM master.dbo.StatsHistory

/**  SEE ALL STATS **/
SELECT  *
FROM    master.dbo.StatsHistory
WHERE BatchRunID = @BatchRunID

/**  SEE ALL STATS OLDER THAN 1 MONTH  **/
SELECT  *
FROM    master.dbo.StatsHistory
WHERE BatchRunID = @BatchRunID
  AND Updated < DATEADD(MONTH,-1,GETDATE())

/**  SEE ALL NON-FULLSCAN STATS  **/
SELECT  *
FROM    master.dbo.StatsHistory
WHERE BatchRunID = @BatchRunID
  AND PercentSampled < 100


/**  CREATE UPDATE STAT SCRIPT FOR ALL NON-FULLSCAN STATS  **/
SELECT  TableName, StatName, TableRows, PercentSampled, 'UPDATE STATISTICS dbo.' + TableName + ' ' + StatName + ' WITH FULLSCAN --' + CAST(TableRows AS VARCHAR(10)) AS SQLCode
FROM    master.dbo.StatsHistory
WHERE   BatchRunID = @BatchRunID
  AND PercentSampled < 100

Tuesday, December 10, 2013

Review of SQL Elements by Idera

Idera SQL Elements Review

I work for a large company which has well over 600 instances of SQL Server installed; we are constantly adding instances to our inventory. Managing our portfolio of instances is a challenge and to help ourselves out we created a series of tables which store server/instance information. Daily an SSIS job and Powershell scripts interrogate the various instances to collect basic information about them, like number/name of databases, disk size, up/down status, backup information, etc. We have been looking for a commercial solution to manage this for us and give us more options for a few years, but unable to find such a product.

Two months ago I was contacted by an Idera sales person who was interested in finding out if I was interested in one of their fine products and while we were talking, I told him what we were looking for in an inventory tool and asked him if they had a tool that would fit our needs. I expected to find out that they didn’t have anything and that what I wanted was a CMDB (configuration management database) solution; however that isn’t what he told me. He told me that Idera was going to debut a new solution called SQL Elements at PASS Summit that would do exactly what we needed and told me to sign up for the Beta program. I was a bit skeptical at first, but I signed up for the Beta program and hoped for the best.

My first impression was a bit of uncertainty, upon signing up for the Beta, I was placed on a wait list until I could be approved, I was afraid this might take a while and so I tried to get as much information off the Idera website as I could about the product. I had limited success finding anything about the product, which I’m sure was because as a beta product they didn’t have as much information to put out, I tried to sign up for a webcast, but had issues with that as well. After only a few days of waiting I was approved for the Beta and given access to the Beta Feedback system.

I dusted off my SQL Server 2012 evaluation VM and installed the product which was very simple. A few short minutes later I had a webpage up which was asking for my login, I found out that the system uses AD credentials to log in and as the person who installed it, mine were pre-setup in the system. I logged in and was given some basic instructions and asked to enter a list of instances to start monitoring. The process was pretty easy, you can enter a list of instances by separating them with a semicolon. Next you are asked for what credentials to use to connect to them. If you used a global AD group or user account on all your instances, this is easy if you setup Elements with that account; it will use the default account. Next you can enter some specific information on the instance, such as Owner, Location, comments and choose to assign a series of tags (I used the following tags: Production, Development, Test, DR). Once you confirm your request, Elements will start investigating the instances and reporting back on health checks and other basic Instance information. Overall the interface is pretty simple. The system has three main sections, a Dashboard view, the Explorer View and the Instance View. The Dashboard is a simple page showing how many of your monitored instances are up/down, Health Check Recommendations, Top Databases by Size, and Environment Statistics.

The Explorer View allows you to gather some information about your environment based on some filters. You can see Instance Count, DB Count, DB Size, Log Size and Activity information all correlated by either Instance, Owner, Location, Version, Database or by a custom Tag. You can filter the results by these last few options as well. So if you only want to see DB Size for Instances in Location X, you can do that.

The Instances View allows you to manage the instances you are monitoring. It allows you to add new instances by typing in the name(s) of the instances, or by seeing SQL Instances Elements has “Discovered” for you. Note that at the time of this writing, it appears that Elements can only detect SQL Server Instances in the subnet that it is installed in; they say they are working on it. The Instance view reports on Status, Response Time, Version, Edition, # of DBs, Size of DBs, Owner (manually entered), Location (manually entered), and if Clustered. The Discovery feature is nice, because it will let you stay up to date on any new instances which have been added to your environment

If you edit an instance, you get a wealth of information about it, like Database Names and Sizes, Max/Min Memory, Configuration Settings, and Server info (Drive sizes, VM, CPUs, Memory, OS, etc).

Under the hood, the database storing all this information is intuitive, the table names make sense, PK/FKs have been created, and a Database Diagram was easy to create. I was able to write a few SSRS reports against it pretty easily, allowing for me to fully utilize the data.

The Beta feedback and support website was very fun to use. You are given a certain number of “Votes” to use when submitting an issue or request. You can vote up other request on the system and by doing this, the developers know what is most important to their users. I found they were fairly responsive to acknowledging my request.

The software is licensed based on how many Instances you want to manage and they have an All you can Eat option, allowing you to license your entire environment and add new instances as they come online for a reasonable price. One important note is that this software only works with Instances of SQL Server which are 2005+, so if you have SQL Server 2000 instances in your environment, you will need to manage them through a separate method.

For companies with a larger SQL Server foot print or with less control on who can install SQL Server, this tool should be very attractive, it does a great job of helping you track your inventory and I’m sure as the product matures it will provide more and more functionality (growth tracking?). If on the other hand you have a smaller environment (< 15 SQL Instances), this may not offer you as much value.

SQL Elements Website
Elements Beta Feedback Site

Friday, September 13, 2013

SQLSaturday #238 is coming to the Twin Cities

SQLSaturday Twin Cities is October 12th 2013, a great day of SQL Training, Networking, Prizes and Fun.
I will be presenting on Table Partitioning for Archiving this year, so stop by and say hi!
Click Here to see the full schedule

Tuesday, August 20, 2013

PowerShell Progress Meter

In many of my past PowerShell scripts, I've included my own home-spun progress report in the results window. I like that a lot, but my mind was blown when I noticed that there is a "Write-Process" cmdlet...with some re-tweaking I made it rock and much more efficient.

So the situation where you would want to use this is if you are looping over a large dataset and it takes a while to bring back results (I often will use PowerShell to hit all the servers I control, get some information about them and return it to me (backup status, if they are VM's, configuration info, etc). It can take 20+ minutes to get to all the servers and it is nice to know how far along the process is; thats where this comes in handy.

Some things to keep in mind...You need to have some resultset to work with, it doesn't have to be a query, but it does have to be some kind of System.Array. After you generate the Array, run this bit of code...I'm calling my array "results":

<################################
 ##  SETUP PROGRESS VARIABLES  ##
 ################################>
$ResultsLen = $results.Length
$LastPercentComplete = 0
[datetime]$DateStart = Get-Date -format G

This sets up some of the variables we will need inside the progress meter. Next, just after you issue your "foreach ($result in $results) {" You run the following code:

<##########################
 ##  RUN PROGRESS METER  ##
 ##########################>
foreach ($result in $results) 
{
 $currentRow++
 $PercentComplete = [math]::floor(($CurrentRow/$ResultsLen)*100)
 [datetime]$DateNow = Get-Date -format G
 $diff = $DateNow-$DateStart
 $DiffSeconds = $diff.TotalSeconds
 $SecondsPerRecord = $DiffSeconds/$currentRow
 $RemainingSeconds = $SecondsPerRecord*($ResultsLen-$currentRow)
 $SecondsPerRecord = [Math]::Round($SecondsPerRecord,2)
 
 Write-Progress -Activity "Investigating Servers..." `
  -PercentComplete $PercentComplete `
  -SecondsRemaining $RemainingSeconds `
  -CurrentOperation "Current Row: $currentRow / $ResultsLen.   Seconds/Record: $SecondsPerRecord   Run Time (seconds): $DiffSeconds" `
  -Status "Please wait."

 Remove-Variable DateNow
 Remove-Variable DiffSeconds
 Remove-Variable SecondsPerRecord
 Remove-Variable RemainingSeconds
}

You don't need to remove your variables, I just like to do it to clean up my code, I do this at the bottom of the script:

Remove-Variable ResultsLen
Remove-Variable currentRow
Remove-Variable LastPercentComplete
Remove-Variable DateStart

Want to see it in action? Take my script and add this to the top of it (creates an array with 10000 items, which should keep the status bar up long enough for you to see):
$results = 1..10000

This looks awesome in PowerGUI, which is what I use for PowerShell Development, it looks a bit funky in the native PS window, but good enough to get you the idea. If you aren't using PowerGUI, give it a shot, it's free. http://www.powergui.org

As always, please leave comments below if you find this useful or have other suggestions. It's nice to know people read these posts.