SQL Load Test Tool V 1.0

After a while, finally version 1.0 is available !! Improvements were made on the performance checks and also added an execution duration setting in order to test in a loop for a certain period of time.
Documentation is on its way now…

The following is a link to a post I’ve created on SQLServerCentral. The essence of this article is getting the deadlock information from the SQL Server logs in a “readable” way when you have the trace flag 1222 enabled…


SQL Load Test Tool

Today I published the first version of a SQL Load test tool in CodePlex. This is still an Alpha version, so any feedback will be appreciated !

The link is http://eyeonsqlloadtest.codeplex.com

On the previous post I covered some suggestions regarding performance related to the disk subsystem. The second part of performance analysis I’ll cover is related to the indexes and its utilization, as that’s also one of the big performance improvements that can easily be done.

Analyzing the Existing Indexes

One of the things you should always look inside SQL is how your indexes are being used (or unused). The best way to achieve this is using the DMVs the SQL 2005 and above provides. Using those DMCs, you can sneak into which indexes are missing, the usage of the existing ones or which ones are unused.
In order to have better results, it’s recommended that you let the server run for a reasonable amount of time first (restarting the SQL engine clears the content of the DMVs), as these are statistics prepared by SQL server itself. It’s hard to put a value to the “Reasonable Time” as that depends on the SQL usage, but as a rule of thumb you can wait for 3~4 weeks before considering using the information DMVs provides.

The Missing Indexes

Perhaps one of the scripts I use the most, the following code will give an idea of the missing indexes, together with the estimated improvement the index can have if created. Of course, you should take care on how you use this, as if you create an index you may introduce a performance issue elsewhere, so it’s really really suggested that you create the indexes on a test DB first and that a smoke test gets executed against that just to confirm, at least, that not only you are improving things but also not adding future nightmares.
So, the script I use if the following:

db_name(database_id) as ‘DBName’
,d.statement as ‘Table’
order by DBName asc, (gs.unique_compiles + gs.user_seeks) desc

The result of the above query can be, for instance, the following:

If you take a look into that results, you can see the engine thinks only three indexes are missing (based on its own statistics) and it shows the columns that it should contain together with and compiles, user seeks and impact (improvement). Next, you can create those indexes and analyze your database performance.

Index Usage

Another interesting script I use is the following. It’ll show how your indexes are used on the database, ordering the results having the less used on top.

object_name(s.object_id) as ‘Table Name’,
i.name as ‘Index Name’,
i.index_id as ‘Index ID’,
from sys.dm_db_index_usage_stats s, sys.indexes i
where objectproperty(s.object_id,’IsUserTable’) = 1
and i.object_id = s.object_id
and i.index_id = s.index_id
order by (user_seeks + user_scans + user_lookups + user_updates) asc

If you run this on your database, you can see how many times your indexes were used, and that will give you an idea of the less used ones. If you analyze that and compare it with the cost of having that index in place, you can end up removing indexes or moving the most used to its own disk system, for instance.

Unused Indexes

Finally, I run the following script to get the list of indexes that we not used at all since the server started:

B.NAME as IndexName

You should take extra and special care on taking decisions based on this results, as it can harm your DB. Here it’s important that you have a lot of statistical information and that you have also some deep knowledge of the application and processes using that DB.
For instance, if you have a process that creates a quarterly report, which of course runs quarterly and use one index of that DB, and you analyze only one month of statistical information, that index will be shown as never used, and if you drop it, when your process runs to generate the report, it’ll not find the index so it may take longer, cost a lot more to the engine and / or even fail.


Indexes are very complex and require a lot of analysis of trained people. This DMVs will give you a lot of information, but people need to make a decision based on that information, so if it’s not well interpreted, then the result won’t be the expected one. Remember to always run the query that will benefit from the index creation before and after the creation of it, showing the Execution Time on both runs so you can compare if the index created really improves your performance. Also use the Database Engine Tuning Advisor (DTA) as that will also give you extra information on indexes, like if it needs to be clustered or non-clustered, the order, etc. This DMVs are not a replacement for the DTA but a complement.

SQL Performance Part I

During the last years I was involved in different projects, both as a system admin and as a DBA. When you participate from the beginning, it’s easy to do things right (or tend to do so), but the real problem may occur when you are involved in a project that is already deployed and running. The nightmare there resides on analyzing what was deployed initially, what was intended to be deployed and what’s the way to follow. You can’t imagine how many servers are misconfigured out there. Well, perhaps you can…

Sometimes inexperienced IT guys are told to install and configure one server and no proper guide is there to help them, or the person that do the capacity planning was relaying on previous experiences installing other systems than SQL, which leads to servers with poor performance on SQL. Another thing that you must deal with is the developers and the way they tend to work. They usually have a good understanding of performance but focused on their code. Not all of them think on the databases the way a DBA does, so commonly errors are repeated once and again, like concurrency, volume of data, filegroups, partitions, etc. Those errors are easy to avoid or mitigate with just an intervention on time of the DBA. I’ll try to highlight some of them on this article and will cover others on the following ones.

So, let’s finish the introduction and let’s focus on the real interesting stuff: Performance.

Disk System

When it comes to the server’s initial configuration, you should take a deep look into the disks configuration. One common mistake is the RAID option’s IT admins choose. Usually, the servers to run SQL are configured as an IIS or file system also, so the standard RAID 1 for the OS and a big RAID 5 to the database, IIS and whatever the server has is a temptation. Of course your server will run if you choose this configuration, but far away this is not the best option. Also, I saw some servers with the disks configured as a single RAID 5 array and then two or more partitions are created, simulating different disks. Again, this is a big mistake.

Of course you must configure your server based on the money you have to purchase it, but you should always consider the disks bottlenecks and try to reduce them as much as you can spending the money appropriately. It’s better to lose some GBs but make your server runs faster, so when it comes to configure the server’s disks, you should try to configure at least one RAID 1 array for your OS, one RAID 1 array for your data files and one RAID 1 array for your log files. If money is not an issue, then I recommend using a RAID 10 for your SQL data files and another RAID 10 for the log files. Additionally, you should have a separate subsystem for your OS pagefile, another for your binaries and a separate disk for the tempdb. If you plan to have databases that will have heavy queries, used as transactional and/or for Datawarehouse, then you should consider also configuring an extra array for your indexes and perhaps one or more arrays for some of the biggest or more used tables.

One interesting aspect also is the disk partition alignment for Windows 2003 or prior. I won’t go into really deep details on this as there´s a great article by Jimmy May (http://blogs.msdn.com/jimmymay/archive/2008/10/14/disk-partition-alignment-for-sql-server-slide-deck.aspx) talking about the improvements in SQL (about %20~%30) only by correctly aligning the partitions. If you’re on Windows 2008 then you don’t need to change this, but anyway it’s worthy to take a look at that article.

Another thing to consider when you’re dimensioning your SQL server is the tempdb. Getting the right size for it is tough but not impossible. First of all, you should have one datafile per processor and make all datafiles the same size. Also, disable the autogrowth of the files. This is a way to reduce the contention on the tempdb database, but removing the autogrowth can lead into out of space on the filegroups, so you should definitely talk to the solution’s architect (if available) and discuss the usage of the tempdb by the DBs. If you can’t get that information, then you should rely on the development server (which you should have already) and see what’s the usage there. Having that info, you can estimate the size of the tempdb always leaving enough empty room on it, and you should analyze how it’s impacted during the firsts weeks after the server’s Go-Live date in order to correct the size (if needed) before getting an issue because of space.

About the databases you’ll host, you should analyze the table’s structure and do an estimate based on the number of expected rows added per month or week. Then, do a projection of that, let’s say for a year or two, and there you’ll have a DB big enough host all the data without any growth of the datafiles. Again, this is hard to achieve but not impossible. You can estimate the row size of a table and then it’s as simple as multiplying that for the expected rows per day, week or month. That will give you the basic idea for proper sizing the DB. Also, look at your development or testing servers as they can reveal incredible information.

In all cases, the load you expect on your servers and the response time wanted or desired will determine the type of disk system that you should implement. If you can afford using solid state disks, then go for it, but for a lot of us who can’t get those yet, a proper capacity plan will save you a lot of headaches. Choosing the right disk configuration will depend on factors like money, but will also depend on how good you know the applications that will connect to your SQL, the databases that you’ll host and of course the expected volume of data to be handled. As a rule of thumb, you should always try to separate the different database and OS components into different arrays in order to reduce IO bottlenecks.


After some time adminitering BizTalk 2006 R2, I was told to deploy the BAM Portal site. The BizTalk group was formed by 3 servers plus the SQL backend, so I wanted to have my BAM portal balanced too. This sounds reasonable and, if we’re talking of a platform such as BizTalk, it should be fairly easy.

I knew for sure I was going to use another set of servers, which were already IIS balanced. After digging a little, I found this article on the MSDN to configure the NLB for BAM Portal. So, after some tests on the Development and QA environments, I prepared the following list of changes, which is a combination of the MSDN article and other things not included there.

Configuring BAM Portal

As I mentioned before, I had a multi-server environment and the BAM portal application was going to be installed into another pair of IIS servers, not on the biztalk ones. Anyway, I found that in order to make this up and running, I needed to install and configure BAM Portal in one of the BizTalk servers first, and once configured, install the BAM Portal on the other IIS servers and do the changes.

For this post, let’s assume the following conditions:
BizTalk Servers: BT01.domain.local and BT02.domain.local
BAM Portal Servers: IIS01.domain.local and IIS02.domain.local
Balanced URL: http://Portal.domain.local/BAM

1. Install and configure BAM portal on BT01.domain.local. This is because the portal configuration is only made on the first computer. All the others are configured manually.
2. Install the BAM Portal components on IIS01.domain.local and IIS02.domain.local. Once installed, join the two of them to the BizTalk group using the account Domain\BizBAMMng.
3. On BT01.domain.local, open a command window, navigate to “drive:\Program Files\Microsoft BizTalk Server 2006\Tracking” and execute the following: bm get-config -FileName:C:\BAM_Config_Prod.xml
4. Go to “C” drive and make a copy of BAM_Config_Prod.xml. Name it BAM_Config_Prod_New.xml
5. Edit the file C:\BAM_Config_Prod_New.xml with notepad and change the <globalproperty name=”BAMVRoot”> property. That value should reflect the NLB name, for example, it could be <globalproperty name=”BAMVRoot”>http://Portal.Domain.local:80/BAM</globalproperty&gt;
6. Save the file and import it to BizTalk running the following from a command line: drive:\Program Files\Microsoft BizTalk Server 2006\Tracking\bm.exe update-config -FileName:C:\BAM_Config_Prod_New.xml
7. Open the registry and export the key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\BizTalk Server\3.0\BAM. Save it to C:\BAM_Export.reg
8. Login into IIS01
9. Copy the file \\BT01\drive$\Program Files\Microsoft BizTalk Server 2006\BAMPortal\web.config to drive:\Program Files\Microsoft BizTalk Server 2006\BAMPortal
10. Open the web.config file using notepad located on drive:\Program Files\Microsoft BizTalk Server 2006\BAMPortal
11. Change the following values to reflect the local URL address. In this example, I’m using the default web site
a. <add key=”BamQueryWSUrl” value=”http://IIS01:80/BAM/BAMQueryService/BamQueryService.asmx&#8221; />
b. <add key=”BamManagementWSUrl” value=” http://IIS01:80/BAM/BAMManagementService/BamManagementService.asmx&#8221; />
12. Add the following key under <system.web>: <machineKey validationKey=’6FFF3A2DB6F94A679FB9FB6D0BDDBCE9FEE4B3D6C1B220CEA11C23429331B52F4BC8FCC49BD122F272CF2816B0B392A2FB61CB030AC7138EF2A86061FEAF97DD’ decryptionKey=’B50BC9EA1B49B62D861E8645ED818596AD9984B248F3BE64′ validation=’SHA1’/>
The MachineKey value must be added because of the NLB. I found random issues when the requests are started in one node and completed on the other. Adding this ensures that the MachineKey is the same on all the nodes. For this case, I’ve used the automatic key generator available on http://www.orcsweb.com/articles/aspnetmachinekey.aspx
13. Save the web.config file and close it.
14. Copy the file \\BT01\drive$\Program Files\Microsoft BizTalk Server 2006\BAMPortal\BAMManagementService\web.config to drive:\Program Files\Microsoft BizTalk Server 2006\BAMPortal\BAMManagementService
15. Copy the file \\BT01\drive$\Program Files\Microsoft BizTalk Server 2006\BAMPortal\BAMQueryService\web.config to drive:\Program Files\Microsoft BizTalk Server 2006\BAMPortal\BAMQueryService
16. Open the web.config file using Notepad located on drive:\Program Files\Microsoft BizTalk Server 2006\BAMPortal\BAMQueryService
17. Under the <runtime>key, locate the <codebase>key and validate the path on the href value. For example, the key should be like the following: <codebase version=”″ href=”file:///drive:/Program Files/Microsoft BizTalk Server 2006/BAMPortal/BAMManagementService/bin/Microsoft.BizTalk.Bam.WebServices.dll”/>
18. If the reference to the library is OK, close the web.config file. If not, change it to reflect the real path, save the changes and close it.
19. Open IIS Manager and create one Application Pool named BAMAppPool. Set the identity on the application pool to be Domain\BizBAMAppPool
20. Create a Virtual Directory called BAM an point it to drive:\Program Files\Microsoft BizTalk Server 2006\BAMPortal. Change the setting so that the Virtual Directory uses the BAMAppPool. Also ensure that the security is set to Integrated Windows Authentication, that Scripts are allowed and that the ASP version in use is the .NET 2. Finally, change the default page to be default.aspx
21. Right click the BAMManagementService folder, go to Properties and create the Application Name BAMManagementService. Configure it to use the BAMAppPool application pool and the .NET 2. Follow the same procedure for the folder BAMQueryService
22. Import into the registry the values of the file \\BT01\C$\BAM_Export.reg
23. Get the aspnet_setreg.exe program and copy it to a local folder. It can be downloaded from <a href=”http://support.microsoft.com/kb/329290/en-us”>http://support.microsoft.com/kb/329290/en-us</a&gt;
24. Open a command window, navigate to the folder where the aspnet_setreg.exe is located and execute the following: aspnet_setreg.exe -k:”SOFTWARE\Microsoft\BizTalk Server\3.0\BAM\WebServices\identity” -u:”Domain\BizBAMMng” –p:”accout password”
25. Grant Read access to the account Domina\BizBAMAppPool to the registry path HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\BizTalk Server\3.0\BAM\WebServices
26. Grant Read access to the account Domain\BizBAMMng to the registry path HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\BizTalk Server\3.0
27. Add the accounts Domain\BizBAMMng and Domain\BizBAMAppPool to the local group IIS_WPG
28. Run the following from a command window: c:\windows\system32\cacls “%windir%\Microsoft.NET\Framework\v2.0.<min>\Temporary ASP.NET Files” /T /E /G Domain\BizBAMMng. Be sure that you change the <min>for the valid version value.
29. Run the following from a command window: c:\windows\system32\cacls “%windir%\Microsoft.NET\Framework\v2.0.<min>\Temporary ASP.NET Files” /T /E /G Domain\BizBAMAppPool. Be sure that you change the <min>for the valid version value.
30. Grant the accounts Domain\BizBAMMng and Domina\BizBAMAppPool Modify access to the folder %windir%\temp
31. Repeat steps 8 to 30 for server IIS02, changing the value IIS01 to IIS02 where suitable.

SQL Unit Testing


Some time ago, I was asked to configure automatic builds in Team System for a solution, and that build should include as much unit tests as it could. This solution contained a mix of web sites, a wizard installer, OLAP cubes, Reporting Services reports and a set of databases. Because it was in the middle of a refactor, the development team would include tests together with the refactor on the web sites and the wizard installer had a set of unit tests for the installation modules. So I started to look for a way to test the deployed databases.

After searching for a while, I found a good tool in CodePlex called SQL Load Test. This tool has the ability of translating a SQL Profiler trace file into a Visual Studio 2005 C# file, generating unit tests for each entry in the trace file. This output file is intended to be added into a Test Project.
So I downloaded the tool and I read the usage notes: really simple. There are just a few considerations for setting up the SQL Profiler, which are:


The following columns must be present:

– EventClass
– TextData
– DatabaseName – if this is omitted then the trace is assumed to be all for the same database
– Events

The following events must be present:

– SQL:BatchStarting
– RPC:Starting
– Audit Login
– Audit Logout

Additionally, the tool will ignore the following entries:

– All entries related to the master, model and msdb databases.
– All entries with no database name at all.
– All calls to sp_reset_connection.

So, after setting the profiler with the above requisites, I’ve asked a QA co-worker to navigate through a set of different Reporting Services’ reports. After some minutes (about 5 or 6 reports, no more than 10 minutes spent), I stopped the profiler and save the trace to a file.

Using the Unit Test

Now, the following step was using the tool and transforms that trace file into a C# unit test. Well, nothing could be as easier as that. If you read the documentation, you’ll see that the tool is a command-line application, and that only three parameters are needed. The first one is a name for the scenario. This is a C# style name. Then, the path and the name of the trace file. Finally (and optional), is the path and name of the configuration file. This configuration file is provided by the tool, and the settings are really simple, like creating a test method for each operation or just one with all the operations, etc. If you don’t set a custom configuration file, the application will load the default parameters, which are fine for the initial tests.

I run the tool and in a few seconds I had my C# file containing the tests. Then I added the file into my project, configure the SQL connection string, and Voila: I had added more than 500 unit tests in a few minutes. I repeated this with my co-worker for different reports and modules of the application, and after 2 hours I had more than 4000 unit tests, separated into modules (Reports, OLAP queries, transactional queries, etc).

For demonstrating the use of the tool, I have a Solution called SQLUnitTesting with two projects: One database project (SQLUnitTesting) and one test project (SQLUnitTesting_TestProject). The database project has 1 table and 1 stored procedure. The sp is used to insert one record in the table. Also, I’ve configured the profiler to monitor the database SQLUnitTesting, executed in Management Studio two queries (one insert command and the sp) and save the trace file as SQLUnitTesting.trc.

After this, I run the following command from the command line to convert the trace file into the C# unit test

As you can see, the first parameter is the name (I used the same as the test project namespace), the name of the trace file and the name of the config file. This file is the default file that comes with the tool.

Next, two files called SQLUnitTesting_TestProject.cs and SQLUnitTesting_TestProject.stubs are created on the application folder. The first one contains the unit tests and the second contains the connection details. You can discard the .stubs file and use only the .cs file. Then, I’ve renamed the .cs file to SQLUnitTesting_TestProject01.cs and add it to my test project:

Finally, you must set the connection string in order to connect to the SQL server. You can do this using the method you want. For this test, I’ve added the following lines on the TestMethod that call the unit tests:

_connection = new SqlConnection(“Initial Catalog=SQLUnitTesting;Integrated SSecurity=SSPI;Data Source=(local)”);

After all this is done, you can run the test from visual studio and see the results:

As you can see, the report indicates I run only one test. This is because the tool created all queries in a single TestMethod:

public void SQLUnitTesting_TestProject()

If you want to see one test per statement, then you can separate them into different TestMethods. Attached, you’ll see the .cs file for my project for better understanding of how the statements are executed


Although there are lots of tools down there for unit testing, and even though Visual Studio provides Unit Testing for SQL, I found in this tool a great value: The user variable. Because these tests are based on database usage, you can have real-life queries to databases. Suppose you can launch the profiler on the production server and leave it running for one or two days. Then, you’ll have tests that reflect what users do. In addition, you can configure the Visual Studio Test Load Agent and monitor the performance of your SQL server during the execution of the test, simulating concurrent connections and so on.

I know that this tool won’t replace the Unit Tests a developer should do on the database solution, and also that there are lots of tools that can achieve similar results, but this application is a great starting point, allowing the configuration of unit tests easily and without knowing the code behind your databases.

In my case, this tool has provided me a great way for managing integrity checks on the solution databases. Usually when a change is made on a complex database, a break can occur in the most unthinking procedure or query. At that time, the dev team was changed from one contractor to another one, so until the new devs really knew the application, this approach helped to avoid releasing a build that has errors on the databases.


Get every new post delivered to your Inbox.