Extreme ASP.NET

Tools of the Trade: SQL Server Profiler and Query Analyzer

Rob Howard

Contents

Optimizing Data Access
SQL Server Profiler
Stored Procedures
Starting Profiler
SQL Server Query Analyzer
Analyzing a Query

In my last column, I discussed Microsoft® Application Center Test and how it could be used to measure the performance of your Web application (see Extreme ASP.NET: Tools of the Trade: Application Center Test). Application Center Test is one of those tools you don't know you need until someone shows you how useful it really is. I covered the value of Application Center Test for measuring performance, which is quite critical because you can use those measurements to make improvements, set goals, and most importantly plan capacity and scale.

In this column, I'm going to discuss two other tools of the trade: SQL Server™ Profiler and SQL Server Query Analyzer. While these SQL-oriented tools are not something most ASP.NET developers are familiar with, I would strongly encourage you to add these to your tool belt. Personally, if I could have worked on any other product team besides ASP.NET during my tenure at Microsoft, it would have been the SQL Server team. Internally, SQL Server is an amazingly complex piece of software, but programming and using it is quite easy. Understanding some of this complexity is important because, as you'll find, the database is often the bottleneck in your application and the more you know about the internals of SQL Server the more you can do to avoid those problems.

Optimizing Data Access

How much time do you spend analyzing how your application uses its database resources? For developers, the most common answer is not very much. Typically most performance problems are addressed in one of two ways: by adding new hardware or by tuning the code that runs the application. Most developers prefer to tune the code, but sometimes it actually can be more effective to just add new hardware. It should be noted, however, that performance problems can sometimes be worsened by adding new hardware before analyzing for bottlenecks. If a bottleneck is in the software, fast performing hardware can queue requests faster, making the bottleneck worse. The best approach, as always, is to analyze bottlenecks (hardware or software) before doing anything.

Being a developer, I prefer to sling some code first, before taking the hardware approach. Once that decision is made, it is usually a matter of doing some basic profiling of the application using Application Center Test and identifying where in the application performance is poor. Once the problem areas are identified, the next step should be to use SQL Server Profiler to inspect how the application uses database resources.

When it comes to writing high-performance Web applications, there is a performance truism: if your application performs any cross-process communication (database, Web services, remote object invocation) it's a better use of your time to either optimize those communications or to remove them entirely before attempting to optimize other code within your application. For database optimization, my general rule of thumb is if the application is going to the database more than once or twice on a given request (or making a single Web service call), there must be a very good reason for it because those roundtrips are most often the cause of poorly performing applications.

My team at Telligent has just spent the past three weeks tuning our Community Server for forums.asp.net. We made nearly 40 performance-related changes, of which 75 percent were in the database while the other 25 percent were directly related to code changes to either reduce or optimize the time spent in the database. The caching features of ASP.NET are one of the best tools for reducing database round-trips.

If you are using SQL Server you've already got a great set of tools, the Profiler and Query Analyzer, for understanding how your application is using the database. I won't cover all the internal workings of the Profiler or Query Analyzer in this column, but to learn more about how SQL Server works under the hood, I highly recommend the Microsoft SQL Server 2000 Performance Optimization and Tuning Handbook by Ken England (Digital Press, 2001). This book is one of my favorite references—my own copy is highlighted and coffee stained. Hopefully your copy will enjoy a similarly useful existence.

SQL Server Profiler

Reviewing logs or trace files generated by your application is invaluable for understanding how your app is being used. For example, for Web developers it is common to review IIS logs to understand app use. Within Community Server there is a built-in exception handler to trace all exceptions generated; later, we can review these logs and quickly get a good idea of the problem areas.

SQL Server Profiler can be used to view live sessions of activity on your running SQL Server database. You can glean important information such as how many queries your database is executing, how much time those queries are taking, which database is executing which query, and so on.

Figure 1 shows a profiler session with the database used on the asp.net cluster. You can see several captured SQL traces, which show stored procedures for weblogs.asp.net (items prefixed with blog_), beta.asp.net (items prefixed with aspnet_), Community Server forums (items prefixed with cs_), and download counters for www.asp.net downloads (IncrementDownloads). By selecting an item you can also view the full text, as has been done for the stored procedure cs_forums_Post. There are additional columns for CPU, Reads, Writes, and Duration.

Figure 1 SQL Server Profiler Session

Stored Procedures

There is much passionate debate about using stored procedures. A good argument for using them can be found in "TheServerSide Debates: Stored Procedures v Parameterized Queries" by Peter DeBetta.

One of the benefits of stored procedures in enterprise applications is encapsulation and abstraction. If an application is using dynamic SQL (SQL text that is being generated on the fly), the application is typically more challenging to tune. Tuning dynamic SQL requires recompiling the application. Stored procedures, on the other hand, encapsulate their functionality and can be modified without affecting the public signature. Just as methods in your APIs are used for encapsulating functionality, stored procedures are in many ways the same capability, but in the database.

Of course there are also some good cases for using dynamic SQL. An example would be object/relational mappers that can dynamically create SQL based on the business objects requirements. If dynamic SQL was used in any of the asp.net clusters it would show up in SQL Profiler as standard SQL, like SELECT column1, column2 from TableA. However, there is none.

Starting Profiler

You can find Profiler with any standard installation of SQL Server in the SQL Server folder in the All Programs listing. I would also recommend installing SQL Server tools as part of any standard developer environment.

Once you open Profiler you will need to first connect to a database before you can begin a profiling session. To connect to the database, select File | New | Trace. This brings up the Connect to SQL Server dialog. Enter either your Windows or SQL Server credentials. Note that you will need to be a member of the SQL Server sysadmin group to run profiler.

Next, after the credentials have been verified you will need to set some properties for the trace in the Trace Properties dialog. By default, Profiler will capture events defined in the Standard template. This is an important subset of events rather than all of them. While capturing everything can be useful, most servers support more than one database and a trace that captures all database traffic isn't necessary or useful. By clicking on the Filters tab you can identify some filters to constrain what is captured. For example, you can constrain the traces to only show connections from a certain login, queries that have a specific duration, or queries that contain some known text (you can use wildcard symbols such as % to do SQL wildcard matching).

For example, let's say you wanted to run a trace of all database traffic for Community Server. Furthermore, you only wanted to look at communications that were taking over 100 ms to execute (usually items that take greater than 200 ms will be noticeable to the end user). In order to do this I set a filter with a specified login name, like csforums, and a duration greater than or equal to 100. Figure 2 shows the result of this trace.

Figure 2 Trace Results

There are several stored procedures that are running more frequently, namely cs_forums_Search and cs_Sections_Get. Also of interest is cs_Threads_GetThreadSet which had a duration of 703! The cs_Sections_Get stored procedure also looks slightly abnormal with over 11,000 total reads along with a high duration value. These numbers aren't terrible, but they aren't perfect either.

Using the information that Profiler provides, you can at least begin to understand what your database is doing and where it is spending its time. In this trace session, I filtered for procedures that were taking a long time to run and focused on the forums database. There might also be cases in which there are too many queries running and slowing down the system, but in those cases it's best to alleviate some of the database load by adding an additional server or getting a local copy of the database for testing to remove the false positives.

On a heavily loaded system, rather than using the Profiler UI to collect your SQL traces, you should consider using T-SQL to send the results to a server-side file (you can still use the UI to create the trace definition, to script it, and to execute it). Sending results to server-side trace files ensures that no events are dropped, while sending trace events to a rowset (which is what SQL Profiler uses) does not guarantee this.

SQL Server Query Analyzer

Now that I've armed you with the basic ability to examine and profile your database you are probably wondering what happens next. Well, the next step is to take the queries and examine them in Query Analyzer, another wonderful tool that is useful for any developer working with SQL Server. As Profiler is useful for taking the 50,000 foot view of the system, Query Analyzer is the microscope used to analyze the details. Using Query Analyzer, you can execute queries and stored procedures and get a visual display of the SQL Server execution plan.

Similar to Profiler, Query Analyzer is found in the SQL Server folder in the All Program menu. It, too, requires that you complete the Connect to SQL Server dialog. Once connected, you can type the following to switch to your database:

use [databasename]

Next, click the play button on the menu bar (or highlight the text and press F5). Either option executes the highlighted SQL.

In Profiler I identified the stored procedure cs_Sections_Get as reading an unusually large amount of data—over 11,000 reads! The SQL trace from Profiler can be copied into Query Analyzer and you can then see exactly what that particular query is doing.

Analyzing a Query

To run the query in Query Analyzer, first paste the contents into Query Analyzer, then go to the toolbar and select Query | Show Execution Plan. Next, highlight the SQL to execute and press F5. Figure 3 shows my results.

Figure 3 Query Analyzer Results

Immediately below the SQL query that was executed is the results grid, which shows what is returned by the query. In this case there are two resultsets returned. The first resultset is a list of Sections (forums) and the second one is a set of permissions. Community Server uses a role-based permissions system to control what users can or cannot do within the system. Whenever a list of Sections is retrieved, the system also refreshes its list of permissions for those sections.

At the bottom right of Query Analyzer you see the text: "5430 rows". This indicates the total number of rows returned. In this procedure, there was a bug in the permissions logic which would cause the system to return a list equivalent to total roles multiplied by the total sections. In the case of forums.asp.net, there are 29 unique roles and 181 sections; this totals 5,249 rows. Add the 181 rows returned by the first resultset and you get a grand total of 5,430 rows returned.

Obviously this is far too many rows. In fact, by using Profiler and Query Analyzer, this particular Query was optimized to return less than 250 total records—cutting the duration to less than 100 ms.

In addition to running queries, Query Analyzer allows you to analyze a query. Recall that you enabled the showing of the execution plan before executing the SQL. The plan provides a visual diagram of how SQL Server is executing your query, what indexes it uses, and other data that can help you optimize indexes or other data needed by SQL Server to enhance query performance.

Figure 4 shows each query executed, its execution plan, indexes used, joins, and the amount of data returned from each operation. Even the arrows between each item have significance: the thickness of the arrow is a visual indication of how much data is returned by that operation; that is, a poorly tuned query might have many thick arrows indicating operations which are likely returning too much data.

Figure 4 Execution Plans

Application Center Test, SQL Server Profiler, and SQL Server Query Analyzer are some of the most powerful tools for tuning the performance of your ASP.NET or SQL Server application. Not only should they be part of a developer's tool belt, but these powerful tools should also be used routinely to inspect the application's overall performance. Too often developers will focus on optimizing routines within their code when eliminating or optimizing cross-process communications, such as database calls, would be a much better use of time.

Hopefully I've piqued your interest in learning more about the inner workings of SQL Server. It is a truly fascinating technology. If you are fortunate enough to have a DBA on your staff, ask them to show you how your application interacts with the database—you'll learn a lot!

Send your questions and comments for Rob to  xtrmasp@microsoft.com.

Rob Howard is a founder of Telligent Systems, specializing in high-performance Web apps, knowledge management, and collaboration systems. Previously, Rob was employed by Microsoft where he helped design the infrastructure features of ASP.NET 1.0, 1.1, and 2.0. You can contact Rob at rhoward@telligentsystems.com.