Bugslayer

Strengthening Visual Studio Unit Tests

John Robbins

Code download available at:Bugslayer0603.exe(169 KB)

Contents

Watch Out for Hardcoded Paths
Working with GenericTest and EXEs
Create Unit Test...
Using NUnit
TimeOutAttribute
Code for You
Names and Places
A Better MSTEST.EXE
Wrap Up

Visual Studio 2005 brought so many new features to the table that it can seem almost overwhelming. One of the most exciting additions is the new unit testing features found in the Test menu on the main menu bar. While we've had some excellent unit testing tools in .NET Framework-based development for a while, these new unit testing features provide clean integration and an emphasis that was missing before.

Nevertheless, the unit testing tools do have a few rough spots. Here, I will discuss the pitfalls and problems I've encountered while working with the new testing tools. I've included one of the real modules I've been working on along with its unit test so you can have a useful example at hand. Note that all of these problems are fixable, and you can confidently implement these testing tools immediately in your development shops.

While using the unit testing tools, I've noticed a couple areas where extending the environment was necessary, so I've provided the code I used to make my testing easier. With this added help, the testing features now work the way I think they should.

For this column, I'm assuming you've read the "Writing Quality Code" and "Working with Unit Tests" sections of the documentation. Those papers provide a good overview, and I'll cover the practical aspects here. If you've got the Visual Studio® Team System or Team Edition for Developers, you'll also have the tools geared for testers. The Team Edition for Testers documentation covers those tools and has the all-important Class Library documentation for extending or interacting with the testing environment.

Watch Out for Hardcoded Paths

Adding tests to a new project is fantastically easy using the Unit Test Wizard. This handy feature can save you hundreds of hours of typing time (the developers at Microsoft deserve a lot of credit for this). However, something happens behind the scenes that can cause you a lot of grief: paths are hardcoded! This is a problem, for example, when you move a test to another machine or directory. I hope that a service pack for Visual Studio 2005 will allow you to set a starting path for tests and the hardcoded paths will use relative paths. For now, however, the easy way to solve the hardcoded path problem is to have a benevolent dictator on the team decree all drives and paths for all time. If it's a small enough project, this solution may be fine. Another, more practical workaround is to reference the %testdeploymentdir% environment variable in your paths. It is set when a test runs.

The first place where hardcoded paths appear is in the VSMDI file, which is a big wrapper around simple lists of tests. When you open a VSMDI file and it can't find the test assemblies or TESTRUNCONFIG files, you're prompted for locations for those items. What I found interesting was that the next time I opened that same VSMDI file, it found everything. Obviously, the updated paths had to have been stored somewhere, but the VSMDI file itself was never changed. I found a hidden file with the name <file name>.VSMDI.OPTIONS in the directory where the VSMDI file resides.

When I opened the VSMDI.OPTIONS file, which is a ubiquitous XML file, I shook my head in frustration. As you can see in Figure 1, there is obviously support for path searching with VSMDI files, but there's no way in the Visual Studio user interface to set the search paths. (Furthermore, there's no reason for these VSMDI files to be hidden.) So, when I want to use the VSMDI file on a different directory structure, I copy an example VSMDI.OPTIONS file to the appropriate directory and manually edit it, adding the paths to the assemblies and TESTRUNCONFIG files. I've included an example file in the BaseVSMDIOPTIONSFile directory for this column's source code.

Figure 1 VSMDI.OPTIONS File

<?xml version="1.0" encoding="utf-8"?> <Tests> <edtdocversion branch="retail" build="50727" revision="42" /> <SearchPaths type="Microsoft.VisualStudio.TestTools.Common.SearchPaths"> <TestSearchPath type="System.Collections.ArrayList"> <element type="System.String"> C:\Dev\Column\39\SourceCode\Debug </element> </TestSearchPath> <RunConfigSearchPath type="System.Collections.ArrayList"> <element type="System.String"> C:\Tests\Bugslayer.Utility.Tests </element> </RunConfigSearchPath> </SearchPaths> </Tests>

The main reason for using a VSMDI file is so you can specify lists of tests to run—this is useful during active development as it allows you to run just the specific tests related to the bits of code you're working on. To execute all the tests in a test assembly, you can use the console test runner, MSTEST.EXE. There are numerous command-line options, but you'll only need the /testcontainer: option, which specifies the assembly containing the tests. Optionally, you can use /resultsfile to specify the name of the results file. When using MSTEST.EXE, you don't have to worry about the VSMDI files. In the next section, I'll discuss an MSBuild.EXE task I put together to automatically run all appropriate tests found in a directory structure so you can avoid using VSMDI files altogether.

The second place you'll find hardcoded paths is in the TESTRUNCONFIG files; the assemblies to instrument and strong key files all have hardcoded paths. The best solution I've figured out to address this issue is to use a different TESTRUNCONFIG file when you run tests on different directory structures. As there's no way to create a new TESTRUNCONFIG, you'll need to copy one of your existing files over to the machine location and edit it with Visual Studio. If the drives and upper directories are different, the IDE and MSTEST.EXE will handle TESTRUNCONFIG files with relative paths but, again, you'll have to edit them by hand.

As it's not the testing tools themselves doing the code coverage instrumentation, there's nothing stopping you from manually performing the code coverage for your compiled assemblies yourself from the command line. If you're working on ASP.NET bits, you'll need to do code coverage from the Visual Studio IDE to create the appropriate assembly on the fly and get it instrumented.

Doing manual code coverage is a three-step process. The first is to instrument your assemblies so they have the code coverage hooks in them. The second is to start the monitor process and tell it where to write the coverage file. Any instrumented binaries that load while the monitor process is running will have their coverage data added to the output file. The final step is to shut down the monitor to write the COVERAGE file. I have whipped up a Coverage.Targets file you can use with MSBuild to automate this process. Figure 2 shows pieces of the file; the full file is available for download from the MSDN®Magazine Web site.

Figure 2 Coverage.Targets

<Project xmlns="https://schemas.microsoft.com/developer/msbuild/2003"> <!-- The task to wrap VSPerfMon.EXE since it has trouble with console I/O redirection.--> <UsingTask TaskName="VSPerfMonTask" AssemblyFile= "$(BUGSLAYERBUILDTASKSDIR)\Bugslayer.Build.Tasks.DLL" /> <!-- **************** Properties Used By All ******************* --> <!-- These are here in case you need to set them from the command line. --> <PropertyGroup> <BASEDIR>$(VSINSTALLDIR)\Team Tools\Performance Tools\</BASEDIR> <VSINSTR>'$(BASEDIR)VSINSTR.EXE'</VSINSTR> <VSPERFCMD>'$(BASEDIR)VSPERFCMD.EXE'</VSPERFCMD> </PropertyGroup> <!-- ************* Instrument and Monitor Target *************** --> <Target Name=»InstrumentAndMonitor» DependsOnTargets= «CodeCoverageInstrumentTarget;StartCoverageMonitorTarget» /> <!-- **************** Instrumentation Target ******************* --> <Choose> <When Condition=»$(StrongNameFile) != ''»> <PropertyGroup> <OffVsInstrWarnings>2001;2013</OffVsInstrWarnings> </PropertyGroup> </When> <Otherwise> <PropertyGroup> <OffVsInstrWarnings>2013</OffVsInstrWarnings> </PropertyGroup> </Otherwise> </Choose> <!-- The target for instrumenting. --> <Target Name=»CodeCoverageInstrumentTarget» Condition="'$(VSINSTALLDIR)' != ''" Inputs="@(InputCoverageBinaries)" Outputs="@(OutputCoverageBinaries-> $(OutputDir)\%(filename)%(extension)"> <!-- Make the output directory if it doesn't exist. --> <MakeDir Condition =»!Exists('$(OutputDir)')» Directories =»$(OutputDir)» /> <!-- Run VSINSTR.EXE on the individual binary. --> <Exec Command=»$(VSINSTR) /NOWARN:$(OffVsInstrWarnings) /COVERAGE /OUTPUTPATH:$(OutputDir) %(InputCoverageBinaries.Identity)»/> <!-- If a strong name key file was specified, run SN.EXE to resign it.--> <Exec Condition=»$(StrongNameFile) != ''» Command=»sn -q -R $(OutputDir)\% (InputCoverageBinaries.filename)% (InputCoverageBinaries.extension) $(StrongNameFile)»/> </Target> <!-- ***************** Start Monitor Target ******************** --> <Target Name=»StartCoverageMonitorTarget» Condition="'$(VSINSTALLDIR)' != ''"> <VSPerfMonTask OutputFile=»$(OutputCoverageFile)» MonitorType=»Coverage» User=»$(User)»/> </Target> <!-- ***************** Stop Monitor Target ********************* --> <Target Name=»StopCoverageMonitorTarget» Condition=»'$(VSINSTALLDIR)' != ''»> <Exec Command=»$(VSPERFCMD) /SHUTDOWN»/> </Target> </Project>

As you poke through Figure 2, you'll see that I wrote a task in Bugslayer.Build.Tasks.DLL to run the monitoring process. Figure 3 shows it in Visual Studio.

Figure 3 Visual Studio Editor

When I first wrote Coverage.TARGETS, I used the Exec task to execute VSPERFCMD /START:Coverage /OUTPUT:$(OutputCoverageFile) but there was a severe problem in doing that: MSBuild.EXE completely hung on the call. VSPERFCMD.EXE spawns off the actual monitor process, VSPERFMON.EXE. If you run VSPERFMON.EXE from a command prompt, the process sits there spitting out connections and other information active in the process, so you can't just call it directly from the project file.

The problem turns out to be in MSBuild.EXE, and stems from the fact that VSPERFMON.EXE is spawned from the VSPERFCMD.EXE process with the bInheritHandles flag to CreateProcess set to true. Any process started with inherit handles will hang under MSBuild.EXE. Consequently, I have to call Process.Start from the ITask.Execute method in my task to make everything happy under MSBuild.EXE.

Working with GenericTest and EXEs

If you've got an existing test system based upon EXE programs, the discussion of the GenericTest type in the documentation probably piqued your curiosity. While working on a project that relied on a batch file that ran nine EXEs as the unit test, I was able to use the GenericTest type to quickly wrap some of my existing code in automation goodness. There were a few catches though. The first small obstacle was that GenericTest allows zero as a successful return value from the EXE. That's not too big a deal, but considering the advanced features in GenericTest, I'm dismayed to see something as simple as a field for acceptable exit codes was left out.

The bigger problem with GenericTest is that it's a bastion of hardcoding. Fortunately, it's relatively easy to figure out the relative path location. If your GenericTest resides in C:\FOO, the test is actually started from C:\FOO\TestResults\<User>_<Machine>_<TimeStamp>\Out. Thus, if your EXE file executed by GenericTest is in C:\FOO as well, you can use .\.\.\<name>.EXE as the program to execute. Unfortunately, nearly everything else in GenericTest is hardcoded from the drive on down. What's interesting is that the Out directory is where all your binaries are copied to each time they run. That way, even if you change the code, you can rerun the test's previous versions easily in order to reproduce any problems.

A handy feature, the GenericTest type will capture anything that goes to standard output, giving you a log of the run in the result file. Unfortunately, there currently seems to be a problem with the capture where pumping a lot of information will cause the test driver process to hang. But most test applications aren't pumping out 100 plus lines of output in a few milliseconds.

Create Unit Test...

When it comes to testing, the real magic of Visual Studio is the wonderful Create Unit Test... option you get when right-clicking on a method in the text editor. The feature works well and makes it easy to quickly add unit tests. But I have a small philosophical problem with the fact that it lets you create unit tests that reach directly into the class and access private methods.

The main argument for allowing the testing tool to call private or protected methods directly is that it eases testing (there is less code to write) and helps broaden code coverage. These arguments are seductive, but I fall in the camp that thinks unit testing should only come through the public interfaces. The unit test is the first use of the code and you want to gear the testing towards how others will use it. If there are private methods that you can't sufficiently test without short circuiting and calling them directly, I have to wonder if the code might need to be redesigned. To keep from accidentally creating a unit test that directly calls a private method, go to the Create Unit Test dialog, click on the Filter dropdown in the upper-right corner, and uncheck Display Non-public Items.

By no means am I an absolutist. I'm sure there are cases where it would help immensely to call private methods. However, just because the tool allows you to do something doesn't mean you should rely on it. Unit testing is the first stage of testing and it's the first place where you start the white box testing.

Using NUnit

I have projects where we've already made a major investment in a test system built out of NUnit. (A new version that works with the .NET Framework 2.0 was recently released.) In one case, we wanted the code to be portable between NUnit and the Visual Studio test systems, giving us the best of both worlds. When planning this, I stumbled upon something extremely cool that would require minimal code changes exactly once and allow the code to work with both NUnit and Visual Studio.

The Microsoft patterns & practices group has released the very interesting Composite UI Application Block. They include their unit tests in the installation and, when reading through the code, I noticed this brilliant bit of code at the top of all the tests:

#if !NUNIT using Microsoft.VisualStudio.TestTools.UnitTesting; #else using NUnit.Framework; using TestClass = NUnit.Framework.TestFixtureAttribute; using TestMethod = NUnit.Framework.TestAttribute; using TestInitialize = NUnit.Framework.SetUpAttribute; using TestCleanup = NUnit.Framework.TearDownAttribute; #endif

All I had to do was change my methods that used NUnit's Test attribute to be TestMethod and I had a test harness code that worked both ways.

TimeOutAttribute

The documentation for most of the attributes is quite good. But one of the more important attributes, TimeOutAttribute, isn't covered except in the API documentation. While the TESTRUNCONFIG file allows you to specify the overall timeout value of the unit test, the TimeOutAttribute lets you specify the maximum number of milliseconds an individual test can take. I find TimeOutAttribute invaluable on those test methods that hit the database so I can keep an eye on those queries. Keep in mind that the timing values include some of the test runner's time. In addition, the speed and capabilities of the machine will affect the time. Be sure to experiment with your tests to see how the timing works on your system.

The TestContext class, which is also the TestContext property added by the Unit Test Wizard, is also only briefly touched upon. The main discussion is on using the TestContext property to get the data rows in when you are using the DataSourceAttribute. The TestContext class has a lot more to offer. The documentation shows the TestContext class as being abstract, but the derived type actually handed to your unit test is UnitTestAdapterContext, from down deep in Microsoft.VisualStudio.QualityTools.Tips.UnitTest.Adapter.dll, which you'll find in <Visual Studio .NET install dir>\Common7\IDE\PrivateAssemblies. You may want to look at UnitTestAdapterContext with .NET Reflector to see how it works.

Probably the most important method supported by this class is WriteLine, which you can use to add additional output to the individual test results. All the writes show up in the Additional Information section of the report. To find out what test is running or what directory the tests started from you can use the TestContext property fields TestName and TestDir, respectively. Finally, if you want a timer for all or part of your tests, call TestContext.BeginTimer and TestContext.EndTimer. The timer statistics are output to the test run results in the Standard Console Output section.

Code for You

As I mentioned, I've included one of my real units along with its unit test. Bugslayer.Utility.DLL is a collection of utility code I've been dragging around between projects. The ArgParser class is a command-line argument parsing class based on the class in the old .NET Framework SDK WordCount sample. The SystemMenuForm class is a Windows Form that allows you to append items to the System Menu and respond to clicks as normal events.

I wrote GlobalMessageBox because I was tired of seeing the Code Analysis error, Specify MessageBoxOptions, every time I used a message box. The rule states that when using a message box, you need to look through the parent classes and see if the RightToLeft property is set toYes. If it is, you must call the appropriate overload of MessageBox.Show to pass in the MessageBoxOptions flags. GlobalMessageBox takes care of this for me, suppressing the error and allowing my code to work correctly on right-to-left language systems.

The full unit test for Bugslayer.Utility.DLL is in the Bugslayer.Utility\Tests\Bugslayer.Utility.Tests directory. There are 39 tests in the various CS files providing over 92 percent code coverage. As this is a Windows Forms-focused unit that that brings up message boxes and controls, you can't run it unattended, but it can still run in under 15 seconds.

The other chunk of code is in the Bugslayer.Build.Tasks.DLL assembly, and is for use with MSTEST.EXE. While running tests in the IDE is great for debugging them, I love that I can easily execute my tests outside the IDE. MSTEST.EXE is likely to become very important in your life, as it will be the runner for all your big smoke and regression tests.

Names and Places

The oddest part of MSTEST.EXE is its output naming convention. If you use the /RUNCONFIG option to use a TESTRUNCONFIG file, the output file will use the naming convention specified in that file. If you don't use /RUNCONFIG, or you leave it set to the default, all the output is written to .\TestResults\<user>_<machine> <timestamp>. I recommend using names that are more quickly identifiable.

MSTEST.EXE offers the /RESULTSFILE option, but this causes the output file name to lose the timestamp. Additionally, MSTEST.EXE fails if the file name specified to /RESULTSFILE exists. What I want is to specify a name that refers to the particular focus I am working on, but without having to add the timestamp manually.

You might be thinking that a possible solution is to use the VSMDI test metadata files that you're used to seeing in the Test Manager window. In fact, MSTEST.EXE does have a /TESTMETADATA option to load and run the tests. The problem is that you can only specify one VSMDI file.

One possible solution is to create a separate VSMDI file that imports all the other VSMDI files in your code. That will certainly work, but it presents yet another maintenance task to remember every time you add new tests to your code.

Also worth mentioning is that you can't tell the IDE or MSTEST.EXE when running VSMDI files where to place the output. The output goes to a directory where the VSMDI file resides. I recommend keeping the tests in a directory below the source code in version control so that if you share the project all the test code goes with it.

With VSMDI files as part of each test and no way to centralize the output, the output will be scattered all over your source code. This isn't a big deal, but it does mean you have to clean up the source tree manually. After having dealt with the results of many test runs, I decided I wanted an easier way to handle this.

A Better MSTEST.EXE

Given this discussion, I saw there were four features I wanted to add to MSTEST.EXE. The first was a way to dynamically find all your unit tests and run them just like a mini smoke test. The second was an easy way to identify similar test runs other than by just reading the timestamp. The third was to ensure all the test output went to a single place. And finally, I wanted a very easy way to get rid of extraneous test runs no matter where they were in your source tree.

These requirements scream for MSBuild. The MSTestTask in Bugslayer.Build.Tests.DLL wraps MSTEST.EXE so you've got all the control in the world. As you look at the code, you'll notice that it's derived from a ToolTask class, which comes out of the Microsoft.Build.Utilities.DLL assembly. ToolTask is what you want to use when writing build tasks that wrap a command-line tool because it does most of the heavy lifting.

For many tools, all you need to do is define your unique properties and override three methods and one property. The property is ToolName, which returns the executable name of the tool. The GenerateFullPathToTool method returns the complete drive, path, and file name to the tool itself. To validate the parameters, you override the ToolTask.ValidateParameters method and return true if everything works. To build the actual command line to the tool, override ToolTask.GenerateCommandLineCommands and use either the CommandLineBuilder class or my small extension to it, ExtendedCommandLineBuilder.

Run MSTEST.EXE /? for all the possible command-line parameters. The ResultsFile is required as it specifies the output file name. You also need to set either the TestMetaData parameter or the TestContainer parameter to indicate the metadata file or test containers, respectively.

The really interesting work is in MSTestTask.ValidateParameters because that's where I crack open the TESTRUNCONFIG file and do a little XPath jujitsu to make sure the unique name I generate matches the formatting you may have set in TESTRUNCONFIG. You might be thinking that it would have been easier to just write this task in a TARGETS file using the Exec task. In fact, while you would not have had the ability to look in the TESTRUNCONFIG file to get the name, you could do the rest.

However, my long-term plan for MSTestTask is to extend the Tests property to allow wildcards to be passed for names of tests to execute. That would allow you to easily execute tests matching only those with specific prefixes. The work would simply be a bunch of reflecting through the assemblies passed in the TestContainer property, looking for the classes that have the TestClassAttribute on them and methods with the TestMethodAttribute matching the regular expression passed in.

The other action part of MSTestTask comes in the RunTests.Targets file, which you'll find in the .\Build directory with the source code. This contains the very cool ExecuteAllTests target, which starts at a directory you specify and looks for all unit tests, GenericTests, WebTests, and OrderedTests in the entire hierarchy and automatically executes them. You can think of the ExecuteAllTests target as an automatic regression test for your unit test. As you add new tests, it will execute them automatically. The code for RunTests.Targets, which is included in the code download, makes judicious use of excluding files to get just the ones we want. To see an example of RunTests.Targets in use, look at SmokeTest.proj, which shows the smoke test for all of this column's code.

The final TARGETS file in the .\Build directory is MSTestCleanUp.Targets. As the name implies, its job is to find all those directories that have TestResults as one of the paths and delete them. It's a good example of using transforms as well as the wonderful RemoveDuplicates task in MSBuild. With MSTestClean-Up.Targets at your side, you won't sully your source directories with extra files.

Wrap Up

If you can't tell, I'm extremely excited about the new unit testing tools in Visual Studio 2005. Things like ASP.NET 2.0 and DataGrids get all the attention in the press, but the testing tools will have a much bigger impact when you're trying to get your application out the door on time. I can guarantee that the more time you spend playing with the testing tools, the better your code will be!

Tip 73 You can control the default programming language for a unit test and exactly what items are put in a new unit test when it's created from the Unit Test Wizard. Go into the Options dialog, expand the Test Tools node, and go to the Test Project property page. There you'll see the Default test project type combobox and the default file selections for each language type. If you're like me, you'll uncheck the "About Test Projects" introduction file after you create your second unit test.

Tip 74 Try to keep all the unit tests for an assembly in a single test assembly. That one-to-one mapping is easy from a maintenance standpoint. However, as the assembly grows, the number of tests can get quite large. I like to place prefixes on the test method names related to the feature they are testing. This allows for easy grouping. For example, in the Bugslayer.Utility.Tests.DLL assembly, the tests related to the GlobalMessageBox class all start with "GMB_".

Send your questions and comments for John to  slayer@microsoft.com.

John Robbins is a cofounder of Wintellect, a software consulting, education, and development firm that specializes in .NET and Windows. His latest book is Debugging Applications for Microsoft .NET and Microsoft Windows (Microsoft Press, 2003). You can contact John at www.wintellect.com.