Micheal LearnedSudheer AdimulamTim Star
In this article we’ll introduce some of the new features in Microsoft Test Manager 2012 (MTM) that are used and “dogfooded” by the Visual Studio ALM Rangers.
To recap, the ALM Rangers are a group of experts who promote collaboration among the Visual Studio product group, Microsoft Services and the Microsoft Most Valuable Professional (MVP) community by addressing missing functionality, removing adoption blockers and publishing best practices and guidance based on real-world experiences.
Following are discussions of various features of MTM 2012.
Exploratory Testing This is sometimes referred to as “ad hoc testing,” defined as performing software testing without a defined script. The idea is to lean on the creativity of the tester to help surface bugs versus having a scripted step-by-step test case for every test run and scenario. In the first version of MTM, released in 2010, exploratory testing was enabled via filing an “exploratory bug” via Microsoft Test Runner. The tool allowed users to perform a set of actions in an unscripted workflow, and once a bug was found, the tester could choose to trim down the recorded actions to an appropriate amount of steps to be included in the bug that was being filed.
The idea was that a tester might spend a considerable amount of time exploring the application before finding a bug, and then the ability to trim down the steps would give the tester the freedom to create the bug with more or fewer repro steps based on knowing the context in a given scenario. This functionality gave testers the ability to freely perform a test run in an unscripted style of workflow and still leverage the capabilities for creating a bug with the exact repro steps. Testers could also create test cases from these steps so the bug fix could be validated later by rerunning the scripted test case.
The exploratory testing experience has been greatly improved with the 2012 release. In the previous release, filing an exploratory bug first required having a test case to run from Microsoft Test Runner. Users could create a “dummy” test case, and, as an example, name it “Exploring” or just leverage some existing test case. Both of these options seemed somewhat clumsy and made discovering exploratory testing features somewhat difficult. In the 2012 release of MTM, a test case is no longer required for doing exploratory testing, and there are a few different ways to start exploratory testing sessions. To get started, simply right-click a test suite in the test plan and choose “Explore.” Users can also associate the exploratory testing effort with a requirement, which enables linking the bugs and test cases that are created to the requirement work items. To do this, launch the exploratory session from MTM 2012 from a backlog item, as shown in Figure 1.
Figure 1 Steps for Exploratory Testing Using MTM
While running the exploratory testing session, testers can create additional data for the bugs in the form of screenshots, comments and file attachments. The exploratory testing window shown in Figure 2 provides a nice experience for testers. The large icons make it easy to create bugs and test cases, and you can type and format notes in a free-form field. The notes you type—and any data captured by testers, such as screenshots—get seamlessly added to the bug or test case as you create them. Users have the ability to add and remove steps to deviate from what was captured during the recorded actions.
Figure 2 Exploratory Testing Session in MTM
Creating new bugs and test cases from an exploratory testing session is one example of a common workflow, but users can also open and update existing manual test cases and bugs.
Scenarios for Exploratory Testing in MTM 2012 The user experience is fast and fluid with exploratory testing in MTM 2012. Testers have the ability to pause and resume testing, which makes the overall user experience extremely flexible. One scenario used by many organizations to support software apps is to field customer phone calls, during which support agents often walk through the application with an end user to attempt to reproduce a bug. Often the support professional might take notes and screenshots and then later send the bug to the developers. The exploratory testing features of MTM 2012 theoretically enable support professionals to walk through the application via an exploratory testing session while on the phone with an end user and then feed the development teams the rich, actionable bugs efficiently. Once off the call, phone support professionals can then end the testing session and begin a new session with the next call.
Often, before releasing an application or new features to an application, product owners want to test the applications thoroughly to ensure bugs aren’t released. By leveraging exploratory testing, organizations can remove the overhead of creating scripted test cases for every scenario. A group of testers could spend time exploring the application, and file bugs and test cases as they’re found. This free-form testing can help reduce the overhead of more defined testing efforts.
For another example, users might want to test an application without the overhead of defined test cases simply because of temporary resource issues.
Improved Performance This was a major goal for the new release, and the product team has done a lot of work in this area. Connecting to a test plan, displaying tests within a suite, launching Microsoft Test Runner, saving work items and creating lab environments have all improved. In addition to these improvements, Visual Studio Team Foundation Server (TFS) proxy support has been enabled for attachments so teams using MTM 2012 and TFS Proxy will now see performance benefits similar to those in source control operations. Specifically, attachments will be cached on the proxy server, saving each consumer of the attachment from having to wait for the attachment to be downloaded from TFS. A couple of other small timesavers include the addition of a “most recently used” list from which you can select a user without going through the complete list. Also, assigning configurations is made easier by providing a single list of configurations to choose from rather than a separate column for each.
Test Case Editor Improvements Though it isn’t obvious, the test steps grid (shown in Figure 3) has been completely rewritten. Features previously available via hotfix and feature packs are now available in the product by default. The test steps grid supports rich text and multiline test steps. Additionally, copy and paste from Microsoft Excel or Microsoft Word, including multiline steps and rich text, is supported. Screen real estate also is managed better by eliminating the frames around the test case fields at the top of the screen and by providing a splitter between the test steps grid and the parameters region at the bottom.
Figure 3 Test Case Editor Improvements
Cloning Test Suites into Other Plans for New Iterations A common question from MTM users has been, “How do I copy a test plan without losing traceability?” The 2010 version of MTM allowed test plans to be copied, which meant that a new test plan was created but new test cases were not created. Instead, the existing test cases were “referenced” by the new test plan. This meant that changing a test case in one plan also changed the test case in the other plan. This wasn’t a desirable behavior for teams that required absolute traceability. Those teams had to use third-party utilities or resort to some low-level TFS API programming to achieve the desired results.
Cloning a test plan is now a feature of TCM.exe, the Test Case Management command-line tool. Cloning a test plan will clone the test cases, shared steps, test suites, assigned testers, configurations, action recordings, links, attachments and test configurations. Test settings, test results and test runs are not cloned. Also, requirement-based suites are not cloned. Cloning the original requirements and associating them to new test cases or associating the new test cases to old requirements is a manual operation.
Performing a clone operation is done from TCM.exe within the Visual Studio command prompt. You must specify the collection, source and destination suites, and a value for the new destination test plan. You can optionally use the overridefieldname and overridefieldvalue parameters to specify a new area path or iteration path, or use custom test case fields that have been added to the test case work item template.
The Tcm.exe suites command format is as follows:
Tcm.exe suites /clone /collection:CollectionURL /teamproject:project /suiteid: id /destinationsuiteid: id /overridefieldname: fieldname /overridefieldvalue: fieldvalue
The following command line will copy a suite with an ID of 100 into a suite with an ID of 115:
tcm.exe suites /clone /collection:http://myTFS:8080/tfs/sampleTPC /teamproject:sampleTeamProject /suiteid:100 /destinationsuiteid:115 /overridefieldname:"Iteration Path" /overridefieldvalue:"areapath\sprint 2"
The team project collection is named “sampleTPC,” and the team project is named “sampleTeamProject.” The new iteration path will be “areapath\sprint 2.”
Note: You can find the test suite ID by highlighting the test suite in the plan contents and then viewing the ID next to the suite name on the right-hand side in the header above the list of test cases.
Link to a Read-Only Version of a Test Case This is now provided in Microsoft Test Runner. Also, the test description field in Microsoft Test Runner supports rich text, as shown in Figure 4.
Figure 4 Microsoft Test Runner Enhancements
Video Recording Enhancements The video recorder no longer requires a separate install, and users may now optionally enable audio recording as well. Audio recording may be enabled or disabled in the Diagnostic Data Adapter for the video recorder, as shown in Figure 5.
Figure 5 Enabling Audio Recording
Navigation in MTM This has been improved in a couple of ways. You’ll notice there’s a Copy Link button sprinkled throughout the product, as shown in Figure 6.
Figure 6 The Copy Link Button
Clicking this link will copy a URL to the clipboard so you can e-mail someone a hyperlink to the item you’re viewing.
Clicking a hyperlink containing this address will launch MTM 2012 and bring the user directly to the test result identified with an ID and run ID, which are included in the hyperlink.
Test plan selection also has been improved. Selecting a plan from the launch screen could be painful in the previous version when a large number of plans were in the list. Rather than scrolling through the long list, simply type the first few letters in the plan to quickly jump to the appropriate location in the plan list.
There has always been a hyperlink in the upper-right corner of MTM that allows users to jump to the plan list. That feature still exists, but now there’s a hyperlink to the team project as well. Jumping to a Team Projects selection screen is now also a single click away.
Connecting to TFS 2010 A majority of features of MTM—such as test planning and execution, data collection and use of lab environments—work fine between mismatched versions of MTM and TFS. To use MTM 2012, you need to install TFS 2010 SP1 and the latest software updates. However, new features such as exploratory testing won’t work until you upgrade TFS 2010 to TFS 2012.
Reports MTM provides various kinds of reports to track and measure the effectiveness of testing. The reports help you figure out which test cases have been passed, failed or blocked. MTM 2012 allows you to view the results from the Plan tab. There’s an option to view results, which gives a good view of the test plan result status. You can view the results based on the test configuration or based on the test suite for which you want to see the results. In addition to this, users can also see the results based on the tester. To view the results in the Plan tab, click the Results link as shown in Figure 7. This will open up the results for the most recent test run.
Figure 7 Test Plan Results
Test Data Reduction to Reduce Load on TFS Storage In MTM 2010, by default, when the results of automated test runs are published to TFS 2010, deployment items and binaries of all the test runs are uploaded. These can be used later to rerun tests and analyze failures. This approach has a large overhead in terms of TFS database storage and performance issues on the client side while opening the test results. In MTM 2012, by default, only the test result files and other data collector attachments are uploaded to the TFS database. Only when code coverage or test impact analysis is enabled are the binaries uploaded to TFS 2012. These binaries are required for code analysis.
Marking Test Case Results in MTM Without Launching Microsoft Test Runner In MTM 2010, there is no option for marking multiple test cases as pass or fail. The tester is only able to set the status of a single test case, and this has to be done from the Microsoft Test Runner window, which is a tedious job. With MTM 2012, testers can mark a test case pass, fail or block directly from the Run Tests screen of the Test tab. The tester also has the option to mark a single test case or multiple test cases as Pass test, Fail test or Block test, or Reset the test to active (see Figure 8).
Figure 8 Mark Test Case Results in MTM
Manual Testing of Windows Store Applications MTM 2012 helps improve the efficiency of manual testing of Windows Store apps. Using MTM 2012, you can test Windows Store applications that are running on a remote Windows 8 device such as a tablet or Windows 8 PC. You can execute your test steps on the remote Windows 8 device and at the same time mark the steps as pass or fail in MTM 2012 on your local machine. MTM 2012 will help you generate rich action logs—with a video and both text and image descriptions of your actions—that are step-by-step representations of the actions you performed on the remote device.
Manual testing of Windows Store applications consists of three steps. The first step is installing Remote Debugger, which consists of the Microsoft Test Tools Adapter service. The second step is connecting to the remote device using MTM 2012. The third step is executing the test cases from MTM 2012.
Before testing Windows Store applications, ensure the Microsoft Test Tools Adapter service is enabled. Once the service is enabled, in MTM 2012, connect to the test plan where you have your test suite. In the Testing Center, click the Modify link next to “Perform manual tests using” to specify the remote device on which to run manual tests (see Figure 9). Select the “Remote device…” option and enter the name or IP address of the device you want to test. Click Test to test the connection and then save your changes.
Figure 9 Manual Testing of Windows Store Applications
Once the connectivity is established, you can run the manual test case. Microsoft Test Runner opens up a “Perform manual tests using” dialog box with options to either Start Test or Install Application. Install Application will do a remote installation of the Windows Store app on the Windows 8 device, which is a three-step process of copying files, installing certificates and installing the app. Clicking the Start option will display the test steps in the MTM window, where you can mark them as pass or fail. While executing the steps in the remote machine, you can take screenshots of the bugs and create bugs.
Enhanced Action Logs for Windows Store Apps With MTM 2012, you generate rich action logs with both text and image descriptions of the actions performed on the Windows Store applications or Internet Explorer 10. The action log files contain screenshots for each action step conducted during the test run, and the files are saved as .html files and can be viewed in the browser. Hovering on any thumbnail in the image action log will display a full-screen image of the action performed (see Figure 10). The enhanced action log makes reproducing bugs easier. The user can see the exact steps taken by the tester, and these logs are displayed when a bug is submitted through Microsoft Test Runner or the exploratory testing window.
Figure 10 Enhanced Action Logs for Windows Store Apps (Source: Visual Studio ALM + Team Foundation Server Blog at bit.ly/NV0Eru.)
Be sure to explore these and many more features in MTM 2012, especially if you’re responsible for raising the quality bar of solutions and testing them.
Sudheer Adimulam is a test consultant with Microsoft Services – Global Delivery, and works as a Visual Studio ALM Ranger. He has a master’s degree in computer applications and is an ISTQB, CSQA, MCSD and MCTS.
Micheal Learned is a senior premier field engineer developer with Microsoft and works as a Visual Studio ALM Ranger. He focuses on helping Microsoft customers with .NET Framework development and application lifecycle management. He can be reached at his blog at tfsmentor.com or on Twitter at twitter.com/mlhoop.
Tim Star is a principal consultant with Intertech Inc., focusing on training, consulting and Visual Studio ALM. He has a bachelor’s degree in electrical engineering and is an MCPD, MCTS, MCT, Visual Studio ALM External Ranger and three-time MVP award recipient.
Thanks to the following technical experts for reviewing this article: Mathew Aniyan, Nivedita Bawa, Willy-Peter Schaub and Charles Sterling
Is there any new feature in 2012 to allow us to export/view the results in a MS word or xls document? Our project stakeholders want to review the test results offline in a traditional format (i.e. word/xls), and in this way , the results are base-lined and no one can modify them later. The existing Export tools (e.g. Test scribe) are not flexible and format of the generated documents are not great.
When a Test Case date is imported in Excel: it does not display the Action and Expected Results separately. Rather both fields are appearing as jumbled up in Excel in the column ‘Steps’. Question: How do I separate Actions and Expected Results in Excel 2007? We have separate column for Actions and Expected Results in Excel 2007, now we need to import the data to MTM from Excel
Hi, The Reports shown in Figure for Test Plan result is very useful for us. However, is there a way we can get it exported to Excel in tabular format and graphical format? Thanks -Ravi
One more question - Our web based application is tested against Oracle and SQL Server db's. We use excel to record our test results and have 4 columns 1. Action 2. Expected Result 3. Oracle DB - This was marked as P or F depending on the result 4. SQL Server DB - This was marked as P or F depending upon the result. I have cut and pasted the excel test cases in the Test Manager but I am not sure how to add the results when tested against the 2 different dbs? Thanks
Any idea how I can import test cases from EXCEL to MTM? I sued the recommended tool http://tcmimport.codeplex.com/ but it did not work. We have TFS 2012. Thanks
Hi, Thanks for information. I have 2 questions: 1. Can I import in MTM 2012 test suits/cases from another team project or it is still available from other test plans only? 2. If I upgrade my MTM2010 to 2012 will be saved all my test plans/suits/cases from MTM2010? Thanks.
Hi, Thanks for this detailed post on the new features. We have been using MTM 2010 for a couple of years now. And back then we were tripping over the traceability, reporting and search features. With 2012, is there a way to search for certain test case ids to figure out which suite they belong to? Currently, we map our Area paths to test suite structure. And then use the Area as a landmark for drilling down to appropriate test suite. This feels clumsy.
More MSDN Magazine Blog entries >
Browse All MSDN Magazines
Subscribe to MSDN Flash newsletter
Receive the MSDN Flash e-mail newsletter every other week, with news and information personalized to your interests and areas of focus.