Overview of Test Scenarios

This topic describes the specifics of the virtual and physical environments that were used to run the performance tests.

The Consume Web Service BizTalk Server 2006 R2 SDK sample was used as the test application for the tests. The original sample makes use of the file adapter for receive locations and send ports. In order provide a more realistic test scenario and to calculate end-to-end latency the sample was modified to expose the orchestration as a Web service via a two-way request response port. When an orchestration request response port (see screenshot of the orchestration below) is exposed as a Web service using the Orchestration Web Services Publishing Wizard the Perfmon counter – BizTalk:Messaging Latency\Request-Response Latency (sec) is provided for the BizTalkServerIsolatedHost. This counter was used as the latency metric throughout the tests; for more information on this please see the Measuring Performance on Hyper-V section of this guide.

The figure below illustrates the high-level architecture used. Loadgen (http://go.microsoft.com/fwlink/?LinkId=59841) was used with the SOAP transport as the test client. Loadgen was chosen as the test client as it provides the capability to configure the number of messages sent in total, number of simultaneous threads, and the sleep interval between requests sent. As a result, a consistent load was sent to both the physical and virtual BizTalk Servers.


Load Test Architecture
  1. The message flow begins when Loadgen sends a purchase order (PO) message using a synchronous SOAP request which is processed by a BizTalk Web Service hosted within IIS 6 in a separate application pool. (Steps 1-2)

  2. The PO is processed by a BizTalk receive location which is configured to use the PassThruReceive pipeline. (Steps 3-4)

  3. The orchestration passes the XML PO message as a string to the Web service. (Steps 5-9)

  4. The Web service receives the XML PO message string, converts it to a corresponding XML invoice message string, and returns it as a string to the calling orchestration. (Steps 10-12)

  5. The response is then returned to the Loadgen client, which was configured as a synchronous client to ensure that the sending thread waits for IIS to return a response before sending an additional request. (Steps 13-17)

A screenshot of the orchestration which has been modified to use a two-way receive port is shown below, for more information on the Consume Web Service example see: http://go.microsoft.com/fwlink/?LinkId=122845 .


Consume Web Service

Performance testing involves many tasks, which if performed manually are repetitive, monotonous and error prone. In order to improve speed and provide consistency between test runs, Visual Studio 2005 Team Test Edition with BizUnit 3.0 (http://go.microsoft.com/fwlink/?LinkID=85168) was used to orchestrate and automate the tasks required during the testing process. Loadgen was used by BizUnit to generate the message load against the system, the same message types were used on each test run to improve consistency and any changes were fully documented in the results spreadsheet. Following this process enabled us to generate a consistent set of data for every test run.

The following steps were automated:

  • Stop BizTalk hosts

  • Cleaning up test directories

  • Resetting IIS

  • Clean up the Message Box

  • Clear event logs

  • Creating a test results folder for each run to store all the data associated with the test run, including logs and Perfmon files

  • Start BizTalk Hosts

  • Start Perfmon counters

  • Warm up BizTalk environment with a small load

  • Send through representative run

To ensure that the results of this lab were able to provide a comparison of the performance of BizTalk in a physical and Hyper-V environment, performance metrics and logs were collected in a centralized location for each test run.

The test client was used to create a unique results directory for each test run. This directory contained all the performance logs, event logs and associated data required for the test. This approach provided information needed when retrospective analysis of prior test runs was required. At the end of each test, the raw data was compiled into a set of consistent results and key performance indicators (KPIs). Collecting consistent results set for physical and virtualized machines provided the points of comparison needed between the different test runs and different environments. The data collected included:

  • Test Run Number – to uniquely identify each test run

  • Test Run Name

  • Date

  • Messages Sent In Total – this was collected along with the Loadgen settings below to run comparable tests between different physical and virtual configurations

  • Sleep Interval – Loadgen setting

  • Threads Per Section – Loadgen setting

  • LotSize – Loadgen setting

  • Time Started – as reported by the first Loadgen client initiated

  • Time Stopped – as reported by the last Loadgen client to complete

  • Request-Response Duration Average (ms) – as reported by the BizTalk:Messaging Latency\Request-Response Latency (sec) counter for the BizTalkServerIsolatedHost – note where there were multiple virtualized BizTalk hosts were running an average of these counters as calculated from the logs was used

Loadgen was used as a consistent load test client throughout all the tests. The base template file that was used is shown below. The properties that are highlighted below were adjusted during the tests to adjust the load profile of the particular test.

  • NumThreadsPerSection – This determines the number of concurrent threads that Loadgen will use to send messages to the configured endpoint. For example the value below of 35 threads means that Loadgen will have 35 concurrent threads which are sending messages to the Web service.

  • SleepInterval – This determines how long in milliseconds each thread will sleep before resending messages once it has sent the number of messages specified in its lot size.

  • LotSizePerInterval – This is the number of messages that each thread will send before going to sleep. Because Loadgen was used in a two-way synchronous scenario during these tests – each thread will wait for a response from the Web service before sending another message. If a one-way transport was used then the messages would be sent in a batch.

  • NumFiles - This specifies the total number of files that are sent during the test. Please note that this figure is per Loadgen machine.

  • URL – This is the Web service endpoint tested by the LoadGen client.


    <StopMode Mode="Files">

    <Transport Name="SOAP">

  <Section Name="SoapSection">
        <SOAPHeader>SOAPAction: "http://tempuri.org/WebService_ConsumeWebService/Operation_1"</SOAPHeader>
        <SOAPPrefixEnv>&lt;soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"  xmlns:tem="http://tempuri.org/" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"&gt;&lt;soap:Body&gt;&lt;tem:Operation_1&gt;</SOAPPrefixEnv>

For further information and examples on how to use BizUnit, Visual Studio 2005 Team Test Edition, and Loadgen together to automate testing, please see the Automating Performance and Stability Testing section of the BizTalk Server Performance Optimizations Guide at http://go.microsoft.com/fwlink/?LinkId=122844.

The following metrics are referenced in the summary tables:

  • Request-Response Duration Average (ms) – Because a two-way receive port was used, the BizTalk:messaging Latency(BizTalkServerIsolatedHost)\Request-Response Latency (sec) counter was used to measure this parameter.

  • Documents/Sec – measured by the BizTalk:Messaging(BizTalkServerIsolatedHost)\Documents processed\Sec counter

  • BizTalk CPU Utilization – measured by the Processor(_Total)\%Processor Time counter

  • Hyper V % Guest Run Time (BizTalk VM Instance) – this is the Hyper-V counter which is used to provide a non-skewed measurement of the performance of the virtual machine logical processor. This counter is collected from the root partition i.e. the host Hyper-V server.

  • SQL\Processor(Total)\%Processor Time – this was used to measure the CPU utilization of the SQL Server

The test scenarios described in this section were performed with BizTalk Server 2006 R2 and SQL Server 2005. BizTalk Server 2006 is fully supported when installed on a supported operating system that is running on Microsoft Virtual Server 2005 or on Windows Server 2008 Hyper-V. SQL Server 2005 is not fully supported on a virtual machine in a Windows Server 2008 Hyper-V environment. Microsoft is considering whether to provide support for SQL Server 2005 on Hyper-V virtual machines in future updates of SQL Server 2005. However, Microsoft Customer Services and Support provides support and technical assistance for customers who run SQL Server 2005 on a Hyper-V virtual machine exactly as it provides support and technical assistance for SQL Server 2005 on non-Microsoft hardware virtualization software as documented in Knowledge Base article 897615, “Support policy for Microsoft software running in non-Microsoft hardware virtualization software”, available at http://support.microsoft.com/?id=897615. For more information about the supportability of BizTalk Server and SQL Server running in a virtual machine environment, review the following Microsoft Knowledge Base articles: