Export (0) Print
Expand All

About Run Settings

Run settings are a set of properties that influence the way a load test runs. Run settings are organized by categories in the Properties window.

You can have more than one run setting in a load test. Only one of the run settings may be active for a load test run. The other run settings provide a set of easily accessible alternative settings to use for subsequent test runs. The active run setting is accessed by the RunSettings property of the LoadTest class. In the Load Test Editor, the active run setting is identified by the "[Active]" suffix. You can change the active run setting by right-clicking a run setting node and choosing Set As Active. You can also change the active run setting by selecting the root node in the Load Test Editor and choosing a run setting name from the drop-down list in the Properties window.

The run setting categories are defined the following section:

General

Description

A description of the Run Settings.

Maximum Error Details

The maximum number of request and response details of failed requests that are stored. This is important because detailed error results can consume a large amount of database storage. If you do not want to record error details, use a value of 0.

Name

The name of the Run Setting as it appears in the Run Settings node of the Load Test Editor.

Validation Level

This defines the highest level of validation rule that will run in a load test. Validation rules are associated with Web test requests. Each validation rule has an associated validation level; high, medium, or low. This load test running setting will specify which validation rules will run while the Web test is run in the load test. For example, if this run setting is set to Medium, all validation rules marked as medium or low will be run.

Results

Storage Type

The way to store the performance counters obtained in a load test. The options are as follows:

Timing Details Storage

This is used to determine which details will be stored in the Load Test Results Store. There are three values:

  • None - Do not collect any individual timing values. This is the default value.

  • StatisticsOnly - Collect and store only the statistics instead of storing the individual timing values for each test, transaction, and page executed/issued during the load test in the Load Test Results Store.

  • AllIndividualDetails - Collect and store individual timing values for each test, transaction, and page run/issued during the load test in the Load Test Results Store.

SQL Tracing

Minimum Duration of Traced SQL Operations

The minimum duration of a SQL operation to be captured by the SQL Trace, in milliseconds. For example, this allows you to ignore operations that complete quickly if you are trying to find SQL operations that are slow under load.

SQL Tracing Connect String

The connection string used to access the database to be traced.

SQL Tracing Directory

The location where the SQL Trace file is put after the trace ends. This directory must have write permissions for SQL Server and read permissions for the controller.

SQL Tracing Enabled

This enables the tracing of SQL operations. The default value is false.

For more information, see How to: Integrate SQL Trace Data.

Timing

Run Duration

The length of the test, in hh:mm:ss format.

Sample Rate

The interval at which to capture performance counter values, in hh:mm:ss format.

Warm up Duration

The period between the beginning of the test and when the data samples start being recorded, in hh:mm:ss format. This is frequently used to step load virtual users to reach a certain load level before recording sample values. The sample values that are captured before the warm-up period ends are shown in the Load Test Monitor.

WebTest Connections

WebTest Connection Model

This controls the usage of connections from the load test agent to the Web server for Web tests running inside a load test. There are two Web test connection model options: ConnectionPerUser and ConnectionPool.

  • The ConnectionPerUser model simulates the behavior of a user who is using a real browser. Each virtual user who is running a Web test uses one or two dedicated connections to the Web server. The first connection is established when the first request in the Web test is issued. A second connection may be used when a page contains more than one dependent request. These requests are issued in parallel using the two connections. These connections are reused for subsequent requests within the Web test. The connections are closed when the Web test finishes. A drawback to this model is that the number of connections held open on the agent computer might be high (up to two times the user load), and the resources required to support this high connection count might limit the user load that can be driven from a single load test agent.

  • The ConnectionPool model conserves the resources on the load test agent by sharing connections to the Web server among multiple virtual Web test users. If the user load is larger than the connection pool size, then the Web tests run by different virtual users will share a connection. This could mean that one Web test might have to wait before it issues a request when another Web test is using the connection. The average time that a Web test waits before submitting a request is tracked by the load test performance counter Average Connection Wait Time. This number should be less than the average response time for a page. If it is not, then the connection pool size is probably too small.

WebTest Connection Pool Size

This specifies the maximum number of connections to make between the load test agent and the Web server. This only applies to the ConnectionPool model.

Setting and Changing a Run Setting

When you create a load test using the Load Test Wizard, you create your initial run settings. For more information, see How to: Specify Run Settings.

After you create your load test, you can change your run settings in the Load Test Editor. For more information, see How to: Change the Run Settings.

See Also

Community Additions

ADD
Show:
© 2014 Microsoft