Understanding and Managing Connections in Azure Cache
Updated: July 17, 2010
|For guidance on choosing the right Azure Cache offering for your application, see Which Azure Cache offering is right for me?.|
Applications that use Microsoft Azure Cache must understand how connections are opened to the target cache. This is especially important for Microsoft Azure Shared Caching, because each Shared Caching offering has a different limit on the number of connections that can be open simultaneously to the same cache. For more information, see the Azure Shared Caching FAQ.
In-Role Cache on roles does not have quotas for connections, because the cache runs on virtual machines in your own deployment. However, even in this dedicated or co-located scenario, connections are an important resource, because they can be an expensive resource to create. Sharing connections can improve performance.
This topic covers the following areas related to connection management:
The way connections are managed is dependent on whether connection pooling is enabled or disabled. The following sections describe the two possible scenarios and their impact on connection accounting.
Connection Pooling Enabled
Beginning with the November 2011 release of the SDK, caching has the ability to pool connections. When connection pooling is configured, the same pool of connections is shared for a single application instance. The number of shared connections is determined by the maxConnectionsToServer configuration setting (the default is
1). The following example shows how to change this setting to
2 in an application configuration file.
Each application instance shares the connections specified by maxConnectionsToServer regardless of the number of DataCacheFactory objects are created. So the formula for total number of connections when using connection pooling is the maxConnectionsToServer value multiplied by the number of running instances of that role.
The previous example assumes that a single cache client configuration is being used. Consider the following Shared Caching configuration file that contains two named dataCacheClient sections.
<dataCacheClients> <dataCacheClient name="default" maxConnectionsToServer="2"> <hosts> <host name="ExampleCache.cache.windows.net" cachePort="22233" /> </hosts> <!-- Other Sections --> </dataCacheClient> <dataCacheClient name="SslEndpoint" maxConnectionsToServer="3"> <hosts> <host name="ExampleCache.cache.windows.net" cachePort="22243" /> </hosts> <!-- Other Sections --> </dataCacheClient> </dataCacheClients>
In this example, the application could create multiple DataCacheFactory instances of which some use the
"default" configuration and some use the
"SSlEndpoint" configuration. The connection pool is related to a single configuration. So even though both configurations point to the same cache,
ExampleCache, the total number of connections possible to this cache are the sum of their maxConnectionsToServer values, which is
When you use In-Role Cache hosted on roles, connection pooling is not always the default even from the application configuration file. This is a known issue with role-based In-Role Cache. To enable connection pooling with role-based In-Role Cache, you must explicitly set useLegacyProtocol to
false. The following configuration section shows this setting being used where the role that hosts In-Role Cache is named
<dataCacheClients> <tracing sinkType="DiagnosticSink" traceLevel="Error" /> <dataCacheClient name="default" useLegacyProtocol="false" > <autoDiscover isEnabled="true" identifier="WebRole1" /> </dataCacheClient> </dataCacheClients>
In this example, the use of useLegacyProtocol set to
false ensures the use of connection pooling. Otherwise, connection pooling is not enabled by default for role-based In-Role Cache.
|When using the application configuration file, connection pooling is enabled by default in the latest SDK. If you are programmatically configuring the DataCacheFactoryConfiguration, connection pooling might not be enabled by default, and some actions disable connection pooling. Please see the Programmatic Configuration of Connection Pooling section in this topic.|
Connection Pooling Disabled
When connection pooling is disabled, each DataCacheFactory object uses one connection. It is important to initialize and store your DataCacheFactory instances to both control the number of open connections as well as to achieve the best performance.
When you are not using connection pooling, the number of connections required for a cache is defined by the following formula:
[DataCacheFactory instances] * [MaxConnectionsToServer setting] * [Azure role instance count]
By default maxConnectionsToServer is 1. You can increase this setting to improve performance if you are sharing a DataCacheFactory object across threads. For example, if maxConnectionsToServer is
2, then each DataCacheFactory object uses two connections.
In this scenario, multiple active DataCacheFactory objects each use the number of connections specified by maxConnectionsToServer. For example, if this value is
2 and there are two DataCacheFactory instances, a total of four connections are used. If there are three instances of this role running, then the total number of connections increases to
In the past this was the default behavior. With the latest SDK, connection pooling is enabled by default if you are using an application configuration file. To disable connection pooling in the configuration file, set the connectionPool attribute to
false. The following configuration file shows this setting.
Connection pooling is not the default if you are programmatically configuring your cache client. For more information regarding programmatic configuration and connection pooling, see the next section, Programmatic Configuration of Connection Pooling.
Programmatic Configuration of Connection Pooling
When you programmatically configure a cache client without using any Cache configuration file settings, connection pooling is not enabled by default. Special steps have to be taken to enable connection pooling through code.
Create a DataCacheFactoryConfiguration object. Configure the standard settings such as Servers and SecurityProperties.
Call the static method, DataCacheFactoryConfiguration. CreateNamedConfiguration, passing a new configuration name, the previously created DataCacheFactoryConfiguration object, and a Boolean flag that indicates whether connection pooling is enabled(
true) or disabled(
Create a new DataCacheFactoryConfiguration object, passing the constructor the new name of the configuration that has connection pooling enabled. This is the name you specified in the previous step.
Then create a DataCacheFactory object that uses that configuration.
DataCacheFactoryConfiguration Config = new DataCacheFactoryConfiguration(); // Configure the DataCacheFactoryConfiguration with appropriate settings for your cache here: // ... // Set the MaxConnectionsToServer to control the size of the connection pool Config.MaxConnectionsToServer = 3; // Create a named configuration from this configuration and enable connection pooling DataCacheFactoryConfiguration.CreateNamedConfiguration("MyConfigWithConnectionPooling", Config, true); // Create a DataCacheFactoryConfiguration using the new named configuration that enabled connection pooling DataCacheFactoryConfiguration ConfigWithPooling = new DataCacheFactoryConfiguration("MyConfigWithConnectionPooling"); // Use this new named configuration in the call to DataCacheFactory DataCacheFactory factory = new DataCacheFactory(ConfigWithPooling);
Note that when you create a DataCacheFactoryConfiguration object, it is also possible to initialize the configuration from the settings in an application configuration file. This is a combination of approaches that use both the configuration file and code to configure the cache client. If the constructor is empty, the "default" configuration section is read. If a string is passed to the constructor, then that named dataCacheClient section is applied. In this scenario, you can enable connnection pooling from the configuration file rather than calling the CreateNamedConfiguration call to accomplish this.
|When you initialize the DataCacheFactoryConfiguration from an application configuration file, changes you make to that configuration could result in disabling connection pooling. These settings include the server, security, compression, maxConnectionsToServer, and transport properties. In that case, you would have to use the CreateNamedConfiguration method as described previously to create a new configuration based on your modified configuration with connection pooling enabled.|
When enabling In-Role Cache on roles, there are no quota restrictions on connections other than the underlying physical resources of the clients and servers. However, it is still recommended to use connection pooling to more easily reuse and manage connections to improve performance.
If connection pooling is enabled, you should then analyze whether the default maxConnectionsToServer value of
1 is appropriate for your application. If you are using DataCacheFactory objects in multiple threads, it is possible that you could achieve better performance by increasing the number of connections in the pool to more than one. Of course, you should then calculate how this affects your total connection requirements based on the number of role instances that will run that code.
As noted, each separate named configuration gets its own connection pool. This is important to understand when calculating your total connections to the cache.
It is important to recognize the difference in the default connection pooling behavior when using the application configuration file settings versus programmatic configuration for your cache client. These differences are described in the previous sections of this topic. Due to the complexity involved in the programmatic configuration, it is easier to use connection pooling by configuring your cache through the configuration files (app.config or web.config).
With past SDK versions or when connection pooling is manually disabled, you should create, store, and re-use a minimal amount of DataCacheFactory objects in your code. This improves performance by not having to establish new connections for each cache operation. It also helps to better manage the number of connections. Again, if you are sharing the same DataCacheFactory objects across threads, you could see a performance increase by increasing the maxConnectionsToServer value. When connection pooling is disabled, you should carefully monitor the number of active DataCacheFactory objects, the maxConnectionsToServer setting, and the number of role instances. All of these factors combine to determine the number of active connections used.