Performance Comparison: Security Design Choices


Priya Dhawan
Microsoft Developer Network

October 2002

Applies to:
   Microsoft® ASP.NET
   Microsoft® .NET Framework SP1
   Microsoft ® Windows® 2000 Advanced Server SP2
   Microsoft® SQL Server™ 2000 and Enterprise Edition SP2
   Microsoft Application Center Test (ACT)

Summary: Compares the relative performance of various security options available for client authentication, hashing algorithms, cryptography techniques, and digital signatures. (19 printed pages)


Test Tools and Strategy
Machine Configuration
Performance Test Results


Design choices for securing a system affect performance, scalability and usability. There is usually a tradeoff between security vs. performance and usability. The more secure a system is, the more you have compromised in terms of performance and usability. When designing a secure system, you should determine all the possible threats, vulnerabilities, and attacks and choose the techniques to implement security based on threat mitigation first and performance second.

This article compares the relative performance of various security options available for client authentication, hashing algorithms, cryptography techniques, and digital signatures. For simplicity we have isolated these different categories of security and restricted the performance comparison to the options available with each category; of course in a real secure system, the overall security will be the combination of one or more of these categories.

Test Tools and Strategy

For our tests, we used Microsoft Application Center Test (ACT), which is designed to stress test Web servers and analyze performance and scalability problems with Web applications, including ASPX pages and the components they use. Refer to ACT documentation for details on how to create and run tests. Application Center Test can simulate a large group of users by opening multiple connections to the server and rapidly sending HTTP requests. It also allows us to build realistic test scenarios where we can call the same method with a randomized set of parameter values. This is an important feature, whereby users are not expected to call the same method with the same parameter values over and over again. The other useful feature is that Application Center Test records test results that provide the most important information about the performance of the Web application.

ACT supports Basic, Kerberos and Digest authentication. It does not handle ASP.NET Forms authentication automatically. We explicitly edited the body of the request to mimic a client side form submission.

We used a separate machine as the domain controller. Ten thousand user accounts were held in Active Directory. The same number of users was created for Application Center Test that picked up users at random when a test was being run.

Machine Configuration

The following tables provide a brief summary of the test bed configuration used to perform the tests.

Table 1. Client Machine Configuration

Number of clientsMachine/CPU# of CPUsMemoryDiskSoftware
1Compaq Proliant 1130 MHz21GB16.9 GB
  • Windows 2000 Advanced Server SP 2
  • Application Center Test

Table 2. Web Server Configuration

Number of serversMachine/CPU# of CPUsMemoryDiskSoftware
1Compaq Proliant 1000 MHz 21 GB16.9 GB
  • Windows 2000 Advanced Server SP 2
  • .NET Framework SP1

Table 3. Application Server Configuration

Number of serversMachine/CPU# of CPUsMemoryDiskSoftware
1Compaq Proliant 1126 MHz 21 GB16.9 GB
  • Windows 2000 Advanced Server SP 2
  • .NET Framework SP1

Table 4. Database Server Configuration

Number of serversMachine/CPU# of CPUsMemoryDiskSoftware
1Compaq Proliant 700 MHz44 GB18 GB
  • Windows 2000 Advanced Server SP 2
  • SQL Server Enterprise Edition SP 2

Performance Test Results

Throughput and latency are the key performance indicators. For a given amount of data being returned, throughput is the number of client requests processed within a certain unit of time, typically within a second. Because peak throughput may occur at a response time that is unacceptable from a usability standpoint, we tracked latency—measured as response time using the report generated by Application Center Test for each of the tests run.

Note   The performance numbers generated for throughput and latency act merely as a basis to compare these technologies. They do not represent absolute performance.

Client Authentication

A server authenticates a client by accepting its credentials and validating those credentials against some designated authority. We will focus on the authentication modes Microsoft® ASP.NET supports, including Microsoft IIS 5.0 authentication modes and ASP.NET Forms authentication. Please refer to ASP.NET Web Application Security in the .NET Framework Developer's Guide for details.

Get Default Page

The test included having a single ACT user send a single request to the customer. Upon requesting the page, the user was asked to authenticate itself by means of providing username and password. Once the user got authenticated, the page is returned with a simple string.

Note   This test does not model a scenario wherein, after getting authenticated on the first request, the user sends subsequent requests by presenting to the Web server some sort of ticket to show that its already been authenticated.


Figure 1. Authentication modes: RPS and response time


  • Anonymous: No authentication is performed.
  • User accounts were stored in Active Directory, which was on a separate box than the Web server.
  • FormsAuth_AD: It uses ASP.NET Forms authentication. User accounts in Active Directory.
  • FormsAuth_SQL: It uses ASP.NET Forms authentication. User accounts stored in SQL Server 2000. For security reasons, password should not be stored in clear text in database. Rather, you should generate and store a one-way hash of user's password combined with a salt (a cryptographically generated random number) for added security and to mitigate threat associated with dictionary attacks. This approach is preferred to storing an encrypted version of the user's password as it avoids key management issues associated with encryption techniques.

As you would expect, the Anonymous authentication mode, in which no authentication is performed, offers the best performance. The Web server does not request the client to send user credentials before the client can view the Web page. This mode is a good choice when you want your site to be publicly available.

With all the other authentication modes, the client is required to send additional authentication messages, which takes additional round trips to the Web server. In Basic, Digest, and Kerberos authentication, the flow of HTTP headers looks like:


Figure 2. Authentication header flow

As shown in Figure 1, Digest and Kerberos authentication modes are very similar in performance, though they have different overheads associated with them. In Digest authentication, the server sends a challenge (NONCE) to the client asking for the username and password. One-way hash of the password (including the resource being accessed and the realm) is used to encrypt the NONCE, which is then sent to the server where the client gets authenticated. The password is not sent in clear text, which certainly is an advantage over Basic authentication. The biggest shortcoming of Digest authentication, despite being an industry standard, is that only a few browsers and Web servers support it, which limits its widespread use. Microsoft® Internet Explorer 5.0 is the first to adopt it along with IIS 5.0 and later. It works with Windows 2000 domain accounts only and requires the accounts to store passwords as encrypted clear text (note that this not the case with Microsoft® .NET® Server).

In Kerberos, the browser attempts to use the user's credentials from a domain logon. If the server rejects those credentials, the client is prompted for a username and password by means of a dialog box. The credentials are sent directly to the Ticket Granting service server, which authenticates the credentials and issues a Kerberos ticket to the client. This ticket is a temporary certificate containing information that identifies the user to the network server. Typically, the client caches the ticket so that it is used for subsequent requests to the server until it expires.

As you see in Figure 1, Basic authentication and FormsAuth_SQL perform exactly similar. With Basic, the client sends the user credentials to the Web server, which makes a round trip the domain controller to get the user authenticated. Remember that Basic authentication is extremely insecure because the password is passed over the network in clear text (actually it is base64-encoded, which can very easily be decoded). To make Basic more secure, you can use SSL to establish a secure session. But it still is not as secure as any of the real network authentication protocols like Kerberos and Digest, which do not send the user's secret to the remote server.

The flow of HTTP headers for ASP.NET Forms authentication looks like:


Figure 3. Authentication header flow

Note that ASP.NET Forms Authentication is slower than all of the Windows authentication schemes. This could be because it involves a couple of redirection before a page can be viewed. In the case of FormsAuth_AD, the system programmatically looks up the user in Active Directory. If a username with the provided password exists in the Active Directory, the user gets authenticated. Similarly, in the case of FormsAuth_SQL, the system calls a SQL stored procedure to look up the user in the database. If the query succeeds, the user is authenticated. We didn't store passwords in clear text in database; rather we stored one-way hash of user's password combined with a salt. We used SHA1 to generate the hash. So in the FormsAuth_SQL mode, when a user submits its credentials, the system first retrieves the hash and salt associated with the user from the database. It then generates the hash of user-supplied password and salt that it just retrieved from the database. If the two hashes match, the user is authenticated. This process consumes extra cycles since we are generating a hash using an SHA1 algorithm while authenticating a user.

ASP.NET Forms authentication is as insecure as Basic since the password is sent in clear text over the network. There's a difference, though; Forms authentication sends credentials once and then uses an encrypted ticket thereafter, whereas Basic authentication sends credentials with each request. To make Forms authentication more secure you can use SSL to establish a secure session, though there will be an impact on performance. Using Forms authentication without SSL makes it susceptible to a replay attack.

Take a look at Authentication in ASP.NET: .NET Security Guidance, which discusses the advantages and disadvantages of these authentication schemes and the environment they are best suited for.

Cryptography Techniques

Cryptography techniques provide data privacy, tamper detection, and authentication by encrypting the data being transmitted between the server and client, assuming there is a pre-shared secret between them that has not been exposed. We will focus on hashing algorithms including SHA1 and MD5, symmetric algorithms including DES, RC2, 3DES and Rijndael and asymmetric algorithms including RSA and DSA. For details, see Cryptography Overview in the .NET Framework Developer's Guide.

Hashing Algorithms

Hash algorithms map a piece of data of arbitrary size to a small unique value of fixed length. We will compare the SHA1 and MD5 algorithms. For details, see Cryptography Overview in the .NET Framework Developer's Guide.


The method computes the hash of data stored in a file. We performed the tests with a data size of 4 KB, 135 KB, and 1 MB to see how the size of data impacts performance.


Figure 4. Hash algorithms (4 KB): RPS and response time


  • .NET Framework supports various hash algorithms including MD5, SHA1, SHA256, SHA384, and SHA512. The only difference between the various SHA implementations is the hash size that they produce. We opted to include only SHA1 and SHA512 in our tests.
  • We used System.Security.Cryptography that provides various implementations of SHA1 and MD5.
  • There is just one implementation of MD5 available in System.Security.Cryptography: MD5CryptoServiceProvider that wraps CAPI.
  • SHA256, SHA384 and SHA512 are not currently available in CryptoAPI. These algorithms are implemented directly in managed code. These algorithms have been added just to support the new key generation requirements of AES, not to provide stronger algorithms than SHA1. The current belief is that SHA1 is more than adequate for hashing data.
  • For SHA1 and SHA512, we used managed implementations, SHA1Managed and SHA512Managed, respectively, available in System.Security.Cryptography.

As shown in Figure 4, all the algorithms are very similar in performance with SHA512 slightly behind. MD5 produces a hash of size 128 bits. The computation process in SHA is very much modeled after MD5. It produces a 160-bit hash.

Longer hash sizes are harder to attack using brute force methods. SHA512 generates a 512-bit hash, SHA1 a 160-bit hash, and MD5 a 128-bit hash; therefore SHA1 is harder to use brute force with than MD5. This also assumes no weaknesses in the algorithms.


Figure 5. Hash algorithms (135 KB): RPS and response time

With increase in size of data, we see that the performance difference between the various algorithms has increased. At 5 concurrent users, MD5 is around 33% faster than SHA1. Although there is not yet a known method to attack MD5, there are theoretical collisions that can be exploited against it.

The performance of SHA512 has degraded with more data. It is around 55% slower than SHA1.

Bear in mind that longer hash size provides greater security at the cost of performance, as I mentioned earlier.


Figure 6. Hash algorithms (1 MB): RPS and response time

The performance difference between the algorithms is increased even more with increase in data.

MD5 is around 43% faster than SHA1 at a user load of 5 concurrent users (at other user loads it is around 20% faster). SHA1 is around 72% faster than SHA512.

Symmetric Key Algorithms

The method that was tested encrypts the data first and then decrypts the encrypted bytes. We performed the tests with a data size of 4 KB, 100 KB, and 500 KB to see how the size of data impacts performance.


Figure 7. Symmetric key algorithms (4 KB): RPS and response time


  • We used managed wrappers for DES, 3DES and RC2 available in System.Security.Cryptography that wrap unmanaged implementations available in CryptoAPI. These are DESCryptoServiceProvider, TripleDESCryptoServiceProvider, and RC2CryptoServiceProvider, respectively. There is only a pure managed implementation of Rijndael available in System.Security.Cryptography, which was used in the tests.
  • The Key and block sizes used by the algorithms to encrypt and decrypt data:
    AlgorithmKey Size


    Block Size


  • 3DES, RC2, and Rijndael also support other key lengths, but we chose to encrypt and decrypt data with the maximum key length supported by each of them. Since a longer key requires more time and effort to attack, and therefore provides better mitigation, this could enable us to measure the performance when the algorithm offers the maximum security.
  • Longer key lengths provide greater security by decreasing the possibility of successful key search attacks by increasing the number of key combinations that are possible.
  • Different algorithms with a same key length (say 128) may not necessarily provide the same strength.

With small data, we find that Rijndael, an AES (Advanced Encryption standard), is the fastest of all the methods. It has a variable block length and key length, which may be chosen to be any of 128, 192, or 256 bits. It also has a variable number of rounds to produce the cipher text, which depends on the key length and the block length.

DES encrypts and decrypts data in 64-bit blocks a using 64-bit key. Each block of data is iterated 16 times to produce the cipher text. Though it is faster than 3DES and RC2, its short key length makes it vulnerable to a brute force attack. It is becoming more vulnerable and easily breakable with today's progressively faster and more powerful computers.

Triple DES (3DES) was invented to improve the security of DES by applying DES encryption three times using three different keys (note that encrypting data three times with the same key does not offer any value). It is simply another mode of DES, but it is highly secure and therefore slower in performance. It takes a 192-bit key, which is broken into three 64-bit subkeys to be used in the encryption procedure. The procedure is exactly like DES, but it is repeated three times, making it much more secure. The data is encrypted with the first subkey, decrypted with the second subkey, and encrypted again with the third subkey.

RC2 turns out to be the slowest method when the data being encrypted is small. It has an expensive computation up front to build a key-dependent table, which apparently is high compared to the cost of encrypting small data. RC2 is a variable key-length symmetric block cipher, which is designed to be alternatives to DES.


Figure 8. Symmetric key algorithms (100 KB): RPS and response time

By increasing the size of data being encrypted and decrypted, we see an entirely different picture to what we saw in the previous test. DES is the fastest, followed by RC2, which is around 20% faster than 3DES. Note that the expensive computation in RC2 to build the key-dependent table I mentioned in the previous test is amortized over more data. Rijndael in this case is the slowest; 25% slower than 3DES. Note that we are using a 256-bit key for Rijndael encryption, which makes it stronger than the other methods (though there has been some press about possible attacks against Rijndael, which might be better than brute force attack) and for the same reason the slowest of all. Similarly, we used a 192-bit key in case of 3DES, which is longer than keys used for DES and RC2 encryption.

One point I would like to mention again is that using a same-length key does not necessarily mean that different algorithms will have the same strength. Different algorithms have different characteristics and hence they may not provide the same strength.

As I mentioned earlier in the article, there is always a tradeoff between security and performance. You need to understand the value of your data, the deployment cost, and usability/performance tradeoffs before you can begin choosing a right algorithm for securing your data. If the cost of data that you are protecting is high, then you must consider taking a performance hit to secure your data. Otherwise, you may be better off using a less secure algorithm.


Figure 9. Symmetric key algorithms (500 KB): RPS and response time

With the increasing size of data being encrypted and decrypted, we see the same trend prevailed in this test too, though the performance difference is increased between the ciphers, except between 3DES and Rijndael.

Asymmetric Key Algorithms

Encryption using asymmetric key algorithms is very slow, especially when the data size is large; hence, they are not used when doing bulk encryption. For bulk encryption, symmetric algorithms should be used. The asymmetric algorithms can be used to do key exchange.

The two common asymmetric algorithms are RSA and DSA. RSA can be used for both encryption and signature generation. On the other hand, DSA can only be used to generate signature. We compared RSA and DSA algorithms based on how fast they generate a digital signature and how fast they verify a signature.

Create Signature


Figure 10. Create signature (100 KB): RPS and response time


  • We used RSACryptoServiceProvider and DSACryptoServiceProvider available in System.Security.Cryptography, which are managed wrappers around the unmanaged RSA implementation and unmanaged DSA implementation, respectively, provided by CryptoAPI.
  • RSA used a 1024-bit Key.
  • DSA used a 512-bit key.

As shown in Figure 10, DSA is around 29% faster than RSA when generating a digital signature. In the RSA digital signature process, the private key is used to encrypt only the message digest. The encrypted method becomes the digital signature.

Although similar to RSA, DSA does not encrypt message digests with the private key or decrypt the message digest with the public key. Instead, DSA uses special mathematical functions to generate a digital signature composed of two 160-bit numbers that are derived from the message digest and the private key.


Figure 11. Create signature (500 KB): RPS and response time

With more data, DSA is still faster than RSA.

Verify Signature


Figure 12. Verify signature (100 KB): RPS and response time

The results are reversed when a digital signature is verified. RSA is faster than DSA by around 29%. RSA uses a public key to verify the signature. It generates a new message digest from the data that was received, decrypts the original message digest with the originator's public key, and compares the decrypted digest with the newly generated digest. If the two digests match, the integrity of the message is verified. The identity of the originator also is confirmed because the public key can decrypt only data that has been encrypted with the corresponding private key.

DSA also uses the public key to verify the signature, but the verification process is more complex than RSA.


Figure 13. Verify signature (500 KB): RPS and response time

With more data, the performance difference between the two algorithms has become negligible.


As these tests demonstrate, authentication schemes, hashing algorithms, and cryptography techniques carry varying amounts of overhead, and therefore have vastly different performance characteristics. The size of data being passed to hashing algorithms, as well to cryptography techniques, is also significant.

When designing a secure system, the implementation techniques should be chosen based on threat mitigation first and performance second. For instance, basic authentication without SSL could be used for better performance, but no matter how fast it is, it would not be useful in systems that are vulnerable to threats not mitigated by it.

This article does not cover the overall performance impact of combining authentication and data privacy, which is how the security is built in a real system. The performance of a secure system will vary depending on the combination of various schemes being used.