UTF7Encoding.GetByteCount Method (String)

 

Calculates the number of bytes produced by encoding the characters in the specified String object.

Namespace:   System.Text
Assembly:  mscorlib (in mscorlib.dll)

[ComVisibleAttribute(false)]
public override int GetByteCount(
	string s
)

Parameters

s
Type: System.String

The String object containing the set of characters to encode.

Return Value

Type: System.Int32

The number of bytes produced by encoding the specified characters.

Exception Condition
ArgumentNullException

s is null (Nothing).

ArgumentOutOfRangeException

The resulting number of bytes is greater than the maximum number that can be returned as an int.

EncoderFallbackException

A fallback occurred (see Character Encoding in the .NET Framework for fuller explanation)

-and-

EncoderFallback is set to EncoderExceptionFallback.

To calculate the exact array size that GetBytes requires to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The following code example demonstrates how to use the GetByteCount method to return the number of bytes required to encode a character array.

using System;
using System.Text;

class UTF7EncodingExample {
    public static void Main() {
        // Unicode characters.
        Char[] chars = new Char[] {
            '\u0023', // #
            '\u0025', // %
            '\u03a0', // Pi
            '\u03a3'  // Sigma
        };

        UTF7Encoding utf7 = new UTF7Encoding();
        int byteCount = utf7.GetByteCount(chars, 1, 2);
        Console.WriteLine(
            "{0} bytes needed to encode characters.", byteCount
        );
    }
}

Universal Windows Platform
Available since 10
.NET Framework
Available since 2.0
Return to top
Show: