ASCIIEncoding.GetMaxByteCount(Int32) Method

Definition

Calculates the maximum number of bytes produced by encoding the specified number of characters.

public:
 override int GetMaxByteCount(int charCount);
public override int GetMaxByteCount (int charCount);
override this.GetMaxByteCount : int -> int
Public Overrides Function GetMaxByteCount (charCount As Integer) As Integer

Parameters

charCount
Int32

The number of characters to encode.

Returns

The maximum number of bytes produced by encoding the specified number of characters.

Exceptions

charCount is less than zero.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Examples

The following example demonstrates how to use the GetMaxByteCount method to calculate the bytes required to encode a specified number of characters.

using namespace System;
using namespace System::Text;
int main()
{
   ASCIIEncoding^ ascii = gcnew ASCIIEncoding;
   int charCount = 2;
   int maxByteCount = ascii->GetMaxByteCount( charCount );
   Console::WriteLine( "Maximum of {0} bytes needed to encode {1} characters.", maxByteCount, charCount );
}
using System;
using System.Text;

class ASCIIEncodingExample {
    public static void Main() {
        ASCIIEncoding ascii = new ASCIIEncoding();
        int charCount = 2;
        int maxByteCount = ascii.GetMaxByteCount(charCount);
        Console.WriteLine(
            "Maximum of {0} bytes needed to encode {1} characters.",
            maxByteCount,
            charCount
        );
    }
}
Imports System.Text

Class ASCIIEncodingExample
    Public Shared Sub Main()
        Dim ascii As New ASCIIEncoding()
        Dim charCount As Integer = 2
        Dim maxByteCount As Integer = ascii.GetMaxByteCount(charCount)
        Console.WriteLine( _
            "Maximum of {0} bytes needed to encode {1} characters.", _
            maxByteCount, _
            charCount _
        )
    End Sub
End Class

Remarks

The GetByteCount method calculates the exact array size required by the GetBytes method to store the resulting bytes, whereas the GetMaxByteCount method calculates the maximum array size. The GetByteCount method generally allocates less memory, but the GetMaxByteCount method generally executes faster.

GetMaxByteCount is a worst-case number, including the worst case for the currently selected EncoderFallback. If you choose a replacement fallback with a potentially large string, GetMaxByteCount can return large values.

The GetMaxByteCount method considers potential leftover surrogates from a previous encoding operation. As a result, if the ASCIIEncoding object uses the default replacement fallback, or if a custom replacement fallback has been defined with a single possible fallback character, the method returns charCount + 1. If the ASCIIEncoding object uses a replacement fallback with more than one possible fallback character, the method returns n * (charCount + 1), where n is the maximum number of fallback characters.

GetMaxByteCount has no relation to GetChars. If your application needs a similar function to use with GetChars, it should use GetMaxCharCount.

Note

GetMaxByteCount(N) is not necessarily the same value as N* GetMaxByteCount(1).

Applies to

See also