ASCIIEncoding::GetMaxCharCount Method (Int32)

 

Calculates the maximum number of characters produced by decoding the specified number of bytes.

Namespace:   System.Text
Assembly:  mscorlib (in mscorlib.dll)

public:
virtual int GetMaxCharCount(
	int byteCount
) override

Parameters

byteCount
Type: System::Int32

The number of bytes to decode.

Return Value

Type: System::Int32

The maximum number of characters produced by decoding the specified number of bytes.

Exception Condition
ArgumentOutOfRangeException

byteCount is less than zero.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

The GetCharCount method calculates the exact array size required by the GetChars method to store the resulting characters, whereas the GetMaxCharCount method calculates the maximum array size. The GetCharCount method generally allocates less memory, while the GetMaxCharCount method generally executes faster.

GetMaxCharCount retrieves a worst-case number, including the worst case for the currently selected DecoderFallback. If a decoder fallback is present that has a maximum fallback length of n, the GetMaxCharCount method returns n * byteCount.

GetMaxCharCount has no relation to GetBytes. If your application needs a similar function to use with GetBytes, it should use GetMaxByteCount.

System_CAPS_noteNote

GetMaxCharCount(N) is not necessarily the same value as N* GetMaxCharCount(1).

The following example demonstrates how to use the GetMaxCharCount method to calculate the maximum number of characters needed to decode a specified number of bytes.

using namespace System;
using namespace System::Text;
int main()
{
   ASCIIEncoding^ ascii = gcnew ASCIIEncoding;
   int byteCount = 8;
   int maxCharCount = ascii->GetMaxCharCount( byteCount );
   Console::WriteLine( "Maximum of {0} characters needed to decode {1} bytes.", maxCharCount, byteCount );
}

Universal Windows Platform
Available since 10
.NET Framework
Available since 1.1
Return to top
Show: