UTF32Encoding.GetByteCount Method (Char*, Int32)
Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.
Assembly: mscorlib (in mscorlib.dll)
[CLSCompliantAttribute(false)] public override int GetByteCount( char* chars, int count )
- Type: System.Char*
A pointer to the first character to encode.
- Type: System.Int32
The number of characters to encode.
Return ValueType: System.Int32
The number of bytes produced by encoding the specified characters.
chars is null.
count is less than zero.
The resulting number of bytes is greater than the maximum number that can be returned as an integer.
Error detection is enabled, and chars contains an invalid sequence of characters.
A fallback occurred (see Understanding Encodings for complete explanation)
To calculate the exact array size required by GetBytes to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.
Windows 7, Windows Vista, Windows XP SP2, Windows XP Media Center Edition, Windows XP Professional x64 Edition, Windows XP Starter Edition, Windows Server 2008 R2, Windows Server 2008, Windows Server 2003, Windows Server 2000 SP4, Windows Millennium Edition, Windows 98
The .NET Framework and .NET Compact Framework do not support all versions of every platform. For a list of the supported versions, see .NET Framework System Requirements.