UTF8Encoding.GetByteCount Method (Char*, Int32)
Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.
This API is not CLS-compliant. Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)
[CLSCompliantAttribute(false)] [ComVisibleAttribute(false)] public override int GetByteCount( char* chars, int count )
- Type: System.Char*
A pointer to the first character to encode.
- Type: System.Int32
The number of characters to encode.
Return ValueType: System.Int32
The number of bytes produced by encoding the specified characters.
chars is null.
count is less than zero.
The resulting number of bytes is greater than the maximum number that can be returned as an integer.
Error detection is enabled, and chars contains an invalid sequence of characters.
A fallback occurred (see Character Encoding in the .NET Framework for complete explanation)
To calculate the exact array size required by GetBytes to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.
Requires full trust for the immediate caller. This member cannot be used by partially trusted or transparent code.
Windows 8.1, Windows Server 2012 R2, Windows 8, Windows Server 2012, Windows 7, Windows Vista SP2, Windows Server 2008 (Server Core Role not supported), Windows Server 2008 R2 (Server Core Role supported with SP1 or later; Itanium not supported)