UTF8Encoding.GetByteCount Method (Char*, Int32)
Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.
Assembly: mscorlib (in mscorlib.dll)
[SecurityCriticalAttribute] [CLSCompliantAttribute(false)] [ComVisibleAttribute(false)] public override unsafe int GetByteCount( char* chars, int count )
A pointer to the first character to encode.
The number of characters to encode.
Return ValueType: System.Int32
The number of bytes produced by encoding the specified characters.
chars is null.
count is less than zero.
The resulting number of bytes is greater than the maximum number that can be returned as an integer.
Error detection is enabled, and chars contains an invalid sequence of characters.
To calculate the exact array size required by the GetBytes method to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.
With error detection, an invalid sequence causes this method to throw an ArgumentException exception. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble is not reflected in the value returned by themethod.
Requires full trust for the immediate caller. This member cannot be used by partially trusted or transparent code.
Available since 10
Available since 2.0