UnicodeEncoding::GetCharCount Method (Byte*, Int32)
Calculates the number of characters produced by decoding a sequence of bytes starting at the specified byte pointer.
This API is not CLS-compliant.
Assembly: mscorlib (in mscorlib.dll)
public: [SecurityCriticalAttribute] [CLSCompliantAttribute(false)] [ComVisibleAttribute(false)] virtual int GetCharCount( unsigned char* bytes, int count ) override
Parameters
- bytes
-
Type:
System::Byte*
A pointer to the first byte to decode.
- count
-
Type:
System::Int32
The number of bytes to decode.
Return Value
Type: System::Int32The number of characters produced by decoding the specified sequence of bytes.
| Exception | Condition |
|---|---|
| ArgumentNullException | bytes is null (Nothing). |
| ArgumentOutOfRangeException | count is less than zero. -or- The resulting number of bytes is greater than the maximum number that can be returned as an integer. |
| ArgumentException | Error detection is enabled, and bytes contains an invalid sequence of bytes. |
| DecoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for fuller explanation) -and- DecoderFallback is set to DecoderExceptionFallback. |
To calculate the exact array size that GetChars requires to store the resulting characters, the application uses GetCharCount. To calculate the maximum array size, the application should use GetMaxCharCount. The GetCharCount method generally allocates less memory, while the GetMaxCharCount method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
Requires full trust for the immediate caller. This member cannot be used by partially trusted or transparent code.
Available since 2.0