UTF7Encoding::GetCharCount Method (Byte*, Int32)
Calculates the number of characters produced by decoding a sequence of bytes starting at the specified byte pointer.
This API is not CLS-compliant.
Assembly: mscorlib (in mscorlib.dll)
public: [SecurityCriticalAttribute] [CLSCompliantAttribute(false)] [ComVisibleAttribute(false)] virtual int GetCharCount( unsigned char* bytes, int count ) override
Parameters
- bytes
-
Type:
System::Byte*
A pointer to the first byte to decode.
- count
-
Type:
System::Int32
The number of bytes to decode.
Return Value
Type: System::Int32The number of characters produced by decoding the specified sequence of bytes.
| Exception | Condition |
|---|---|
| ArgumentNullException | bytes is null (Nothing). |
| ArgumentOutOfRangeException | count is less than zero. -or- The resulting number of characters is greater than the maximum number that can be returned as an int. |
| DecoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for fuller explanation) -and- DecoderFallback is set to DecoderExceptionFallback. |
To calculate the exact array size required by GetChars to store the resulting characters, use GetCharCount. To calculate the maximum array size, the application should use GetMaxCharCount. The GetCharCount method generally allows allocation of less memory, while the GetMaxCharCount method generally executes faster.
Requires full trust for the immediate caller. This member cannot be used by partially trusted or transparent code.
Available since 10
.NET Framework
Available since 2.0