UTF7Encoding::GetCharCount Method (array<Byte>^, Int32, Int32)
Calculates the number of characters produced by decoding a sequence of bytes from the specified byte array.
Assembly: mscorlib (in mscorlib.dll)
public: virtual int GetCharCount( array<unsigned char>^ bytes, int index, int count ) override
Parameters
- bytes
-
Type:
array<System::Byte>^
The byte array containing the sequence of bytes to decode.
- index
-
Type:
System::Int32
The index of the first byte to decode.
- count
-
Type:
System::Int32
The number of bytes to decode.
Return Value
Type: System::Int32The number of characters produced by decoding the specified sequence of bytes.
| Exception | Condition |
|---|---|
| ArgumentNullException | bytes is null (Nothing). |
| ArgumentOutOfRangeException | index or count is less than zero. -or- index and count do not denote a valid range in bytes. -or- The resulting number of characters is greater than the maximum number that can be returned as an int. |
| DecoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for complete explanation) -and- DecoderFallback is set to DecoderExceptionFallback. |
To calculate the exact array size required by GetChars to store the resulting characters, use GetCharCount. To calculate the maximum array size, the application should use GetMaxCharCount. The GetCharCount method generally allows allocation of less memory, while the GetMaxCharCount method generally executes faster.
The following code example demonstrates how to use the GetCharCount method to return the number of characters produced by decoding a range of elements in a byte array.
using namespace System; using namespace System::Text; int main() { array<Byte>^bytes = {85,0,110,0,105,0,99,0,111,0,100,0,101,0}; UnicodeEncoding^ Unicode = gcnew UnicodeEncoding; int charCount = Unicode->GetCharCount( bytes, 2, 8 ); Console::WriteLine( "{0} characters needed to decode bytes.", charCount ); }
Available since 10
.NET Framework
Available since 1.1