ASCIIEncoding::GetByteCount Method (array<Char>^, Int32, Int32)
Calculates the number of bytes produced by encoding a set of characters from the specified character array.
Assembly: mscorlib (in mscorlib.dll)
Parameters
- chars
-
Type:
array<System::Char>^
The character array containing the set of characters to encode.
- index
-
Type:
System::Int32
The index of the first character to encode.
- count
-
Type:
System::Int32
The number of characters to encode.
| Exception | Condition |
|---|---|
| ArgumentNullException | chars is null. |
| ArgumentOutOfRangeException | index or count is less than zero. -or- index and count do not denote a valid range in chars. -or- The resulting number of bytes is greater than the maximum number that can be returned as an integer. |
| EncoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for complete explanation) -and- EncoderFallback is set to EncoderExceptionFallback. |
To calculate the exact array size required by GetBytes to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.
The following example demonstrates how to use the GetByteCount method to return the number of bytes required to encode an array of Unicode characters using ASCIIEncoding.
using namespace System; using namespace System::Text; int main() { // Unicode characters. // Pi // Sigma array<Char>^chars = {L'\u03a0',L'\u03a3',L'\u03a6',L'\u03a9'}; ASCIIEncoding^ ascii = gcnew ASCIIEncoding; int byteCount = ascii->GetByteCount( chars, 1, 2 ); Console::WriteLine( " {0} bytes needed to encode characters.", byteCount.ToString() ); }
Available since 10
.NET Framework
Available since 1.1