UTF7Encoding::GetByteCount Method (String^)
Calculates the number of bytes produced by encoding the characters in the specified String object.
Assembly: mscorlib (in mscorlib.dll)
Parameters
- s
-
Type:
System::String^
The String object containing the set of characters to encode.
| Exception | Condition |
|---|---|
| ArgumentNullException | s is null (Nothing). |
| ArgumentOutOfRangeException | The resulting number of bytes is greater than the maximum number that can be returned as an int. |
| EncoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for fuller explanation) -and- EncoderFallback is set to EncoderExceptionFallback. |
To calculate the exact array size that GetBytes requires to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.
The following code example demonstrates how to use the GetByteCount method to return the number of bytes required to encode a character array.
using namespace System; using namespace System::Text; int main() { // Unicode characters. // Pi // Sigma array<Char>^chars = {L'\u03a0',L'\u03a3',L'\u03a6',L'\u03a9'}; UTF7Encoding^ utf7 = gcnew UTF7Encoding; int byteCount = utf7->GetByteCount( chars, 1, 2 ); Console::WriteLine( "{0} bytes needed to encode characters.", byteCount ); }
Available since 10
.NET Framework
Available since 2.0