UTF7Encoding::GetBytes Method (array<Char>^, Int32, Int32, array<Byte>^, Int32)
Encodes a set of characters from the specified character array into the specified byte array.
Assembly: mscorlib (in mscorlib.dll)
public: virtual int GetBytes( array<wchar_t>^ chars, int charIndex, int charCount, array<unsigned char>^ bytes, int byteIndex ) override
Parameters
- chars
-
Type:
array<System::Char>^
The character array containing the set of characters to encode.
- charIndex
-
Type:
System::Int32
The index of the first character to encode.
- charCount
-
Type:
System::Int32
The number of characters to encode.
- bytes
-
Type:
array<System::Byte>^
The byte array to contain the resulting sequence of bytes.
- byteIndex
-
Type:
System::Int32
The index at which to start writing the resulting sequence of bytes.
| Exception | Condition |
|---|---|
| ArgumentNullException | chars is null (Nothing). -or- bytes is null (Nothing). |
| ArgumentOutOfRangeException | charIndex or charCount or byteIndex is less than zero. -or- charIndex and charCount do not denote a valid range in chars. -or- byteIndex is not a valid index in bytes. |
| ArgumentException | bytes does not have enough capacity from byteIndex to the end of the array to accommodate the resulting bytes. |
| EncoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for fuller explanation) -and- EncoderFallback is set to EncoderExceptionFallback. |
To calculate the exact array size required by GetBytes to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.
Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, the application should use the Decoder or the Encoder provided by the GetDecoder method or the GetEncoder method, respectively.
Note |
|---|
UTF7Encoding does not provide error detection. Invalid characters are encoded as a modified base 64 character. For security reasons, your applications are recommended to use UTF8Encoding, UnicodeEncoding, or UTF32Encoding and enable error detection. |
The following code example demonstrates how to use the GetBytes method to encode a range of characters from a String and store the encoded bytes in a range of elements in a byte array.
using namespace System; using namespace System::Text; using namespace System::Collections; int main() { array<Byte>^bytes; // Unicode characters. // Pi // Sigma array<Char>^chars = {L'\u03a0',L'\u03a3',L'\u03a6',L'\u03a9'}; UTF7Encoding^ utf7 = gcnew UTF7Encoding; int byteCount = utf7->GetByteCount( chars, 1, 2 ); bytes = gcnew array<Byte>(byteCount); int bytesEncodedCount = utf7->GetBytes( chars, 1, 2, bytes, 0 ); Console::WriteLine( "{0} bytes used to encode characters.", bytesEncodedCount ); Console::Write( "Encoded bytes: " ); IEnumerator^ myEnum = bytes->GetEnumerator(); while ( myEnum->MoveNext() ) { Byte b = safe_cast<Byte>(myEnum->Current); Console::Write( "[{0}]", b ); } Console::WriteLine(); }
Available since 10
.NET Framework
Available since 1.1
