ASCIIEncoding::GetChars Method (array<Byte>^, Int32, Int32, array<Char>^, Int32)
Decodes a sequence of bytes from the specified byte array into the specified character array.
Assembly: mscorlib (in mscorlib.dll)
public: virtual int GetChars( array<unsigned char>^ bytes, int byteIndex, int byteCount, array<wchar_t>^ chars, int charIndex ) override
Parameters
- bytes
-
Type:
array<System::Byte>^
The byte array containing the sequence of bytes to decode.
- byteIndex
-
Type:
System::Int32
The index of the first byte to decode.
- byteCount
-
Type:
System::Int32
The number of bytes to decode.
- chars
-
Type:
array<System::Char>^
The character array to contain the resulting set of characters.
- charIndex
-
Type:
System::Int32
The index at which to start writing the resulting set of characters.
| Exception | Condition |
|---|---|
| ArgumentNullException | bytes is null. -or- chars is null. |
| ArgumentOutOfRangeException | byteIndex or byteCount or charIndex is less than zero. -or- byteindex and byteCount do not denote a valid range in bytes. -or- charIndex is not a valid index in chars. |
| ArgumentException | chars does not have enough capacity from charIndex to the end of the array to accommodate the resulting characters. |
| DecoderFallbackException | A fallback occurred (see Character Encoding in the .NET Framework for complete explanation) -and- DecoderFallback is set to DecoderExceptionFallback. |
To calculate the exact array size required by GetChars to store the resulting characters, the application uses GetCharCount. To calculate the maximum array size, the application should use GetMaxCharCount. The GetCharCount method generally allows allocation of less memory, while the GetMaxCharCount method generally executes faster.
Data to be converted, such as data read from a stream, can be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, the application should use the Decoder or the Encoder provided by the GetDecoder method or the GetEncoder method, respectively.
ASCIIEncoding does not provide error detection. Any byte greater than hexadecimal 0x7F is decoded as the Unicode question mark ("?").
Caution |
|---|
For security reasons, your application is recommended to use UTF8Encoding, UnicodeEncoding, or UTF32Encoding and enable error detection. |
The following example demonstrates how to decode a range of elements from a byte array and store the result in a set of elements in a Unicode character array.
using namespace System; using namespace System::Text; using namespace System::Collections; int main() { array<Char>^chars; array<Byte>^bytes = {65,83,67,73,73,32,69,110,99,111,100,105,110,103,32,69,120,97,109,112,108,101}; ASCIIEncoding^ ascii = gcnew ASCIIEncoding; int charCount = ascii->GetCharCount( bytes, 6, 8 ); chars = gcnew array<Char>(charCount); int charsDecodedCount = ascii->GetChars( bytes, 6, 8, chars, 0 ); Console::WriteLine( "{0} characters used to decode bytes.", charsDecodedCount ); Console::Write( "Decoded chars: " ); IEnumerator^ myEnum = chars->GetEnumerator(); while ( myEnum->MoveNext() ) { Char c = safe_cast<Char>(myEnum->Current); Console::Write( "[{0}]", c.ToString() ); } Console::WriteLine(); }
Available since 10
.NET Framework
Available since 1.1
