This documentation is archived and is not being maintained.

UnicodeEncoding.GetByteCount Method (Char[], Int32, Int32)

Calculates the number of bytes produced by encoding a set of characters from the specified character array.

Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)

public override int GetByteCount (
	char[] chars,
	int index,
	int count
)
public int GetByteCount (
	char[] chars, 
	int index, 
	int count
)
public override function GetByteCount (
	chars : char[], 
	index : int, 
	count : int
) : int
Not applicable.

Parameters

chars

The character array containing the set of characters to encode.

index

The index of the first character to encode.

count

The number of characters to encode.

Return Value

The number of bytes produced by encoding the specified characters.

Exception typeCondition

ArgumentNullException

chars is a null reference (Nothing in Visual Basic)(Nothing).

ArgumentOutOfRangeException

index or count is less than zero.

-or-

index and count do not denote a valid range in chars.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

ArgumentException

Error detection is enabled, and chars contains an invalid sequence of characters.

EncoderFallbackException

A fallback occurred (see Understanding Encodings for fuller explanation)

-and-

EncoderFallback is set to EncoderExceptionFallback.

To calculate the exact array size required by GetBytes to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, the application should use GetMaxByteCount. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

NoteNote:

To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.

The following code example demonstrates how to use the GetByteCount method to return the number of bytes required to encode an array of Unicode characters using UnicodeEncoding.

using System;
using System.Text;

class UnicodeEncodingExample {
    public static void Main() {
        // Unicode characters.
        Char[] chars = new Char[] {
            '\u0023', // #
            '\u0025', // %
            '\u03a0', // Pi
            '\u03a3'  // Sigma
        };

        UnicodeEncoding Unicode = new UnicodeEncoding();
        int byteCount = Unicode.GetByteCount(chars, 1, 2);
        Console.WriteLine(
            "{0} bytes needed to encode characters.", byteCount
        );
    }
}

import System.*;
import System.Text.*;

class UnicodeEncodingExample
{
    public static void main(String[] args)
    {
        // Unicode characters.
        char chars[] = new char[] {
            '\u0023', // #
            '\u0025', // %
            '\u03a0', // Pi
            '\u03a3'  // Sigma
        };
        UnicodeEncoding unicode = new UnicodeEncoding();
        int byteCount = unicode.GetByteCount(chars, 1, 2);
        Console.WriteLine("{0} bytes needed to encode characters.", 
            String.valueOf(byteCount));
    } //main
} //UnicodeEncodingExample

Windows 98, Windows Server 2000 SP4, Windows Millennium Edition, Windows Server 2003, Windows XP Media Center Edition, Windows XP Professional x64 Edition, Windows XP SP2, Windows XP Starter Edition

The Microsoft .NET Framework 3.0 is supported on Windows Vista, Microsoft Windows XP SP2, and Windows Server 2003 SP1.

.NET Framework

Supported in: 3.0, 2.0, 1.1, 1.0

.NET Compact Framework

Supported in: 2.0, 1.0

XNA Framework

Supported in: 1.0
Show: