This documentation is archived and is not being maintained.

Decimal.op_Implicit Method (Char)

Converts a Unicode character to a Decimal.

Namespace: System
Assembly: mscorlib (in mscorlib.dll)

public static implicit operator decimal (
	char value
)
Not applicable.

Parameters

value

A Unicode character.

Return Value

A Decimal that represents the converted Unicode character.

The following code example converts Char values (Unicode characters) to Decimal numbers using the Char to Decimal conversion. This conversion is implicit in C#, but requires the op_Implicit operator in Visual Basic and C++. Implicit conversions to Decimal use other methods in these languages.

// Example of the implicit conversion from char to decimal.
using System;

class DecimalFromCharDemo
{
    const string formatter = 
        "{0,6}{1,15}{2,10:X8}{3,9:X8}{4,9:X8}{5,9:X8}";

    // Convert the char argument and display the decimal value.
    public static void DecimalFromChar( char argument )
    {
        decimal decValue;
        int[ ]  bits;

        // Display the decimal and its binary representation.
        decValue = argument;
        bits = decimal.GetBits( decValue );

        Console.WriteLine( formatter, argument, decValue, 
            bits[ 3 ], bits[ 2 ], bits[ 1 ], bits[ 0 ] );
    }

    public static void Main( )
    {
        Console.WriteLine( 
            "This example of the implicit conversion from char to " +
            "decimal generates the \nfollowing output. It displays " +
            "the decimal value and its binary representation.\n" );
        Console.WriteLine( formatter, "char", 
            "decimal value", "bits[3]", "bits[2]", 
            "bits[1]", "bits[0]" );
        Console.WriteLine( formatter, "----", 
            "-------------", "-------", "-------", 
            "-------", "-------" );

        // Convert char values and display the results.
        DecimalFromChar( '\0' );
        DecimalFromChar( ' ' );
        DecimalFromChar( '*' );
        DecimalFromChar( 'A' );
        DecimalFromChar( 'a' );
        DecimalFromChar( '{' );
        DecimalFromChar( '' );
    }
}

/*
This example of the implicit conversion from char to decimal generates the
following output. It displays the decimal value and its binary representation.

  char  decimal value   bits[3]  bits[2]  bits[1]  bits[0]
  ----  -------------   -------  -------  -------  -------
                    0  00000000 00000000 00000000 00000000
                   32  00000000 00000000 00000000 00000020
     *             42  00000000 00000000 00000000 0000002A
     A             65  00000000 00000000 00000000 00000041
     a             97  00000000 00000000 00000000 00000061
     {            123  00000000 00000000 00000000 0000007B
                 198  00000000 00000000 00000000 000000C6
*/

Windows 98, Windows Server 2000 SP4, Windows CE, Windows Millennium Edition, Windows Mobile for Pocket PC, Windows Mobile for Smartphone, Windows Server 2003, Windows XP Media Center Edition, Windows XP Professional x64 Edition, Windows XP SP2, Windows XP Starter Edition

The Microsoft .NET Framework 3.0 is supported on Windows Vista, Microsoft Windows XP SP2, and Windows Server 2003 SP1.

.NET Framework

Supported in: 3.0, 2.0, 1.1, 1.0

.NET Compact Framework

Supported in: 2.0, 1.0

XNA Framework

Supported in: 1.0
Show: