Export (0) Print
Expand All

Decimal.Implicit Operator

Converts a Unicode character to a Decimal.

Namespace:  System
Assembly:  mscorlib (in mscorlib.dll)

public static implicit operator decimal (
	char value
)

Parameters

value
Type: System.Char

A Unicode character.

Return Value

Type: System.Decimal
A Decimal that represents the converted Unicode character.

The following code example converts Char values (Unicode characters) to Decimal numbers using the Char to Decimal conversion. This conversion is implicit in C#, but requires the op_Implicit operator in Visual Basic and C++. Implicit conversions to Decimal use other methods in these languages.

// Example of the implicit conversion from char to decimal. 
using System;

class DecimalFromCharDemo
{
    const string formatter = 
        "{0,6}{1,15}{2,10:X8}{3,9:X8}{4,9:X8}{5,9:X8}";

    // Convert the char argument and display the decimal value. 
    public static void DecimalFromChar( char argument )
    {
        decimal decValue;
        int[ ]  bits;

        // Display the decimal and its binary representation.
        decValue = argument;
        bits = decimal.GetBits( decValue );

        Console.WriteLine( formatter, argument, decValue, 
            bits[ 3 ], bits[ 2 ], bits[ 1 ], bits[ 0 ] );
    }

    public static void Main( )
    {
        Console.WriteLine( 
            "This example of the implicit conversion from char to " +
            "decimal generates the \nfollowing output. It displays " +
            "the decimal value and its binary representation.\n" );
        Console.WriteLine( formatter, "char", 
            "decimal value", "bits[3]", "bits[2]", 
            "bits[1]", "bits[0]" );
        Console.WriteLine( formatter, "----", 
            "-------------", "-------", "-------", 
            "-------", "-------" );

        // Convert char values and display the results.
        DecimalFromChar( '\0' );
        DecimalFromChar( ' ' );
        DecimalFromChar( '*' );
        DecimalFromChar( 'A' );
        DecimalFromChar( 'a' );
        DecimalFromChar( '{' );
        DecimalFromChar( 'Æ' );
    }
}

/*
This example of the implicit conversion from char to decimal generates the
following output. It displays the decimal value and its binary representation.

  char  decimal value   bits[3]  bits[2]  bits[1]  bits[0]
  ----  -------------   -------  -------  -------  -------
                    0  00000000 00000000 00000000 00000000
                   32  00000000 00000000 00000000 00000020
     *             42  00000000 00000000 00000000 0000002A
     A             65  00000000 00000000 00000000 00000041
     a             97  00000000 00000000 00000000 00000061
     {            123  00000000 00000000 00000000 0000007B
     Æ            198  00000000 00000000 00000000 000000C6
*/

Windows 7, Windows Vista, Windows XP SP2, Windows XP Media Center Edition, Windows XP Professional x64 Edition, Windows XP Starter Edition, Windows Server 2008 R2, Windows Server 2008, Windows Server 2003, Windows Server 2000 SP4, Windows Millennium Edition, Windows 98, Windows CE, Windows Mobile for Smartphone, Windows Mobile for Pocket PC, Xbox 360, Zune

The .NET Framework and .NET Compact Framework do not support all versions of every platform. For a list of the supported versions, see .NET Framework System Requirements.

.NET Framework

Supported in: 3.5, 3.0, 2.0, 1.1, 1.0

.NET Compact Framework

Supported in: 3.5, 2.0, 1.0

XNA Framework

Supported in: 3.0, 2.0, 1.0

Community Additions

ADD
Show:
© 2015 Microsoft