This documentation is archived and is not being maintained.

# Decimal Implicit Conversion (Char to Decimal)

Visual Studio 2010

Converts a Unicode character to a Decimal.

Namespace:  System
Assembly:  mscorlib (in mscorlib.dll)

## Syntax

```public static implicit operator decimal (
char value
)
```

#### Parameters

value
Type: System.Char
The Unicode character to convert.

#### Return Value

Type: System.Decimal
The converted Unicode character.

## Examples

The following code example converts Char values (Unicode characters) to Decimal numbers using the Char to Decimal conversion. This conversion is implicit in C#, but requires the op_Implicit operator in Visual Basic and C++. Implicit conversions to Decimal use other methods in these languages.

```
// Example of the implicit conversion from char to decimal.
using System;

class DecimalFromCharDemo
{
const string formatter =
"{0,6}{1,15}{2,10:X8}{3,9:X8}{4,9:X8}{5,9:X8}";

// Convert the char argument and display the decimal value.
public static void DecimalFromChar( char argument )
{
decimal decValue;
int[ ]  bits;

// Display the decimal and its binary representation.
decValue = argument;
bits = decimal.GetBits( decValue );

Console.WriteLine( formatter, argument, decValue,
bits[ 3 ], bits[ 2 ], bits[ 1 ], bits[ 0 ] );
}

public static void Main( )
{
Console.WriteLine(
"This example of the implicit conversion from char to " +
"decimal generates the \nfollowing output. It displays " +
"the decimal value and its binary representation.\n" );
Console.WriteLine( formatter, "char",
"decimal value", "bits[3]", "bits[2]",
"bits[1]", "bits[0]" );
Console.WriteLine( formatter, "----",
"-------------", "-------", "-------",
"-------", "-------" );

// Convert char values and display the results.
DecimalFromChar( '\0' );
DecimalFromChar( ' ' );
DecimalFromChar( '*' );
DecimalFromChar( 'A' );
DecimalFromChar( 'a' );
DecimalFromChar( '{' );
DecimalFromChar( 'Æ' );
}
}

/*
This example of the implicit conversion from char to decimal generates the
following output. It displays the decimal value and its binary representation.

char  decimal value   bits[3]  bits[2]  bits[1]  bits[0]
----  -------------   -------  -------  -------  -------
0  00000000 00000000 00000000 00000000
32  00000000 00000000 00000000 00000020
*             42  00000000 00000000 00000000 0000002A
A             65  00000000 00000000 00000000 00000041
a             97  00000000 00000000 00000000 00000061
{            123  00000000 00000000 00000000 0000007B
Æ            198  00000000 00000000 00000000 000000C6
*/

```

## Version Information

#### .NET Framework

Supported in: 4, 3.5, 3.0, 2.0, 1.1, 1.0

#### .NET Framework Client Profile

Supported in: 4, 3.5 SP1

#### Portable Class Library

Supported in: Portable Class Library

## Platforms

Windows 7, Windows Vista SP1 or later, Windows XP SP3, Windows XP SP2 x64 Edition, Windows Server 2008 (Server Core not supported), Windows Server 2008 R2 (Server Core supported with SP1 or later), Windows Server 2003 SP2

The .NET Framework does not support all versions of every platform. For a list of the supported versions, see .NET Framework System Requirements.