This documentation is archived and is not being maintained.

Decimal Implicit Conversion (Char to Decimal)

Converts a Unicode character to a Decimal.

Namespace:  System
Assembly:  mscorlib (in mscorlib.dll)

static implicit operator Decimal (
	wchar_t value
)

Parameters

value
Type: System::Char
The Unicode character to convert.

Return Value

Type: System::Decimal
The converted Unicode character.

The following code example converts Char values (Unicode characters) to Decimal numbers using the Char to Decimal conversion. This conversion is implicit in C#, but requires the op_Implicit operator in Visual Basic and C++. Implicit conversions to Decimal use other methods in these languages.


// Example of the op_Implicit conversion from __wchar_t to Decimal.
using namespace System;
#define formatter "{0,10}{1,15}{2,10:X8}{3,9:X8}{4,9:X8}{5,9:X8}"

// Convert the __wchar_t argument and display the Decimal value.
void DecimalFromChar( __wchar_t argument )
{
   Decimal decValue;
   array<Int32>^bits;

   // The compiler invokes a constructor in the Managed Extensions 
   // for C++ unless op_Implicit is explicitly called.
   decValue = argument;

   // Display the Decimal and its binary representation.
   bits = Decimal::GetBits( decValue );
   Console::WriteLine( formatter, argument, decValue, bits[ 3 ], bits[ 2 ], bits[ 1 ], bits[ 0 ] );
}

int main()
{
   Console::WriteLine( "This example of the op_Implicit conversion from "
   "__wchar_t to Decimal \ngenerates the following output. "
   "It displays the Decimal value and \nits binary "
   "representation.\n" );
   Console::WriteLine( formatter, "__wchar_t", "Decimal value", "bits[3]", "bits[2]", "bits[1]", "bits[0]" );
   Console::WriteLine( formatter, "---------", "-------------", "-------", "-------", "-------", "-------" );

   // Convert __wchar_t values and display the results.
   DecimalFromChar( L'\0' );
   DecimalFromChar( L' ' );
   DecimalFromChar( L'*' );
   DecimalFromChar( L'A' );
   DecimalFromChar( L'a' );
   DecimalFromChar( L'{' );
   DecimalFromChar( L'Æ' );
}

/*
This example of the op_Implicit conversion from __wchar_t to Decimal
generates the following output. It displays the Decimal value and
its binary representation.

 __wchar_t  Decimal value   bits[3]  bits[2]  bits[1]  bits[0]
 ---------  -------------   -------  -------  -------  -------
                        0  00000000 00000000 00000000 00000000
                       32  00000000 00000000 00000000 00000020
         *             42  00000000 00000000 00000000 0000002A
         A             65  00000000 00000000 00000000 00000041
         a             97  00000000 00000000 00000000 00000061
         {            123  00000000 00000000 00000000 0000007B
         Æ            198  00000000 00000000 00000000 000000C6
*/


.NET Framework

Supported in: 4, 3.5, 3.0, 2.0, 1.1, 1.0

.NET Framework Client Profile

Supported in: 4, 3.5 SP1

Portable Class Library

Supported in: Portable Class Library

Windows 7, Windows Vista SP1 or later, Windows XP SP3, Windows XP SP2 x64 Edition, Windows Server 2008 (Server Core not supported), Windows Server 2008 R2 (Server Core supported with SP1 or later), Windows Server 2003 SP2

The .NET Framework does not support all versions of every platform. For a list of the supported versions, see .NET Framework System Requirements.
Show: