Returns an integer representing the Unicode encoding of the character at the specified location.
Required. Any String object or literal.
Required. Zero-based index of the desired character. Valid values are between 0 and the length of the string minus 1.
The first character in a string is at index 0, the second is at index 1, and so forth.
If there is no character at the specified index, NaN is returned.
The following example illustrates the use of the charCodeAt method.
var str = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"; //Initialize variable.
return str.charCodeAt(n - 1); //Return Unicode value of the character.
Supported in the following document modes: Quirks, Internet Explorer 6 standards, Internet Explorer 7 standards, Internet Explorer 8 standards, Internet Explorer 9 standards. See Version Information.