Letters: A B C D .... a b c d .... Numerical characters: 0 1 2 3 4 5 6 7 8 9 Special symbols: ! @ # $ % ^ & * ... |
Text is also represented inside a computer using a code
In other words:
|
In theory, you can assign an arbitrary binary number to a letter, number or special character - as long as the assignment is unique
But in practice (for decoding purpose), it's more efficient to group similar symbols when you assign values to the symbols.
(Meaning: in one computer, 00000001 can mean 'A" while in another 00000001 would mean 'X')
Obviously, this is a nightware when you want to exchange data:
Original text Representation Representation Received as in computer A in computer B 'A' 00000001 -------> 00000001 'X'
The text would become scrambled....
The first international code standard to represent the English Alphabet was:
|
Notice that:
|
Here's tool by Mathias I found on the web that shows the binary code for each symbol:
|
Specifically:
|
I use C because this programming language does not need character <--> integer conversion (casting) operations (Java require a casting operator when you want to perform computation on a character)
Here is the C program - the syntax is very similar to Java and you should be able to follow the gist of the discussion:
// file: /home/cs255001/demo/atoi/ascii.c int main() { char c; while (1) { printf("\n\nEnter a character: "); scanf("%c", &c); // Read in ASCII code from terminal getchar(); printf("\nYou have enter the character: %c\n\n", c); printf("Internally, this character is stored as the binary number: "); printBits(c); printf("\n"); printf("The symbol represented by this code (in ASCII) is: %c\n", c); printf("This is decimal number represent by this code (in 2's compl) is: %hhd\n", c); printf("\nWe can add 1 to this number:\n"); printf(" this number + 1 = %hhd\n", c+1); printf(" it's binary representation = "); printBits(c+1); printf("\n"); printf("The symbol represented by code+1 (in ASCII) is = %c\n", c+1); printf("\n\n"); } } |
The program reads in a character (i.e.: an ASCII code from the terminal and store the ASCII code in the variable c.
Afterwards, the program will use print the value in the variable c in different formats:
(1) printf("The symbol represented by this code (in ASCII) is: %c\n", c); will print the value in c decoded using the ASCII code |
How to run the program:
|
Here is an example output (I entered 'a' and 'b' as input):
cheung@lab0z> ascii Enter a character: a You have enter the character: a Internally, this character is stored as the binary number: 01100001 The symbol represented by this code (in ASCII) is: a This is decimal number represent by this code (in 2's compl) is: 97 // Can you explain why it's 97 ? We can add 1 to this number: this number + 1 = 98 it's binary representation = 01100010 The symbol represented by code+1 (in ASCII) is = b // Can you explain why it's b ? Enter a character: b You have enter the character: b Internally, this character is stored as the binary number: 01100010 The symbol represented by this code (in ASCII) is: b This is decimal number represent by this code (in 2's compl) is: 98 We can add 1 to this number: this number + 1 = 99 it's binary representation = 01100011 The symbol represented by code+1 (in ASCII) is = c |
If you can explain the output of this program, then you have fully realized that ASCII code is a number !!!