We were doing the intro lesson to the internet simulator, and a student asked me why the computer even needed the decimal number, and why couldn’t it just take the binary and translate that directly into text. I tried explaining that the decimal level of abstraction is necessary because the computer does not know whether you, the user, want the decimal number, or want the English character, and a higher level of abstraction takes the decimal number and translates that to the character for you.
But he said he didn’t get why the computer couldn’t just go straight from binary to English, and I spent 10 minutes stumbling over an explanation before I said I’d get back to him.
Can anyone provide a more detailed explanation, or resource, to help explain this? Thanks!