Martijn said:
This is so off-topic, you wouldn't believe it. Although I shouldn't do
this
(because it may encourage others to post OT here as well):
The question is not actually off topic. It's a bit like "what does the add
sign mean?", or "when I multiply two negative numbers the program outputs a
positive, why?".
In pre-computer days a "text character" was defined by the topology of
lines. So a reasonably geometric circles with a hole in it is an "O", a
vertical triangle with a raised lower bar is a "A", and so on.
It is extremely difficult to get computers to use this system. So instead of
using writing pads, we usually use keyboards. Each key generates a character
code - for English you only need about a hundred codes to represent each
character. Internally, the computer uses these codes, almost always ASCII in
an English-speaking environment. That's why a char is usually an 8 bit
integer.
However when it comes to output, humans don't want codes. They want to see
the glyphs. These could easily be stored as bitmaps (rasters giving the dot
pattern of the character) somewhere in the computer. Alternatively you could
hook the computer up to a teletext, in which case the metal key is carved
into the shape of the character. If the computer is being used by a blind
person, you could have a device that converts the ASCII code into a pattern
of raised bumps.
These days fonts tend to be rather sophisticated, with variable pitch, anti
aliasing, kerning, sometimes other features. So the usual answer is that a
fairly complicated program writes the characters to a raster display.
However if you want to implement printf() yourself, an easy way of doing it
is to define each character by an 8 by 8 block. This gives readable output.