Crosspost, but do not multipost! Do not shout! (Do not use
uppercase without need.)
»Space« and »blank« are just two different names for the
same character.
Oh, I didn't think of that (that the OP could have a misconception
here). Hm.
A hexadecimal numeral is just used for
textual I/O, not for storage. Storage eventually always
uses sequences of bits.
When there is onely one character »space« to be stored, but
no other characters, no runtime storage bits are needed in
this special case. The program shows an instance r of
»struct representation« storing a space with 0 bits.
#include <iostream>
#include <ostream>
#include <cstdlib>
struct representation {};
::std:

stream & operator<<
( ::std:

stream & stream, representation const & )
{ stream << ' '; return stream; }
int main()
{ representation r;
::std::cout << sizeof r << '\n';
::std::cout << '[' << r << ']' << '\n'; }
(»sizeof r« was intended to be 0, but it is 1 here.)
Well, every most derived object has size >0, in order to have a unique
address. Further in that direction, the standard guarantees that you get
a unique pointer when you allocate an array of dynamic size 0.
However, a base class sub-object can have size 0. This is known as the
"empty base class optimization", and as I recall it's used for e.g.
std::function and possibly std::unique_ptr (size dependent on the type
of deleter). I don't exactly recall the detailed rules and limitations.
I /think/ that your idea here was in the direction of the
information-theoretical, of bits as a measure of the information content.
Just to recap that, with n ordinary bits you have 2^n bitpatterns, and
so can represent a choice from 2^n possibilities. Turning that around,
if there are N equally likely possibilities for a choice, then you need
log(N)/log(2) bits to represent the choice. And around 1946 Claude
Shannon generalized that and said that instead of N possibilities, we
can think of probability of occurrence 1/N (which is roughly just
another view of the same situation), which then says that one needs
log(N)/log(2) bits to represent the occurrence of something that has
probability 1/N, or just, -log(P)/log(2) bits needed for probability P,
which then can be viewed as if the occurrence of something with
probability P carries -log(P)/log(2) bits of information.
The O(n log n) limit for sorting then follows directly by considering
sorting to be a choice from n! permutations -- and so on.
The whole thing getting much less clear again when one considers how
probability depends on knowledge.
Back to the program, if it always produces the same text, then that text
conveys no information and so corresponds to 0 info-bits. Which means
that ideally it can be reduced to 0 real bits, or thereabouts. Which
might seem impossible, but since the text is then well known it can be
denoted by a single symbol, or even just by the execution of the program
with no output.
And that again getting much less clear when one considers a program that
produces text by means of a /system/, such as repeating a character N
times. Then the question is, how much can one reduce the total size of
the code + data? And turning that around, and asking that of a program
to produce some given pattern, might be a better measure of information
content, but AFAIK we do not yet have any coherent theory of this side
of things (Stephen Wolfram made some beginnings).
Anyway, upshot, you can't get away with zero storage for producing a
single space character; it might seem that you might get close to zero
storage for the limit of average storage per character when you let the
program produce N of them; but then you're really getting into the
problem of information content, how much that's an artifact of the
pattern system's match to the pattern producing device (C++ program).
Cheers & hth.,
- Alf