Kaz Kylheku said:
But there are M implementations and N goals, where M >> N.
There are, but as there are a number of different ways of doing
fundamental things, this sort of thing will tend to happen.
Take a "real" string library. There are at least two very different
design approaches here. In one the thing still looks like a C string
(it has the extra information held in memory "left" of the pointer). In
the other the pointer is to a structure.
The first is easier to pass to the existing library functions, but the
address can change when the size changes (and so keeping pointers to
strings around for a while is risky). If a normal string is passed to
the new functions horrible things will happen.
The second is just a bit fiddly to pass to the existing library
functions, but the address is unchanging so copies can hang around. If
you try to pass a normal string to the new functions the compiler will
object. Accessing the contents involves an additional indirection
compared with the first approach - costing small amounts of time and
memory.
It's far from clear to me which of these is "right". I've done the
latter, another contributor here the former.
Ditto there are two major ways (at least, again) to do generic
containers. Are they structures with a void pointer, or are they little
structures that can be embedded in your own?
But whichever you choose, the moment you build on this (say anywhere
that generates a string of unknown length - reading, character set
conversion, (de)compression etc) you then have a library to do that
which is dependent on the first, so you need one of these for each.
With, say, 4 major binary design decisions to be made, that gives you 16
implementations (and, of course, 16 sets of goals).