I've personally used ones' complement and signed magnitude
(decimal!) computers.
I've used a 1's-C computer and a signed magnitude one -- almost 30 years
ago! I haven't used or even seen either since the late '70s, though.
Do Burroughs still make computers? I think theirs was the S/M one.
Admittedly, it was a while ago. But the
fact that I've seen them dwindle away doesn't suggest to me that
they're gone forever; rather, it suggests that techniques in the
computer industry are subject to change. The assumption that
things will remain forever the way they happen to be today has
not been tenable up to now; are you confident that Change has
come to its end?
Any change which means that a signed value can't be cast to its unsigned
equivalent and the back would, I think, break a lot of code. Yes,
things might change (like using balanced ternary) but it would break an
enormous amount of code. The change from BCD to binary was bad enough
(and there are still BCD based languages).
Or as a colleague of mine likes to say, "We work in a
fashion-driven industry."
After a fashion <g>. Although some of the 'fashions' have lasted a long
time for such things, the 8 bit byte for instance (I suspect that almost
all of several major operating systems plus their utilities would have
to be rewritten for a different-sized byte). Even Unicode is only
gradually becoming accepted.
Preparing for every conceivable raising or lowering of
computers' hemlines carries a cost, and failing to be prepared
carries a risk. In the context of a given project you may
well decide that the risk is too small and the cost too large.
That's fine; that's part of what engineering is about. But an
implicit decision that all risks are zero is just as foolhardy
as a decision that all costs are justified.
I didn't say anything about such a decision. I'm looking at risk
assessment -- is it really worth writing code which will be inefficient
and hard to maintain in order to cope with a possible hole which the
standard allows but no one is likely to implement that way? Is the
probability of someone producing a system which breaks a lot of code
higher than that of the next C standard breaking code? (Anyone who used
a variable called 'restrict' or 'inline' will have run foul of that in
C99).
BTW, what are your thoughts on the C0x Committee's decision
to allow balanced ternary integers? ;-)
I'd like to see how they propose to square it with all the references to
'binary' in the C specification <g>. Yes, it is possible to emulate bit
operations using b-tits[1] but C as we know it would not be an efficient
language to program such a machine...
[1] Ternary digits ought to be called tits. If they aren't someone was
slipping when they were named[2]...
[2] Robert A. Heinlein used ternary in some of his futuristic computers.
Knowing his proclivities, how did he miss calling them tits?
Chris C