dandelion said:
That and the fact that such a shift (in either direction) inevitably
results in 0, i presume. The operation seems pointless. which is
(i suspect) the motivation behind not supporting it for many HW
vendors.
Shifting by zero bits is even more pointless, but that
doesn't seem to have prompted instruction-set designers to
omit the operation (unless they also omit all multi-bit
shifts; I've used machines whose only shift instructions
were single-bit shifts).
Bit positions in CPU instructions are usually a scarce
resource, because the machine can usually be made faster if
its instructions require less memory (it takes fewer cycles
to fetch and decode an instruction that occupies one word
than an instruction requiring three). Given the scarcity of
instruction bits, a designer faced with encoding a shift
distance that "should almost always" lie between 1..31 will
be unlikely to allocate a six-bit field; the "spare" bit can
probably be put to more effective use. And that, I think, is
the motivation for hardware ceilings on shift counts.
(Since the zero-bit shift also seems useless, I imagine a
designer might decide to use the opcode that resembles "shift
by zero" to denote some entirely different operation. I don't
know whether any have done so, but I imagine it might complicate
the instruction decode process and require a bunch of extra
silicon -- it's probably easier to allow the pointless zero-bit
shift than to detect it and recycle its code space for other
purposes.)
I would not call it an "observation" though.
All right, how about "conjecture?" Or would you prefer
"damfoolishness?"