I agree with you, Arne.
"Scalability" is usually understood as the ability to get (reasonably)
close to n times performance increase of a component by increasing its
resources by n. More generally, it's the ability to adjust the
performance of a software system by adjusting resources without changing
code. There is no concept of "high-end" or "low-end" scalability that
I've ever encountered. There's only "scalability".
If a system is scalable, it can be adjusted readily for high or low
workloads and achieve its performance goals.
The number of tiers in the system is no determinant of scalability, nor
of manageability, at any level.
Thanks for shedding some light, Arne and Lew.
So scalability is commonly used for measurements of greater N, and not
of lower. But wouldn't the study of the latter be relevant, too?
Let's take the following simple and perhaps somewhat contrived example.
Assume a component in a piece of software that serves some function
whose memory usage is linear and that, due to considerations as to, say,
the minimum load it expects to encounter, allocates itself a
corresponding amount of memory. Now put that component in a production
environment in which it only ever sees at most half the load it had
expected. Clearly, half of the memory it is hogging would be wasted. As
such, this component does not scale for lower N: as soon as it reaches
the minimum load it expects, it ceases to scale.
There's a similar and more realistic example with complexity. Greater
complexity of a piece of software is generally expressed through more
code, more data structures, more classes. Put a complex system in a
situation where only a fraction of its capabilities are used, and if the
capabilities are not lazily loaded, you've got waste. This would suggest
that the more features something has, the lesser its low-end scalability
is (I'll still use that term, for want of a better one).
Lastly, so as not to be speaking only about software, imagine a piece a
hardware that, being designed as a workhorse, consumes a certain minimal
amount of electricity, but gets put in an environment where it only sees
sporadic use.
In all these cases, you could say: well, you've got the wrong tool for
the job. And that is certainly right. However, standardisation is
effective. In a capitalist system, out of two producers of the same IT
good, the one whose product would match the need of the most consumers
would win over the other -- all others things being equal. One way for
the product to match many needs would be by being able both to increase
as well as decrease its scale smoothly. Consequently, we as the people
who are designing, building or using these things are likely to be
confronted with this issue.
df.