osmium said:
The [] is much more clear that at.
A reasonable subjective argument can be made for that (or for the
contrary). I would imagine it depends a great deal on the individual
reading the code.
A well designed language would check for
bounds errors when using the [] notation.
This too is very subjective. What do you mean by "well-designed?"
Given that a programming language is created with certain design
principles involved, certain goals to accomplish, is it fair to call
something poorly designed if you disagree with the selection of
principles and goals? In C++, you don't pay (in terms of time and
space cost at runtime) for what you don't use. There's no free lunch
-- you can't have checked access and have it be as fast as unchecked
access. Inside the body of a tight loop, or another
performance-critical context, this overhead may be both unnecessary and
unacceptably expensive.
You might argue that, while unchecked access is a good facility to have
available, checked access should be the common case, and therefore
provide what you see as the clearest syntax. However, remember that
another important design goal of C++ has always been maximal
compatibility with C; it is because C++ is compatible with C that it
has come into such wide use. The vast majority of programming
languages, "well-designed" or not, languish in obscurity. The []
element access syntax comes from arrays, and for arrays it is an
unchecked operation. Hence, the most consistent scheme is to have the
unchecked operations for other containers use the same syntax.
So, with this in mind, do you really feel it's a bad design, or just
disagree with the priorities involved in choosing design criteria?
It would also have the option of
turning off checking on production builds.
That negates much of the benefit. Non-production code is only run on
whatever test cases you invent (and take the time to implement).
Production use is typically far more exhaustive, and as such may
uncover cases not reached in testing. If the access is unchecked in
those cases, you're no better off for having had checked access in your
tests. When you're in a nice, safe testing sandbox, undefined behavior
may take longer to diagnose than a handled error, but for the most part
the result is the same. In production, undefined behavior can be
catastrophic -- for a lot of C++ code, lives depend on not invoking
undefined behavior.
Better to use checked access in general, and make careful use of
unchecked access when a specific optimization need is discovered during
performance analysis. Fortunately, for those who use the STL
effectively and as intended, it's rare to have to use operator[] or
at() in any case -- that's why we have iterators and standard
algorithms. You can bet that your local for_each implementation isn't
using checked access, but it's small and tight and closely-scrutinized,
and it works great 100% of the time.
All problems solved, nothing to discuss.
That's naive (and rather arrogant). You made your points, based on
your subjective opinion; that doesn't entitle you to declare the
discussion closed and take a victory lap.
Luke