Arithmetic overflow checking

M

Martin Gregorie

"MikeP"<[email protected]> said:
Patricia Shanahan wrote: [...]
No, I was not really joking, though I did not attempt to find all the
languages that would meet the stated requirement.

Don't look now, but if you weren't joking, then you recommended Ada to
a Java programmer! Oh my.

I often suggest Ada to Java programmers; knowledgeable Java programmers
often return the favor; I've learned a lot that way.

All Java developers should master at least one language from the
Pascal/Modula-2/Ada family.
Add Algol 68 to that list. Compilers and runtimes are available and free.

I knew Algol 60 and a smattering of Pascal when I learnt it.

A68 was a huge eye-opener in terms of expressiveness and its ability to
handle variable record structures that I thought could only be
successfully handled in assembler: this was in pre-C days. For George 3
hands, these were G3 job accounting records. Others should think ASN.1
 
A

Andreas Leitgeb

supercalifragilisticexpialadiamaticonormalizeringelimatisticantations said:
In four words: Lack of operator overloading.
Math on non-primitive types is *painful* in Java.

agreeCount = agreeCount.plus(AgreeCount.ONE)
on painfulness of non-primitive math.
 
M

Martin Gregorie

Sure (assuming you're talking about the cost of adding that feature to
javac and JVM etc), I'm speaking hypothetically.

@IntegerOverflowExceptions(true)
int c = a + b;
@IntegerOverflowExceptions(false)

It looks horrid to me, and presumably there's lots of issues to
overcome, better syntax to achieve the same ends, but essentially, if
Sunacle felt it worthwhile, something could be done to introduce new
behaviours whilst making them optional for existing code that those
behaviours would break?
Agreed, but there's another way to handle this in Java that would
probably cater for most overflow and out of range requirements without
affecting existing programs or the language at all:

Just add range checking constructors to Integer and friends, e.g:

Integer boundedValue = new Integer(0,1,10);

to declare an Integer, initialised to zero and with the range 1:10. The
overheads would be minimal if the checks were only enabled if limits were
added. It could either throw an unchecked exception or provide a range
checking method:

if (!boundedValue.isInRange()) {...}

to check the current value or the possibly more useful:

if (boundedValue.setChecked( ( a + b ) / c )) {...}
 
D

David Lamb

s/those three shitty languages/Lisp and I'll agree with you. :)

That raises a non-smiley point; if you want to expand programmers' minds
about alternate language possibilities, it seems to me that learning
Prolog (logic programming) and Haskell (functional programming) would do
a lot more than another procedural language. I seem to recall at least
one comment about some XML-related Java package that it was bogus
because it required people to understand functional programming; that
comment made me sad.
 
M

Martin Gregorie

Talking about Java specifically, SE or EE, there is a great deal of
advance warning. You're typically going to have close to a decade of
warning that you need to make certain modifications, if a version at
some point introduces backwards incompatibilities. This is more than
enough time.
Arguably, the delay in removing deprecated features (which amounts to the
same thing) is a waste of time: Y2K showed us that, with absolutely
nothing being done ahead of time in most shops, the result was the
expensive last minute panic.

In fact, you can make the opposite argument: delay too long and there
will be nobody left who remembers which programs might be affected or
even that there is a problem.
 
T

Tom McGlynn

On Jul 21, 8:43 am, Martin Gregorie <[email protected]>
wrote:
....
Agreed, but there's another way to handle this in Java that would
probably cater for most overflow and out of range requirements without
affecting existing programs or the language at all:

Just add range checking constructors to Integer and friends, e.g:

        Integer boundedValue = new Integer(0,1,10);

to declare an Integer, initialised to zero and with the range 1:10. The
overheads would be minimal if the checks were only enabled if limits were
added. It could either throw an unchecked exception or provide a range
checking method:

How would this help given that Integers can't change anyway? It can
only check that the initialization value is in range. You'd have to
somehow make these Integers 'sticky' in some way so that any
expression using one of them is also subject to bounds checking.

E.g.,
Integer firstValue = new Integer(Integer.MAX_VALUE, 0,
Integer.MAX_VALUE);
Integer nextValue = new Integer(firstValue+Integer.MAX_VALUE+3,
0, 10);

clearly overflows, but it would work fine unless something significant
were added to the language.

Regards,
Tom McGlynn
 
L

lewbloch

RedGrittyBrick said:
lewbloch said:
RedGrittyBrick wrote:
No.  Not if you want some of the new features.  [...]
You also have to make sure that all your third-party libraries for
that build conform to the older spec.
Targeting an old version is an all-or-nothing proposition.

All good points but I feel your "No." is more like a "Yes, but".

It's a fair cop.
If everyone at Oracle went mad and Java 1.9 changed the language spec so
that integer arithmetic could throw an IntegerOverflowException,
presumably they could arrange things in the compiler (and JVM) so that
the `-source 1.5` option would compile the code such that integer
operations would never throw an IntegerOverflowException thus satisfying
Arved's objection - couldn't they?

Sure. I present the obstacles not as stoppers but as things for which
the designer(s) of such a change must be responsible.

Yes, and they would have to. The issue is with the statement that
"Java should have feature /X/" without any concept of the challenges
that must be addressed.

If the challenges are too expensive to overcome, or lack sufficient
widespread utility, or are too foreign to the ethos of the language,
then it might be best to reject the feature even if there does exist a
good use case for it somewhere.

Patricia and Arved, among others, have made the point that focusing on
changes to Java when there are suitable alternative languages (that
even run on the JVM!) is not always optimal. Why change Java if
Javascript or Ruby or Scala can do the job for you right now?

And what about the suggestion to write a CheckedInteger type that does
what you need? You can get what you need without waiting for some
potential future revolution in Java, and without leaving the
comforting security of your favorite language.
Sure (assuming you're talking about the cost of adding that feature to
javac and JVM etc), I'm speaking hypothetically.

    @IntegerOverflowExceptions(true)
    int c = a + b;
    @IntegerOverflowExceptions(false)

It looks horrid to me, and presumably there's lots of issues to
overcome, better syntax to achieve the same ends, but essentially, if
Sunacle felt it worthwhile, something could be done to introduce new
behaviours whilst making them optional for existing code that those
behaviours would break?

Whilst I personally (think I) have no need for arithmetic overflow
checking of this sort, I imagine some clever language designers could
provide for it in a way that didn't break existing crypto libraries that
rely on such overflows being ignored.

Perhaps I'm wrong, but people seem to be saying "you just can't do this
because it would break existing code" - I haven't grasped why those
people are sure this is the case.

That's not my claim. My claim is that you just can't do this
irresponsibly lest you break existing code. I also believe that
others' suggestions in this thread can solve the problem immediately
without requiring Draconian and slow-to-arrive changes in the
language.
Despite the above, I'm in favour of keeping the language relatively simple.

That's especially valid given that the language supports the desired
functionality as long as you're willing to be a programmer and write
the necessary code. I especially endorse Arved's comment:
 
L

lewbloch

That raises a non-smiley point; if you want to expand programmers' minds
about alternate language possibilities, it seems to me that learning
Prolog (logic programming) and Haskell (functional programming) would do
a lot more than another procedural language. I seem to recall at least
one comment about some XML-related Java package that it was bogus
because it required people to understand functional programming; that
comment made me sad.

+1
 
A

Andreas Leitgeb

lewbloch said:
And what about the suggestion to write a CheckedInteger type that does
what you need?

That has been answered already, but you may have missed it, or maybe
blocked the one who answered this seriously and (imho) agreeably.

Due to Java's lack of operator overloading, doing Math with
non-primitive types is just painful.

PS: my approach would be a new keyword like strictfp (which may modify
floating point semantics compared to not-applying it),
just one for modifying integral semantics: arithmetic overflow checks
and runtime checks on cast to narrower types. Just like use of
"strictfp" can be expected to slow down programs, the same would be
acceptable for an integral pendant.
 
L

lewbloch

That has been answered already, but you may have missed it, or maybe
blocked the one who answered this seriously and (imho) agreeably.

Due to Java's lack of operator overloading, doing Math with
non-primitive types is just painful.

I saw that suggestion, but painful != impossible. And how freaking
"painful" is it to read method calls anyway? "Painful" is digging
ditches, tarring roofs, smelting steel, even working the floor at your
neighborhood mall anchor store. All a programmer has to do is read
method calls and do some typing. People need to get over themselves.

So for those who "can't" do range checking because "method calls are
too 'painful'" - Shut The Front Door, whiner!

Jesus H. Tap-dancing Christ!

Or get out of programming and get a real job.

Do keep agitating for a better way. Just don't cavil that there's no
way to do it now, because there is.

Amazing. Yeesh.
 
J

Joshua Cranmer

I saw that suggestion, but painful != impossible. And how freaking
"painful" is it to read method calls anyway? "Painful" is digging
ditches, tarring roofs, smelting steel, even working the floor at your
neighborhood mall anchor store. All a programmer has to do is read
method calls and do some typing. People need to get over themselves.

No, painful is realizing that your attempts to make something work all
fail because you need to put together two libraries that need different
versions of the same library. Next step, writing a script to extract the
necessary symbols from library v2 but not in v1, dump those out as
assembly, and copy them over.

I'd love to have the pain that comes with "I need to write out `+' as
`plus'" right now.
 
H

Henderson

Arguably, the delay in removing deprecated features (which amounts to the
same thing) is a waste of time: Y2K showed us that, with absolutely
nothing being done ahead of time in most shops, the result was the
expensive last minute panic.

In fact, you can make the opposite argument: delay too long and there
will be nobody left who remembers which programs might be affected or
even that there is a problem.

One problem with removing deprecated features, at least from Snoracle's
standpoint, is that organizations will be even more reluctant to update
Java if doing so might require extensive modifications to all their
stuff's source code.
 
A

Arne Vajhøj

I agree 100 percent.

I'll add this observation: this state of affairs is largely a result of
the mediocrity of most programmers. The pressure to conform to a very
few mainstream languages - and there is real pressure to this effect,
unless you are dabbling, or are in some odd niche - may come from
managers, from business, from customers, or from developers themselves -
ultimately stems from this pervasive mediocrity. And this state of
affairs will not change so long as software development remains
unprofessional.

How many languages should be in a programmer's toolkit? Well, at least
half a dozen. Preferably a dozen. These would be languages that cover
the entire spectrum, and that the programmer is at least competent in.

To add insult to injury you don't even often see most mainstream
programmers taking advantage of the realistic constrained possibilities
offered by real-world working environments. For example, developers who
usually will find themselves working with .NET on the CLR, or Java SE/EE
on the JVM. The C# programmers don't often consider that maybe judicious
interop with some F# code will be a better solution, or that they could
contemplate IronRuby on the DLR, and you don't often see enterprise Java
programmers (or their bosses) willing to think of using some Scala or
Clojure or Groovy. And the practise of taking advantage of one's larger
platform, and writing shell or Powershell scripts (or Python or whatever
programs) to handle other tasks connected with a larger project in Java,
is both frowned upon and rare.

The blanket excuse used to justify all this is standardization of
skillsets. Although candid hirers and managers will tell you that this
is mandated by widespread mediocrity. They acknowledge that a very good
programmer does do better if they can choose their tools, but they are
worried about the ability of 90 percent of the developers out there to
maintain and extend the code that the good programmers write.

This is the real world, and it'll take a long time to change it.

You can hardly blame the managers from making decisions on tools
based on what their developers actually know instead of what they
should know.

mediocre developers => mediocre code

But if we say that there are 10 million developers of which 2 million
is good, 6 million is mediocre and 2 million is hopeless, then just
getting the majority to e good will require 3 million good developers.
They will not show up tomorrow by magic.

Arne
 
A

Arne Vajhøj

On Sun, 10 Jul 2011 10:53:09 -0400, David Lamb wrote:

On 08/07/2011 12:30 AM, Eric Sosman wrote:
On 7/7/2011 8:51 PM, Peter Duniho wrote:
[...]
I would not worry about the "simple" or "efficient" criteria. IMHO,
if one is deciding to apply overflow checking to every computation,
one has already abandoned the hope of efficiency.

I've used machines that raised overflow traps "for free,"
...
(The machines I speak of were from forty-odd years ago

When microprocessors started to arrive on the scene, a lot of
old-timey hardware folks said they'd forgotten 30+ years of hardware
design. When operating systems for computers based on said processors
came out, a lot of old-timey software folks said they'd forgotten 30+
years of operating system design. We seem to still be suffering the
consequences.

That happened not once, but twice.

The first great leap backward was the minicomputer era, when the likes
of the PDP-8 arrived with a single user, single tasking OS reminiscent
of early computers, except they generally had teletypes instead of
banks of switches and flashing lights. By then the better mainframes
were multi- user, multitasking beasts.

Then the first microcomputers arrived in the mid/late '70s. By this
time the better minis had multi-tasking operating systems, but micros
had re- implemented the earliest mini OSes - CP/M was near as dammit a
copy of the old PDP-8 OS (RSTS?) from the late 60s - and the earliest
micros even had switches and flashing lights (KIM-1, IMSAI 8080). By
1980 the minis were running UNIX but the latest and greatest micros had
- drumroll - MS- DOS!
Only twice? Aren't you forgetting "smart" phones. One of the great
advances in Android is (Drum roll!) multitasking!!!
They don't count since, unlike minis and micros, their builders didn't
retreat to the techno-stone age, ignore progress made to date, and build
primitive OS by rubbing (metaphorical) sticks together.

AFAIK all smartphones started an a more advanced level because they
inherited better operating systems. IIRC these all originated on
electronic memo pads such as Psion, HP and Palm Pilot made, and were all
a lot more advanced than the likes of RSTS, CP/M, Flex09, etc. Leastwise,
I don't think you can consider Symbian and whatever MS was calling the
iPAQ OS at that stage any more primitive than the contemporary versions
of MacOS, OS/2 or even Windows, though admittedly they were rather behind
UNIX and its distant relations such as OS-9/68K.

If they don't support multi-tasking I would say that they in at least
one aspect is behind the desktop OS'es.

(how important multitasking is on a smartphone is a different
discussion)

Arne
 
A

Arne Vajhøj

Not necessarily. If a rocket ends up being destroyed as a
result, having the computing go a bit slower to save having to build
another rocket would have been more efficient.

I am pretty sure that Peter was talking about the efficiency of
the computer program.
Unfortunately, this is
not a made-up example. See:
http://en.wikipedia.org/wiki/Ariane_5_Flight_501
In the subsequent investigation, the cause of the problem was
recreated.

It was an integer overflow. But the real problem was a bad software
process. If there had not been an integer overflow there could have
been many other types of problems.
Turn on those run-time checks unless speed *REALLY* is of
paramount importance. It usually is not.

I would agree with that.

Arne
 
A

Arne Vajhøj

I don't like the term "fanboy" because it suggests it is an immature
male tendency. The research does not support that. I'm definitely a
fanwoman when it comes to the desirability of programming as a career,
but have never fallen in love with a programming language.

The liking and disliking of programming languages has some objective
considerations but also some subjective feelings.

It is part of being a professional to act based on the objective
part - how well does the language fit with the task at hand.

But they can still have some personal preferences about what they
think are elegant languages and what is not.

To make a car analogy (!): if you are buying cars to a fleet, then
there are some objective facts like price, space, safety, fuel
economy etc. and the decision should be made on those. But you
may still have a personal preference that the instrumentation
in a Ford is easier to work with and that blue looks better than
other colors.

Arne
 
A

Arne Vajhøj

That raises a non-smiley point; if you want to expand programmers' minds
about alternate language possibilities, it seems to me that learning
Prolog (logic programming) and Haskell (functional programming) would do
a lot more than another procedural language. I seem to recall at least
one comment about some XML-related Java package that it was bogus
because it required people to understand functional programming; that
comment made me sad.

Calling some languages shitty just mean that the post will be ignored.

It is true that there are languages that are more different from Java
than those, but they still have a different approach to type safety
that in my opinion is very relevant.

Arne
 
A

Arne Vajhøj

Patricia Shanahan wrote:
[...]
No, I was not really joking, though I did not attempt to find all the
languages that would meet the stated requirement.

Don't look now, but if you weren't joking, then you recommended Ada to
a Java programmer! Oh my.

I often suggest Ada to Java programmers; knowledgeable Java programmers
often return the favor; I've learned a lot that way.

All Java developers should master at least one language from the
Pascal/Modula-2/Ada family.
Add Algol 68 to that list. Compilers and runtimes are available and free.

I knew Algol 60 and a smattering of Pascal when I learnt it.

A68 was a huge eye-opener in terms of expressiveness and its ability to
handle variable record structures that I thought could only be
successfully handled in assembler: this was in pre-C days. For George 3
hands, these were G3 job accounting records. Others should think ASN.1

I don't know Algol, so I could not add it to the list.

But I do know that the begin end family of languages are children
of Algol.

Arne
 
A

Arne Vajhøj

I agree, with caveats. The larger issue is the proper treatment of
numerical quantities in business applications. Leaving aside currency,
which there is _some_ awareness of in terms of appropriate things to do,
the expression of other numerical quantities in Java is typified by the
use of unconstrained primitives, with haphazard and inconsistent bounds
checking scattered over the code. Maybe it's just me, but wouldn't a
better approach to a numerical data type be to write its own class,
which is responsible for its own invariants (*) (**)? Hang the minor to
moderate performance implications: what's more important, correct code
or fast code?

My questions to a Java programmer who wants (or thinks they want)
overflow detection would include: are you using the correct primitive
type (or wrapper for a primitive)? Do you even know the design ranges
for your quantity? Why are you wanting to rely on overflow detection to
save your program when 99 percent of the possible legal values for your
chosen data type are also wrong for the design problem, and you're
obviously not concerned about that at all? Why don't you write a proper
class for your data type?

PEAA recommends a money class.

Arne
 
A

Arne Vajhøj

agreeCount = agreeCount.plus(AgreeCount.ONE)
on painfulness of non-primitive math.

It does not look good.

But I find it hard to believe to be a significant problem in typical
business apps.

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,780
Messages
2,569,608
Members
45,250
Latest member
Charlesreero

Latest Threads

Top