Java 7 features

T

Twisted

The only thing that I definitely support is the last one, but probably
in the unusual way: indeed, get rid of checked exceptions, by making
*all* exceptions checked.
For the skeptic: it can be done, see how Eiffel handles it.

Are you out of your cotton-picking MIND?! I don't want to have to
declare every method as "throws RuntimeException" just because just
about everything might throw NPE and any of several other assorted
bounds, arithmetic, etc. exceptions. And if you want Error declared
too, just forgeddaboudit.

You'd have to massively redesign the language to make this have any
beneficial effect, instead of people just slapping "throws
RuntimeException" on everything. For starters, you'd have to have
fields, parameters, and locals that could be null or could not be, say
with

String foo = whatever;
String? bar;

meaning foo is a String, and bar may be a String or null, to make this
at all workable. A lot of code that would generate NPE will simply not
compile without explicitly handling the possibility of a null then,
but it also means they'd have to rewrite the whole standard library(!)
and everyone would have to rewrite their legacy code(!!) ... if they'd
done this from the beginning it would have been a good idea, but
changing it now would trigger chaos that'd make the last-minute
scrambling to fix Y2K bugs look like a Sunday picnic. Note that non-
nullable fields would be required to be explicitly initialized or else
assigned in every constructor, whether directly or by constructor
chaining. The example above explicitly initializes the String foo. A
local variable would have to be assigned immediately on being declared
if it could not be null, e.g. String foo = whatever; but never String
foo. That alone would break nearly all code. A more minor alternative
is:

String! foo = whatever;
String bar;

Again bar is the one that can be null, but all legacy code works. But
you wouldn't see a decrease in implicitly NPE-capable code until the
new !-declarations became widespread in currently-used codebases, and
only then might having to declare this exception be anything but a
massive PITA. On the flip side, the ? version above would be a massive
PITA whether or not NPE became a checked exception at the same time...

And of course there's still the potential for the NPE to crop up at
runtime. The String!/String or String/String? distinction would
probably only exist at compile time, like generic type parameters do
now. Even if it did exist at run time, it would have to do something
if an attempt was made to assign null to the wrong reference, and the
obvious thing is to eagerly throw the NPE that would otherwise happen
anyway but happen further from the location of the buggy code.

Oh, and speaking of generics and type erasure, the #2 exception on the
official 10 Most Annoying If They Became Checked Exceptions list
is ... you guessed it ... ClassCastException...
 
T

Twisted

Don't care. Sounds like an unholy mess.

Rather than "Against. Sounds like an unholy mess."? :)

I just looked at the proposal. An example given:

Example:

elt.appendChild(
<muppet>
<name>Kermit</name>
</muppet>);


Against, with "it's pointless" as my reason. We only need to maybe add
an appendChild(String) method that parses its argument, and:

elt.appendChild(
"<muppet>" +
" <name>Kermit</name>" +
"</muppet>");

is legal Java with the required semantics. All you lose is compile-
time linting of the XML literals and pre-construction of XML objects.

It sounds like that's basically what was proposed anyway -- just XML
literals. Nothing fancy like the language-extensibility as I suggested
earlier. All it would do is a) give compile-time checking of the XML
code for well-formedness and b) give you preconstructed objects at run-
time, saving a little time when an appendChild(String) method would
have to parse and construct elements instead of an
appendChild(XMLElement) method getting a ready-made one on a silver
platter. I think there are bigger bottlenecks out there.
Also don't know what this is/Don't care. If my bugs are due to type
casting of array elements then I have bigger problems than the version
of my programming language.

Problems of the "Oops, I used an array instead of ArrayList again"
sort? ;)
Don't know. Would have liked for the CLASSPATH problem to never have
been created. It is the same as the lib/DLL hell and include path
problems of yesteryear. If Superpackages fixes that then yes. I
haven't studied but guess it is like Maven with net-based JAR linking.

I looked at this too now and it looks ... complicated. I thought JAR
files had largely superseded CLASSPATH problems, and JWS had largely
superseded JAR problems?

There will always be problems with shared code. If I create a library
Foo 1.0 and later, with the same package names and class names, a non-
backward-compatible Foo 2.0, and Joe creates Bar which uses Foo 1.0
under the hood, and Fred creates Baz which uses Foo 2.0 under the
hood, and Joe won't update Bar to work with Foo 2.0, and someone
writes application Quux which uses Bar and Baz ...

Well, if Bar can come with a "private" copy of Foo 1.0 it uses
internally, and Baz with a "private" copy of Foo 2.0, the problem
looks solved.

Until the author of Quux gets to the bit where he needs to write

BarObject futzmeier = new BarObject(someArguments);
futzmeier.doSomethingWith(aFooObject)...

and Foo 1.0 isn't visible.

OK, so he can include another copy of Foo 1.0 for Quux to use, but now
aFooObject might not look right to doSomethingWith because its run-
time class is a copy of FooObject loaded with one classloader and
doSomethingWith is expecting one whose run-time class is a different
copy of FooObject, loaded with another classloader.

Confused yet? Now picture
someBazObject.doSomethingElseWith(anotherFooObject)...

which needs a Foo 2.0 object...

*bangs head against wall*

The solution is really very simple. Do not make backward-incompatible
changes to libraries without also changing the package name. Put the
major version number somewhere in the package name! If everyone did
this, either Quux shipped with Bar, Baz, and Foo 2.0 would work, with
Bar accepting Foo 2.0 as a substitute for Foo 1.0, or else Quux comes
with Bar, Baz, Foo1.0, and Foo2.0, with the latter two not clashing as
one is com.foo.version1 and the other is com.foo.version2 ...

Unfortunately (or maybe fortunately), enforcing this worldwide is
nontrivial. :)

Then there's no need for "modules" with "hidden" JARs and all that
cruft. Just JARs, and a central place people can put the highest
version of a JAR they have to get rid of disk space wastage from
duplication...

(Sun itself is guilty of breaking backward compatibility. How many
times have we had someone complain here when something with "enum" in
it futzes up? The final straw being that apparently there's a whole
damn package out there with "enum" in its name, which is now useless
with anything later than Java 4...)
One thing I would like to see is very tight integration with Linux.
With Linux being Open Source then Sun could feasibly write a kernel
module and maybe speed up the JVM on Linux I would guess.

What, and have all Java applications run with root privileges? Are you
fucking nuts? No amount of speedup is worth that.
 
T

Twisted

In my experience (from dealing in the past with C++ exceptions) this is
one of the things Java currently has right; you can use static analysis
tools to find misuse of checked exceptions (e.g. catch Exception) and
they greatly reduce the number of unexpected thread or process deaths
post deployment.

Static analysis tools are no substitute for human judgment, however.
If it finds "catch (Exception e)" in the code, don't instantly jump to
the alarm switch and break the "in case of emergency, break" glass;
first look at the code in question. It may turn out to be a perfectly
harmless and legitimate

catch (Exception e) {
myLogger.log(e.toString());
throw e;
}
 
T

Twisted

Would have been nice if you had numbered the items. =P

Yeah, it really was rather inconsiderate of the OP to assume that all
of the experienced IT professionals with years of experience and
schooling here would be able to count as high as seven. ;)
 
T

Tom Hawtin

Twisted said:
Secondly, operators are then just syntactic sugar as follows:
x * y -> x.multiply(y)

I want to multiply my Matrix by five using: 5 * m
It would be desirable to be able to set a thread-local MathContext for
BigDecimal calculations and scope it somehow.

Ick. Best not call any other methods in that scope (for instance by
causing a class to load)...
I'd suggest a more
general "environment" or "context" mechanism built on a (hopefully
speeded-up!) threadlocal mechanism, whereby you can use code like

ThreadLocal is already pretty fast. I believe it spends most of its time
in Thread.currentThread().
do switch (threadlocal: expression) { ... stuff ... }
It would be syntactic sugar for:

Object t3dsfgr838 = someThreadLocal.get();
someThreadLocal.set(expression);
try {
stuff
} finally {
someThreadLocal.set(t3dsfgr838);
}

Why not just use a closure? In current syntax (with restricted closure):

someThreadLocal.with(expression, new Runnable() {
public void run() { ... stuff ... }
});

Or in my suggested syntax for anonymous inner classes, the same thing
only less restricted:

someThreadLocal.with(expression, ### { ... stuff ... });

In Gafter, et al, I believe (don't quote me) the equivalent is:

someThreadLocal.with(expression) { ... stuff ... }
And of course ThreadLocal should be made a) faster to access (at least
to just read) and b) generic (so with a ThreadLocal<String>, you would
not have to cast theThreadLocal.get() to String, etc.; why was this
done promptly for WeakReference et. al. but not ThreadLocal? Arrrgh!)

(a) It is quite fast in comparison to BigDecimal operations.

(b) ThreadLocal is generic. IIRC, the example wasn't generified in 1.5.
In 1.6 the example has a bug (not from the original author).

http://java.sun.com/javase/6/docs/api/java/lang/ThreadLocal.html

Tom Hawtin
 
S

Stefan Ram

Tom Hawtin said:
I want to multiply my Matrix by five using: 5 * m

I never liked the asymmetry of »x.multiply( y )«.
I'd prefer:

x * y -> new BigDecimalPair( x, y ).multiply()

(The object creation overhead can be optimizid
away for such cases.)

»5 m« might be written as follows:

new IntegerMatrixPair( 5, m ).multiply()

Or, why not use static methods?

x * y -> BigDecimalPair.multiply( x, y );

IntegerMatrixPair.multiply( 5, m )

Here, »BigDecimalPair« should be a friend of »BigDecimal« in
the C++-sense of »friend«, and »IntegerMatrixPair« should be a
fried of »Matrix«. - Java, of course, does not have a »friend«
keyword.
 
T

Twisted

(Yes, I realize that I am about to potentially unleash strong opinions.)

Recently, I was looking around online and came across this (partial) list
of new Java 7 features. What I want to know is what support/disapproval
people have of these options:

@ Closures
@ Strings in switch statements
@ Operator overloading for BigDecimal
@ Language-level XML support
@ Reified generics
@ Superpackages
@ Removing checked exceptions

And here are the good, bad, and ugly ones that weren't listed above
and for which I have particular comments:

Short instance creation
Description: Allow a more compact syntax for creation of local
variables to avoid repeating the assignment type and the
constructor.

This better just add the ability to write StringBuffer foo = new();
without making it impossible to do e.g. List structureList = new
ArrayList();

Sometimes you want the reference to be of interface or superclass
type. For example, you can change that to ...new LinkedList(); later
without breaking any other code in the method.

Comparisons for Enums
Description: Allow enum values to work with range operators
(<, >, etc)

boolean isRoyalty(Rank r) {
return rank >= Rank.JACK && rank != Rank.ACE;
}

Now just add allowing "case foo-bar:" in switch statements and we're
golden. Someone elsewhere in this thread lamented the lack of this
facility in current Java.

Of course, this should be done by providing operator overloading in
the manner I suggested elsewhere in this thread, with <= and >= and
their ilk usable on anything where the LHS implements Comparable<RHS>
or vice versa. (Where this means it implements Comparable<Foo> for
some Foo assignable from the compile-time type of the RHS.) If both
implement Comparable, the LHS's comparison method gets used. And of
course if they aren't already enums get Comparable and "case foo-bar:"
boils down to "if (x >= foo && x <= bar) {" with further translation
for the >= and <= operators.

invokedynamic
Description: Introduces a new bytecode invokedynamic
for support of dynamic languages."

I don't see what this buys us. No current or proposed legal Java code
would compile to bytecodes that would use this instruction. If this
affects anything, it's only non-Java languages that target the JVM.
Which makes it not a change to Java at all and irrelevant here.
 
D

David Gourley

Twisted said:
Static analysis tools are no substitute for human judgment, however.
If it finds "catch (Exception e)" in the code, don't instantly jump to
the alarm switch and break the "in case of emergency, break" glass;
first look at the code in question. It may turn out to be a perfectly
harmless and legitimate

catch (Exception e) {
myLogger.log(e.toString());
throw e;
}

The tool we use would not report this as an error (because the exception
is re-thrown).

On the other hand code that catches Exception and does not rethrow it is
just plain evil.

Dave
 
G

Guillermo Schwarz

But we already have (minimal) closures. I think it would be much more
sensible to improve what we already have, rather than to duplicate. It's
probably unlikely to get in before Java 1.8 anyway.

We already have Runnable. Who needs something else? Although it would
be nice if it were a la Smalltalk:

new Thread( new Runnable() { public void run() { doSomething(); } } );

Would become:

new Thread( <quote> doSomething(); );

Where <quote> could be:
1. [] a la Smalltalk, ie: [ doSomething(); ] It would look ugly in
Java, though.
2. ' (quote) a la Lisp, ie: 'doSomething(); It looks horrible in Java,
so forget about it.
3. # a la Smalltalk, ie: #doSomething(); Not bad, eh?
Not too fussed. It'll make switches on strings clearer, but perhaps will
encourage poor code.

Simply use hashCode(), ie:

switch( str.hashCode() )
{
case "hello".hashCode(): ...; break;
case "bye".hashCode(): ...; break;
}
BigDecimal and BigInteger are hideous to use at the moment. The overflow
semantics of primitive integers are not something that I think should be
in a high level language. I'd prefer general operator overloading of
sensible operators.

It would be nice if all numerical classes in Java were retrofitted to
descend from Number (or implement Number), so that you could declare a
number and it could contain any number: float, int, bigdecimal, etc.
Evil and unnecessary. A lot like much of XML.

Use XStream instead.
This could have made sense in 1.5. But I don't think this is really on
given the situation we have got ourselves into.

What do you mean?
Good. I guess packages provide what is necessary from superpackages, but
we tend to write libraries and applications in more than one package.

What is a superpackage?
Is it this? http://blogs.sun.com/gbracha/entry/developing_modules_for_development

I hope not.

First, EJB already have modularization, you deliver 2 jars, one for
use on the client and another for the whole implementation. It is
trivial (some could argue), but it already works.

The same approach could be used for super packages, have some "client"
package and stuff subpackages inside, and another "implementation"
package and stuff everything else, so that all interfaces and Transfer
Objects go into the "client" packages, and the rest have all the
implementations of those interfaces, great!

EJB was a dead horse because the Enterprise Java Beans did not need to
implement the remote interfaces, and besides the home interfaces were
unnecessary. If they had only one interface and one class (it could
have many implementations, but follow the idea for a minute), it would
be nice.
I like static type checking.

Sometimes it gets in the way, but you can always convert those to
RuntimeException, what is the big deal?

Why does people need to force other into their preferences?
 
T

Twisted

I never liked the asymmetry of »x.multiply( y )«.
I'd prefer:

x * y -> new BigDecimalPair( x, y ).multiply()

[snip stuff with static methods, friends...]

Yuck. Complicated. Ugly.

x op y with only one reference type is handled by dispatching on the
reference-type argument, and looking for its compile-time type to have
a suitable valueOf() static method. E.g.:

5 * m -> m.multiply(Matrix.valueOf(5));

So a static, Matrix-returning valueOf method that accepts one integer
argument needs to be in the class Matrix in this case, and if
MatrixReloaded is the actual runtime type of m,
MatrixReloaded.multiply (but just Matrix.valueOf) actually gets
called.

And of course 5 - m would be m.subtract(Matrix.valueOf(5)).negate(),
versus m - 5 being m.subtract(Matrix.valueOf(5)) -- this means we need
a reciprocal() method in Matrix for 5/m to be legal of course. It
should be allowed to throw anything, to accomodate your
MatrixNotInvertibleException, which if a checked exception needs to be
handled or declared by the method with the "5/m" expression in it.

(To get the expected behavior from e.g. 5*m requires Matrix.valueOf(5)
return a matrix with fives on the diagonal entries and zeros
everywhere else. This doesn't seem especially efficient, but
Matrix.valueOf(5) could return a ScaleMatrix subclass that has the
same overhead for creation and use as an Integer.valueOf(5). Matrix
can double-dispatch multiply(), resulting in
scaleMatrixFive.multiply2(m) basically occurring. And that, in turn,
could just boil down to "return m.scalarMultiply(5)". Or Matrix can
specifically check its parameter for being a ScaleMatrix with
instanceof or an isScaleMatrix() method call or whatever. And if
Matrix doesn't do some of these efficient things, someone can write a
subclass MatrixRevolutions that does!)
 
G

Guillermo Schwarz

If we had closures (and unnamed functions a la Smalltalk), we could
use HashMaps instead of switch statements.
 
G

Guillermo Schwarz

Not keen. I think this will needlessly make code hard to understand by
the general programmer.

Not really if they are implemented a la Smalltalk.

For example:

if ( x > 0 ) { doSomething( x + 1 ); }
else { doOther( x - 1 ); }

could be written:

cond1 = #[ x > 0 ]; // #[] means delayed evaluation. cond1.eval() will
execute
cond2 = #[ fun( fun a, fun b ) { if ( cond1.eval() ) a.eval(); else
b.eval(); } ];
// since cond1 is delayed, cons1.eval() will execute.
// fun means function, fun a means a is a function.
// A function without a name is written simply fun(...parameters...)

cond2.eval( #doSomething( x + 1 ), #doOther( x - 1 ) );
// cond2 also was delayed, in order to execute,
// 2 functions need to be passed as parameters
// please note that any 2 functions could be passed
// #doSomething( x + 1 ) means pass the function instead of the
result.
Not as big as deal as it once was now there are enums. The feature
will encourage inefficient code. I would like instead RANGES of ints
and enums that generate better code and code easier to maintain than
you would do manually. This is a relatively trivial extension,
requiring no JVM change. Seehttp://mindprod.com/projects/caserange.htmlfor what I a would like.

Wouldn't it be better to use RangeHashMap and closures?
fine. This can probably be done without a JVM change.
Actually the source for:

a + b

shoudl be replaced with:

a.operator_plus( b );

So it can be done trivially by the compiler.
I don't know what that means. Reify means to regard or treat (an
abstraction) as if it had concrete or material existence. Does this
mean tossing out type erasure? Generics are an embarrassment to the
language. I personally would like to start from scratch and see if
they can be made 10 time times simpler and wart-free.
What are the problems you are seeing in Generics?

It was supposed that generics fixed what C++ templates screwed, but
since I worked in C++, I felt templates were a good idea at the
beginning and then they were totally messy. It simply doesn't scale.
 
G

Guillermo Schwarz

I would rather prefer:

Map map = new HashMap();
map.put( y, #doThis() );
map.put( z, #doThat() );
map.put( null, #doSomethingElse() );
map.put( Default, #giveUp() );
map.switchOn( x );
 
T

Twisted

We already have Runnable. Who needs something else? Although it would
be nice if it were a la Smalltalk:

new Thread( new Runnable() { public void run() { doSomething(); } } );

Would become:

new Thread( <quote> doSomething(); );

Where <quote> could be:
1. [] a la Smalltalk, ie: [ doSomething(); ] It would look ugly in
Java, though.
2. ' (quote) a la Lisp, ie: 'doSomething(); It looks horrible in Java,
so forget about it.
3. # a la Smalltalk, ie: #doSomething(); Not bad, eh?

Nope, but we still don't have ones with generic argument and return
types ... of course, for use with threads, you want some sort of
ResultHolder and in the Runnable:

synchronized (resultHolder) {
resultHolder.set(someObject);
resultHolder.notify();
}

instead.

For this, I suppose a Foo[1] is usable for a result of type Foo.
Simply use hashCode(), ie:

switch( str.hashCode() )
{
case "hello".hashCode(): ...; break;
case "bye".hashCode(): ...; break;

}

Hashcodes aren't guaranteed to be unique. They could be used under the
hood to speed up switches on reference types, whose cases are
constants; a premade HashMap can be used at runtime to lookup a
closure that was built by the compiler to actually contain the case
body bytecode, and the found closure invoked or else a default closure
is invoked. Cases with fallthrough produce closures that chain to
other closures under the hood.
It would be nice if all numerical classes in Java were retrofitted to
descend from Number (or implement Number), so that you could declare a
number and it could contain any number: float, int, bigdecimal, etc.

Ultimately, to do any serious math nicely means you need either double
dispatch to be easy to do nicely, or a "ladder" or directed acyclic
graph of potentially implicit type promotions, based on the providing
of Foo.valueOf(bar) static methods in more advanced types to upconvert
lowlier ones, including primitives. So we'd end up with

short
| \
int float
| \ |
long double
| \ |
BigI-BigD

And of course there's third party types. What about vectors, tensors,
matrices, and suchlike? What about custom bignum types, which may have
different requirements and characteristics? (JScience has a
LargeInteger with faster multiplies at large sizes, a Real
representing an error-bounded quantity, and suchlike; it unfortunately
does not provide a LargeDecimal with the LargeInteger multiply...)
Sometimes it gets in the way, but you can always convert those to
RuntimeException, what is the big deal?

Why does people need to force other into their preferences?

Because unemployment of professional soldiers would skyrocket if there
were no fascist pigs in the world for them to either a) fight against
or b) work for. :p
 
G

Guillermo Schwarz

I'm sort of dreading this change, because it means once Java 7's out,
I will no longer consider myself a competent Java programmer. On the other
hand, might be a good opportunity to finally learn what closures are all
about.

They are not a big deal, just reduce the clutter of Runnables and
write the code directly as a parameter, assign the code to a variable
(and tell it precisely when to execute). Purists will say that I'm
describing anonymous functions, well maybe. Java now has anonymous
classes and Java purists said anonymous functions and anonymous
classes were equivalent. Probably they were computationally
equivalent, but I certainly don't want to program in Turing. ;-)
I thought "No Operator Overloading" was an argument for favoring Java
over C++. The purist in me is already annoyed with the + operator being
overloaded for Strings. This seems like a step in the wrong direction.

Generics were IMHO a big step in the wrong direction.

Operator overloading is complex in C++ becasue you can overload =, +=,
-=, etc. and not make them equivalent.

In Java, operator overloading should only be used for: +, -, * and /,
and their meaning should therefore let programmers write a += b and
automatically replace that by a = a + b (which would get then
translated to a = a.operator_plus( b );) No harm done.

I like checked exceptions. I'm part of the crowd that is willing to do
more typing in exchanged for less buggy programs.

If you don't want to use checked exceptions you could always use
RuntimeException. One size does not fit all.
 
S

Stefan Ram

Guillermo Schwarz said:
Not really if they are implemented a la Smalltalk.

In Smalltalk, not even »boolean« is primitiv, but boolean and
conditions are implemented relying solely on
object-oriented programming (polymorphism).

Could one translate, for example, the following
pseudo-Smalltalk to Java - without making any use of the
inbuilt Java-type »boolean« nor »java.lang.Boolean« nor
inbuilt conditionals of Java?

[ int i = 0; while( [ i < 3 ], [ int j = 0; print( i++, j++ )])]

On the following page, I indeed display such a Java program.

http://www.purl.org/stefan_ram/pub/smalltalk-blocks-in-java
 
R

Roedy Green

Benchmark running - Please minimize activity of other processes.
The columns
Identification number of the outermost iteration
Runtime of "if" divided by runtime of "switch"
Average of the preceding column

0; 0,73; 0,73
1; 0,93; 0,83
2; 0,93; 0,87
3; 0,93; 0,88
4; 0,93; 0,89
5; 0,93; 0,90
6; 0,93; 0,90



It depends on whether you use -client or -server

Here is Java 1.6 on my AMD 64 DUAL core 2.047 Mhz

java -client SwitchTester
Benchmark running - Please minimize activity of other processes.
The columns
Identification number of the outermost iteration
Runtime of "if" divided by runtime of "switch"
Average of the preceding column
0; 1.18; 1.18
1; 1.19; 1.19
2; 1.14; 1.17
3; 1.09; 1.15
4; 1.27; 1.17
5; 1.18; 1.18
6; 1.18; 1.18

java -server SwitchTester
enchmark running - Please minimize activity of other processes.
he columns
Identification number of the outermost iteration
Runtime of "if" divided by runtime of "switch"
Average of the preceding column
0; 0.75; 0.75
1; 0.94; 0.85
2; 0.82; 0.84
3; 1.11; 0.91
4; 0.86; 0.90
5; 0.84; 0.89
6; 0.91; 0.89
7; 0.85; 0.89
8; 0.91; 0.89
9; 0.86; 0.89
10; 0.78; 0.88
11; 0.92; 0.88
12; 0.83; 0.88

With Jet 5.0
Benchmark running - Please minimize activity of other processes.
The columns
Identification number of the outermost iteration
Runtime of "if" divided by runtime of "switch"
Average of the preceding column
0; 0.57; 0.57
1; 0.87; 0.72
2; 0.93; 0.79
3; 0.92; 0.83
4; 1.03; 0.87
5; 0.65; 0.83
6; 0.96; 0.85
7; 0.97; 0.86

So it looks like optimisers spend more time on IFs than SWITCH.
This is puzzling. Iin Assembler a jump table should be much faster
than any nested set of switches, even if organised in a binary search.

The -genasm+ switch in Jet seems to be discontinued so I can't easily
figure out what sort of code it is generating for both.
 
R

Roedy Green

So it looks like optimisers spend more time on IFs than SWITCH.
This is puzzling. Iin Assembler a jump table should be much faster
than any nested set of switches, even if organised in a binary search.

I am wondering if parallel logic in the chip is the secret of the
speed of the nested if. Perhaps it can do several lookaheads.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,584
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top