Possible Loss of Precision - not caused by type conversion

T

Thomas

Hope I will not get banned for too many posts :)
The code below does not want to compile. I get ""Possible Loss of Precision"
in the lines:
'j=last;'

'sito[j]=i;'

'increase(i,last);'

Whats wrong with it ?
////////////////////////////////////////////////////////////////////////////
//////////////////////////////////////////////////////////////////////////

package narzedzia;
public class LiczbyPierwsze {
protected final static long length = 21;
protected final static long sito [] = new long [(long)1<<length];

// inicjalizacja sita :

{
sito[0]=1;
sito[1]=1;

long last = 0L;
long j;
long i =2;
while(i<length){ // in i we got the following prime
j=last;
while(j<length){ // we mark the numbers whose smallest divisor is i
if(sito[j]== j);
sito[j]=i;
j+=i;
}
increase(i,last); // we count the next prime for the sive
}
}


public final static long czyPierwsza(long x ){
;
return 0;
}

public final static long naCzynnikiPierwsze(long x){
;
return 0;
}
private final static void increase(long x, long last){
while(sito[last++]!=(long)0);
x =(long)sito[last];
}

}
 
L

Lew

Thomas said:
Hope I will not get banned for too many posts :)
The code below does not want to compile. I get ""Possible Loss of Precision"
in the lines:
long last = 0L;
long j;
long i =2; ....
'j=last;'

'sito[j]=i;'

Maybe because
Arrays must be indexed by int values; short, byte, or char values may also be used as index values because they are subjected to unary numeric promotion (§) and become int values. An attempt to access an array component with a long index value results in a compile-time error.
'increase(i,last);'
The value of i will not change from this call - you knew that, yes?
private final static void increase(long x, long last){
while(sito[last++]!=(long)0);
x =(long)sito[last];
}

The value of x from the assignment is thrown away.
 
T

Thomas

U¿ytkownik "Lew said:
Thomas said:
Hope I will not get banned for too many posts :)
The code below does not want to compile. I get ""Possible Loss of Precision"
in the lines:
long last = 0L;
long j;
long i =2; ...
'j=last;'

'sito[j]=i;'

Maybe because
Arrays must be indexed by int values; short, byte, or char values may
also be used as index values because they are subjected to unary numeric
promotion (§) and become int values. An attempt to access an array component
with a long index value results in a compile-time error.
'increase(i,last);'
The value of i will not change from this call - you knew that, yes?
private final static void increase(long x, long last){
while(sito[last++]!=(long)0);
x =(long)sito[last];
}

The value of x from the assignment is thrown away.

The value of i will not change from this call - you knew that, yes?

Well I know, but I forgot :).
Now after changing from long to int it works. Java sux if it doesn't allow
on more flexibilty. THX
 
P

Patricia Shanahan

Thomas said:
Hope I will not get banned for too many posts :)
The code below does not want to compile. I get ""Possible Loss of Precision"
in the lines:
'j=last;'
....

I don't understand that particular error, and Eclipse did not report it.
j and last are both of type long, so the assignment should be valid.

There is an error on each attempt to use a long as an array index. That
is due to the unfortunate decision to limit Java arrays by using int for
array size and index.

Patricia
 
L

Lew

Now after changing from long to int it works. Java sux if it doesn't allow
on more flexibilty. THX

If you have an array of long with 2 billion entries it will occupy over 16GB
of heap - the issue of your index will be moot.

Since arrays are limited to (Integer.MAX_VALUE + 1) entries, using a long
index risks an out-of-bounds problem, hence the restriction to int indexes.
You used a long, which in conversion to int risks losing precision, and this
is the error you got. In today's world an array with more entries than that
is not manageable anyway.

You can always downcast the long index to int for use in the indexing expression.

I do not begin to guess how you can characterize this as an issue of
"flexibility".
 
G

getsanjay.sharma

"Lew" <[email protected]> napisa? w wiadomo?ci
I do not begin to guess how you can characterize this as an issue of
"flexibility".

I agree. Plus no one 'forces' anyone to use arrays. You can always use
'Arraylist' if you want _flexibility_.
 
L

Lew

I agree. Plus no one 'forces' anyone to use arrays. You can always use
'Arraylist' if you want _flexibility_.

Although it lacks the particular feature the OP wanted, the ability to index
outside the int range.
 
G

getsanjay.sharma

Although it lacks the particular feature the OP wanted, the ability to index
outside the int range.

I know it sounds expensive memory wise but OP can always use the Hash
Map and with the autoboxing capability, he can index outside the int
range.
 
T

Twisted

If you have an array of long with 2 billion entries it will occupy over 16GB
of heap - the issue of your index will be moot.

Yeah, since 640K really ought to be enough for anybody, even far into
the future ... ;)
 
P

Patricia Shanahan

Lew said:
If you have an array of long with 2 billion entries it will occupy over
16GB of heap - the issue of your index will be moot.

Not if you have a 64 bit JVM and a server with a large memory.

Patricia
 
L

Lew

Patricia said:
Not if you have a 64 bit JVM and a server with a large memory.
(and -Xmx configured accordingly)

Good catch - even though I have a 64b machine for my own development I keep
forgetting what a huge difference it makes. I found
<http://msdn2.microsoft.com/en-us/vstudio/aa700838.aspx>
which examines the effect objectively for both .NET and Java (IBM WebSphere
with their JVMs).

I agree that an argument can be made that it was shortsighted of Sun to limit
array indexes to int, but the fact is that they did and it is documented in
the JLS. Does Sun have a plan to change this, or to introduce a large-array
type to Java? A sparse-array type?

I foresee when people will superciliously disparage those who once thought 64
bits provided enough address range, along the lines of those now parroting the
"and you thought 640K is enough" canard.
 
K

Karl Uppiano

Lew said:
(and -Xmx configured accordingly)

Good catch - even though I have a 64b machine for my own development I
keep forgetting what a huge difference it makes. I found
<http://msdn2.microsoft.com/en-us/vstudio/aa700838.aspx>
which examines the effect objectively for both .NET and Java (IBM
WebSphere with their JVMs).

I agree that an argument can be made that it was shortsighted of Sun to
limit array indexes to int, but the fact is that they did and it is
documented in the JLS. Does Sun have a plan to change this, or to
introduce a large-array type to Java? A sparse-array type?

I foresee when people will superciliously disparage those who once thought
64 bits provided enough address range, along the lines of those now
parroting the "and you thought 640K is enough" canard.

Technology marches on, but the idea of using 'int' for array indices was
probably a compromise between performance and size for computers of the mid
1990s. The question I have to ask, how often does someone need to look up
one of 2 billion entries that quickly, and does it make sense to have it all
in memory at once? If not, then an array might not be the right data
structure anyway.
 
?

=?ISO-8859-1?Q?Arne_Vajh=F8j?=

Karl said:
Technology marches on, but the idea of using 'int' for array indices was
probably a compromise between performance and size for computers of the mid
1990s. The question I have to ask, how often does someone need to look up
one of 2 billion entries that quickly, and does it make sense to have it all
in memory at once? If not, then an array might not be the right data
structure anyway.

True.

But should the language enforce the restriction ?

Arne
 
P

Patricia Shanahan

Karl said:
Technology marches on, but the idea of using 'int' for array indices was
probably a compromise between performance and size for computers of the mid
1990s. The question I have to ask, how often does someone need to look up
one of 2 billion entries that quickly, and does it make sense to have it all
in memory at once? If not, then an array might not be the right data
structure anyway.

Of course one would want other data structures, such as rectangular
matrix, but array is a good starting point. Most data structure
operations can be expressed in terms of array access, and several of
Java's Collection classes are effectively built on arrays.

There are a lot of tasks that can be done out-of-core, with explicit
program transfers between slices of a file (representing the logical
array) and chunks of memory. However, such algorithms are significantly
more complicated to code than their in-core equivalents. For example, I
remember a program for solving 50,000 linear equations in double complex
that was primarily a data movement program, copying chunks of a single
logical array between files and memory.

At each memory increase so far, there have turned out to be jobs that
were best expressed using a single array occupying most of the new
memory size. One of the benefits of increased memory size is making
those jobs simpler, by allowing the natural large array representation.
Why should the Integer.MAX_VALUE boundary be different?

Patricia
 
T

Twisted

I agree that an argument can be made that it was shortsighted of Sun to limit
array indexes to int, but the fact is that they did and it is documented in
the JLS. Does Sun have a plan to change this, or to introduce a large-array
type to Java? A sparse-array type?

Java already HAS a sparse-array type -- it's called HashMap<Long,
Foo>. Actually, it has HashMap<BigInteger, Foo> as well if you really
need it.
 
K

Karl Uppiano

Patricia Shanahan said:
Of course one would want other data structures, such as rectangular
matrix, but array is a good starting point. Most data structure
operations can be expressed in terms of array access, and several of
Java's Collection classes are effectively built on arrays.

There are a lot of tasks that can be done out-of-core, with explicit
program transfers between slices of a file (representing the logical
array) and chunks of memory. However, such algorithms are significantly
more complicated to code than their in-core equivalents. For example, I
remember a program for solving 50,000 linear equations in double complex
that was primarily a data movement program, copying chunks of a single
logical array between files and memory.

At each memory increase so far, there have turned out to be jobs that
were best expressed using a single array occupying most of the new
memory size. One of the benefits of increased memory size is making
those jobs simpler, by allowing the natural large array representation.
Why should the Integer.MAX_VALUE boundary be different?

I don't know the answer to that. But the engineers that designed Java made
many good decisions; they weren't stupid. They might have made a mistake,
but my gut tells me that it is more likely a typical case of an engineering
compromise, a trade-off.
 
L

Lew

Java already HAS a sparse-array type -- it's called HashMap<Long,
Foo>. Actually, it has HashMap<BigInteger, Foo> as well if you really
need it.

Excellent point.
 
R

Roedy Green

I agree that an argument can be made that it was shortsighted of Sun to limit
array indexes to int, but the fact is that they did and it is documented in
the JLS. Does Sun have a plan to change this, or to introduce a large-array
type to Java? A sparse-array type?

The problem with giant arrays is garbage collection. You have to copy
these great monsters around and find holes big enough for them. From
a GC point of view, it might make sense to implement an big array as a
number of smaller arrays.

So what we might see is a BigArrayList which works just like ArrayList
with long indexes. It would likely handle allocate ram 64K or so at a
pop as needed, so it would work for some types of sparce array.

The alternative would be to shove giant objects around by fiddling the
mapping hardware, leaving the object fixed.

You could implement big arrays as a new LongArrayList that uses some
native classes to manage indexing with 64-bits. This would be
considerably easier than adding 64-bit indexing to the language.
 
R

Roedy Green

But the engineers that designed Java made
many good decisions; they weren't stupid.

Considering they were designing for TV set top box, Java was
remarkably open ended.

I got to meet and talk with Bill Joy. I came away greatly relieved.
He was very intelligent. He was taking into consideration all the
things about Java design I was worried about and many others besides,
and looking to the best compromise. Often the limit is just time and
money. Change is expensive.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,567
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top