32 bit applet on 64 bit Java?

J

John Smith

Can a 32 bit applet run on 64 bit Java?

If not, does 64 bit browser require a 64 bit java?


Thanks in advance for your answers.
 
J

Joshua Cranmer

Can a 32 bit applet run on 64 bit Java?

Java bytecode can run on any JVM, although if you use native libraries,
you naturally have to provide one for every platform you wish to support.

If you were asking if a 32-bit applet plugin could use a 64-bit JVM, I
believe the answer is no.
If not, does 64 bit browser require a 64 bit java?

I am pretty sure that most browser plugin APIs require that a 64-bit
browser have only 64-bit plugins. Certainly, for best results, you
should use the same bit width.
 
J

John Smith

My situation is the following. I have a 32 bit java applet which
accesses JNI library which accesses some other library (APPLET -> JNI
LIBRARY -> LIBRARY).

I have tried to open it with the 64 bit browser and the applet loads,
and it successfully loads JNI library. But the when it starts to "work"
something everything crashes.

The browser is Safari on a MacOS Snow Leopard. The JNI library, and the
end library are both 32 and 64 bit compatible.

In summary:
64bit Safari -> 64bit JVM -> 32bit Applet -> 32_64 JNI Library -> 32_64
end library
 
J

John Smith

The crash happens when the JNI library executes memcpy command.
I think the code should be ok because it already works for a long time.

It may be because the applet is 32 bit and the JNI is now 64 bit
compiled. The universal or 32 bit compiled JNI does not load due to
"wrong architecture" error.


1. Should I try to compile the JNI to 32 bit as well and try to make it
work? I get a compile time warning ("file was built for unsupported file
format which is not the architecture being linked i386").

2. Or should I compile the java applet to 64 bit?

3. Are java applications even different when compiled with 32 and 64 java?


Any suggestions?
 
N

Nigel Wade

The crash happens when the JNI library executes memcpy command.
I think the code should be ok because it already works for a long time.

Hmmm, just because it's worked up to now in no way guarantees that the
code is ok. Especially so if the code is ported from a 32bit environment
and compiled for 64bit.

C (assuming it is C) is quite unforgiving of changes in architecture
when certain false assumptions are made (such as using non-pointer types
to store address/pointers and coercing the access). It's "valid" code,
in that the compiler won't necessarily warn you that what you've done is
wrong, and it will compile and run. It just won't work the way you want,
it may work perfectly well for quite some time, overwriting unimportant
code and data - until the day it doesn't work. Maybe that day is today.
It may be because the applet is 32 bit and the JNI is now 64 bit
compiled. The universal or 32 bit compiled JNI does not load due to
"wrong architecture" error.

Java byte-code is not 32bit or 64bit. It's architecture neutral.

However, the JVM plugin which is run by the browser, and any JNI which
is loaded by the JVM, will almost certainly need to match the
architecture of the browser. So if the browser is 64bit then the JVM
plugin will be 64bit and any JNI it loads will also need to be 64bit. As
will any other library that the JNI loads.
1. Should I try to compile the JNI to 32 bit as well and try to make it
work? I get a compile time warning ("file was built for unsupported file
format which is not the architecture being linked i386").

I doubt that that will work. If the browser/JVM is 64bit it will only
execute 64bit code.
2. Or should I compile the java applet to 64 bit?

There is no such animal. You can compile the Java using a 32bit or a
64bit JDK and run the generated byte-code on any compliant JVM, 32bit or
64bit.
3. Are java applications even different when compiled with 32 and 64 java?

Maybe very, very slightly, but not so you'd notice. They may be identical.

You really need to compile all your JNI, and associated other libraries,
as 64bit and verify that they actually execute as intended in a 64bit
environment.
 
R

Roedy Green

Can a 32 bit applet run on 64 bit Java?

If not, does 64 bit browser require a 64 bit java?

Applets don't have 32 or 64-bitness any more than ordinary
applications do, UNLESS the JNI were provided only for 32 or 64 bit.
 
J

Joshua Cranmer

The crash happens when the JNI library executes memcpy command.
I think the code should be ok because it already works for a long time.

For starters, top-posting is considered bad form in Usenet.

Anyways, "it's always worked before" is not exactly a sign that the code
in question actually works [1]. Although, giving my experience with
debugging C code, a crash here is generally a sign of a more pernicious
latent problem that only happened to manifest itself.
It may be because the applet is 32 bit and the JNI is now 64 bit
compiled. The universal or 32 bit compiled JNI does not load due to
"wrong architecture" error.

What do you mean by "the applet is 32-bit"? Java bytecode is completely
independent of architecture (indeed, that is its point), so the only
things that could be 32-bit or 64-bit is the JVM that you are running or
the native library code being called by said applet.

[1] Random digression. It's slightly annoying when you are trying to
point out why to not implement something in a certain way, and the
example you come up with on the fly would seem to give the same result.
Even more annoying is when the explanation as to why it happens in this
particular case is a ways beyond the scope of the class.
 
J

John Smith

Hmmm, just because it's worked up to now in no way guarantees that the
code is ok. Especially so if the code is ported from a 32bit environment
and compiled for 64bit.

C (assuming it is C) is quite unforgiving of changes in architecture
when certain false assumptions are made (such as using non-pointer types
to store address/pointers and coercing the access). It's "valid" code,
in that the compiler won't necessarily warn you that what you've done is
wrong, and it will compile and run. It just won't work the way you want,
it may work perfectly well for quite some time, overwriting unimportant
code and data - until the day it doesn't work. Maybe that day is today.

You are right.
It seems that some pointers are casted to int which are 2 bytes IIRC.
Since java int is 4 bytes it seems I should change both the applet and
the JNI library.

Do you think the way to go should be to put the java type to long and
the C jint to jlong (__int64)?


This is the sample method declaration:

---------------------------------------------------------------------
JAVA:
public static native void jni_SetMem(int i, byte abyte0[], int j, int k);

C:
SetMem(JNIEnv *pEnv, jclass jObj, jint i_pDest, jbyteArray jArr, jint
iDestOffset, jint iSize)
----------------------------------------------------------------------

Should I change to:

---------------------------------------------------------------------
JAVA:
public static native void jni_SetMem(long i, byte abyte0[], long j, long k);

C:
SetMem(JNIEnv *pEnv, jclass jObj, jint i_pDest, jbyteArray jArr, jlong
iDestOffset, jlong iSize) //maybe I could leave the size to int, but id
doesn't matter to much
----------------------------------------------------------------------


Thanks again for everyone's help.


Java byte-code is not 32bit or 64bit. It's architecture neutral.

However, the JVM plugin which is run by the browser, and any JNI which
is loaded by the JVM, will almost certainly need to match the
architecture of the browser. So if the browser is 64bit then the JVM
plugin will be 64bit and any JNI it loads will also need to be 64bit. As
will any other library that the JNI loads.

Thanks for the clarification. I was not sure how this works.
I'll keep the JNI compiled universally.
 
J

Joshua Cranmer

It seems that some pointers are casted to int which are 2 bytes IIRC.

On most systems, sizeof(int) == 4 (most 64-bit compilers keep
sizeof(int) equal to 4). sizeof(long) is problematic in C, some 64-bit
compilers keep it 4, while others make it 8. sizeof(void*) would be 8 in
64-bit computers, though.
Should I change to:

---------------------------------------------------------------------
JAVA:
public static native void jni_SetMem(long i, byte abyte0[], long j, long
k);

C:
SetMem(JNIEnv *pEnv, jclass jObj, jint i_pDest, jbyteArray jArr, jlong
iDestOffset, jlong iSize) //maybe I could leave the size to int, but id
doesn't matter to much

Strictly speaking, you want to move to size_t for the iSize, which will
still be sizeof(void*), so yes, moving to long might be preferable.
You'll also probably want checking to make sure that the Java long
values are valid memory addresses for the processor you are on.
 
N

Nigel Wade

Hmmm, just because it's worked up to now in no way guarantees that the
code is ok. Especially so if the code is ported from a 32bit environment
and compiled for 64bit.

C (assuming it is C) is quite unforgiving of changes in architecture
when certain false assumptions are made (such as using non-pointer types
to store address/pointers and coercing the access). It's "valid" code,
in that the compiler won't necessarily warn you that what you've done is
wrong, and it will compile and run. It just won't work the way you want,
it may work perfectly well for quite some time, overwriting unimportant
code and data - until the day it doesn't work. Maybe that day is today.

You are right.
It seems that some pointers are casted to int which are 2 bytes IIRC.
Since java int is 4 bytes it seems I should change both the applet and
the JNI library.

Do you think the way to go should be to put the java type to long and
the C jint to jlong (__int64)?


This is the sample method declaration:

---------------------------------------------------------------------
JAVA:
public static native void jni_SetMem(int i, byte abyte0[], int j, int k);

C:
SetMem(JNIEnv *pEnv, jclass jObj, jint i_pDest, jbyteArray jArr, jint
iDestOffset, jint iSize)
----------------------------------------------------------------------

Should I change to:

---------------------------------------------------------------------
JAVA:
public static native void jni_SetMem(long i, byte abyte0[], long j, long
k);

C:
SetMem(JNIEnv *pEnv, jclass jObj, jint i_pDest, jbyteArray jArr, jlong
iDestOffset, jlong iSize) //maybe I could leave the size to int, but id
doesn't matter to much


Sorry, I don't use JNI so I can't be of any real help on this matter.

Hopefully someone else who does use it, especially on different
architectures, can provide the help you require.
 
B

BGB

The crash happens when the JNI library executes memcpy command.
I think the code should be ok because it already works for a long time.

For starters, top-posting is considered bad form in Usenet.

Anyways, "it's always worked before" is not exactly a sign that the code
in question actually works [1]. Although, giving my experience with
debugging C code, a crash here is generally a sign of a more pernicious
latent problem that only happened to manifest itself.

yep...

there are many evils which can lurk in a piece of code only to manifest
later. something can also work flawlessly in one place and fail
miserably in another.

hence, one needs to test their code on any relevant target, but OTOH,
one can set limits to how and where their code will work (say: this only
works on 32-bit x86, or this will work on 32 or 64 bit x86, or similar...).

What do you mean by "the applet is 32-bit"? Java bytecode is completely
independent of architecture (indeed, that is its point), so the only
things that could be 32-bit or 64-bit is the JVM that you are running or
the native library code being called by said applet.

well, in some sense, the bytecode is always 32 bit, given certain
properties:
long and double require 2 spots in the constant pool, locals frame, on
the stack, and in argument lists;
dup_x (dup_x1, dup_x2, dup2_x1, dup2_x2) will exhibit a lot of
funkiness, essentially treating the long/double entries multiple entries;
....

(and in another sense it would seem better suited for a MIPS or SPARC
based interpreter than an x86 based one).


granted, it works just the same on 64 bits, since needing 128 bits to
store them in this case (naive possibility A), or glossing over the
issue in the JIT (say, the extra spot becomes 'void' and is not assigned
any physical storage).

OTOH, I had before wrote a translator which tried to coerce these types
into only a single conceptual stack/args entry, but this made the dup_x
instructions very awkward...


granted, trying to make long/double be single entries on a 32-bit
interpreter would likely be more awkward, because it would either mean
naively using larger spots for all the other entries, using indirect
storage (larger types are internally passed by reference), or
type-boxing (expensive).

again, in this case, a JIT probably wouldn't care much.

[1] Random digression. It's slightly annoying when you are trying to
point out why to not implement something in a certain way, and the
example you come up with on the fly would seem to give the same result.
Even more annoying is when the explanation as to why it happens in this
particular case is a ways beyond the scope of the class.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,009
Latest member
GidgetGamb

Latest Threads

Top