Help me!! Why java is so popular

  • Thread starter amalikarunanayake
  • Start date
A

Amali

Have you ever coded in C++? It's not at all similar to developing in
Java, unless you mean they're both C language derivations. C++
requires compilation to find syntax errors, and linking to resolve
missing or mistyped externals. This is completely dissimilar to Java
development in an IDE like Visual Age, WSAD, or Eclipse, where missing
imports, mistyped classes, and syntax errors are flagged in the
editor.

Do you use notepad to write Java and then run it against java.exe? If
you do that I suppose it would be similar, but most folks I know don't
develop Java in that way.

I'm not familiar with Python or Ruby, but I use Perl quite a bit and
believe it to be similar. I don't have a Perl IDE, so I use notepad
and run it with Perl.exe. Perl development for me falls somewhere
between C++ and Java, but closer to Java. There's no linking step,
and it's pretty fast to write something and test it.

The Perl language is syntactically similar, but it provides some great
benefits over Java or C++ for scripting. I love it for scripting; I
can't imagine using Java or C++ to replace Perl in the situations I
use Perl.

Everything has it's place.






First who's "we"? Second, you don't have to be snide -- was I rude to
you? Third, I never said I was impressed with my skills; I said I
knew a decent number of languages fluently. There's a difference. I
mention my programming experience because I thought it relevant to
this thread. Why is java so popular -- I have my own perspective
based on experiences with more than one or two languages.

I never said my experience was impressive; never meant to imply it
either. It is what it is: experience.

Wanna know what's impressive about me? I'm a world class powerlifter.
That's right, I weigh 165 lbs and should break the US record for squat
next month with a 590-to-615 lb full squat, somewhere around there.
I've trained next to 290-lb pro football players and outlifted them.
Gather up a 1000 165'ish pound guys and see how many can do that. It
doesn't make me great, and it may not impress you, but down at the gym
people line up to watch me lift.

Hope that clarifies what I find impressive...

Anyways, Java is seen as a panacea in some places I know. C was a
panacea at one time. I remember ADA being proclaimed the mother of all
languages. I know people to this day who swear COBOL is all you ever
need. Sheesh. New languages will spawn from Java and C++, just as they
spawned from C, SmallTalk, Pascal, ALGOL 60, etc.

If all I'd ever coded was Java, or just C and Java, I might have a
different view than if all I'd ever coded was SmallTalk or something
like that. Besides, it doesn't matter if I'm impressed with myself or
with you or neither -- this thread is about why java has become
popular.

Thank you very much! because your according the thread. Realy I can't
understand all of your's arguments because I'm a java beginner.But u r
exchanging ur knowladge nicely.
Thanks again
 
S

senior

the most advantage for java is in networks or distributing
computation

and please googling about advantage of java to see why it is popular
then welcome to any question
 
N

nukleus

Lew said:
Great idea.

Well. I didn't expect youre gonna buy it.
But...

Oki, doki.
Just warn everybody here
that some things may...

Well, just relax if my monkey gets too wild.
You might see zome posts of yerr royal highness.

Lets narrow it down.
Which subject would you pick to reflect the
"Best of Lew"?

Your turn now.
 
N

nukleus

That's an excellent point.

To the spirit of the thread, and from my experience working in
reasonably diverse range of languages, I still don't believe Java's
popularity has in any way, shape or form been a byproduct of its
legendary performance. I think in many cases the overhead incurred to
run a JVM is acceptable because of the functionality the code base
provides.

So winding back a bit, the reason that functionality can exist with
Java has much to do with the language/environment which frees
programmers from a lot of resource management and allows them to focus
more on plugging together components. Time to market and all that
jazz.

Like a lot of interpreted languages, Java's development cycle is
pretty excellent I think. A programmer can get a fairly complicated
application off the ground quickly due to a number of things brought
about both by the interp-language development paradigm that allows for
immediate detection of errors and compile-and-test-as-you-code
sessions.

I think those are the kinds of things which has made Java popular.
The performance issues are the trade off -- I find most Java-only
coders will explain away any performance issues by pointing out all
the great things you get in return, or how in controlled or specific
circumstances modern JVMs can use environment-specific knowledge to
coerce code into a better behaving form.

And don't forget about the very fact that with JVM,
you have to load the entire JVM to run a 2+2 program
and that magnifies the size of a 1 meg program by the
factors of magnitude. Before your main() is hit,
it already swallowed at least 10 megs, and for what?

Secondly, there is no mechanism allowing users to run
the application in native mode, so, to deliver a 1 megs
app, you need to include at least 5 more megs of JVM
and convince users that they MUST install that JVM,
otherwise they won't even be able to run your app.

I wish there was a native mode support, where java
app is compiled without JVM, which is also a bloatware.
You should be able to deliver you app in .exe format
where user simply souble clicks on it and it runs,
and there is no need to carry another 10 megs of its fat.

Vast majority of systems out there are windows based.

Third, java has too many layers of abstraction, which,
in turn, translates into pointer pointing to pointers,
pointing to yet more pointers, which can not possibly
translate into the equivalent performance.

All these layers are also a load on the developer
as you can not possibly do the simpliest things
in a most direct way. Especially for the people that
are new to java, it becomes just a royal pain on the
neck. If you hit that help key expecting to get a
simple answer on your simple problem, you'd have to
go thru several layers of abstraction and, therefore,
learn a lot of those things, you don't need at the moment.
And this is so pervasive, that there is basically exists
no aspect of java where things are kept simple.
Yes, it does provide for more universality, but some
things look like an overkill by factors of magnitude.

Another thing is the very lexicon of language.
Those things that people know for many years and
clearly understand the meaning of, are all renamed.
GUI elements are called components and containers,
and just about everywhere you look, you'd have to
deal with pretty non-intuitive concepts.

First of all, components and containers have no association
in human mind with GUI elements or structures.
Both of these rather associate with database type of ideas,
storage mechanisms, etc. When I first heard of "components"
and "containers", I had goose bumps, as I though they
are going to bring up the same thing as windows does,
a 5 layered set of complexities and an overkill by the
factors of magnitude.

And this goes to just about any common sense programming
issue. It is literally everywhere.

Another issue is this obscession with obsoleting things.
When you get the next version, you are pretty much
guaranteed that some things in your old code would
have to be rewritten to use the never version of the
same thing. I never heard of such issues with C/C++
code.

I like to compile code on the most detailed level
of warnings and, when program compiles with warnings,
I don't like it. When I hit that "build" key, just
about ALL I want to see, is a one line message:
Success.

As to graphical aspects of GUI design, to see people
telling you that about the best way to do it, is to
write a GUI code by hand is simply insane!
WHAT?
By HAND is the BEST way of writing a GUI code?
In Java?

I understand if it is an assembler, it is a different
story. In my entire experience with windows and C/C++,
I never even heard such a concept as writing gui
with my hex calculator.

And we are not talking about all the various versions of
layout managers. For what? And about the most powerful
of them, the gridbag, is screwed up in the most profound
sense of the word. Why?

Finally, just about every single "new" concept in java,
is what already existed in object oriented programming,
be it exceptions, events, file streams, stacks and
just about everything you can put your finger on.
Just about the only difference is that it is all wrapped
in a different package and named in just about the most
unintuitive way conceivable.

So... For you to write just about any realistically
complex app, you'd have to learn the entire thing,
and that can take years because of all those layers
upon layers upon layers of abstraction, that promise
you the way to a wonder world at the end. Meanwhile,
you'd have to sweat and waste days on the simpliest
concepts conceivable.

So, the learning curve is on the par with bloatware.

Another lil thing: can anyone imagine that standard
C/C++ code will become OBSOLETE with a new version
of compiler?

Never heard of such a thing.
File is a file and gui is just gui, knob is a knob
and text field is a text field.

Why should I abandon AWT?
It is just like abandoning some assembly level
instructions in your CPU. The reason Intel and
Microsoft gained a monster part of the market
is because even today, you can pull a program,
written a generation ago, and it will perfectly
run on the latest and gratest version of your
bloatware, be it in a DOS box.

The whole issue JVM needs to be looked at.
The fact that you need to load the whole thing
every time you run ANY app, is a monster size
overkill. It is like loading an operating system
just to output "hello" string on your terminal.
I'm fluent in a decent array of languages, compiled and interpreted,
as well as a good number of assembler languages. If it does nothing
else, it helps me to avoid falling madly in love with any one language
and making everything about it okay. The JVM and its mitigation
strategies are great, but it's not about that -- it's about Java's
ease of coding that's made it popular. Coding Java is like traveling
in a Winnebago, where coding C is like traveling on a dirt bike.
There are tradeoffs to both....


One other point related to Java on the downside IMO is a side effect
of exactly what's good about Java: ease of code reuse.

I have seen a lot of inefficiently written components used in a lot of
applications. Because Java encourages the programmer to let the JVM
handle a lot of details it seems to me -- and I welcome other opinion
on this -- that many Java coders get "lazy" about what's really going
on inside the code. If it's slow, they write it off as "there's a lot
going on." They seldom dig much below the surface, because that's
what Java seems to be trying to free you from having to do. It's
valuable in Java to make sure the appropriate techniques and objects
are used and to understand the implications of a bad choice -- refer
back to my 1st Java program, the "grep" implementation, to see what
wrong use/type of objects can do to runtime performance.

Java is popular partly (largely?) due to the ease with which you can
get an app going, but unless one is disciplined that very speed can
lead to less than optimal solutions. I'm certainly not suggesting
easier programming languages are bad, just thinking outloud with
respect to human nature.

Thoughts/experiences?

I wish the megawars between Sun, Microsoft and other major
players will eventually lead to a commonly accepted set
of standards, instead of sitting here, concerned that
with next version, you may not even have legal rights
to include the next version of their super-sexy
version of JVM or whatever they call it in the next
version of their bloatware.

The very fact that Microsoft does not support anything
beyond AWT, is about the worst news for the entire java
world.

I predict that in 2-3 years, Microsoft will obliterate
the whole java philosophy and mechanisms, and people
would not even see the word Java in documentation.
And they have ALL sorts of technologies developed,
time tested, and working perfectly well.

Never mind, it is called bloatware, or even monsterware,
as far as i can see. It is ALL bloatware, no matter
how you look at it. You either incur it this way,
or that. That is the whole difference.

When I read some articles regading the performance
issues and Java freaks even going as far as to claim
that java could be even MORE efficient that the native
mode programs, I could not believe my eyes.

What happened to their brains?
Have they all gone mad?

How this could POSSIBLY be, even in theory?
It is EXACTLY the same argument as to claim that
interpreter is more efficient than the compiled code,
just because it has some optimized primitives.

I, personally, do not even want to hear about
things like JVM. To me, it should be all hidden
and flexible enough not to carry factors of magnitude
more fat than I need. How can anyone in his clear mind
claim that C++ code is less efficient?

Then what is JVM?

How come for a program just to come up, you need to
wait for so long? How come my code carries the whole
universe of Java with the simpliest program,
where 90% of it is never even executed and simply
sits there eating your memory like a packman?

As I said before, and repeat it again, unless java
is wired into your hardware, which is probably
a generation in the future, I just don't see it becoming
that brick out of which any building is built.
Way too much overhead, WAY too much of unnecessary
complexity, way too many layers of abstration.

The things they wired into very language, could
be easily provided with off the shelf libraries
and efficiently compiled so that you only carry
those things your program refers to, and, in run
time, only those things you actually need,
are loaded, and loaded on demand.

To me, any good program should come up instantly.
It is a royal pain on the neck to even attempt
to start just about any major program out there
as you have to wait for minutes till you see
the main screen. By the time you see it,
look at your process monitor. You lost tens
and even hundreds of megs of memory, and you
didn't even BEGIN to do anything.

Bloatware?

Bloatware is a compliment.
It is rather a lunacyware.
 
N

nukleus

Have you ever coded in Java ?

Java development is very similar to C++ development
and very distinct from development in the interpreted
usually dynamic typed languages like Python and Ruby.


We have understood

Who is "we"?
that you are very impressed over your own skills.

What is this?
 
N

nukleus

????

Similar features are available in C++ IDE'es.


Java with standard editor and command line build is similar to C++
with standard editor and command line build.

Java in IDE is similar to C++ in IDE.

C++ with standard editor and command line build is obviously
not similar to Java in IDE.

But that is a ridiculous comparison.


The readers of this thread.

I did not authorise you to speak for me.
No but I strongly dislike people who think that various suspect claims
will appear more credible if they present themselves as great experts.
:--}

Interesting combination.
:--}

I, personally, like the range of experiences.
About the most difficult thing in the world
is to deal with those people, who are one sided,
with thouse blinkers around their eyes,
or flappers on their elephant size ears.
 
N

nukleus

"Chris Uppal" said:
I wouldn't call it "legendary" (it is very impressive technology -- but not
unique). I wouldn't call it "mythical" either ;-)

I would say that Java implementations have offered a balance between
flexibility and other ease-of-programming aspects
performance
conformance to prejudice
!!!!!

which has been found appealing by many. It's the /balance/ which is important.

(I don't, in fact, agree with that judgement myself, but that's a question for
another day).



Hmm... Or maybe it's for today after all. Java was designed as if it was never
going to be interpreted. In most ways the language (or the compiler's
interpretation of the language) is totally static. (The /JVM/ on the other hand
is much more dynamic). There is almost nothing that the designers of Java
/could/ eliminate from the benefits of dynamic languages that they /haven't/
eliminated. Possibly in the misguided belief that they were impossible to
implement efficiently.

Fortunately, the runtime semantics of the JVM are more flexible.

And (perhaps even more fortunately) the creators of IDEs for Java have
attempted to replace it (at least at development/debug time) by a more dynamic
language with the same syntax but a less rigid semantics.

Yep. There IS something of this sort.
They haven't succeeded, of course -- in fact their efforts are laughable[*] --
but they /have/ tried, and they haven't entirely failed either.
([*] The scorn -- and it /is/ scorn -- is because I'm a Smalltalker in my other
life.)

Oh, jeeez. Smalltalk is quite an animal.
Like Prolog.
That is ENTIRELY different world.
My own impression is that, although you are right, a different language would
not materially change those programmer's attitudes. They /still/ would not
want to understand what they were doing in more detail, or at a greater depth,
than they could get away with. One of the parameters one can "twiddle" when
designing a language (or an API, come to that) is how the language copes with
such abuse. The Java approach is generally to try to protect them from /error/
while allowing performance (in all senses) problems. The C approach is
different. The C++ approach is (interestingly) a combination of the two -- at
the /language/ level it has very much the same philosophy as C, but it provides
better tools for creating safe APIs.
(Re-reading that last paragraph, I realise it sounds as if I'm claiming that
there's a black-and-white distinction between "good" programmers and
"lazy/ignorant" programmers -- I don't really mean that at all, but the sense
may come across better if I leave it overstated. Anyway, I can't be bothered
to re-write it now...)

Oh, and I must add: although I strongly support the idea that programmers
should understand the next level or two down of the operations they are
invoking. I don't think that the difficulty of that is always appreciated in
these days where the hardware itself is hugely complicated, with subtle and
non-obvious performance characteristics of its own.

To me, the information needs to be separated into
at least 3 categories.

1. Newcomers.
2. Intermediate level of expertise
3. Expert level.

When I start something new, it is all about the
information flow. I do not want to know about the
whole Java world when I do the simpliest things,
and so, trying to get some info on sope operation,
about the LAST thing I want to see, is several
layers of abstraction, where I could not possibly
do what I want, unless I dig all those references up.

All that stuff is outdated in my view.
It represents such an immense amount of overload
on a human mind, that it is next to insane.

There is no need for that any longer.

It all can be presented on a level of a USER,
and not the level of chief architect, sitting
on an ivory tower.

The whole issue of information presentation
has to be rethought and I have done somw work
on that level.

What I have seen, is that so called documentation
is nothing more than an automated extraction
of the developer's comments from the source code.

That is a SHAME on Sun.
They can keep those specs separate.
Those are for EXPERTS that know the whole thing
in and out, more or less.

Another problem with documentation
is the very examples.

Genarally, Sun does not provide code examples
in their documentation. Just about ALL you have,
is these dry one liners, telling users virtually
nothing of value, at least from the standpoint
of where they are at the moment.

What is abstract classes?
Why are they needed?
What do they improve upon?
Which exact things need to be done to use them?
What is code example that shows how to do it correctly?
What is more expert view on it, reflecting some
internal issues and more specific details?
What is the HIGHEST level of it all
and the very philosophical underpinnings?

You see, it is all abscent.
Even if you look at some of those "tutorials",
the information presented is on the most rudimentary
and utterly unrealistic level.

Furthermore, the way the information is presented
is nothing but a torturing procedure of some sort.
You are given about 50 pages of stuff,
grinding the same things. But you can not get
to the point information on realistic real life
application of it.

As a result, the burden on perception is simply
monstrous, and there is no need for that.
It simply defeats the very purpose of that very
documentation.

Trying to get info on just about anything imaginable,
I was not so pleasantly surprised.

With technologies of our time, it should take you
minutes, if not seconds, to get just about ANY
view on the information imaginable, and just about
ANY kind of examples, helping you to understand
how something work.

It is time to rethink the whole informational aspect
of it all.
 
N

nukleus

Hmmmm, I dunno about that... .have you done much with the WMFSDK?

Nope, and to tell you the truth,
every time I see these new buzzwords,
I have goose bumps.

And with every new version, they invent tons of new buzzwords.

You see, it is the very strategy.

The idea is this:
As soon, as they release their next version,
with all sorts of new complications and complexities
of such magnitude, that it'll take you literally years
to fully comprehend it, you'll be tied up with chains,
and you MUST learn it.

So, after a couple of years, you, eventually, learn it.
And...

The next revision gives you entirely different wordview.

So...

You are FOREVER tied up to it, never quite able to
finally say: I know this thing in and out, as with
the next revision, you are a newbie again.

That is how they keep ahead of everyone.
But simply creating tremendous amounts of information
and "new", "advanced" technologies. Just about the
simpliest things there are, are labeled as "new
technology". It is like you were some kind of
electrician to be forever tought how to use the latest
version of a lightbulb, super wrapped into fully
distrubuted globally accessible, database systems
and virtual domains.
Intuitive is not a description that seems appropriate in the same
sentence with the proliferation of Interfaces supporting obscure and
ecclectic functions embedded in the bowels of the documentation for
that SDK.

Yep, agreed 100%. The DARK BOWELS of documentation.
Interestinly enough, the new version of documentation
does not even document things. When you hit on the subject,
just about all you have is a one line sentence, and it is
not even clear what subsystem that thing relates to.
The documentation became even worse than it used to be,
even though the previous versions were also vastly
incomplete and immensely convoluted.

But again, I think it is a deliberate effort.
They are not such dummies as not to understand
how to present the information in the easiest and
most insightful way, and having to deal with
billions of customers for generations.

It is not a mistake or omission, but that very
method to assure that everyone remains at a comfortable
distance behind, and so, even if you just crawl like
a turtle, by the shear fact that those "behind"
are carrying the mountains on their shoulders
that you yourhave created,
you have a comfortable margin to work with.

I, personally, think that Linux and the whole
GNU project is about the only hope left,
where all these blood thirsty corporate mega monsters,
can no longer control the oxygen supply.

The very fact that it is open source
have produced more advancement in technologies,
than all this talk on this thread combined,
even though we are talking to the experts.

But it is still "us" versus "them" talk.
Java is "good".
C/C++ is "bad".

And on and on and on.
It is rather a religious excersize
in proving that MY prejudice
is more valid than yours.

Things like that.
Wanna programatically cut a 5-second clip out of the middle
of an ASF movie file?

Nope. Sorry.
I try to stay away from all that madness.
What is there to cut on the first place?
Some horrendous destruction?
Blood all over my royal screen?
Inter-gallactic manifestations of NWO?
The EVIL, ruling the planet Earth?
Sorry, not interested.
Because, first of all, there is no information in it all.
It is just a continuos, ongoing onslaght on your intelligence.
Long subject.

:--}
You'll need an IWMSyncReader, IWMWriter,
IWMProfileManager, etc. etc etc...

I know, I know.
It all makes me shiver.

Anyting that starts with "I" in MS worldview,
is about the fastest one way highway to hell.

That'll keep you busy for a couple of years
and you'll shell out kilobux aon all sorts
of garbage literature, trying to get the idea
wa da funk are they talking about on the first place.
and about 3,000 lines of code later
you're done.

Well, you are lucky enough if that stuff even installs.
You'd have to download a few terabytes of their latest
and greatest I-SuperSync-Virtual-Multilayer-FullyZappazoid,
SuperTransContinental interface, and when you use your app,
you never know what is it sending in its fully encrypted
packets or what kind of jazz you are receiving from the
net, without even knowing it is happening.
Wanna do the same thing with an AVI file? Scrap your IWM stuff and
read up on DirectShow -- it's crazy, man.

But thats the whole idea, you see.

:--}
The other thing that's really badly managed by MS is incompatibilities
in released SDKs and a TON of obscure fixes via pragmas, deprecated
versions of SDKs, or instructions on how to modify the base code.

Like MFC -- you have to define the appropriate level with a #define
_WIN32_WINNT 0x0401 or some such rot, otherwise you'll get link and/or
compile errors.

Yep. Heard that before.
I was lucky enough one of my apps even built under last
version of their IDE.

But maaaan. Just trying to build it, is like an interrogation
with the FBI.

All these "assembiles". Just try to push help on
what is Assembly.
You hairs will raise a couple of feet tall.

After you traverse a few levels deep,
do you think it will become clearer
what da funk is that Assembly
beyond trying to slaughter Sun?
Not at all intuitive there.... a real impediment to programmer
productivity, IMO. I've personally spent hours perusing the WWW
looking for solutions to compile or link or coding problems which are
not easily found in msdn.microsoft.com only to find the answer in some
game developer forum or in comments in a someone's sample program,
etc.

Yep. This is a VERY typical problem.
And Java docs are not much different in that respect.

As I said before, the whole issue of information presentation
needs to be completely scrapped.

Interestingly enough, I told some people here,
that this very group could be a grand collection
of useful information. But the very presentation,
depth and coverage need to be vastly improved.

Basically, the "standard" procedure is to give you
some web link, where you can go thru yet another
pile of garbage, just to get a simple, straightforward
answer.

But that very answer could be provided RIGHT HERE.
Just take a few minutes of your time
and describe it in sufficient detail and precision,
and, from then on, this issue is covered, done with,
and it becomes a part of the biggest global information
system known to mankind and available thru 100k servers.

As I said before, I have at least 25k article archive
of this group, going back at least half a year,
and that information is available via some of the
biggest servers. For FREE. No need to even sign up.

And I have a gadget that allows extraction of that
information in a pretty precise manner, even if it
is poorly named, and to generate you very fancy
looking web pages, fully indexed, with navigation
bar, and you can adjust the way it looks using
your own style sheets if you want.

In terms of learning, this is a goldmine.
Better than just about ANY doc out there.
I'd rather spend hours reading the usenet articles,
than trying to find it in some "documentation".
And i get MUCH more in-depth view on things,
even if people "fight for their right"
to see things thee way it is more natural to them,
instead of having utterly biased, one view of the world,
which, in most cases, is nothing more than raging
marketing propaganda and a pile of confusion.
But that's stuff for a whole different thread -- it is possibly
relevant here in the sense that obstacles like that definitely will
push people toward a more "standard," non-OS specific development
language like Java.

Well, it is not a problme with me even on this thread.
This is the very nature of usenet.
Pretend you are in the main square of some ancient
city in Greece. You just get up on the podium
and say just about ANYTHING that comes to your mind,
and no one, mind you, no one, even the Emperor himself,
is not allowed to deprive your of that opportunity.

It is called Democracy.
People have forgotten all about it it seems.
Just about all that is "acceptable" is
utterly template built wordview
of "black" and "white",
"vituous" or "evil",
"good" and "bad",
and on and on and on.

But WHAT is "good"?
Can anyone define it?
In that sense Microsoft may be doing more to push people to other
solutions than they realize.....

True. And more and more people are switching to Linux.
I have not seen what is happening in Linux world
for a couple of years, but I suspect it is MUCH more
pleasant environment to deal with.

First of all, it is orders of magnitude leaner.
Than you have a full source code of every single
gadget or package. So you can see whats in the bowels
of it, learn how other people do it, and modify it
to either fix bugs or adjust it to the way YOU
see the world.

And THAT is the way to go.

This whole MS world is but outdated worldview.
It can not possibly have future,
especially considering that Linux/GNU world
has became something undeniably present.

I was pleasantly surprised to see that during
the debugging, I can step thru the source code
of JVM and see how my app interacts with it.
I like that idea.

Good luck.
 
N

nukleus

Not to worry. There are some priests around here
that will be doing just about all they can
to reduce you to dust.
In the beginning, it is subtle humiliation
and hidden ridicule, because most of those people
are perverts, that know no way of presenting
information directly, in undistorted an poisonous way.

No need to even appologize.
Because the very trick is to try to make you feel
guilty, to constrict your energy, as once you are
on defensive, it is about the easiest thing in the
world to "finish you off".
True.

But utterly irrelevant to the part of your post I commented on:

Oh, cockfight again?

Oki, doki.
That'd be sufficient fer now.
 
M

Mark Thornton

nukleus said:
And don't forget about the very fact that with JVM,
you have to load the entire JVM to run a 2+2 program
and that magnifies the size of a 1 meg program by the
factors of magnitude. Before your main() is hit,
it already swallowed at least 10 megs, and for what?

This simply isn't true. If you don't use the windowing bits, the
graphics related DLLs don't get loaded. Ditto for various other components.

Mark Thornton
 
B

blmblm

[ snip ]
This kind of reminds me of my first stab at writing a java program
that I'll share -- feel free to scroll by if this is not of
interest...

Some belated "just curious" questions here .... :

My first experience with Java was about 1996 when for grits and shins
I wanted to compare Java against C, since we were doing a lot of C and
Java was this "new kid on the block.". So I decided to do a program
that had a reasonable balance of I/O vs computational load as a means
to compare -- it was a searching function ala "grep" to scan log files
on a telephony switch. The logs were over a megabyte with ASCII text
-- pretty standard "logs" if you will.

Both programs required a case-sensitive string enclosed in quotes as
input. I really didn't know any of the particulars of Java, but since
it was block structured I decided for a "fair" comparison I'd use the
same algorithm in both. The algorithm looked more or less like this:

int line = 0;
open file
while( ! eof )
{
buffer;
++line;
if( readline(buffer) == endoffile )
eof = true;
else if( buffer contains string )
print "found string on line #" line
}
close file;

The print was a printf in C and System.out.println in java. In C
buffer was declared as "char buffer[nnn]" and in Java it was a "String
buffer" --

String or StringBuffer?
for both it was declared inside the loop. The C search
used strstr(buffer, searchStr) and java used
buffer.indexOf(searchStr).

I know a lot of Java experts are shaking their heads,

Well, I'm no Java expert, just someone with intermediate-level
experience with the language, but -- I don't spot anything about
your pseudocode that's obviously bad. ?
but remember I
didn't know the nuances of Java; I just knew it was block structured,
and perusing a Java reference book was enough to get it to compile.

The benchmarking test used an RS/6000 with AIX, and the results were
staggering. Tthe C program could search the entire log file and
produce the desired output in about 18 seconds. The Java version ran
for over 11 minutes before it finally blew off because it was out of
memory. I tried and retried and I could not get it to run through to
completion.

Dramatic results, all right. It might be interesting to repeat this
test sometime .... Hm, it might be interesting to first repeat it
using your original code (since Java compilers / runtime systems
have improved) and again with code written using what you now know
about Java.
Of course today I know using a String was bad and that a StringBuffer
would have been better,

It would? You're not changing the contents of the lines you read
from the file, so how would a StringBuffer be better?
plus declaring an Object inside the loop was
causing GC to run nonstop I imagine.

But, of course, you're not doing that; you're declaring an object
reference.
I hadn't used any kind of
BufferedReader -- I don't remember what I used to be honest, but it
wasn't the best choice.

Now *that* I can imagine making a difference.
So admittedly, I made a lot of java "rookie" mistakes.

Anyways, the exercise was valuable, as I realized Java simplified some
things, but to be good it required some meta knowledge that wen't
beyond syntax and semantics -- it was very important what objects you
used, where you put them, and that you understood the side effects
caused by the objects (sync vs non-sync, immutable, etc) you used.

For those who only code in Java you're probably saying to yourself
"okay, everyone knows that, so your point is?" -- but if you use most
other languages, a buffer is pretty much a buffer, and whether it's
declared as an object or a character array doesn't have such a
dramatic effect on performance, so outside of the Java community it's
compelling. I went into my exercise unaware of this difference, and
needless to say I wasn't highly impressed with Java at that time.

I'm not convinced that this makes Java different from other languages,
though. It seems to me that there are two issues here: whether
familiarity with the language's quirks can have a dramatic impact
on performance, and quality of implementation (compiler, library,
runtime system if any).

With regard to whether familiarity with the language's quirks can
affect performance, the obvious answer is "of course it can" -- and
I don't quite get how Java is different from C++, or any other
language, in that regard.

With regard to quality of implementation, well, as I said above,
I wonder whether if you repeated your tests now you would get the
same results. As I understand it, and as others have described,
changes to Java implementations over the years have resulted in the
potential for much better performance.


I did something along similar lines a few years back -- rewrote a C
program to compute something Mandelbrot-set-related (in which most
of the computation involves calculations using complex numbers) in
C++, specifically so I could replace the ugly use of two doubles to
represent each complex number with instances of a class for complex
numbers, complete with appropriate operator overloading. The resulting
program was a lot prettier but was slower by a factor of about 1.5.
To me this was a convincing demonstration that there *are* situations
in which it's a good idea to focus more on performance than on making
the code pretty, and I cited it often. Not too long ago, though,
I repeated this experiment and found .... that the C++ program ran
just as fast as the original C. So much for my nice example!
I guess my point is that claims about performance maybe have to be
re-evaluated from time to time. <shrug>, maybe.

[ snip ]
 
C

Chris Uppal

Christian wrote:

I once implemented a cryptograhic hashfunction .. and I couldn't come
closer to the speed of a c++ implementation than a factor of about 0.5.

since its purely deerministic I would assume a hashfunction can be very
well optmized at compile time in case of c++ and JIT and AOT should not
matter much to the program?

That's what I would have expected too.

Assuming that your code was transforming binary data (byte[] arrays) into more
binary data, and that your code wasn't dependent on the speed, or lack of it,
of utility code (such as, perhaps, java.math.BigInteger), then I don't really
see why there should be much difference.

<wild guess>
Did your code make much use of 2-dimensional arrays (or higher) ? That's the
only thing I can think of where similar-looking code in C and Java would
actually be doing something different.
</wild guess>

Bounds checking might make some difference, but I can't imagine it making a 2x
difference. (The impact of checking depends on the code, and on how good a job
the JIT and/or the processor's pipeline can do).

-- chris
 
J

John W. Kennedy

That's actually very interesting... why do you think that's the case?
I'm guessing the C++ Bitset template is grossly inefficient.

That's what they thought, so they wrote their own. It was better, but
Java still outperformed it.
 
?

=?ISO-8859-1?Q?Arne_Vajh=F8j?=

Chris said:
That's what I would have expected too.
<wild guess>
Did your code make much use of 2-dimensional arrays (or higher) ? That's the
only thing I can think of where similar-looking code in C and Java would
actually be doing something different.
</wild guess>

I have seen huge differences for Java just between running with
-client and -server.

Arne
 
R

raddog58c

How difficult this is depends on how many variations you need to test.
For example SSE 1,2,3, AMD; number of processors (1 or >1). So 4 or 5
different levels of accelerated math, plus the number of processors, you
could be looking at 10 different DLLs for the same function. This gets
very tedious. The usual response is simply to insist on say SSE2.


Not having to duplicate all the methods which take String parameters.
What encoding would you assume for 'narrow' character strings?

Mark Thornton


I'm thinking I wouldn't have to worry because the JVM would handle,
right? This would be an ideal run-time derived feature for the JVM to
provide -- NOP unnecessary translations based on the mode of
operation.

If I needed to know, a System.Getyadda call should obtain it.

If I was building a rather large application for in-house use only, I
wouldn't really care because I'd always be using the same format.
Today I don't spend an inordinate amount of time converting outside of
persisting data. I'm just saying it should improve performance if you
can eliminate unnecessary transformations.
 
M

Mark Thornton

raddog58c said:
I'm thinking I wouldn't have to worry because the JVM would handle,
right? This would be an ideal run-time derived feature for the JVM to
provide -- NOP unnecessary translations based on the mode of
operation.

Using Unicode whether encoded as UTF-16 or UTF-8 eliminates a whole
world of pain the moment you see a character outside your current
character set. If you never see anything outside traditional ASCII then
you may not appreciate this. I'd be very happy to see all the
traditional character sets disappear (CP-437, etc), leaving only the
Unicode encodings. I've had data sent to me without any declaration of
the character set in use and had to guess on the basis of the words
contained. One case I never did figure out --- it had probably been
mangled by other software that didn't understand the character set.

A simple example here with traditional software is what happens to our
currency symbol (£, pounds sterling) if you mix up code pages.

Earlier you mentioned the doubling of space caused by Unicode (assuming
UTF-16). This is only valid if most of your memory was taken up by text.
The only applications where this is likely to be true (word processors
and the like), ought to be capable of handling a wider range of
characters than ASCII. Even writing in English, I want a generous range
of mathematical symbols available (I am a mathematician).

Mark Thornton
 
R

raddog58c

How difficult this is depends on how many variations you need to test.
For example SSE 1,2,3, AMD; number of processors (1 or >1). So 4 or 5
different levels of accelerated math, plus the number of processors, you
could be looking at 10 different DLLs for the same function. This gets
very tedious. The usual response is simply to insist on say SSE2.

Lots of the very best things are tedious: graphics, device drivers,
parsers.

The key is using your base language's features to handle as much of
that as possible. That's what copybooks, macros, libraries, dll's,
generic classes, etc, can do for you when applied the right way.

You write the hardware checking once, adapt it for new hardware, and
avoid recoding and distributing the soon-to-be out-of-date code
throughout your deployed app base.
Not having to duplicate all the methods which take String parameters.
What encoding would you assume for 'narrow' character strings?

Mark Thornton

SIDE NOTE: You made me remember one other observation that I thought
would improve Java that I wanted to mention for discussion's sake.

the String class.

My impression of String is that it was a design snafu in a way. It's
used like a scalar, but it's a very PHAT class. My initial (and
continuing) impression with String is that it's too heavyweight for
its place in the Java coding food chain.

I think it would have been a better design for String to have been the
minimum number of features for a string of base characters. In
essence the strcats, strlens, touppers etc that most pre-Java
programmers became intimate with early on.
From this base String class could have come a SuperString or
something, which included other features embedded in what is the
current String class. Upon that could have come a MultiLingualString
or UniString or whatever.

It's also inconvenient that String is declared final. You can
certainly wrap string with a String interface that contains a String,
but you cannot derive a subclass from String. There's probably a very
good reason behind that to which I'm simply ignorant, but I was
suprised the first time I declared a convenience class derived from
String and it caused errors.

Any thoughts on this line of thinking? Am I being "old school" in
wanting a primitive String, or do many think the same way? IMO
creating scalar-like base classes in the spirit of String that are
contain everything but the kitchen sink AND are final seems like the
wrong way to factor down and implement that functionality.

Thoughts?
 
R

raddog58c

Not to worry. There are some priests around here
that will be doing just about all they can
to reduce you to dust.
In the beginning, it is subtle humiliation
and hidden ridicule, because most of those people
are perverts, that know no way of presenting
information directly, in undistorted an poisonous way.

No need to even appologize.
Because the very trick is to try to make you feel
guilty, to constrict your energy, as once you are
on defensive, it is about the easiest thing in the
world to "finish you off".

LOL! Well, it's cool. I think Arne may have taken my language as
negative towards Java. That's not true at all -- I love pretty much
every language I've learned, from RPG II to Java and all points in
between.

The issue for me is that EVERY language has strengths and weaknesses.
This portion of the thread kinda got ignited when I said I didn't
write Java for my home PC. I don't because I don't want to incur
loading the JVM so that I can turn around and do a recursive tree
search through my directories to find the largest, or oldest, or a
particular file with a certain string. I can quickly write that in
Perl, C or C++, call the OS APIs in the case of C and C++, and I'm
done. A Java app doing the same thing will not run faster, the
overall algorithm will not be simplified, and worse the JVM will
negatively impact my overall system's performance because it requires
a lot of space to load.

If I owned a Winnebago, I wouldn't drive it to and from Quicktrip or
the grocery store -- but when I went on longer, more ambitious trips,
it's the ONLY thing I'd drive.

If I decide to write a big, feature-rich application for my home PC,
I'd definitely consider Java, especially if I want to make comms
across the WWW or XML configuration/parsing a part of it. Heck yeah I
would, because I'm going to save a TON of time using Java, and I'm
going to avoid having to write a TON of lower-level code to get it to
work.

I wrote a TSR back in the mid 80's that got a decent bit of acclaim.
Essentially it latched a couple of interrupts (timer, keyboard (int
16H), video (int 10H), int 3 (breakpoint), int 1 (stepbystep)) and by
depressing Ctl+Alt+RightShift, I launched a step-by-step debugger ala
debug.com, only I didn't make any DOS calls so it was completely good
to trace into INT 21 DOS calls. This was invaluable to me at the time
because 90% of my code involved interrupt driven device drivers that
couldn't be debugged with the conventional debuggers at the time.

I wrote the entire application in 8086/8088 INTEL instruction set
using MASM. Collegues pestered me with "why didn't you use C? C can
do everything MASM can, plus it's portable and more easily
understood."

C would have been a pain because I didn't want library functions
making DOS calls and clobbering the OS.

MASM was absoultely perfect.

That doesn't mean MASM is perfect for EVERYTHING..... it was perfect
for that.

Languages are like tools in my tool chest. Maybe that's because due to
my jobs I've learned enough different ones, that learning another new
one isn't a HUGE synaptic leap for me anymore. So I look at the
problem space, consider my "tools" and which one works best, and
that's how I do it.

Anyone who runs around saying one language can do it probably only
knows that one language. My wife believes you can "fix" every broken
household device with WD-40, superglue, and a rubber mallet.

She reminds me of a lot of programmers I know.....

peace.
 
R

raddog58c

Some belated "just curious" questions here .... :




String or StringBuffer?

String. Before I get flamed, keep in mind I knew NOTHING about Java.
I picked up a Java Unleashed or some rot that was laying around,
perused it, figured String was the same as "char string[BUFSIZE];" and
so I used it. The language accepted it, no warnings, and thus I
figured it was all good.

I know now about immutability and such, I know now about garbage
collection, etc. It's not enough to understand syntax and semantics,
there's meta knowledge about how objects behave that's bigger than any
data structure in C, and more impactful than most (but not all) of
those in C++.
Well, I'm no Java expert, just someone with intermediate-level
experience with the language, but -- I don't spot anything about
your pseudocode that's obviously bad. ?

What's bad is the String class is immutable -- rereading into the same
buffer forced instantiation of a new String object every time.

A fair change in the C program would have been instead of doing "char
buffer[sizebuf];" type of thingy, to do :

char* buffer = malloc(buffer);
read(buffer);
search(buffer);
free(buffer);

That was inherent in the way String was used, but not blatantly
obvious to this programmer when coding the application at that time.

StringBuffer in place of String would have been closer to "char
buffer[sizebuf];"
Dramatic results, all right. It might be interesting to repeat this
test sometime .... Hm, it might be interesting to first repeat it
using your original code (since Java compilers / runtime systems
have improved) and again with code written using what you now know
about Java.

I'm going to do some benchmarking this week (time permitting) because
this thread has really sparked an interest in me.
It would? You're not changing the contents of the lines you read
from the file, so how would a StringBuffer be better?

The StringBuffer would not be reinstantiated each time causing the
garbage collector to run like a squirrell on crack. The original app
innocently violated good Java programming best practices.
But, of course, you're not doing that; you're declaring an object
reference.

But forcing a "new" each time. A StringBuffer outside of the loop
instantiated once would circumvent some of the problems I experienced.
I'm not convinced that this makes Java different from other languages,
though. It seems to me that there are two issues here: whether
familiarity with the language's quirks can have a dramatic impact
on performance, and quality of implementation (compiler, library,
runtime system if any).

The difference between using a Vector and an ArrayList, for instance,
is that Vector has an inherent lock. You won't tend to find that in
char, char* to buffer, struct&, struct* in C, or object&, object*, etc
in C++. A COBOL PIC X(1000) or COBOL PIC X OCCURS 1000 TIMES wouldn't
get you either. Some objects in Java are orders of magnitude more
efficient and you need to be aware of that when you use them.
With regard to whether familiarity with the language's quirks can
affect performance, the obvious answer is "of course it can" -- and
I don't quite get how Java is different from C++, or any other
language, in that regard.

With regard to quality of implementation, well, as I said above,
I wonder whether if you repeated your tests now you would get the
same results. As I understand it, and as others have described,
changes to Java implementations over the years have resulted in the
potential for much better performance.

I did something along similar lines a few years back -- rewrote a C
program to compute something Mandelbrot-set-related (in which most
of the computation involves calculations using complex numbers) in
C++, specifically so I could replace the ugly use of two doubles to
represent each complex number with instances of a class for complex
numbers, complete with appropriate operator overloading. The resulting
program was a lot prettier but was slower by a factor of about 1.5.
To me this was a convincing demonstration that there *are* situations
in which it's a good idea to focus more on performance than on making
the code pretty, and I cited it often. Not too long ago, though,
I repeated this experiment and found .... that the C++ program ran
just as fast as the original C. So much for my nice example!
I guess my point is that claims about performance maybe have to be
re-evaluated from time to time. <shrug>, maybe.

[ snip ]

I think it's about understanding effects.

For instance, in some native assemblers you can peruse the instruction
set and find that choice of instruction can effect performance. A DIV
by a factor of 2 is far less efficient than a SHR. IBM's 360/370
instruction sets contain some peculiar instructions that were built to
enhance COBOL. The EDTMK (I might have that mneumonic wrong -- been a
LONNNNNGGG time) can take packed data and make it into a character
string with dollar signs, commas and decimal points in one statement.
But your data has to be packed decimal to use it. Packed decimal is
not as good for math operations as binary. So you'd need to know that
so you could either convert to packed just before your EDMKs or leave
it packed all the time.

The avg person would only care if it "worked" or not -- realtime
system programmers go nutz when they see inefficiencies at this level.

There's lots of collections that will work in the same algorithm, but
which one you chose and how you apply it can really make a big
difference. From the languages I know, this kind of data structure
choice is more prevalent by far with the Java runtime than with
anything else I've used.
 
R

raddog58c

That's actually very interesting... why do you think that's the case?
That's what they thought, so they wrote their own. It was better, but
Java still outperformed it.

--

So do you suppose this was a case where late binding made the
difference, or do you think Java's implementation of whatever is used
by the aforementioned algorithm is significantly better?

I ask because I'm curious what other techniques might be used by the
JVM beyond instruction set coercion (sp?). Does the JVM recognize the
algorithm or some facet of it and take shortcuts the C++
implementation doesn't, or does this algorithm use vast amounts of
heap storage?

One thing that's different between C++ and Java is memory. I believe
the JVM grabs everything it will ever need at startup, but I'm not
100% sure this would be the case in a C++ .EXE. There might be some
memory dynamics in the form of quazi "lazy init" in C++ (ie, non-
preallocated heap) that causes C++ to invoke the native OS's getmem
API where the JVM does that upfront.

Again, don't know the algorithm, but if there's no instruction set
advantages and no algorithmic pruning taking place, then my guess
would be something along the lines of memory management. I do think
the JVM manages memory better than most operating systems -- it would
have to because the cost of keeping it clean via garbage collection
would overly expensive.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,007
Latest member
obedient dusk

Latest Threads

Top