Java Future

A

amanda

Tom said:
To a certain extent I agree, but it does allow you to read the details
of and experiment with the processors own instructions, this is
invaluable information.

Yes, I want to know those thing.
In addition to this, knowing how modern
processors and computers work, helps you understand what actually
happens when you write a statement or algorithm in java.

That's great. Thanks.
 
S

Simon Brooke

amanda said:
How does one determine whether one can pick up a new language quickly
or not? So far, they all seem pretty much the same to me, at least at
basic level.

OOoooooh...

What you mean is that you've only really been exposed to languages which
trace their syntax and at least some of their core concepts back to Algol.
Try

APL http://en.wikipedia.org/wiki/APL_(programming_language)
Forth http://en.wikipedia.org/wiki/Forth
LISP http://en.wikipedia.org/wiki/Lisp_(programming_language)
(or Scheme, which is similar)
Prolog http://en.wikipedia.org/wiki/Prolog
Python http://en.wikipedia.org/wiki/Python_(programming_language)
SmallTalk http://en.wikipedia.org/wiki/Smalltalk

At least one of these will be a total revelation to you, and will enable
you to do things you wouldn't otherwise think possible; at least one of
them you will find completely impossible to learn. Which is which depends
on you. Apart from Python, none of those listed are currently commercially
important, but all are powerful tools and worth learning.

Note that there are implementations of all the languages listed (except
APL) which will compile for the Java Virtual Machine; see
http://www.robert-tolksdorf.de/vmlanguages.html
http://jvm-languages.com/
 
S

Simon Brooke

Chris Uppal said:
So, how long does it take to pick up a language ?  It depends on your
standards.  If all you are trying to do is get to the point where you
know the syntax well enough to write programs without having to check the
manual every time you use, say, a while loop; then it I see no reason why
it should take longer than a weekend for most well designed languages
(simply because any well designed language has been designed to be
learned).  There are exceptions -- I doubt whether it is possible to
learn C++ (even to that minimal extent) in less than a week or so.  And I
doubt whether I personally would ever be able to learn APL (or its
descendent J) no matter how long I tried.

However, knowing a language to that extent is a useless party trick.  You
will neither be writing correct code yourself, nor able to follow other
people's code -- unless it is written in baby talk.  Learning a language
to the extent where you are naturally comfortable with full and idiomatic
use of its features takes a long time.  I would say 1 year of nearly
full-time use at a minimum. You would certainly be /productive/ long
before then, but not expert enough to be trusted to work safely
unsupervised.

While I'd agree that superficial knowledge of a language can be a dangerous
thing I think you are pessimistic on time frame - at least as far as well
designed languages are concerned. I also think that it's often easier to
learn a language which shares little or nothing in common with the
languages with which one is most familiar than to learn a language which
is a slight but significant variant.

But it's true that different languages have distinct idiolects and allow
one to approach similar problems in quite different ways; and that until
you become sufficiently familiar with a language to think in its idioms
you are not a master of it.

Example: one of the things one uses all the time in LISP are MAPCAR and
friends; they're extremely powerful. I've now been writing in Java pretty
much full time for ten years, and I realise I don't actually know how to
do the equivalent of MAPCAR in Java. Not that I've ever felt the need; it
simply is not a Java idiom.
 
D

Daniel Dyer

That might be true, bit it is a good source to refer to so we don't have
to answer the same question for the nth time. The comp.lang.c faq for
example is extremely valuable and well written.

Does there exist a comp.lang.java.{programmer,help} faq of its own?
(I know there is one comp.lang.java faq but it has not been updated
since 1997, so its a bit old.)

I'm pretty sure that there is no currently maintained FAQ for this group.
I know of the mini faq that's posted in this group every week or so,

The mini FAQ is more about what the groups are for than answering
particular technical questions.
does any of the faq in there cover many of the faq questions that come
up in this group? Some of the questions I can think of are
- which is the best language
- what is best for developing app X
- what do you think is the future in java
- compilation/classpath problems
- does java have pointers / what's a reference compared to a pointer
and other basic language questions.

How do I find out the size of an object?
Can I create a native executable with Java?
What do these warnings about unchecked conversions mean?

And many, many more.
I know that the faqs listed in the mini faq are very good, but I feel
that there is some stuff not covered by them. Am I mistaken?

tom

Not at all, but I think that an FAQ for this group would be a significant
undertaking... but that doesn't mean it wouldn't be worthwhile...

The JINX wiki looked quite promising as a kind of collaborative FAQ but it
seems to have been suffering problems for a while now. I don't know if
the problems are terminal (Chris?).

Dan.
 
M

Martin Gregorie

Simon Brooke wrote:
..../...
What you mean is that you've only really been exposed to languages which
trace their syntax and at least some of their core concepts back to Algol.
Try

APL http://en.wikipedia.org/wiki/APL_(programming_language)
Forth http://en.wikipedia.org/wiki/Forth
LISP http://en.wikipedia.org/wiki/Lisp_(programming_language)
(or Scheme, which is similar)
Prolog http://en.wikipedia.org/wiki/Prolog
Python http://en.wikipedia.org/wiki/Python_(programming_language)
SmallTalk http://en.wikipedia.org/wiki/Smalltalk

At least one of these will be a total revelation to you, and will enable
you to do things you wouldn't otherwise think possible; at least one of
them you will find completely impossible to learn. Which is which depends
on you. Apart from Python, none of those listed are currently commercially
important, but all are powerful tools and worth learning.
Don't forget that there is probably more existing code in:

COBOL - http://en.wikipedia.org/wiki/COBOL
FORTRAN - http://en.wikipedia.org/wiki/FORTRAN
C - http://en.wikipedia.org/wiki/C_(programming_language)

than everything else combined, so take a glance at them too. COBOL is a
big, verbose language, but it and Fortran are possibly the easiest of
the lot to become productive in: both were alive and in use well before
Computer Science was even a gleam in some professor's eye.

In contrast, C must be one of the smallest useful languages and is worth
knowing because so many modern languages have been derived from it. Its
probably the most common language for writing operating systems, system
utilities and compilers.

Last but not least, its worth knowing that monsters like PL/1 exist:

http://en.wikipedia.org/wiki/PL/1

This was an attempt at IBM to create a programming language to end all
programming languages, apparently by combining COBOL, Fortran and Algol
60 - some would say by combining the worst features of all three.

Algol 60 (http://en.wikipedia.org/wiki/Algol_60) isn't much used these
days but deserves respect. It was the first block structured language
and also the first to allow arrays to be dynamically sized at run time
and one of the few to let array subscripts to have a base other than
zero or 1.
 
J

John W. Kennedy

Martin said:
Last but not least, its worth knowing that monsters like PL/1 exist:

http://en.wikipedia.org/wiki/PL/1

This was an attempt at IBM to create a programming language to end all
programming languages, apparently by combining COBOL, Fortran and Algol
60 - some would say by combining the worst features of all three.

Rubbish. There is much that is wrong with PL/I, given 40 years'
hindsight, but, apart from the absence of things that had not yet been
invented, such as OO, it is almost entirely due to two other causes:

1) A general philosophy that anything that /could/ be done, /should/
be done, such as allowing arithmetic on character strings provided that
they convert to numeric values.

2) Unwise implementations of features new to HLLs, such as all
pointers being anonymous, exception handlers that have to be activated
at run time (due to a false analogy with set-interrupt-handler APIs in
assembler), and the use of exception handlers for non-exceptional
conditions, such as end-of-file.

But with all its faults, PL/I was the best vehicle for adult programming
on mainframes for decades.
 
J

John W. Kennedy

Simon said:
OOoooooh...

What you mean is that you've only really been exposed to languages which
trace their syntax and at least some of their core concepts back to Algol.
Try

APL http://en.wikipedia.org/wiki/APL_(programming_language)
Forth http://en.wikipedia.org/wiki/Forth
LISP http://en.wikipedia.org/wiki/Lisp_(programming_language)
(or Scheme, which is similar)
Prolog http://en.wikipedia.org/wiki/Prolog
Python http://en.wikipedia.org/wiki/Python_(programming_language)
SmallTalk http://en.wikipedia.org/wiki/Smalltalk

At least one of these will be a total revelation to you, and will enable
you to do things you wouldn't otherwise think possible; at least one of
them you will find completely impossible to learn. Which is which depends
on you. Apart from Python, none of those listed are currently commercially
important, but all are powerful tools and worth learning.

Note that there are implementations of all the languages listed (except
APL) which will compile for the Java Virtual Machine; see
http://www.robert-tolksdorf.de/vmlanguages.html
http://jvm-languages.com/

Also, even though it is a member of the great ALGOL family, there are
many things in Ada that can open one's eyes. I recommend the Barnes book.
 
S

Stefan Ram

John W. Kennedy said:
Rubbish. There is much that is wrong with PL/I, given 40 years'
hindsight, but, apart from the absence of things that had not yet been
invented, such as OO

1962 Simula (1964: first implementation)
1966 PL/I (first manual)
 
J

John W. Kennedy

Stefan said:
1962 Simula (1964: first implementation)
1966 PL/I (first manual)

1964 PL/I (announcement, as "NPL")
1965 I personally read a published NPL manual in July
1967 Simula 67 (first version with classes)
 
S

Stefan Ram

John W. Kennedy said:
1964 PL/I (announcement, as "NPL")

There must have been an "xPL" naming scheme at that time,
because there also was "CPL", which was influenced by Algol 60
and Christopher Strachey.

http://en.wikipedia.org/wiki/Combined_Programming_Language

Its design goal was similar to PL/I as a "large" language for
every purpose.

CPL was a predecessor of BCPL, then B, the C. Since C++ is
based on C, and the Java-Syntax was created to be familiar for
C and C++ programmers, Java is remotely and partially based on
CPL.

Strachey sometimes is quoted as if he had coined the term
"polymorphism" in a CPL-related summercamp lecture in 1967.
But now I read these lecture notes and found that he only
coined the distinction between parametric and ad-hoc
polymorphism, but takes the basic notion of "polymorphism" for
granted.

Sometimes, the sentence »Polymorphism is the ability of a
function to handle objects of many types« is attributed to
Strachey or this lecture, but this is not contained in the
summercamp lecture notes.

So, I am still looking for an CPL manual predating this
lecture, because it might contain an early definition of
"polymorphism". I am still searching for the roots of this
notion.

"The main features of the CPL" from Volume 6, Issue 2, of "The
Computer Journal" might help to clarify that, because
"polymorphism" might have been used therein to describe some
aspect of CPL. But, they seem to have published several issues
online

http://comjnl.oxfordjournals.org/archive/1963.dtl

but not that very issue in question. Actually, it has been
published

http://scholar.google.com/scholar?q...al/hdb/Volume_06/Issue_02/060134.sgm.abs.html

, but now has disappeared from the web.

Well, many things happend in 1967, Strachey gave this
lecture and Kay coined "object-oriented", and then?

»Computer science has been in a dark age since the 1970s.«
(Lupo LeBoucher)

I recently heard someone boast that they have a
natural-language system that can understand references by
words like "it". But this was already achieved by SHRDLU in
1968-1970.

http://hci.stanford.edu/~winograd/shrdlu/

So, where is the progress?

Oh yes: Java now does have a »printf«, so that one can
fill output fields more easily.

The March of Progress

1980: C
printf("%10.2f", x);

1988: C++
cout << setw(10) << setprecision(2) << showpoint << x;

1996: Java
java.text.NumberFormat formatter =
java.text.NumberFormat.getNumberInstance();
formatter.setMinimumFractionDigits(2);
formatter.setMaximumFractionDigits(2);
String s = formatter.format(x);
for (int i = s.length(); i < 10; i++)
System.out.print(' ');
System.out.print(s);

2004: Java
System.out.printf("%10.2f", x);

http://horstmann.com/

And this does not even use

import static System.out;

which could make the last line even shorter.
 
S

Simon Brooke

Stefan Ram said:
There must have been an "xPL" naming scheme at that time,
because there also was "CPL", which was influenced by Algol 60
and Christopher Strachey.

http://en.wikipedia.org/wiki/Combined_Programming_Language

Its design goal was similar to PL/I as a "large" language for
every purpose.

I doubt there was any serious influence either way between NPL and CPL.
Although the projects were similar in intent, the distance between Britain
and the United States, in communication terms, was much greater at the
time. Further, there was a considerable distance between the academic and
commercial worlds, and CPL was a British academic project whereas PL/I was
a US commercial one.
CPL was a predecessor of BCPL,

Ish. As I understand it CPL never ran. BCPL was a cut down version of
things done at the Cambridge end of the project in exploring compiler
design. BCPL is a /much/ more primitive language than CPL was envisaged as
being.
then B, the C. Since C++ is
based on C, and the Java-Syntax was created to be familiar for
C and C++ programmers, Java is remotely and partially based on
CPL.

There's much more in common between BCPL and Java than that; indeed, BCPL
was a very interesting and in many ways very forward looking language. The
BCPL compiler compiled to 'CINT code' (Compact INTerpreted code) -
essentially VM instructions. The 'CINT code interpreter' was the only part
of the system which had to be rewritten to port BCPL to a new hardware
architecture, and was essentially a VM. A BCPL program could assume the
presence of standard libraries analogous to the Java's standard classes.

BCPL was the first real attempt at 'compile once, run anywhere'. It
partially worked: the exact same compiled binary would run on a BBC micro
or an Amiga or an ICL mainframe.

B was essentially BCPL with a native-code compiler for a particular
concrete machine architecture. In my opinion that was a retrograde step. C
started life as B with a preprocessor. So BCPL has been hugely
influential.
 
T

Thomas Hawtin

Stefan said:
The March of Progress

1980: C
printf("%10.2f", x);

1988: C++
cout << setw(10) << setprecision(2) << showpoint << x;

1996: Java
java.text.NumberFormat formatter =
java.text.NumberFormat.getNumberInstance();
formatter.setMinimumFractionDigits(2);
formatter.setMaximumFractionDigits(2);
String s = formatter.format(x);
for (int i = s.length(); i < 10; i++)
System.out.print(' ');
System.out.print(s);

2004: Java
System.out.printf("%10.2f", x);

http://horstmann.com/

To be fair, if you didn't want the spaces (and not many people do
today), then the following could have be written for Java 1996:

System.out.println(MessageFormat.format(
"{0,number,0.00}", new Object[] {
new Float(x)
}
));

And Java 2004 could go down to:

out.println(format("{0,number,0.00}", x));

(I'm not showing imports, as #include and the namespace thingy isn't
shown (for C++ 1997).)

I think it unfortunate that such a cryptic format was chosen for Java
printf.

Tom hawtin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,773
Messages
2,569,594
Members
45,123
Latest member
Layne6498
Top