Alternatives languages on the JVM: which one or is there no alternative?

S

Silvio

Am Dienstag, 3. Dezember 2013 03:07:31 UTC+1 schrieb markspace:
The main issue iirc was that he wanted the lambdas to be immutable. If

you go back and modify objects or fields by closing over them, then you

no longer have an immutable operation. Whether the risk was thread

safty and optimization, or programmer misuse, or something else, I don't

recall off hand, but he was pretty adamant it was a dumb idea.

You can easily brake this using the single element array trick:

int sumArray[] = new int[] { 0 };
ints.forEach(i -> {sumArray[0] += i;});
println(sumArray[0]);

I wonder what he value is if it's that easily broken ...

In addition, Java closures can still mutate heap objects so the point
made against mutation from closures was quite weak. I suspect BG made it
as part of the discussion on lambdas at the time and that it should not
be taken out of that context.
 
S

Saxo

Am Dienstag, 3. Dezember 2013 09:42:09 UTC+1 schrieb Silvio:
Am Dienstag, 3. Dezember 2013 03:07:31 UTC+1 schrieb markspace:
The main issue iirc was that he wanted the lambdas to be immutable. If

you go back and modify objects or fields by closing over them, then you

no longer have an immutable operation. Whether the risk was thread

safty and optimization, or programmer misuse, or something else, I don't

recall off hand, but he was pretty adamant it was a dumb idea.
You can easily brake this using the single element array trick:
int sumArray[] = new int[] { 0 };
ints.forEach(i -> {sumArray[0] += i;});
println(sumArray[0]);

I wonder what he value is if it's that easily broken ...



In addition, Java closures can still mutate heap objects so the point

made against mutation from closures was quite weak. I suspect BG made it

as part of the discussion on lambdas at the time and that it should not

be taken out of that context.

Yah, but I'm anyway looking forward to using Java8. I think we will start developing with it in my company earliest in 2020 ... No kiding. I guess other places might be similar. You have to convince the architects that lambdas are beneficial while those guys only care about the CTO and would never talk to some developer. Seriously, architects being decoupled from development was the worst mistake in the last 10-15 years in Java development ...

Sorry for the rant. Couldn't control myself for a little moment ... ;-).

-- Saxo
 
S

Saxo

Am Dienstag, 3. Dezember 2013 21:55:14 UTC+1 schrieb Martin Gregorie:
Decoupling architects from the development process has a much longer

history than Java: I first ran into sort of mess around 1977 in a COBOL

shop. Those guys were so decoupled that they trashed their copies of

design project documents as soon as they'd handed the project over to the

developers. That way of working didn't work then and doesn't now.



Now add UML to the mix, managers who think that a high level UML

description of each process is all a developer needs and eliminate all

feedback to the design team from the devs: the result is most unlikely to

be a working system. UML is OK if used properly, but its all too easy for

non-technical managers and designers, who are outside the development

loop, to convince themselves that once the high-level UML has been

produced, their job is done. In reality its barely started.





Ditto.

Thanks for your contribution. I feel much better now having talked to people that think alike ;-).
 
S

Sebastian

Am 03.12.2013 03:07, schrieb markspace:
Have you read any of the discussion on the lambda-dev list? Brain Goetz
was dead-set against "real" closures. He insisted on several occasions
that "real" closures were broken and would not have been designed the
way they were if computer science had known what they know today.

The main issue iirc was that he wanted the lambdas to be immutable. If
you go back and modify objects or fields by closing over them, then you
no longer have an immutable operation. Whether the risk was thread
safty and optimization, or programmer misuse, or something else, I don't
recall off hand, but he was pretty adamant it was a dumb idea.

Brian Goetz also takes exception to talk of "real closures". Here's
an excerpt from an interview at

http://www.infoq.com/articles/Brian_Goetz_Project_Lambda_from_the_Inside_Interview

<quote>
InfoQ: Lambdas, Closures, there is some debate about what the difference
is. What's your perspective?

Brian: I think getting worked up over whether lambdas as expressed in
Java SE 8 are "real" closures or not is a pretty unconstructive
activity. To me, the use of the term "real" here should be a syntax
error; there are many different languages with closure-like constructs,
with varying degrees of similarity and difference. Declaring that
language X gets to define what a real closure is and any language that
doesn't do it exactly that way isn't real is not helpful.

That said, there were hard decisions to make about whether to support
features like mutable local capture, nonlocal control flow, exception
transparency, and other features, and having left some of these out
means certain use cases are harder to express naturally; these are
tradeoffs between expressiveness and complexity, and we tried to spend
our complexity budget where it had the most impact for developers. It's
fair to talk about the pros and cons of each of these decisions, but
framing this in terms of "real vs fake closures" is not a constructive
way to have this dialog.
</quote>

-- Sebastian
 
A

Arved Sandstrom

Decoupling architects from the development process has a much longer
history than Java: I first ran into sort of mess around 1977 in a COBOL
shop. Those guys were so decoupled that they trashed their copies of
design project documents as soon as they'd handed the project over to the
developers. That way of working didn't work then and doesn't now.

Now add UML to the mix, managers who think that a high level UML
description of each process is all a developer needs and eliminate all
feedback to the design team from the devs: the result is most unlikely to
be a working system. UML is OK if used properly, but its all too easy for
non-technical managers and designers, who are outside the development
loop, to convince themselves that once the high-level UML has been
produced, their job is done. In reality its barely started.

Ditto.
Truth be told, I never heard of software "architects" back in the '70's
or '80's, it only started showing up as a term in the '90's. Even in the
'90's I don't recall people describing themselves as architects so much
as there were folks who started talking about software architecture.

All of a sudden with the original dot com boom we started seeing
"architects". That's less than 15 years ago. It's been astonishing to me
how you can actually become an architect with just 5 or 10 years under
your belt, but apparently that's readily achievable. Given the fact that
we've got dozens of conferences and dozens of websites where hundreds of
youngsters with 2-5 years software experience can posture as architects,
either I'm dated or my standards are too high.

Martin, I gave up on UML several years ago. That and all other graphical
modeling languages. All of that graphical stuff seriously wastes time -
competent developers understand text better, and God forbid if managers
see UML. Plain English, pseudocode, and sketches on paper work a lot better.

There actually is a role for architects. It's just that 95 percent of
the people who advertise themselves as such couldn't code their way out
of a piss-soaked paper bag, let alone comprehend a system.

AHS
 
A

Arved Sandstrom

Am 03.12.2013 03:07, schrieb markspace:

Brian Goetz also takes exception to talk of "real closures". Here's
an excerpt from an interview at

http://www.infoq.com/articles/Brian_Goetz_Project_Lambda_from_the_Inside_Interview


<quote>
InfoQ: Lambdas, Closures, there is some debate about what the difference
is. What's your perspective?

Brian: I think getting worked up over whether lambdas as expressed in
Java SE 8 are "real" closures or not is a pretty unconstructive
activity. To me, the use of the term "real" here should be a syntax
error; there are many different languages with closure-like constructs,
with varying degrees of similarity and difference. Declaring that
language X gets to define what a real closure is and any language that
doesn't do it exactly that way isn't real is not helpful.

That said, there were hard decisions to make about whether to support
features like mutable local capture, nonlocal control flow, exception
transparency, and other features, and having left some of these out
means certain use cases are harder to express naturally; these are
tradeoffs between expressiveness and complexity, and we tried to spend
our complexity budget where it had the most impact for developers. It's
fair to talk about the pros and cons of each of these decisions, but
framing this in terms of "real vs fake closures" is not a constructive
way to have this dialog.
</quote>

-- Sebastian
I get Goetz' argument, a bit. But there is a fairly well established
notion of closure - I don't think it's up to Goetz to suggest that there
is ambiguity about the definition. I don't like the terminology
"closure-like".

I respect Brian Goetz. I've read and relied upon much of his literature.
Having said that, it's unhelpful and not healthy for any one single
person to control so much of the conversation about a language or
platform. There's only so many people that can directly influence Java
development...as in a dozen or less...but millions use it, and
presumably thousands are just as proficient with Java as Goetz or Bloch
- except you never hear from them.

I think Goetz could have just summarized by explaining what a real
closure is, and explaining whether or not Java 8 has one. Maybe two or
three sentences.

AHS
 
R

Robert Klemme

Truth be told, I never heard of software "architects" back in the '70's
or '80's, it only started showing up as a term in the '90's. Even in the
'90's I don't recall people describing themselves as architects so much
as there were folks who started talking about software architecture.
ACK.

All of a sudden with the original dot com boom we started seeing
"architects". That's less than 15 years ago. It's been astonishing to me
how you can actually become an architect with just 5 or 10 years under
your belt, but apparently that's readily achievable. Given the fact that
we've got dozens of conferences and dozens of websites where hundreds of
youngsters with 2-5 years software experience can posture as architects,
either I'm dated or my standards are too high.

Yes, indeed. Another thing I find quite irritating is the verve with
which people advertise the next new language / framework / project
management technique etc. on these conferences. There's a lot of good
ideas around but nobody will master any of these if they jump on the
next bandwagon at the frequency that they come round the corner.
Martin, I gave up on UML several years ago. That and all other graphical
modeling languages. All of that graphical stuff seriously wastes time -
competent developers understand text better, and God forbid if managers
see UML. Plain English, pseudocode, and sketches on paper work a lot
better.

UML works great on paper. I do agree with you when it comes to
graphical modeling. That is a waste of time IMHO. But even for paper
and whiteboard it's good to have a common graphical syntax - even if
you just use a small portion of UML for that.
There actually is a role for architects. It's just that 95 percent of
the people who advertise themselves as such couldn't code their way out
of a piss-soaked paper bag, let alone comprehend a system.

I can't comment on the 95% but I have certainly seen this phenomenon.

Kind regards

robert
 
A

Arved Sandstrom

Yes, indeed. Another thing I find quite irritating is the verve with
which people advertise the next new language / framework / project
management technique etc. on these conferences. There's a lot of good
ideas around but nobody will master any of these if they jump on the
next bandwagon at the frequency that they come round the corner.

I have fairly typical online subscriptions and professional reading
habits: DDJ, CodeProject, InfoQ, ACM TechNews, ODN, The ServerSide,
LinkedIn groups, to name a few that are reasonably general. I've noted
the same thing - how the hell does anyone keep track of all this stuff?
Agile was supposed to supplant waterfall and spiral and all that, but
now apparently agile often fails at scale, so lean is the way to go.
Build system A for language X is obviously obsolete because it's already
6 months old, so someone had to invent build system B. There may be ten
thousand *.js libraries out there now, and God only knows what they all do.

I've evidently really missed the boat by not being up to speed on
asynchronous event-driven functional reactive programming. I've also
noticed that after the few years that I generally ignored NoSQL that now
there's a backlash that is talking up relational again - good to know,
RDBMS's worked just fine for me all along.

There's plenty of innovation and disruption alright. So much so that
we're no further ahead in solving core problems than, say, 20 or 30
years ago.
UML works great on paper. I do agree with you when it comes to
graphical modeling. That is a waste of time IMHO. But even for paper
and whiteboard it's good to have a common graphical syntax - even if
you just use a small portion of UML for that.

Ranting aside, I don't completely discount the use of UML or another
graphical language. A lot of the modeling graphics I've seen (UML and
otherwise) have made me question the wisdom of the adage "a picture is
worth a thousand words", but there's certainly important uses for these
drawings.

The core problem with UML or other modeling languages is not the
constructs they make available, it's the fact that many people feel the
need to express every detail of their design using UML etc. I'm not
interested in seeing a class diagram that shows every private field for
every class, nor sequence diagrams for mundane interactions. The
graphics should express key information, not all information.
I can't comment on the 95% but I have certainly seen this phenomenon.

Kind regards

robert
Ahh, I pulled 95% out of my hat. Based on empirical evidence ( personal
knowledge of hundreds of IT types over the decades), I might not be so
harsh, but I'd estimate that well over three-quarters of self-titled
software architects simply do not have the seniority nor experience to
possibly be one [1], and of those people that do ostensibly have the
seniority and the experience (on paper), a large majority prove not to
be adequate.

AHS

1. Not always a person's fault. One's employer frequently indulges in
title inflation. I'm familiar with quite a few local software
consultancies where practically everyone is a "senior consultant" - no
matter that you're just a junior coder for hire who's got 2 years
experience...
 
R

Robert Klemme

I have fairly typical online subscriptions and professional reading
habits: DDJ, CodeProject, InfoQ, ACM TechNews, ODN, The ServerSide,
LinkedIn groups, to name a few that are reasonably general. I've noted
the same thing - how the hell does anyone keep track of all this stuff?

Hire someone doing the reading for you... ;-)
Agile was supposed to supplant waterfall and spiral and all that, but
now apparently agile often fails at scale, so lean is the way to go.
Build system A for language X is obviously obsolete because it's already
6 months old, so someone had to invent build system B. There may be ten
thousand *.js libraries out there now, and God only knows what they all do.
:)

I've evidently really missed the boat by not being up to speed on
asynchronous event-driven functional reactive programming. I've also
noticed that after the few years that I generally ignored NoSQL that now
there's a backlash that is talking up relational again - good to know,
RDBMS's worked just fine for me all along.

I, too, am convinced that many underestimate the value of mature RDBMS
systems. I'd love to know how many people go through hoops and loops to
retrofit transactional behavior on a NoSQL DB because it looked cool
when the project started out. "Concurrency we'll do later." Well...
There's plenty of innovation and disruption alright. So much so that
we're no further ahead in solving core problems than, say, 20 or 30
years ago.

It's disturbing to see how often the same problem is solved over and
over again. Just look at the number of text editors...
The core problem with UML or other modeling languages is not the
constructs they make available, it's the fact that many people feel the
need to express every detail of their design using UML etc. I'm not
interested in seeing a class diagram that shows every private field for
every class, nor sequence diagrams for mundane interactions. The
graphics should express key information, not all information.

Exactly! Well put.
1. Not always a person's fault. One's employer frequently indulges in
title inflation. I'm familiar with quite a few local software
consultancies where practically everyone is a "senior consultant" - no
matter that you're just a junior coder for hire who's got 2 years
experience...

You get to charge more for a senior... In part it's also a cultural
thing. A few weeks back I saw a request for a senior which required two
years of experience - and it was not in the western hemisphere.

Cheers

robert
 
A

Andreas Leitgeb

A late response...

Joshua Cranmer 🧠said:
RTTI doesn't enter into the picture. The problems it solves are
downcasts and reflection. The problem is doing an upcast between two
classes, neither of which are the most derived type of a class. Even in
C++, the most derived type is easy to find (it's the vtable pointer).

I knew that RTTI isn't itself necessary, but I thought that as Java has
RTTI already, the need for analysis of non-trivial downcasts wouldn't be
a show stopper.
This approach means that following statement holds:
D d = new D();
(A)(B)d != (A)(C)d;

That sure looks surprising :-}

Maybe some restrictions on MI could avoid these cases, while still
allowing other parts of MI. Obviously disallowing diamonds altogether
won't cut it, as "Object" would be always there as an "A". Eventually,
non-primary "is-a" relations would turn into "behaves like a", which
could boil down to treating the non-primary superclasses like interfaces,
except that there would be a way to adopt methods to some degree...
I think I once read about some approach of that kind that was already
more thought out than my utterings here.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,056
Latest member
GlycogenSupporthealth

Latest Threads

Top