no more primitive data types in Java (JDK 10+). What do you think?

A

Arne Vajhøj

The term probably refers to unifying the type hierarchy such that the
primitive types are logically subtypes of Object. In other words, remove
the distinction between primitive and reference types.


5 is an object instance, not a type that can be extended. Just like I
can't say class Allegro extends System.out {}...


My guess is the main goal is to allow things like a true List<int>
(where the T data would be `int data') instead of List<Integer>.

Which combined with the generics change also mentioned in the
roadmap would improve performance of collections of simple
types.

Arne
 
L

Lew

glen herrmannsfeldt wrote but callously failed to attribute his citations:
Lew wrote:
(snip)
(snip)
And that's relevant because ... ?
Do you think they'll suddenly allow leading digits in class
identifiers for Java code? I think not.

As I remember, all unicode [sic] letters are allowed. There are plenty

As I looked up in the JLS, that's not true. Leading digits are not permitted.

"An identifier is an unlimited-length sequence of Java letters and Java digits, the first of which must be a Java letter."
<http://docs.oracle.com/javase/specs/jls/se7/html/jls-3.html#jls-3.8>

The JLS trumps your memory.
that could be confusing to readers. Maybe there aren't any that
look like roman digits, though. There are many that look like,
but aren't the same character as, roman alphabet letters.

But those characters are not used to represent integers, so are not germaneto this conversation.

The question at hand was the potential legitimization of glyphs that represent integers to be used as class names that inherit from other classes. Those glyphs are not currently allowed to be leading characters of identifiers, so unless that changes, rossum's construct will never be legal.
 
S

Stefan Ram

Joshua Cranmer said:
My guess is the main goal is to allow things like a true List<int>

Yeah, yeah:

2000 C# starts out as another Java clone
"Our customers /don't want/ closures"

2010 Java struggles to catch up with C#
 
S

Stefan Ram

"Our customers /don't want/ closures"

Sorry for the lack of context!

I was referring to a quotation about the design of Java:

»Guy Steele wrote:

Actually, the prototype implementation *did* allow non-final
variables to be referenced from within inner classes. There was
an outcry from *users*, complaining that they did not want this!«

http://madbean.com/2003/mb2003-49/
 
A

Arne Vajhøj

Sorry for the lack of context!

I was referring to a quotation about the design of Java:

»Guy Steele wrote:

Actually, the prototype implementation *did* allow non-final
variables to be referenced from within inner classes. There was
an outcry from *users*, complaining that they did not want this!«

http://madbean.com/2003/mb2003-49/

The world evolves.

It happens frequently that C# users get confused about this.

But many seems to think that the benefits outweigh the problems.

So ...

Arne
 
D

David Lamb

But it did, by popular acclaim. There is no "real" millennium other than the
day after whenever it was hardest to get New Year's Eve hotel reservations at
Times Square. [...]

You would be correct, except you're not. If I thought the people making
the mistake I'm talking about actually understood the point you're making,
and were just arbitrarily reassigning the term "millennium", you'd have a
point.

But they don't. They are specifically looking at the count of years and
falsely imagine that on Jan 1, 2000, two sets of 1000-year intervals have
passed.

No, they're deciding that when the leading digit changes is more
important than whether a year zero ever existed. Like Lew has been
saying, *language* depends on the way the common people use words, not
on what pedants think words ought to mean. I have often lamented when
people "misuse" my favourite words, but I've become resigned in this
area to masses winning out over the cognoscenti.
 
G

glen herrmannsfeldt

David Lamb said:
But it did, by popular acclaim. There is no "real" millennium
other than the day after whenever it was hardest to get
New Year's Eve hotel reservations at Times Square. [...]
No, they're deciding that when the leading digit changes is more
important than whether a year zero ever existed. Like Lew has
been saying, *language* depends on the way the common people use
words, not on what pedants think words ought to mean. I have
often lamented when people "misuse" my favourite words, but
I've become resigned in this area to masses winning out over
the cognoscenti.

If you think about it enough, you wonder why celebrate anything
related to special numbers. Not only that, these depend specifically
on the decimal representation. Shouldn't changes to the leading
digit in other bases also be important?

We like to celebrate birthdays, but again give special consideration
to those when the leading digit changes, though in that case
start counting at zero.

(As I understand it, in the Chinese system you are born one, and
add one on Chinese (lunar) new year.)

Now, when Kennedy promised to land on the moon before the decade
was out, did he mean 1969 or 1970? It seems that NASA believed
he meant 1969. (The July launch gave them some margin for later
tries if that one didn't make it.)

-- glen
 
G

glen herrmannsfeldt

(snip, I wrote)
As I remember, all unicode [sic] letters are allowed.
There are plenty
As I looked up in the JLS, that's not true. Leading digits are
not permitted.

What isn't true? I wrote letters, you wrote digits. Unicode has many
of each, but the letters aren't digits and the digits aren't letters.
"An identifier is an unlimited-length sequence of Java letters
and Java digits, the first of which must be a Java letter."
<http://docs.oracle.com/javase/specs/jls/se7/html/jls-3.html#jls-3.8>
The JLS trumps your memory.

and there are a lot more than 52 Java letters.
But those characters are not used to represent integers,
so are not germane to this conversation.

True, but it could be confusing. Well, we already have the
confusion between 0 and O, but most are used to that by now.
Now, name a variable \u039f and see how confusing it can be.
The question at hand was the potential legitimization of
glyphs that represent integers to be used as class names that
inherit from other classes. Those glyphs are not currently
allowed to be leading characters of identifiers, so unless
that changes, rossum's construct will never be legal.

I don't know of a visual representation for all the legal
Java letters, but yes they should be disjoint from the digits
that can be used in numeric constants.

-- glen
 
B

BGB

If many thought the world were flat, would that make them right?

No.

"Ten". "Eleven". "Twelve". "Thir_*teen*_". "Four_*teen*_".
"Fif_*teen*_". ...

I was aware of this, hence why 13 was mentioned in contrast.
"many" need not be the majority though, nor necessarily correct, only "a
sizeable minority".

much like saying "many people use Linux", despite most people using
Windows...


there are similar things prone to vary, such as:
how many is "a couple"? "a few"? or "a handful"? (*1)
is "a dozen" necessarily 12? (rather then 10, 13, or 14)
....


*1: yes, there are people who use these terms and expect a certain
number. it leads to frustration, personally, as I think of most of these
as "something greater than 1", whereas I know of someone who thinks:
"couple"=3, "few"=4, and "handful"=5.
 
B

BGB

As an additional data point: Ruby MRI works like that. Basically
integers (instances of class Fixnum) look like ordinary objects but
under the hood the value is encoded in the reference and there is no
object on the heap. You get a nice consistent model for the language
user but avoid the overhead of GC. Ruby is still not a racing car
compared with other PL - usual trade offs apply. The concept is
described here:
http://en.wikipedia.org/wiki/Tagged_pointer

It could require a signicifant (nice typo, sounds like an animal) change
in the JVM definition though.

AFAIK, in development versions of the JVM this is apparently already
being worked on (I remember seeing it being talked about a few months
back on one of the JVM development mailing lists). I am not really sure
about the implementation specifics though.


my own VM also has fixnums, although they don't use tagged pointers.


the VM uses inaccessible address-ranges as types.

on 32-bit x86 targets, this generally means the space between 0xC0000000
and 0xFFFFFFFF. on x86-64, a much bigger chunk of address space is used
(currently a roughly 56-bit wide region, but on current HW this could be
pushed actually to about 60 or 62 bits, given the actual accessible part
of the space is relatively tiny).

as-is, this currently means 28 bit fixnums on x86, and 48 bit fixnums on
x86-64, rather than the 30 and 62 bits possible via tagged pointers.

an advantage, however, is that this does not interfere with my ability
to make use of unaligned pointers: I wanted a system where I could point
a character pointer anywhere in a string, and still have the
type-checking able to figure out that it was a string, and more so, also
be able to tell me the address of the start of the string and the
relative offset therein.

it all works fairly well, although type-checking is potentially a little
more costly, given that most such operations need to identify the base
of the heap-object in question. luckily, these lookups are
"approximately" constant time (it is not really constant, but roughly
ranges from O(1) to O(log2 n) depending on various factors).


or such...
 
B

Bernd Nawothnig

If you complicate things, the compiler then has to work to
decomplicate (optimise). Why not just keep it simple?

My proposal was quite the contrary: simplification of things, i.e.
removal of unnecessary data types by unifications.

Keep in mind: the compiler is not the programmer!
One application of keeping it simple would be to use primitives
where possible -- since they are simpler than objects -- and only use
objects where they are needed.

See above: don't mix up the compiler, the machine, and implementation
details with the programmer. Things should be simple for the
*programmer*, not necessarily for the compiler or the machine, even if
that maybe preferable. But preferable is not necessary ...




Bernd
 
L

Lew

I was aware of this, hence why 13 was mentioned in contrast.
"many" need not be the majority though, nor necessarily correct, only "a
sizeable minority".

How sizeable is this sadly mistaken minority?

If they all thought the world were flat, would that make them correct?
much like saying "many people use Linux", despite most people using Windows...

Nothing like that at all. We're talking about what "teens" actually means, not
how many people use it wrongly. This is not a subjective matter.
there are similar things prone to vary, such as:
how many is "a couple"? "a few"? or "a handful"? (*1)
is "a dozen" necessarily 12? (rather then 10, 13, or 14)

Nothing remotely similar. The "teens" definition is precise, the "few"
definition is not.

A dozen is necessarily 12.

Duh.

You're all over the map on this one.
...


*1: yes, there are people who use these terms and expect a certain number. it
leads to frustration, personally, as I think of most of these as "something
greater than 1", whereas I know of someone who thinks: "couple"=3, "few"=4,
and "handful"=5.

Imprecise words like "few" or "couple" in that sense (most senses of "couple"
are "two", but there is one sense of "near two") are deliberately vague and
subject to interpretation. Precise words, like "teens"

"teens   [teenz] Show IPA
plural noun
the numbers 13 through 19, especially in a progression, as the 13th through
the 19th years of a lifetime or of a given or implied century."
<http://dictionary.reference.com/browse/teens>

or "dozen"
<http://dictionary.reference.com/browse/dozen>

are not so subject to interpretation.
 
L

Lew

glen said:
Now, when Kennedy promised to land on the moon before the decade
was out, did he mean 1969 or 1970? It seems that NASA believed
he meant 1969. (The July launch gave them some margin for later
tries if that one didn't make it.)

Well, of course. The decades question was already cited upthread as
non-controversial. No one for a millisecond back then thought he meant after
January 1, 1970, I assure you.

It certainly would be silly to regard nineteen-*seven*ty as part of the
*six*ties, wouldn't it?

What possible hyperpedantic and stupid "reasoning" could be used to claim
otherwise?
 
L

Lew

glen said:
Lew wrote:

(snip, I wrote)
As I remember, all unicode [sic] letters are allowed.
There are plenty
As I looked up in the JLS, that's not true. Leading digits are
not permitted.

What isn't true? I wrote letters, you wrote digits. Unicode has many
of each, but the letters aren't digits and the digits aren't letters.

It isn't true that a construct such as

class 0 extends Peano

could be in conflict with numbers as objects, as you were trying to claim.
and there are a lot more than 52 Java letters.



True, but it could be confusing. Well, we already have the

Maybe, slightly, but that was irrelevant to the point in the conversation,
which was a discussion of whether numbers as objects implied legitimacy for
having a numerically-named type derive from another. Your statement about
non-digit Unicode characters is a red herring in that context.
confusion between 0 and O, but most are used to that by now.
Now, name a variable \u039f and see how confusing it can be.

What does that have to do with whether digit glyphs could be syntactically
valid as identifiers for descendant types?>
I don't know of a visual representation for all the legal
Java letters, but yes they should be disjoint from the digits
that can be used in numeric constants.

And are, as cited. This whole thing is an attempt to resolve your tangential
comment as not pertinent to the point that was being made at the time.

Confusing non-digit glyphs have nothing to do with whether digit glyphs can
lead off identifiers. Pretending that you broke the rules is not the same as
actually breaking the rules.
 
B

BGB

Language isn't defined by an objective physical reality, though, but
by usage -- which is why "awful" and "awesome" today have opposite
meanings.

yes, and why people might actually disagree regarding the usage of
various terms.

for objective reality, people can go and measure and test stuff.
for language use, it more amounts to consensus, and languages change
over time:
this can be drastic, as in the differences between English and German;
this can be more subtle, as in the differences between the US California
dialect and Londoner dialects of English;
or it can be things like the specific uses of certain terms.

so, it is all a bit more informal and fluid, and not nearly so much
about people being somehow chained to a dictionary or similar.


even regarding objective reality, there is still some room for disagreement:
competition between conventions and theories;
various ways of interpreting the results of various measurements or
experiments;
....

rarely are things so simply black and white.
 
B

BGB

In C# value types can not be extended.

In Scala value types is a fixed set.

So it seem very likely that int would be final if this
change were implemented.

yes, probably final, and likely more so with the "object" aspects of all
this essentially being faked in the VM as well, because actually
allocating a memory object for every integer would be a bit, expensive.

it could also work out that:
Integer iobj=new Integer(5);

doesn't actually create a new heap object in the first place, it only
appears to do so, and behaves as if it had done so.

this is the great fun of compilers and VMs:
what is going on in the language, and what is going on nearer the actual
HW, need not really be all that similar.
 
B

BGB

The same for Python.

ironically, in my case I went differently:
the root of the tree is not "objects", it is "variants", of which both
the primitive/value types and objects are subtypes.

variants have the funky behavior that anything can be stored in a
variant (no casting needed), but going the other way may throw an
exception (the language does not require explicit up-casts in most
cases, but is much more inclined to give warnings about them).

an "object" type also exists, but is generally considered an alias for
"variant".

"Object" is not the root of the type-system, it is only the root of the
class hierarchy (the class hierarchy and type-system are not regarded as
equivalent in this language).


the ability to use ".toString()" on pretty much everything still exists
though, except it doesn't currently do much useful for some types (many
types which lack a sensible string representation will show up in a form
like "#<typename:address>", which is the default syntax for anything
lacking a known "toString()" method).

These implementation details should better be hidden and invisible for
most cases. Let the compiler automatically detect and generate
possible optimisations.

A programming language should be as simple and orthogonal as possible.

actually, I personally think it should be as "useful" and "usable" as
possible, with simplicity and orthogonality coming second.

otherwise, people could be off using very awkward toy languages in the
name of simplicity and orthogonality, and this would hardly be a better
outcome.

so, it is more of a set of trade-offs.
 
L

Lew

BGB said:
yes, and why people might actually disagree regarding the usage of various terms.

for objective reality, people can go and measure and test stuff.
for language use, it more amounts to consensus, and languages change over time:
this can be drastic, as in the differences between English and German;
this can be more subtle, as in the differences between the US California
dialect and Londoner dialects of English;
or it can be things like the specific uses of certain terms.

so, it is all a bit more informal and fluid, and not nearly so much about
people being somehow chained to a dictionary or similar.


even regarding objective reality, there is still some room for disagreement:
competition between conventions and theories;
various ways of interpreting the results of various measurements or experiments;
...

rarely are things so simply black and white.

Fortunately, in today's world, the meanings of "dozen" and "teens" are so
simply black and white. While your statement as a poster-saying superficial
generality is correct, in the particulars of these terms it's sadly mistaken.
 
B

BGB

Fortunately, in today's world, the meanings of "dozen" and "teens" are
so simply black and white. While your statement as a poster-saying
superficial generality is correct, in the particulars of these terms
it's sadly mistaken.

http://en.wikipedia.org/wiki/Dozen

as noted, 13, 14, and 10 notions of "dozen" are listed, in addition to
the usual 12.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,578
Members
45,052
Latest member
LucyCarper

Latest Threads

Top