No, you wouldn't; the behavior you described is completely different
from, and incompatible with, what zaur wrote.
He's saying that instead of thinking the integer value of 3 itself being
the object, he expected Python's object model would behave as though the
entity m is the object, and that object exists to contain an integer
value.
What is "the entity m"?
Is it the name m, as follows?
Or is it the literal "3" (without quotes)?
Or an object holding the value three?
Or something else?
In that case, m is always m,
but it has whatever integer value
it is told to hold at any point in time. The self-referential addition
would access the value of m, add the operand, and store the result back
in the same object as the object's value.
Ah wait, I think I get it... is m a memory location? So when you say:
m = 3
the memory location that m represents is set to the value 3, and when you
say:
m += 1
the memory location that m represents is set to the value 4?
That would be how Pascal and C (and presumably other languages) work, but
not Python or Ruby or even VB (so I'm told) and similar languages. Java
has a hybrid model, where a few data types (such as ints) are handled
like C, and everything else is handled like Python. Consistency was never
Java's strong suit.
This is not the way Python
works, but he's saying this is the intuitive behavior.
It isn't intuitive if you've never been exposed to Pascal- or C-like
languages. If your only programming language was Haskell, the very idea
of mutating values would be alien. So I guess when you say "the intuitive
behaviour", what you actually mean is "familiar".
I happen to
agree, and argued at length with you and others about that very thing
months ago, when some other third party posted with that exact same
confusion.
By contrast, your description maintains the concept of numerical value
as object that Python uses, and completely misses the point. I did find
the description you gave to be highly enlightening though... It
highlighted perfectly, I think, exactly why it is that Python's behavior
regarding numerical values as objects is *not* intuitive. Of course,
intuition is highly subjective.
What exactly is it about Python's behaviour regarding numbers that is not
intuitive? That you can't do this?
anum = 2
alist = [anum]
anum += 1
print alist # this doesn't work
[3]
That won't work in any language that I know of -- as far as I am aware,
the above is impossible in just about every common programming language.
(My C and VB are nearly non-existent, so I may be wrong about them.) Here
is Ruby's behaviour:
irb(main):001:0> anum = 2
=> 2
irb(main):002:0> alist = [anum]
=> [2]
irb(main):003:0> anum += 1
=> 3
irb(main):004:0> puts alist
2
=> nil
Just like Python.
I believe it boils down to this: People expect that objects they create
are mutable.
Why would they expect that? Is there any evidence apart from the
anecdotal complaints of a few people that they expect this? People
complain equally when they use a mutable default value and it mutates, or
that they can't use mutable objects as dict keys, so this suggests that
people expect objects should be immutable and are surprised when they
change.
If you're going to argue by analogy with the real world (as you do
further on), I think it's fair to argue that some objects are mutable
(pieces of rubber that expand into a balloon when you blow into them),
and some are immutable unless you expend extraordinary effort (rocks). I
would be gobsmacked if my desk turned pink or changed into an armchair, I
expect it to be essentially unchanging and immutable. But I fully expect
a banana to turn black, then squishy, and finally white and fuzzy if I
leave it long enough.
At least, unless they specify otherwise. It is so in some
other programming languages which people may be likely to be familiar
with (if they are not taking their first forray into the world of
computing by learning Python), and even "real world" objects are
essentially always mutable.
[snip example of a 2002 Buick LeSabre]
Be careful bringing real-world examples into this. People have been
arguing about identity in the real-world for millennia. See, for example,
the paradox of my great-grandfather's axe. My great-grandfather's axe is
still in my family after 80 years, as good as new, although the handle
has been replaced four times and the head twice. But it's still the same
axe. An even older example is the paradox of the Ship of Theseus.
Numbers are fundamentally different from objects. The number 3 is a
symbol of the idea of the existence of three countable objects. It can
not be changed
Doesn't this contradict your claim that people expect to be able to
mutate numbers? That you should be able to do this?
123456
You can't have it both ways -- if people think of objects as mutable, and
think of numbers as not-objects and unchanging, then why oh why would
they find Python's numeric behaviour to be unintuitive?
(though it can be renamed, if you so choose -- just don't
expect most people to know what you're talking about). It is
unintuitive that 3 is an object;
Says you. People have considered numbers to be eternal, unchanging,
immutable entities going back to at least Plato. If people are
comfortable thinking that there are Platonic ideal numbers, why wouldn't
they think of them represented in computers as immutable objects?
What I think is that some people, such as you and Zaur, have *learned*
from C-like languages that numbers are mutable not-objects, and you've
learned it so well that you've forgotten that you ever needed to learn
it. Consequently what you actually mean when you say Python is
unintuitive is that Python is not like some other language (and is like
yet other languages).
it is rather what we use to describe
objects -- the value of the object. It is an abstract concept, and as
such it is not an object at all. You cannot hear 3, taste 3, nor smell
3. You can neither see nor touch 3, though you can certainly see 3
*objects* if they are present, and you can certainly see the symbol '3'
that we use to represent that idea... but you can not see three itself,
because there is no such object.
Human beings are excellent at reifying abstract things into (imaginary)
objects. We talk about letting in the cold (we actually let out the
heat); we talk about souls and life-force as if they are things; we treat
justice and love and mercy and hate as palpable things instead of
emotions and abstract entities. Plato imagined that everything has a
pure, abstract form, an ideal, and people find that intuitive: we expect
that there is an archetypal "chair" that embodies everything that is
chair-like and nothing which is not, if only we could find it.
Numbers are no different. Reifying them into objects comes easy to people
who haven't learned differently.
The only way to see three is to
envision 3 of some object. The number 3 does not have a value; it IS a
value (it is the symbolic representation of the value of three). To say
that 3 is an object that has a value is a bit like saying the length of
a car is an object that itself has a length.
No, the length of a car is an object which *is* a length, it doesn't
*have* a length.
None of this explains why you would expect to be able to mutate the value
three and turn it into four.
I don't think your argument makes sense -- I think your explanation is
tangled up in knots. Everything you say about numbers being unchangeable,
immutable entities supports Python's behaviour to make numbers immutable,
and yet you argue that since people expect numbers to be unchanging, they
therefore expect Python ints to be mutable! This makes no sense.
THAT is why Python's behavior with regard to numerical objects is not
intuitive, and frankly bizzare to me, and I dare say to others who find
it so.
Yes, that's right. BIZZARE.
I think you have confused yourself. If the number three cannot change,
and you have a label "m" associated with three, then if you add one to
it, you *can't* have the three mutate into a four, because numbers cannot
change. Obviously the label has to detach itself off the three and onto
the four -- precisely the behaviour Python uses. Plato would surely have
found this commonsensical, and I would say that it's only those who have
become used to Pascal-like languages that don't.