RE: anything like C++ references?

Discussion in 'Python' started by Adam Ruth, Jul 16, 2003.

  1. Adam Ruth

    Adam Ruth Guest

    In <> Michael Chermside
    wrote:
    > Adam Ruth writes:
    >> I assume that you're referring to people 'faking' pointer in Python (
    >> such as wrapping variables in lists, etc.) I'd ammend your statement
    >> to say: And the worst thing you can do is to obscure the issue
    >> even more by disguising them as something else, WHEN YOU DON'T
    >> REALLY HAVE TO. In Python, there is no situation where "you really
    >> can't avoid pointers". It's only when you program C or C++ in
    >> Python that you think you can't avoid pointers. There are much
    >> better idioms to achieve the desired results.

    >
    > Look, first of all, let me say that I disagree with Stephen Horne in
    > this discussion. Or, to be more precise, I think that the approach he
    > is using is not one which is useful in describing Python. HOWEVER,
    > that doesn't mean that there's NOTHING to what he is saying, and your
    > claim that there is no situation requiring "pointers" in Python seems
    > wrong to me.
    >
    > Actually (to be quite pedantic) it is technically true. For instance,
    > I could write a program in Python which simulated a turing machine,
    > and then write the entire program to be executed on THAT. But this
    > is a meaningless definition for "requiring" a feature -- by this
    > definition NO feature is ever needed if the language is turing
    > equivalent without it... for instance, Python doesn't ever need
    > classes, modules, dictionarys, or functions (I'm pretty sure a
    > turing machine could be written which didn't use any of these).
    >
    > What's more useful is to say that a feature is not "needed" if there's
    > a straightforward alternate way to handle the situations where it
    > would normally be used. For instance this (using C++ syntax for
    > "references"):
    >
    > def returnsMultipleValues(x, &minusFive, &timesFive):
    > """Calculates x+5, x-5, and x*5."""
    > *minusFive = x - 5
    > *timesFive = x * 5
    > return x + 5
    >
    > Is unnecessary, since this:
    >
    > def returnsMultipleValues(x):
    > return x + 5, x - 5, x * 5
    >
    > would work fine. However, there are some places where it IS useful to
    > be able to modify a value within a routine. I have written code like
    > this:
    >
    > i = [0]
    > stuff.append( makeNewComplexObject(i, 'name', 'fred') )
    > stuff.append( makeNewComplexObject(i, 'age', 17) )
    > stuff.append( makeNewComplexObject(i, 'item') )
    > stuff.append( makeNewComplexObject(i, 'item') )
    >
    > where the source for makeNewComplexObject() went like this:
    >
    > def makeNewComplexObject(iBox, objName, *objArgs):
    > # increment i
    > i = iBox[0]
    > iBox[0] += 1
    >
    > if objName == 'name':
    > # some simple cases
    > return new NameObject(*objArgs)
    > elif objName == 'age':
    > # some complex cases
    > yrsOld = objArgs[0]
    > if yrsOld >= 18:
    > return new AdultAge(*objArgs)
    > else:
    > return new MinorAge(*objArgs)
    > elif objName == 'item':
    > # some cases that use i
    > return new IndexedItem(i)
    >
    > Now, clearly, this code was inspired by C code which used an
    > idiom like this:
    >
    > i = 0
    > stuff = newComplexObject(i++, "name", "fred")
    > stuff = newComplexObject(i++, "age", 17)
    > stuff = newComplexObject(i++, "item")
    > stuff = newComplexObject(i++, "item")
    >
    > And certainly it *can* be rewritten in Python without "boxing" the
    > variable i. However, it would also be nice to be ABLE to "box" the
    > variable i. So it's not as if references would be USELESS in Python.
    > I just think it'd be better handled differently (one-item list, or
    > maybe a container class) rather than redefining assignment in Python
    > as Stephen seems to prefer.[1]
    >
    > -- Michael Chermside
    >
    > [1] - Stephen doesn't want to change Python now for historical
    > reasons. But my position is that if I were inventing a NEW
    > language I'd do it Python's way by choice, because I think
    > it's better.
    >


    You are correct in that there are times when pointer semantics would be
    somewhat useful, but they're never necessary. His statement was that
    there are time that "you really can't avoid pointers". That's not true.
    I definitely went a little overboard, and it sounds like I'm saying,
    "not only are pointers not necessary, they're never desirable". My tone
    was a bit more knee jerk than was prudent.

    Your code example isn't really about pointers, though, as the C++
    version doesn't use pointers. There's also another difference, the
    caller is responsible for incrementing the counter, which is quite
    different than your Python version. There are a number of ways to write
    that without "pointers" that express the intent just as clearly, if not
    more. The real question is what does the i represent? If it represents
    the index of the object in the list, then this is more clear:


    > stuff.append( makeNewComplexObject(len(stuff), 'name', 'fred') )
    > stuff.append( makeNewComplexObject(len(stuff), 'age', 17) )
    > stuff.append( makeNewComplexObject(len(stuff), 'item') )
    > stuff.append( makeNewComplexObject(len(stuff), 'item') )


    In the worst case (needing the most new code to implement) it would need
    to be written with a very small "counter" class:

    > class Counter(object):
    > def __init__(self, initial=0):
    > self.value = initial
    > def inc(self):
    > self.value += 1
    > return self.value


    > stuff.append( makeNewComplexObject(i.inc(), 'name', 'fred') )
    > stuff.append( makeNewComplexObject(i.inc(), 'age', 17) )
    > stuff.append( makeNewComplexObject(i.inc(), 'item') )
    > stuff.append( makeNewComplexObject(i.inc(), 'item') )


    This is just as clear as the C++ version, and more clear than the Python
    version that wraps with the list.

    This example, though, doesn't really show the difference, it's too
    trivial. All of the versions are clear enough, with the difference
    being academic.

    I would be interested in seeing a more complex example where something
    would be substantially cleaner with pointers. I have to acknowledge the
    possibility that they exist, I don't know everything... yet :)

    Adam Ruth
    Adam Ruth, Jul 16, 2003
    #1
    1. Advertising

  2. On Wed, 16 Jul 2003 19:06:40 +0000 (UTC), Adam Ruth
    <> wrote:

    >In <> Michael Chermside
    >wrote:


    >> Look, first of all, let me say that I disagree with Stephen Horne in
    >> this discussion. Or, to be more precise, I think that the approach he
    >> is using is not one which is useful in describing Python. HOWEVER,
    >> that doesn't mean that there's NOTHING to what he is saying, and your
    >> claim that there is no situation requiring "pointers" in Python seems
    >> wrong to me.


    Thankyou - nice to know I'm not *completely* insane ;-)

    >You are correct in that there are times when pointer semantics would be
    >somewhat useful, but they're never necessary. His statement was that
    >there are time that "you really can't avoid pointers".


    Hmmm - those words were in a context. I never claimed that "you really
    can't avoid pointers" in *current* Python - that's obviously not true.
    But if the copy-on-write stuff were implemented, the need would arise.
    For example...

    >>> class c :

    .... x = 0
    ....
    >>> k = c()
    >>> def fn(a) :

    .... a.x=1
    ....
    >>> fn(k)
    >>> k.x


    At the moment, the result is 1. With copy-on-write, object parameters
    would behave exactly the same as integer or other immutable
    parameters. The result would be 0. You'd need pointers or references
    or call-by-reference to do a number of things.

    >I definitely went a little overboard, and it sounds like I'm saying,
    >"not only are pointers not necessary, they're never desirable". My tone
    >was a bit more knee jerk than was prudent.


    Then I withdraw certain comments I've made. I really can't complain
    about people going "a little overboard", can I ;-)

    >This is just as clear as the C++ version, and more clear than the Python
    >version that wraps with the list.
    >
    >This example, though, doesn't really show the difference, it's too
    >trivial. All of the versions are clear enough, with the difference
    >being academic.


    I agree with both of these comments.

    >I would be interested in seeing a more complex example where something
    >would be substantially cleaner with pointers. I have to acknowledge the
    >possibility that they exist, I don't know everything... yet :)


    I was told today that both Perl and ML have something equivalent to
    pointers. I don't know either language, though. Given the current
    audience, mentioning Perl may be a mistake - but we could look up the
    rationale for including them in ML.

    That is something to do with caution, though. I imagine that ML is a
    very different language to Python. I have used Haskell and Miranda,
    which are at least broadly the same paradigm but may or may not be
    quite a similar languages, but even those I never exactly knew well.

    The rationales might not be very portable.
    Stephen Horne, Jul 16, 2003
    #2
    1. Advertising

  3. Adam Ruth

    Adam Ruth Guest

    In <> Stephen Horne wrote:
    > On Wed, 16 Jul 2003 19:06:40 +0000 (UTC), Adam Ruth
    > <> wrote:
    >
    >>In <> Michael Chermside
    >>wrote:

    >
    >>> Look, first of all, let me say that I disagree with Stephen Horne in
    >>> this discussion. Or, to be more precise, I think that the approach
    >>> he is using is not one which is useful in describing Python. HOWEVER,
    >>> that doesn't mean that there's NOTHING to what he is saying, and
    >>> your claim that there is no situation requiring "pointers" in Python
    >>> seems wrong to me.

    >
    > Thankyou - nice to know I'm not *completely* insane ;-)
    >
    >>You are correct in that there are times when pointer semantics would
    >>be somewhat useful, but they're never necessary. His statement was
    >>that there are time that "you really can't avoid pointers".

    >
    > Hmmm - those words were in a context. I never claimed that "you really
    > can't avoid pointers" in *current* Python - that's obviously not true.
    > But if the copy-on-write stuff were implemented, the need would arise.
    > For example...


    It sure seemed that way to me:

    <prior dicussion>
    >>They're an unhappy necessity, akin to stop-lights. You need them
    >>because roads intersect, but if roads don't intersect, don't use them!

    >
    >Absolutely true. And the worst thing you can do when you really can't
    >avoid pointers is to obscure the issue even more by disguising them as
    >something else.

    </prior discussion>

    We were talking about people who wrap things in lists to simulate
    pointers. It would seem, from your statement, you think there are
    situations where you really can't avoid pointers. If I misread that, I
    apologize.

    >
    >>>> class c :

    > .... x = 0
    > ....
    >>>> k = c()
    >>>> def fn(a) :

    > .... a.x=1
    > ....
    >>>> fn(k)
    >>>> k.x

    >
    > At the moment, the result is 1. With copy-on-write, object parameters
    > would behave exactly the same as integer or other immutable
    > parameters. The result would be 0. You'd need pointers or references
    > or call-by-reference to do a number of things.


    Something occurred to me earlier today. And that is that Python does
    use copy-on-write semantics for parameter calls, it's just that Python
    only has one data type. A reference. The reference is passed by value
    to the function, just as you describe it should. Python, however, has a
    futher layer of abstraction on top of its one data type, and that's
    objects. I would venture that it's the extra layer of abstraction that
    makes Python work in a more intuitive, proper way.

    This is, however, just my opinion. But it does seem that new
    programmers who learn this abstraction find it natural and simple. It's
    people coming from C, et al., that seem thrown off by it.

    >
    >>I definitely went a little overboard, and it sounds like I'm saying,
    >>"not only are pointers not necessary, they're never desirable". My
    >>tone was a bit more knee jerk than was prudent.

    >
    > Then I withdraw certain comments I've made. I really can't complain
    > about people going "a little overboard", can I ;-)


    It's much less overboard then I've been known to go. I must be making
    progress :)

    >
    >>This is just as clear as the C++ version, and more clear than the
    >>Python version that wraps with the list. This example, though,
    >>doesn't really show the difference, it's too trivial. All of the
    >>versions are clear enough, with the difference being academic.

    >
    > I agree with both of these comments.
    >
    >>I would be interested in seeing a more complex example where something
    >>would be substantially cleaner with pointers. I have to acknowledge
    >>the possibility that they exist, I don't know everything... yet :)

    >
    > I was told today that both Perl and ML have something equivalent to
    > pointers. I don't know either language, though. Given the current
    > audience, mentioning Perl may be a mistake - but we could look up the
    > rationale for including them in ML.
    >
    > That is something to do with caution, though. I imagine that ML is a
    > very different language to Python. I have used Haskell and Miranda,
    > which are at least broadly the same paradigm but may or may not be
    > quite a similar languages, but even those I never exactly knew well.
    >
    > The rationales might not be very portable.


    From this discussion, there seem to be two points regarding pointers.

    1) Whether or not pointers are necessary, or helpful in Python as it
    now stands. This is debatable.
    2) Whether or not pointers would be necessary if Python used a form of
    parameter passing more like you prefer. I now see better that this is
    what you are talking about regarding pointers. That they *would* be
    necessary if copy-on-write were in use. That I can agree with, they
    would be necessary (or at least some of the semantics). I still can't
    see that copy-on-write is better than what we have now, but hey, I
    prefer Pepsi to Coke, so that tells you something!

    As a side note, based on your description of your eduction and studies
    in an earlier post, I realize now how different our backgrounds seem to
    be. I'm coming from a world where the focus is engineering (Ada is the
    only language purpose-built for engineering, AFAIK). You seem to come
    from a background more focused on science and theory. Perhaps this is
    the cause of our different world views (on this issue). Just a thought,
    though it is interesting we both chose Python.

    Adam Ruth
    Adam Ruth, Jul 16, 2003
    #3
  4. Adam Ruth

    Donn Cave Guest

    In article <>,
    Stephen Horne <> wrote:
    ....
    > I was told today that both Perl and ML have something equivalent to
    > pointers. I don't know either language, though. Given the current
    > audience, mentioning Perl may be a mistake - but we could look up the
    > rationale for including them in ML.


    In Objective CAML (if that's the ML they meant, or if this is general
    to other ML implementations, I don't know), there is a "ref" type
    that is essentially a mutable record with one member.

    Variables, if I may use the term loosely, are about the same as in
    Haskell - symbols, really, that are equated with some value and
    thereafter stand for that value. The value's internal state can
    change, if it supports that, but it can't be replaced with a different
    value. The only thing that ref adds to this is a notational convenience.
    You can say

    let a = ref 0 in begin
    ...
    a := !a + 1;
    ...
    end

    instead of the semantically equivalent

    type 'a myref = {mutable val: 'a}
    let a = {val = 0} in begin
    ...
    a.val <- a.val + 1;
    ...
    end

    Actually it's rather hard to see a great benefit there. But at
    any rate, this feature isn't fully equivalent to a hypothetical
    pointer in Python - it doesn't point to a location, it is just
    a mutable container.

    Donn Cave,
    Donn Cave, Jul 16, 2003
    #4
  5. On Wed, 16 Jul 2003 22:00:22 +0000 (UTC), Adam Ruth
    <> wrote:

    >In <> Stephen Horne wrote:
    >> On Wed, 16 Jul 2003 19:06:40 +0000 (UTC), Adam Ruth
    >> <> wrote:
    >>
    >>>You are correct in that there are times when pointer semantics would
    >>>be somewhat useful, but they're never necessary. His statement was
    >>>that there are time that "you really can't avoid pointers".

    >>
    >> Hmmm - those words were in a context. I never claimed that "you really
    >> can't avoid pointers" in *current* Python - that's obviously not true.
    >> But if the copy-on-write stuff were implemented, the need would arise.
    >> For example...

    >
    >It sure seemed that way to me:
    >
    ><prior dicussion>
    >>>They're an unhappy necessity, akin to stop-lights. You need them
    >>>because roads intersect, but if roads don't intersect, don't use them!

    >>
    >>Absolutely true. And the worst thing you can do when you really can't
    >>avoid pointers is to obscure the issue even more by disguising them as
    >>something else.

    ></prior discussion>


    Right - "the worst thing you can do when you really can't avoid
    pointers" is not the same as "really can't avoid pointers".

    You have taken the condition out of a conditional scentence and quoted
    it as if it were a statement. That's a pretty clear case of quoting
    out of context to misrepresent what someone said, isn't it.

    Now lets restore the rest of what Adam said...

    : I came to Python from Ada, and all I have to say is: Pointer-like
    : semantics are evil. They're worse than goto's, they're deadly,
    : damaging, and should be avoided at all costs.
    :
    : They're an unhappy necessity, akin to stop-lights. You need them
    : because roads intersect, but if roads don't intersect, don't use them!

    Guess what - when Adam said pointers are an 'unhappy necessity' he was
    talking about ADA - NOT PYTHON. In Ada, pointers (or rather access
    types) are definitely not overused (where I worked you needed
    management permission and a damned good reason to use them due to the
    military contract terms), but sometimes they are unavoidable.


    Please be more careful what you snip.

    >The reference is passed by value
    >to the function, just as you describe it should.


    Yes - there has to be pass-by-value at some level since Python is
    implemented in C. But to me this just says that we are seeing the
    side-effect of an implementation detail.

    No need to argue this - you don't need to explain it to me because I
    understand. It just see things from a different perspective.

    Python, however, has a
    >futher layer of abstraction on top of its one data type, and that's
    >objects. I would venture that it's the extra layer of abstraction that
    >makes Python work in a more intuitive, proper way.
    >
    >This is, however, just my opinion. But it does seem that new
    >programmers who learn this abstraction find it natural and simple. It's
    >people coming from C, et al., that seem thrown off by it.


    No, my background isn't just in C - that is a long way from being the
    first language I used, and I've used Python regularly longer than C++.

    >As a side note, based on your description of your eduction and studies
    >in an earlier post, I realize now how different our backgrounds seem to
    >be. I'm coming from a world where the focus is engineering (Ada is the
    >only language purpose-built for engineering, AFAIK). You seem to come
    >from a background more focused on science and theory.


    Nope - I just happen to be interested in learning the theory too, and
    particularly in experiencing a broad range of languages to understand
    the ideas behind them.

    How come you didn't notice ada in my list?

    About the first half of my programming career was working in the
    defence industry as a software engineer. A lot of that was working at
    very low level stuff (the 80c196kc microcontrollers I mentioned) and
    with a lot of messing around with target hardware and in-circuit
    emulators. The rest of my time in defence was spent using Ada.

    I first messed around with writing my own language quite young, with
    an Atari ST. I liked the idea of text adventures and decided to write
    one, but basically got caught up in the engine and the scripting
    stuff. Pretty cool, though, through the rose-tinted glasses of memory.

    Anyway, I stayed interested in compilers and stuff, but the reason I
    got interested in LR and parsing theory has little to do with parsing
    as most see it. A grammar, seen as describing a sequence of tokens,
    can deal with many things - the tokens don't have to be words. They
    might be events, for instance. And the state of the 'parser' may be
    used to select a response strategy in an AI. I wanted it for a game I
    was going to write a few years back which would construct strategies
    from various elements by building them into a kind of grammar, and use
    the resulting state model as a real time AI.

    >Perhaps this is
    >the cause of our different world views (on this issue). Just a thought,
    >though it is interesting we both chose Python.


    Strictly speaking, the reason I chose Python at the time was because
    at work we used macs with terminal emulators. If you wanted a utility,
    you either wrote it under VMS or you were stuffed as mac interpreters
    and compilers were hard to come by.

    I see it as one of the happiest flukes - I just wish I'd found Python
    earlier.
    Stephen Horne, Jul 17, 2003
    #5
  6. Adam Ruth

    Adam Ruth Guest

    In <> Stephen Horne wrote:
    > On Wed, 16 Jul 2003 22:00:22 +0000 (UTC), Adam Ruth
    > <> wrote:
    >
    >>In <> Stephen Horne wrote:
    >>> On Wed, 16 Jul 2003 19:06:40 +0000 (UTC), Adam Ruth
    >>> <> wrote:
    >>>
    >>>>You are correct in that there are times when pointer semantics would
    >>>>be somewhat useful, but they're never necessary. His statement was
    >>>>that there are time that "you really can't avoid pointers".
    >>>
    >>> Hmmm - those words were in a context. I never claimed that "you
    >>> really can't avoid pointers" in *current* Python - that's obviously
    >>> not true. But if the copy-on-write stuff were implemented, the need
    >>> would arise. For example...

    >>
    >>It sure seemed that way to me:
    >>
    >><prior dicussion>
    >>>>They're an unhappy necessity, akin to stop-lights. You need them
    >>>>because roads intersect, but if roads don't intersect, don't use
    >>>>them!
    >>>
    >>>Absolutely true. And the worst thing you can do when you really can't
    >>>avoid pointers is to obscure the issue even more by disguising them
    >>>as something else.

    >></prior discussion>

    >
    > Right - "the worst thing you can do when you really can't avoid
    > pointers" is not the same as "really can't avoid pointers".
    >
    > You have taken the condition out of a conditional scentence and quoted
    > it as if it were a statement. That's a pretty clear case of quoting
    > out of context to misrepresent what someone said, isn't it.
    >
    > Now lets restore the rest of what Adam said...
    >
    >: I came to Python from Ada, and all I have to say is: Pointer-like
    >: semantics are evil. They're worse than goto's, they're deadly,
    >: damaging, and should be avoided at all costs.
    >:
    >: They're an unhappy necessity, akin to stop-lights. You need them
    >: because roads intersect, but if roads don't intersect, don't use them!
    >
    > Guess what - when Adam said pointers are an 'unhappy necessity' he was
    > talking about ADA - NOT PYTHON. In Ada, pointers (or rather access
    > types) are definitely not overused (where I worked you needed
    > management permission and a damned good reason to use them due to the
    > military contract terms), but sometimes they are unavoidable.
    >
    >
    > Please be more careful what you snip.


    Actually, no, I wasn't talking about Ada. I was talking in general and
    about Pythyon. In languages with pointers, they are there to overcome a
    weakness (the roads intersecting). But where the roads don't intersect (
    Python) there's no need to use them (or their semantics, as the case may
    be).

    So your response about diguising pointers as something else seemed to me
    to be referring to Python, since the prior message was about disguising
    a pointer as a mutable list in Pythyon.

    Anyway, glad we cleared that up. Sorry about misrepresenting you, I
    really thought you were talking about Python. In retrospect, my
    metaphor wasn't very clear.

    >
    >>The reference is passed by value
    >>to the function, just as you describe it should.

    >
    > Yes - there has to be pass-by-value at some level since Python is
    > implemented in C. But to me this just says that we are seeing the
    > side-effect of an implementation detail.


    I don't see it as a side effect, I see it as the core design.
    References are a very concrete type, they happen to be implemented as
    pointers in C. In Java they're not implemented as pointers, I'm not
    sure what they're implemented with, but it isn't pointers. In that case
    it must pass by reference (ala Java), but the semantics in Python of
    passing references by value is maintained.

    >
    > No need to argue this - you don't need to explain it to me because I
    > understand. It just see things from a different perspective.
    >
    > Python, however, has a
    >>futher layer of abstraction on top of its one data type, and that's
    >>objects. I would venture that it's the extra layer of abstraction
    >>that makes Python work in a more intuitive, proper way. This is,
    >>however, just my opinion. But it does seem that new programmers who
    >>learn this abstraction find it natural and simple. It's people
    >>coming from C, et al., that seem thrown off by it.

    >
    > No, my background isn't just in C - that is a long way from being the
    > first language I used, and I've used Python regularly longer than C++.


    I wasn't referring to you, but to the people that come to the newsgroup
    and ask questions about it. I still don't really understand what it is
    that bugs you about it, but that's more on my end.

    >
    >>As a side note, based on your description of your eduction and studies
    >>in an earlier post, I realize now how different our backgrounds seem
    >>to be. I'm coming from a world where the focus is engineering (Ada
    >>is the only language purpose-built for engineering, AFAIK). You seem
    >>to come from a background more focused on science and theory.

    >
    > Nope - I just happen to be interested in learning the theory too, and
    > particularly in experiencing a broad range of languages to understand
    > the ideas behind them.
    >
    > How come you didn't notice ada in my list?


    It's such a small word :). I'm sure I did, but wasn't thinking in this
    context then, so I didn't INCREF the information and it got garbage
    collected.

    >
    > About the first half of my programming career was working in the
    > defence industry as a software engineer. A lot of that was working at
    > very low level stuff (the 80c196kc microcontrollers I mentioned) and
    > with a lot of messing around with target hardware and in-circuit
    > emulators. The rest of my time in defence was spent using Ada.
    >
    > I first messed around with writing my own language quite young, with
    > an Atari ST. I liked the idea of text adventures and decided to write
    > one, but basically got caught up in the engine and the scripting
    > stuff. Pretty cool, though, through the rose-tinted glasses of memory.
    >
    > Anyway, I stayed interested in compilers and stuff, but the reason I
    > got interested in LR and parsing theory has little to do with parsing
    > as most see it. A grammar, seen as describing a sequence of tokens,
    > can deal with many things - the tokens don't have to be words. They
    > might be events, for instance. And the state of the 'parser' may be
    > used to select a response strategy in an AI. I wanted it for a game I
    > was going to write a few years back which would construct strategies
    > from various elements by building them into a kind of grammar, and use
    > the resulting state model as a real time AI.
    >
    >>Perhaps this is
    >>the cause of our different world views (on this issue). Just a
    >>thought, though it is interesting we both chose Python.

    >
    > Strictly speaking, the reason I chose Python at the time was because
    > at work we used macs with terminal emulators. If you wanted a utility,
    > you either wrote it under VMS or you were stuffed as mac interpreters
    > and compilers were hard to come by.
    >
    > I see it as one of the happiest flukes - I just wish I'd found Python
    > earlier.


    Interesting. I chose it mainly because I was experimenting with
    Smalltalk to learn dynamic programming, and it seemed a more practical
    language (I never got used to the whole image thing in Smalltalk). One
    of the things that I really liked about both languages is how they
    handled references. Perhaps that's why I'm so vehement about this whole
    topic. Well, that and too much caffeine.

    Adam Ruth
    Adam Ruth, Jul 17, 2003
    #6
  7. On Thu, 17 Jul 2003 00:03:38 +0000 (UTC), Adam Ruth
    <> wrote:

    >Anyway, glad we cleared that up. Sorry about misrepresenting you, I
    >really thought you were talking about Python. In retrospect, my
    >metaphor wasn't very clear.


    And in restrospect, I appologise too - of course the disguising
    pointers thing was about Python and the fact that people *do* fake
    pointers. But the "when you really can't avoid pointers" had no place
    being carried over as in Python you can avoid pointers (including
    faked pointers).

    I should have though about that before replying - what I didn't state,
    I certainly did imply and wrongly. Sorry again.

    >I don't see it as a side effect, I see it as the core design.
    >References are a very concrete type, they happen to be implemented as
    >pointers in C. In Java they're not implemented as pointers, I'm not
    >sure what they're implemented with, but it isn't pointers. In that case
    >it must pass by reference (ala Java), but the semantics in Python of
    >passing references by value is maintained.


    The terms 'pointer' and 'reference' are actually different names for
    essentially the same thing - an indirect way of accessing something
    else. The distinction between the terms is somewhat artificial, and
    the meanings vary according to your background. For example, the ML
    'reference' would probably be called a pointer by a C++ programmer as
    it requires explicit dereferencing using a dereference operator. But
    note the word 'reference' embedded within 'dereference', even when
    applied to a pointer.

    Different name, different syntactic quirks, but basically the same
    thing. It's just that references (as in things that don't need
    explicit dereferencing) tend to limit errors simply by limiting the
    ways you can use them.

    I just got tired of typing 'pointer/reference'.

    At some point, however, there must be some kind of manipulation of
    hardware addresses - maybe in the implementation of the language,
    maybe in the next layer down, or whatever. Even if you use a C
    fixed-size array, the implementation in assembler is using essentially
    pointer math to implement it. That pointer math is so exposed in C,
    actually, that in most cases you can treat a pointer as an array and
    visa versa (sizeof being the only exception I can think of off the top
    of my head).

    Anyway, this kind of thing is not what I want from a high level
    language. What I would want (in the alternate universe I mentioned
    elsewhere) is actually a simpler version of what we already have - a
    mutable container that always holds exactly one item. That would do
    the important jobs of a pointer, but being explicitly meant the job it
    would cause less confusion and would have less potential for errors.

    Even if it were dereferenced using '[0]' that is still doing the job
    of a pointer. I'd simply like to see it dereferenced in a way that
    doesn't suggest a '[1]' or '[2]' might be used. And in some contexts,
    implicit dereferencing might well be the way to go. Variable
    parameters being an important example.

    >> No, my background isn't just in C - that is a long way from being the
    >> first language I used, and I've used Python regularly longer than C++.

    >
    >I wasn't referring to you, but to the people that come to the newsgroup
    >and ask questions about it.


    OK - sorry for taking that personally, then.
    Stephen Horne, Jul 17, 2003
    #7
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Ian Bicking

    RE: anything like C++ references?

    Ian Bicking, Jul 12, 2003, in forum: Python
    Replies:
    54
    Views:
    1,046
    Christos TZOTZIOY Georgiou
    Jul 22, 2003
  2. Dave Brueck

    Re: anything like C++ references?

    Dave Brueck, Jul 13, 2003, in forum: Python
    Replies:
    4
    Views:
    345
    =?ISO-8859-1?Q?Hannu_Kankaanp=E4=E4?=
    Jul 14, 2003
  3. David McNab

    Re: anything like C++ references?

    David McNab, Jul 13, 2003, in forum: Python
    Replies:
    19
    Views:
    482
    Christos TZOTZIOY Georgiou
    Jul 18, 2003
  4. Tim Peters

    RE: anything like C++ references?

    Tim Peters, Jul 13, 2003, in forum: Python
    Replies:
    44
    Views:
    999
    Tim Roberts
    Jul 20, 2003
  5. Michael Chermside

    RE: anything like C++ references?

    Michael Chermside, Jul 14, 2003, in forum: Python
    Replies:
    2
    Views:
    254
    Michael Hudson
    Jul 15, 2003
Loading...

Share This Page