Re: Number of languages known [was Re: Python is readable] - somewhatOT

Discussion in 'Python' started by Nathan Rice, Mar 29, 2012.

  1. Nathan Rice

    Nathan Rice Guest

    On Thu, Mar 29, 2012 at 10:03 AM, Chris Angelico <> wrote:
    > On Fri, Mar 30, 2012 at 12:44 AM, Nathan Rice
    > <> wrote:
    >> We would be better off if all the time that was spent on learning
    >> syntax, memorizing library organization and becoming proficient with
    >> new tools was spent learning the mathematics, logic and engineering
    >> sciences.  Those solve problems, languages are just representations.

    >
    > Different languages are good at different things. REXX is an efficient
    > text parser and command executor. Pike allows live updates of running
    > code. Python promotes rapid development and simplicity. PHP makes it
    > easy to add small amounts of scripting to otherwise-static HTML pages.
    > C gives you all the power of assembly language with all the
    > readability of... assembly language. SQL describes a database request.


    Here's a thought experiment. Imagine that you have a project tree on
    your file system which includes files written in many different
    programming languages. Imagine that the files can be assumed to be
    contiguous for our purposes, so you could view all the files in the
    project as one long chunk of data. The directory and file names could
    be interpreted as statements in this data, analogous to "in the
    context of somedirectory" or "in the context of somefile with
    sometype". Any project configuration files could be viewed as
    declarative statements about contexts, such as "in xyz context, ignore
    those" or "in abc context, any that is actually a this". Imagine the
    compiler or interpreter is actually part of your program (which is
    reasonable since it doesn't do anything by itself). Imagine the build
    management tool is also part of your program in pretty much the same
    manner. Imagine that your program actually generates another program
    that will generate the program the machine runs. I hope you can
    follow me here, and further I hope you can see that this is a
    completely valid description of what is actually going on (from a
    different perspective).

    In the context of the above thought experiment, it should be clear
    that we currently have something that is a structural analog of a
    single programming metalanguage (or rather, one per computer
    architecture), with many domain specific languages constructed above
    that to simplify tasks in various contexts. The model I previously
    proposed is not fantasy, it exists, just not in a form usable by human
    beings. Are machine instructions the richest possible metalanguage?
    I really doubt it.

    Lets try another thought experiment... Imagine that instead of having
    machine instructions as the common metalanguage, we pushed the point
    of abstraction closer to something programmers can reasonably work
    with: abstract syntax trees. Imagine all programming languages share
    a common abstract syntax tree format, with nodes generated using a
    small set of human intelligible semantic primes. Then, a domain
    specific language is basically a context with a set of logical
    implications. By associating a branch of the tree to one (or the
    union of several) context, you provide a transformation path to
    machine instructions via logical implication. If implications of a
    union context for the nodes in the branch are not compatible, this
    manifests elegantly in the form of a logical contradiction.

    What does pushing the abstraction point that far up provide? For one,
    you can now reason across language boundaries. A compiler can tell me
    if my prolog code and my python code will behave properly together.
    Another benefit is that you make explicit the fact that your parser,
    interpreter, build tools, etc are actually part of your program, from
    the perspective that your program is actually another program that
    generates programs in machine instructions. By unifying your build
    chain, it makes deductive inference spanning steps and tools possible,
    and eliminates some needless repetition. This also greatly simplifies
    code reuse, since you only need to generate a syntax tree of the
    proper format and associate the correct context to it. It also
    simplifies learning languages, since people only need to understand
    the semantic primes in order to read anything.

    Of course, this describes Lisp to some degree, so I still need to
    provide some answers. What is wrong with Lisp? I would say that the
    base syntax being horrible is probably the biggest issue. Beyond
    that, transformations on lists of data are natural in Lisp, but graph
    transformations are not, making some things awkward. Additionally,
    because Lisp tries to nudge you towards programming in a functional
    style, it can be un-intuitive to learn. Programming is knowledge
    representation, and state is a natural concept that many people desire
    to model, so making it a second class citizen is a mistake. If I were
    to re-imagine Lisp for this purpose, I would embrace state and an
    explicit notion of temporal order. Rather than pretending it didn't
    exist, I would focus on logical and mathematical machinery necessary
    to allow powerful deductive reasoning about state. It is no
    coincidence that when a language needs to support formal verification
    (such as microcontrollers and DSPS for mission critical devices) a
    synchronous language is the go-go. On the other side of the spectrum,
    Haskell is the darling of functional programmers, but it is one of the
    worst languages in existence as far as being able to reason about the
    behavior of your program goes. Ignoring state for a few mathematical
    conveniences is the damning mark on the brow of the functional
    paradigm. Functional programming may be better on the whole than
    imperative programming, but anyone who doesn't acknowledge that it is
    an evolutionary dead-end is delusional.

    > You can't merge all of them without making a language that's
    > suboptimal at most of those tasks - probably, one that's woeful at all
    > of them. I mention SQL because, even if you were to unify all
    > programming languages, you'd still need other non-application
    > languages to get the job done.
    >
    > Keep the diversity and let each language focus on what it's best at.


    I don't know of any semi-modern programming language that doesn't
    generate an abstract syntax tree. Since any turing complete language
    can emulate any other turing complete language, there is no reason why
    a concise metalanguage for describing nodes of abstract syntax trees
    couldn't form the semantic vocabulary for every language in existence
    at the AST level. The syntax could be wildly different, but even then
    there is a VERY simple feature of CFGs that helps: they are closed
    under union. The only issue you could run into is if a node with a
    given name is represented by two different compositions of semantic
    primes at the AST level. Even this is not a show stopper though,
    because you could proceed using a union node. When converting the
    parse tree to an AST, it is likely only one of the two possible nodes
    in the union will fulfill all the requirements given its neighboring
    nodes and location in the tree. If there is more than one
    incompatible match, then of course you just alert the programmer to
    the contradiction and they can modify the tree context.

    I'm all for diversity of language at the level of minor notation and
    vocabulary, but to draw an analogy to the real world, English and
    Mandarin are redundant, and the fact that they both creates a
    communication barrier for BILLIONS of people. That doesn't mean that
    biologists shouldn't be able to define words to describe biological
    things, if you want to talk about biology you just need to learn the
    vocabulary. That also doesn't mean or that mathematicians shouldn't
    be able to use notation to structure complex statements, if you want
    to do math you need to man up and learn the notation (of course, I
    have issues with some mathematical notation, but there is no reason
    you should cry about things like set builder).
     
    Nathan Rice, Mar 29, 2012
    #1
    1. Advertising

  2. Re: Number of languages known [was Re: Python is readable] -somewhatOT

    On Thu, 29 Mar 2012 13:48:40 -0400, Nathan Rice wrote:

    > Here's a thought experiment. Imagine that you have a project tree on
    > your file system which includes files written in many different
    > programming languages. Imagine that the files can be assumed to be
    > contiguous for our purposes, so you could view all the files in the
    > project as one long chunk of data. The directory and file names could
    > be interpreted as statements in this data, analogous to "in the context
    > of somedirectory" or "in the context of somefile with sometype". Any
    > project configuration files could be viewed as declarative statements
    > about contexts, such as "in xyz context, ignore those" or "in abc
    > context, any that is actually a this". Imagine the compiler or
    > interpreter is actually part of your program (which is reasonable since
    > it doesn't do anything by itself). Imagine the build management tool is
    > also part of your program in pretty much the same manner. Imagine that
    > your program actually generates another program that will generate the
    > program the machine runs. I hope you can follow me here, and further I
    > hope you can see that this is a completely valid description of what is
    > actually going on (from a different perspective).

    [...]
    > What does pushing the abstraction point that far up provide?


    I see why you are so hostile towards Joel Spolsky's criticism of
    Architecture Astronauts: you are one of them. Sorry Nathan, I don't know
    how you breathe that high up.

    For what it's worth, your image of "everything from the compiler on up is
    part of your program" describes both Forth and Hypercard to some degree,
    both of which I have used and like very much. I still think you're
    sucking vacuum :(



    --
    Steven
     
    Steven D'Aprano, Mar 30, 2012
    #2
    1. Advertising

  3. Nathan Rice

    Nathan Rice Guest

    >> Here's a thought experiment.  Imagine that you have a project tree on
    >> your file system which includes files written in many different
    >> programming languages.  Imagine that the files can be assumed to be
    >> contiguous for our purposes, so you could view all the files in the
    >> project as one long chunk of data.  The directory and file names could
    >> be interpreted as statements in this data, analogous to "in the context
    >> of somedirectory" or "in the context of somefile with sometype".  Any
    >> project configuration files could be viewed as declarative statements
    >> about contexts, such as "in xyz context, ignore those" or "in abc
    >> context, any that is actually a this".  Imagine the compiler or
    >> interpreter is actually part of your program (which is reasonable since
    >> it doesn't do anything by itself).  Imagine the build management tool is
    >> also part of your program in pretty much the same manner.  Imagine that
    >> your program actually generates another program that will generate the
    >> program the machine runs.  I hope you can follow me here, and further I
    >> hope you can see that this is a completely valid description of what is
    >> actually going on (from a different perspective).

    > [...]
    >> What does pushing the abstraction point that far up provide?

    >
    > I see why you are so hostile towards Joel Spolsky's criticism of
    > Architecture Astronauts: you are one of them. Sorry Nathan, I don't know
    > how you breathe that high up.
    >
    > For what it's worth, your image of "everything from the compiler on up is
    > part of your program" describes both Forth and Hypercard to some degree,
    > both of which I have used and like very much. I still think you're
    > sucking vacuum :(


    We live in a world where the tools that are used are based on
    tradition (read that as backwards compatibility if it makes you feel
    better) and as a mechanism for deriving personal identity. The world
    is backwards and retarded in many, many ways, this problem is
    interesting to me because it actually cuts across a much larger tract
    than is immediately obvious.

    People throughout history have had the mistaken impression that the
    world as it existed for them was the pinnacle of human development.
    Clearly all of those people were tragically deluded, and I suspect
    that is the case here as well.
     
    Nathan Rice, Mar 30, 2012
    #3
  4. Nathan Rice

    alex23 Guest

    Re: Number of languages known [was Re: Python is readable] - somewhat OT

    On Mar 30, 3:37 pm, Nathan Rice <>
    wrote:
    > We live in a world where the tools that are used are based on
    > tradition (read that as backwards compatibility if it makes you feel
    > better) and as a mechanism for deriving personal identity.  The world
    > is backwards and retarded in many, many ways, this problem is
    > interesting to me because it actually cuts across a much larger tract
    > than is immediately obvious.


    Do you produce commercial code in a team? Because going by your
    absolutist bullshit here, it certainly doesn't sound like it.

    When I join an organisation that requires language A as all of its
    systems are written in it, is that 'tradition' or 'personal identity'?
    How is 'compatibility' - either with existing systems or existing
    *developers* - a "backwards and retarded" approach to complex
    problems?

    If I've chosen language A because some aspect of its syntax maps
    better onto my mind (or for _whatever_ reason that makes individuals
    prefer one language to another), and you've chosen language B: who
    gets to decide which is the 'superior' language, which is the 'better'
    mapping etc?

    You're arguing for a top-down centralised approach to language
    development that just will _never_ exist, simply because it cannot. If
    you don't accept that, I believe there's a fascinating fork called
    "Python 4000" where your ideas would be readily adopted.
     
    alex23, Apr 2, 2012
    #4
  5. Nathan Rice

    Nathan Rice Guest

    On Sun, Apr 1, 2012 at 11:18 PM, alex23 <> wrote:
    > On Mar 30, 3:37 pm, Nathan Rice <>
    > wrote:
    >> We live in a world where the tools that are used are based on
    >> tradition (read that as backwards compatibility if it makes you feel
    >> better) and as a mechanism for deriving personal identity.  The world
    >> is backwards and retarded in many, many ways, this problem is
    >> interesting to me because it actually cuts across a much larger tract
    >> than is immediately obvious.

    >
    > Do you produce commercial code in a team? Because going by your
    > absolutist bullshit here, it certainly doesn't sound like it.


    Think of me like the Wolf, the cleaner in pulp fiction that Marcellis
    Wallis calls in to take care of the mess when Jules accidentally blows
    a kid's brains out in the back of a car. I get called in when my
    skills are needed, and when the mess has been handled and things are
    back to normal I take my leave.

    > When I join an organisation that requires language A as all of its
    > systems are written in it, is that 'tradition' or 'personal identity'?
    > How is 'compatibility' - either with existing systems or existing
    > *developers* - a "backwards and retarded" approach to complex
    > problems?


    I don't care what people do related to legacy systems. There will
    always be a COBOL. I do care about programmers that are too lazy to
    learn, and would be happy to ignore the fact that programming is hard
    for most people to learn, so they can continue not learning. Those
    programmers are scumbags.

    Just don't let me hear you complaining because some syntax is not "C
    like" enough for you. Whenever I hear that I want to strangle the
    self-serving 'tard that wrote it. When I see people defending "C
    like" syntax as optimal or somehow much more expressive, that makes me
    doubly irritated. These are the people who are selfishly defending
    the status quo because they're invested. If you're going to be
    selfish and inconsiderate at least be honest about it, rather than
    pretending that one of the earliest languages somehow got almost
    everything right and should be the basis for new languages till the
    end of time. This goes for most of the ALGOL derived languages. I
    don't have a problem if you know your language well and are happy
    using it, that's great. Don't try to delude people that our modern
    ALGOL derivatives are the best possible way to model knowledge
    (including process knowledge) to a computer, because that is a lie.

    > If I've chosen language A because some aspect of its syntax maps
    > better onto my mind (or for _whatever_ reason that makes individuals
    > prefer one language to another), and you've chosen language B: who
    > gets to decide which is the 'superior' language, which is the 'better'
    > mapping etc?


    You should be able to live in your reality if you want, as long that
    doesn't impinge on others. Of course, if you disagree on basic
    grammar, then I would have to ask you, do you disagree about English
    grammar, or have you accepted it so that you can communicate with
    people? This is why I advocate following English grammar closely for
    syntax - people have accepted it and don't make a big deal, and it is
    the way we represent information already.

    > You're arguing for a top-down centralised approach to language
    > development that just will _never_ exist, simply because it cannot. If
    > you don't accept that, I believe there's a fascinating fork called
    > "Python 4000" where your ideas would be readily adopted.


    You completely missed my point. In fact, my argument is for a bottom
    up approach, with a meeting point which is much closer than the
    machine code which is currently used. However you want to represent
    it, the knowledge is the same, and that is what matters. We need to
    get past the idea of different, incompatible languages, and settle on
    a common knowledge representation format that underlies all languages,
    and is compatible. If you want to make an alex23 DSL where up is down
    and inside is upside down, go for it, just as long as it is
    represented in a sensible set of semantic primes that I can transform
    to whatever reality I want.
     
    Nathan Rice, Apr 3, 2012
    #5
  6. Nathan Rice

    alex23 Guest

    Re: Number of languages known [was Re: Python is readable] - somewhat OT

    On Apr 3, 2:55 pm, Nathan Rice <>
    wrote:
    > I don't care what people do related to legacy systems.


    And that's what earns you the label 'architecture astronaut'. Legacy
    systems are _part_ of the problem; it's very easy to hold to a purist
    approach when you ignore the bulk of the domain that causes the
    issues. There's _never_ going to be an InfoTech3k where we just stop
    supporting older code.

    > I do care about programmers that are too lazy to
    > learn, and would be happy to ignore the fact that programming is hard
    > for most people to learn, so they can continue not learning.  Those
    > programmers are scumbags.


    Wait, what?

    Programmers are both "too lazy to learn" and yet somehow happy that
    the skills they've acquired are "too hard for most people to learn"?
    So how did they learn them?

    And they're also somehow "lazy" because they have to learn multiple
    languages to be effective, rather than one mythical ur-language?

    In my 20 years as a software developer, I have _never_ encountered
    anyone trying to deliberately expand the knowledge gap. This isn't a
    priesthood.

    > Just don't let me hear you complaining because some syntax is not "C
    > like" enough for you.  Whenever I hear that I want to strangle the
    > self-serving 'tard that wrote it.  When I see people defending "C
    > like" syntax as optimal or somehow much more expressive, that makes me
    > doubly irritated.  These are the people who are selfishly defending
    > the status quo because they're invested.


    Syntax is never the issue, it's the deeper semantics. Is the scoping
    of one C-like language the same as C? How does it differ? Why does it
    differ? Is the difference a fundamental implementation issue that you
    really need to know before you actually grok the language? Are
    functions first-class objects? Are they actual objects or some kind of
    magical stub? Can you extend those objects with properties? etc etc

    Every language tackles _so many_ things differently. It's not lazy to
    say that you prefer something to resemble/be based on a language you
    have experience with, that's human nature. If you're insistent that
    your non-typical syntax is so much better, the onus is on you to prove
    it, not to insist that the lack of uptake is 'laziness'.

    And one again: code is _communication_. Not having to understand new
    optimal patterns for every single language is a Good Thing.

    > Don't try to delude people that our modern
    > ALGOL derivatives are the best possible way to model knowledge
    > (including process knowledge) to a computer, because that is a lie.


    Um, okay, I'll stop doing that...not that I've ever seen anyone make
    that claim...

    A large part of what makes languages popular _is their popularity_. In
    many ways, ALGOL is English to your hypothetical language's Lojban.
    You can argue until the end of time for the superiority of Lojban due
    to it's lack of ambiguity, it's not going to affect it's acquisition
    at all.

    > You should be able to live in your reality if you want, as long that
    > doesn't impinge on others.  Of course, if you disagree on basic
    > grammar, then I would have to ask you, do you disagree about English
    > grammar, or have you accepted it so that you can communicate with
    > people?  This is why I advocate following English grammar closely for
    > syntax - people have accepted it and don't make a big deal, and it is
    > the way we represent information already.


    And programmers have accepted ALGOL and don't etc

    The idea of coding in English just fills me with horror and dread.
    COBOL died for a reason.

    > > You're arguing for a top-down centralised approach to language
    > > development that just will _never_ exist, simply because it cannot. If
    > > you don't accept that, I believe there's a fascinating fork called
    > > "Python 4000" where your ideas would be readily adopted.

    >
    > You completely missed my point.  In fact, my argument is for a bottom
    > up approach, with a meeting point which is much closer than the
    > machine code which is currently used.


    You missed my point; I was referring more to the _adoption_ of your ur-
    language. The only way to push this is to force it on everyone.

    > However you want to represent
    > it, the knowledge is the same, and that is what matters.  We need to
    > get past the idea of different, incompatible languages, and settle on
    > a common knowledge representation format that underlies all languages,
    > and is compatible.  If you want to make an alex23 DSL where up is down
    > and inside is upside down, go for it, just as long as it is
    > represented in a sensible set of semantic primes that I can transform
    > to whatever reality I want.


    So effectively for any given project I'd need to know: the underlying
    representation (because we have to be able to discuss _something_ as a
    team), my DSL, how my DSL transforms to the underlying representation,
    and to be really effective, every team member's DSL and how it
    transforms. Because _no one_ on my team works alone, debugs alone 100%
    of the time.

    How do I share cool patterns? Show them the underlying representation?
    How do they copy them? Back trace the representation to their own DSL
    and reimplement? What if the elegance in my DSL is a nightmare to
    construct in a peer's? How does my code look to them? Does it even
    include my identifiers & comments or is the representation too low
    level for that? How do they debug it?

    How do we learn? How do we share?
     
    alex23, Apr 3, 2012
    #6
  7. Nathan Rice

    Nathan Rice Guest

    On Tue, Apr 3, 2012 at 1:40 AM, alex23 <> wrote:
    > On Apr 3, 2:55 pm, Nathan Rice <>
    > wrote:
    >> I don't care what people do related to legacy systems.

    >
    > And that's what earns you the label 'architecture astronaut'. Legacy
    > systems are _part_ of the problem; it's very easy to  hold to a purist
    > approach when you ignore the bulk of the domain that causes the
    > issues. There's _never_ going to be an InfoTech3k where we just stop
    > supporting older code.


    There are people who are paid pretty well to support crappy old COBOL
    apps, but I am not among them (nor are you, with very high
    likelihood), so your "we" is misplaced. For all intents and purposes
    that software exists in an alternate reality.

    Remember the tutorial on global vs local optimization I made
    previously? Let me distill it... If you are unwilling to endure pain
    to move towards a better world you will always be trapped in a
    sub-optimal situation.

    >> I do care about programmers that are too lazy to
    >> learn, and would be happy to ignore the fact that programming is hard
    >> for most people to learn, so they can continue not learning.  Those
    >> programmers are scumbags.

    >
    > Wait, what?
    >
    > Programmers are both "too lazy to learn" and yet somehow happy that
    > the skills they've acquired are "too hard for most people to learn"?
    > So how did they learn them?
    >
    > And they're also somehow "lazy" because they have to learn multiple
    > languages to be effective,  rather than one mythical ur-language?
    >
    > In my 20 years as a software developer, I have _never_ encountered
    > anyone trying to deliberately expand the knowledge gap. This isn't a
    > priesthood.


    Did you miss the part where I said that most people who learn to
    program are fascinated by computers and highly motivated to do so?
    I've never met a BROgrammer, those people go into sales. It isn't
    because there aren't smart BROmosapiens (sadly, there are), they just
    couldn't give two shits about computers so programming seems like a
    colossal waste of time to them.

    It isn't about people scheming to "dis-empower then plebs" rather it
    is about people who don't want to move outside their comfort zone.
    You can talk about people learning multiple languages all you want,
    but for the most part they will be 10 descendants of ALGOL, with minor
    variations. Very few people are willing to tackle something like
    Haskell or ML if they weren't taught functional programming in
    university, though there are a few that view it as an endurance trial
    or mountain to climb. Those people get a pass on most of what I've
    said thus far.

    >> Just don't let me hear you complaining because some syntax is not "C
    >> like" enough for you.  Whenever I hear that I want to strangle the
    >> self-serving 'tard that wrote it.  When I see people defending "C
    >> like" syntax as optimal or somehow much more expressive, that makes me
    >> doubly irritated.  These are the people who are selfishly defending
    >> the status quo because they're invested.

    >
    > Syntax is never the issue, it's the deeper semantics. Is the scoping
    > of one C-like language the same as C? How does it differ? Why does it
    > differ? Is the difference a fundamental implementation issue that you
    > really need to know before you actually grok the language? Are
    > functions first-class objects? Are they actual objects or some kind of
    > magical stub? Can you extend those objects with properties? etc etc


    Syntax and semantics are both a big mess right now. That is why I
    always address them both.

    > Every language tackles _so many_ things differently. It's not lazy to
    > say that you prefer something to resemble/be based on a language you
    > have experience with, that's human nature. If you're insistent that
    > your non-typical syntax is so much better, the onus is on you to prove
    > it, not to insist that the lack of uptake is 'laziness'.


    The winds of change generally blow for programming when generations of
    older programmers leave the workforce. Alan Kay was a smart man,
    viewing programming as an educational tool and designing for youth is
    absolutely the right way to do things. If you try to retrain older
    programmers, you are basically telling them they have to change
    decades of learning for a moderate (but not huge) productivity
    increase, so that programming is accessible to a much wider group of
    people. Much like with the terminal to GUI transition, you will have
    people attacking declarative natural language programming as a stupid
    practice for noobs, and the end of computing (even though it will
    allow people with much less experience to be more productive than
    them).

    > And one again: code is _communication_. Not having to understand new
    > optimal patterns for every single language is a Good Thing.


    Code is a horrible medium for communication. If it weren't, I
    wouldn't be trolling this thread.

    >> Don't try to delude people that our modern
    >> ALGOL derivatives are the best possible way to model knowledge
    >> (including process knowledge) to a computer, because that is a lie.

    >
    > Um, okay, I'll stop doing that...not that I've ever seen anyone make
    > that claim...


    Computers require you to state the exact words you're searching for as
    well. Try looking again, and this time allow for sub-categories and
    synonyms, along with some variation in word order.

    > A large part of what makes languages popular _is their popularity_. In
    > many ways, ALGOL is English to your hypothetical language's Lojban.
    > You can argue until the end of time for the superiority of Lojban due
    > to it's lack of ambiguity, it's not going to affect it's acquisition
    > at all.


    I would say that ALGOL is more like the grunts and gestures of a
    proto-language. Some day, one or two hundred years from now,
    computers will be embarrassed that they were ever so obtuse. Kind of
    like when grown up children finally apologize to their parents for all
    the trouble they caused when they were younger.

    >> You should be able to live in your reality if you want, as long that
    >> doesn't impinge on others.  Of course, if you disagree on basic
    >> grammar, then I would have to ask you, do you disagree about English
    >> grammar, or have you accepted it so that you can communicate with
    >> people?  This is why I advocate following English grammar closely for
    >> syntax - people have accepted it and don't make a big deal, and it is
    >> the way we represent information already.

    >
    > And programmers have accepted ALGOL and don't etc
    >
    > The idea of coding in English just fills me with horror and dread.
    > COBOL died for a reason.


    COBOL gets brought up every time there is a conversation about natural
    language programming. Take a break from the thread, program some
    COBOL, and tell me there is ANYTHING natural about it. On top of
    that, I imagine you would find many other reasons besides the use of
    English words that the language deserved to die.

    >> > You're arguing for a top-down centralised approach to language
    >> > development that just will _never_ exist, simply because it cannot. If
    >> > you don't accept that, I believe there's a fascinating fork called
    >> > "Python 4000" where your ideas would be readily adopted.

    >>
    >> You completely missed my point.  In fact, my argument is for a bottom
    >> up approach, with a meeting point which is much closer than the
    >> machine code which is currently used.

    >
    > You missed my point; I was referring more to the _adoption_ of your ur-
    > language. The only way to push this is to force it on everyone.


    No, most people are too selfish to do something because it is good for
    others. People learn programming languages because of a "killer app"
    or lucrative platform... Rails, Django, PHP, javascript (the browser),
    objective c (the iphone).

    Again, I defer to Alan Kay who I am quite sure already thought about
    this issue. Targeting young people and education avoids a lot of the
    "killer app" chasing and knowledge intertia.

    Providing a scripting layer for video games is also a viable option.
    That is modeling a game world, so a declarative language that is
    designed to model knowledge and systems would be a fairly easy sell.

    Finally, build a NoSQL database around it. Support both in-memory and
    distributed processes. It doesn't have to be the fastest, but it does
    have to be stable and easy to use.

    >> However you want to represent
    >> it, the knowledge is the same, and that is what matters.  We need to
    >> get past the idea of different, incompatible languages, and settle on
    >> a common knowledge representation format that underlies all languages,
    >> and is compatible.  If you want to make an alex23 DSL where up is down
    >> and inside is upside down, go for it, just as long as it is
    >> represented in a sensible set of semantic primes that I can transform
    >> to whatever reality I want.

    >
    > So effectively for any given project I'd need to know: the underlying
    > representation (because we have to be able to discuss _something_ as a
    > team), my DSL, how my DSL transforms to the underlying representation,
    > and to be really effective, every team member's DSL and how it
    > transforms. Because _no one_ on my team works alone, debugs alone 100%
    > of the time.


    People don't walk around using words of their own creation, they get
    together and agree on terminology for a topic, then stick to it.
    Additionally, I was suggesting that while you *write* your code in
    alex23ese, the computer would be able to produce a canonical
    representation, as a courtesy to you, since you suggested that not
    being able to write "your way" was somehow horrible or crippling.

    > How do I share cool patterns? Show them the underlying representation?
    > How do they copy them? Back trace the representation to their own DSL
    > and reimplement? What if the elegance in my DSL is a nightmare to
    > construct in a peer's? How does my code look to them? Does it even
    > include my identifiers & comments or is the representation too low
    > level for that? How do they debug it?
    >
    > How do we learn? How do we share?


    Don't think "underlying", instead think "canonical".

    Ultimately, the answers to your questions exist in the world for you
    to see. How does a surgeon describe a surgical procedure? How does a
    chef describe a recipe? How does a carpenter describe the process of
    building cabinets? Aside from specific words, they all use natural
    language, and it works just fine.
     
    Nathan Rice, Apr 3, 2012
    #7
  8. Nathan Rice

    rusi Guest

    Re: Number of languages known [was Re: Python is readable] - somewhat OT

    On Apr 3, 5:39 pm, Nathan Rice <>
    wrote:
    >
    > Don't think "underlying", instead think "canonical".
    >
    > Ultimately, the answers to your questions exist in the world for you
    > to see.  How does a surgeon describe a surgical procedure?  How does a
    > chef describe a recipe?  How does a carpenter describe the process of
    > building cabinets?  Aside from specific words, they all use natural
    > language, and it works just fine.


    A carpenter describes his carpentry-process in English
    A CSist describes his programming-process in English (at least all my
    CS books are in English)

    A carpenter uses his tools -- screwdriver, saw, planer --to do
    carpentry
    A programmer uses his tools to to programming -- one of which is
    called 'programming language'

    Doing programming without programming languages is like using toenails
    to tighten screws
     
    rusi, Apr 3, 2012
    #8
  9. On 03/04/2012 14:51, rusi wrote:
    > On Apr 3, 5:39 pm, Nathan Rice<>
    > wrote:
    >>
    >> Don't think "underlying", instead think "canonical".
    >>
    >> Ultimately, the answers to your questions exist in the world for you
    >> to see. How does a surgeon describe a surgical procedure? How does a
    >> chef describe a recipe? How does a carpenter describe the process of
    >> building cabinets? Aside from specific words, they all use natural
    >> language, and it works just fine.

    >
    > A carpenter describes his carpentry-process in English
    > A CSist describes his programming-process in English (at least all my
    > CS books are in English)
    >
    > A carpenter uses his tools -- screwdriver, saw, planer --to do
    > carpentry
    > A programmer uses his tools to to programming -- one of which is
    > called 'programming language'
    >
    > Doing programming without programming languages is like using toenails
    > to tighten screws


    The latter is extremely difficult if you bite your toenails :)

    --
    Cheers.

    Mark Lawrence.
     
    Mark Lawrence, Apr 3, 2012
    #9
  10. On Wed, Apr 4, 2012 at 12:26 AM, Mark Lawrence <> wrote:
    > On 03/04/2012 14:51, rusi wrote:
    >> Doing programming without programming languages is like using toenails
    >> to tighten screws

    >
    >
    > The latter is extremely difficult if you bite your toenails :)


    I agree, thumbnails are far better suited. Mine are often pushed into
    that service. But to extend the analogy: Using a thumbnail to tighten
    a screw is like directly patching a binary to fix a bug. It works, but
    it's not exactly a practical way to build a system.

    ChrisA
     
    Chris Angelico, Apr 3, 2012
    #10
  11. Re: Number of languages known [was Re: Python is readable] -somewhat OT

    On 2012-04-03, Chris Angelico <> wrote:
    > On Wed, Apr 4, 2012 at 12:26 AM, Mark Lawrence <> wrote:
    >> On 03/04/2012 14:51, rusi wrote:
    >>> Doing programming without programming languages is like using toenails
    >>> to tighten screws

    >>
    >>
    >> The latter is extremely difficult if you bite your toenails :)

    >
    > I agree, thumbnails are far better suited. Mine are often pushed into
    > that service. But to extend the analogy: Using a thumbnail to tighten
    > a screw is like directly patching a binary to fix a bug. It works, but
    > it's not exactly a practical way to build a system.


    Anybody remember DEC's VAX/VMS "patch" utility? Apparently, DEC
    thought it was a practical way to fix things. It had a built-in
    assembler and let you "insert" new code into a function by
    auto-allocating a location for the new code an hooking it into the
    indicated spot with jump instructions.

    The mind wobbled.

    --
    Grant Edwards grant.b.edwards Yow! I'm a fuschia bowling
    at ball somewhere in Brittany
    gmail.com
     
    Grant Edwards, Apr 3, 2012
    #11
  12. Nathan Rice

    Ian Kelly Guest

    On Tue, Apr 3, 2012 at 6:39 AM, Nathan Rice
    <> wrote:
    > Did you miss the part where I said that most people who learn to
    > program are fascinated by computers and highly motivated to do so?
    > I've never met a BROgrammer, those people go into sales.  It isn't
    > because there aren't smart BROmosapiens (sadly, there are), they just
    > couldn't give two shits about computers so programming seems like a
    > colossal waste of time to them.


    I have never met the brogrammer stereotype. I have also never met the
    non-brogrammer stereotype of nerdy solitude (well, maybe once).
    That's all these things are -- stereotypes. Real programmers are much
    more complex.

    > Computers require you to state the exact words you're searching for as
    > well.  Try looking again, and this time allow for sub-categories and
    > synonyms, along with some variation in word order.


    Lazy troll. You made the claim. The onus is on you to provide the evidence.
     
    Ian Kelly, Apr 3, 2012
    #12
  13. On Wed, Apr 4, 2012 at 1:01 AM, Ian Kelly <> wrote:
    > Real programmers are much more complex.


    Are you saying that some part of all of us is imaginary??

    ChrisA
     
    Chris Angelico, Apr 3, 2012
    #13
  14. On 03/04/2012 15:56, Chris Angelico wrote:
    > On Wed, Apr 4, 2012 at 12:46 AM, Grant Edwards<> wrote:
    >> Anybody remember DEC's VAX/VMS "patch" utility? Apparently, DEC
    >> thought it was a practical way to fix things. It had a built-in
    >> assembler and let you "insert" new code into a function by
    >> auto-allocating a location for the new code an hooking it into the
    >> indicated spot with jump instructions.
    >>
    >> The mind wobbled.

    >
    > Not specifically, but I _have_ heard of various systems whose source
    > code and binary were multiple years divergent. It's actually not a
    > difficult trap to fall into, especially once you start patching
    > running systems. I've had quite a few computers that have been unable
    > to reboot without assistance, because they go for months or years
    > without ever having to go through that initial program load. (I've had
    > _programs_ that were unable to load, for the same reason.) But
    > auto-allocating a new spot for your expanded function? That's just...
    > awesome. My mind is, indeed, wobbling.
    >
    > ChrisA


    Around 1990 I worked on Telematics kit. The patches on all their
    software were implemented via assembler once the original binary had
    been loaded into memory. They even came up with a system that let you
    select which patches you wanted and which you didn't, as e.g. some
    patches were customer specific.

    --
    Cheers.

    Mark Lawrence.
     
    Mark Lawrence, Apr 3, 2012
    #14
  15. Nathan Rice

    Nathan Rice Guest

    On Tue, Apr 3, 2012 at 9:51 AM, rusi <> wrote:
    > On Apr 3, 5:39 pm, Nathan Rice <>
    > wrote:
    >>
    >> Don't think "underlying", instead think "canonical".
    >>
    >> Ultimately, the answers to your questions exist in the world for you
    >> to see.  How does a surgeon describe a surgical procedure?  How doesa
    >> chef describe a recipe?  How does a carpenter describe the process of
    >> building cabinets?  Aside from specific words, they all use natural
    >> language, and it works just fine.

    >
    > A carpenter describes his carpentry-process in English
    > A CSist describes his programming-process in English (at least all my
    > CS books are in English)
    >
    > A carpenter uses his tools -- screwdriver, saw, planer --to do
    > carpentry
    > A programmer uses his tools to to programming -- one of which is
    > called 'programming language'
    >
    > Doing programming without programming languages is like using toenails
    > to tighten screws


    I would argue that the computer is the tool, not the language.
     
    Nathan Rice, Apr 3, 2012
    #15
  16. Nathan Rice

    Dave Angel Guest

    On 04/03/2012 11:16 AM, Mark Lawrence wrote:
    > On 03/04/2012 15:56, Chris Angelico wrote:
    >> On Wed, Apr 4, 2012 at 12:46 AM, Grant
    >> Edwards<> wrote:
    >>> Anybody remember DEC's VAX/VMS "patch" utility? Apparently, DEC
    >>> thought it was a practical way to fix things. It had a built-in
    >>> assembler and let you "insert" new code into a function by
    >>> auto-allocating a location for the new code an hooking it into the
    >>> indicated spot with jump instructions.
    >>>
    >>> The mind wobbled.

    >>
    >> Not specifically, but I _have_ heard of various systems whose source
    >> code and binary were multiple years divergent. It's actually not a
    >> difficult trap to fall into, especially once you start patching
    >> running systems. I've had quite a few computers that have been unable
    >> to reboot without assistance, because they go for months or years
    >> without ever having to go through that initial program load. (I've had
    >> _programs_ that were unable to load, for the same reason.) But
    >> auto-allocating a new spot for your expanded function? That's just...
    >> awesome. My mind is, indeed, wobbling.
    >>
    >> ChrisA

    >
    > Around 1990 I worked on Telematics kit. The patches on all their
    > software were implemented via assembler once the original binary had
    > been loaded into memory. They even came up with a system that let you
    > select which patches you wanted and which you didn't, as e.g. some
    > patches were customer specific.
    >


    And I worked on a system where the microcode was in ROM, and there was a
    "patch board" consisting of lots of diodes and some EPROMs. The diodes
    were soldered into place to specfy the instruction(s) to be patched, and
    the actual patches were in the EPROMs, which were reusable. The diodes
    were the only thing fast enough to "patch" the ROM, by responding more
    quickly than the ROM. This was back when issuing a new ROM was a very
    expensive proposition; there were masking charges, so you couldn't
    reasonably do low quantities.



    --

    DaveA
     
    Dave Angel, Apr 3, 2012
    #16
  17. On Wed, 4 Apr 2012 01:05:49 +1000, Chris Angelico <>
    declaimed the following in gmane.comp.python.general:

    > On Wed, Apr 4, 2012 at 1:01 AM, Ian Kelly <> wrote:
    > > Real programmers are much more complex.

    >
    > Are you saying that some part of all of us is imaginary??
    >

    At least half of me is genetically engineered wolf (though he seems
    to have stopped aging five years ago <G>)
    http://home.earthlink.net/~baron.wulfraed/wr_biog.htm

    Will that qualify?

    --
    Wulfraed Dennis Lee Bieber AF6VN
    HTTP://wlfraed.home.netcom.com/
     
    Dennis Lee Bieber, Apr 3, 2012
    #17
  18. On Tue, 3 Apr 2012 14:46:17 +0000 (UTC), Grant Edwards
    <> declaimed the following in
    gmane.comp.python.general:

    > Anybody remember DEC's VAX/VMS "patch" utility? Apparently, DEC
    > thought it was a practical way to fix things. It had a built-in
    > assembler and let you "insert" new code into a function by
    > auto-allocating a location for the new code an hooking it into the
    > indicated spot with jump instructions.
    >
    > The mind wobbled.


    Sounds like the TRSDOS patch program... No assembler, but it would
    allow one to specify direct replacement (if same size or less), or a
    jump to extension block if large...

    --
    Wulfraed Dennis Lee Bieber AF6VN
    HTTP://wlfraed.home.netcom.com/
     
    Dennis Lee Bieber, Apr 3, 2012
    #18
  19. Nathan Rice

    rusi Guest

    Re: Number of languages known [was Re: Python is readable] - somewhat OT

    On Apr 3, 9:15 pm, Nathan Rice <>
    wrote:
    > On Tue, Apr 3, 2012 at 9:51 AM, rusi <> wrote:
    > > On Apr 3, 5:39 pm, Nathan Rice <>
    > > wrote:

    >
    > >> Don't think "underlying", instead think "canonical".

    >
    > >> Ultimately, the answers to your questions exist in the world for you
    > >> to see.  How does a surgeon describe a surgical procedure?  How does a
    > >> chef describe a recipe?  How does a carpenter describe the process of
    > >> building cabinets?  Aside from specific words, they all use natural
    > >> language, and it works just fine.

    >
    > > A carpenter describes his carpentry-process in English
    > > A CSist describes his programming-process in English (at least all my
    > > CS books are in English)

    >
    > > A carpenter uses his tools -- screwdriver, saw, planer --to do
    > > carpentry
    > > A programmer uses his tools to to programming -- one of which is
    > > called 'programming language'

    >
    > > Doing programming without programming languages is like using toenails
    > > to tighten screws

    >
    > I would argue that the computer is the tool, not the language.


    "Computer science is as much about computers as astronomy is about
    telescopes" -- E W Dijkstra

    Here are some other attempted corrections of the misnomer "computer
    science":
    http://en.wikipedia.org/wiki/Computer_science#Name_of_the_field
     
    rusi, Apr 3, 2012
    #19
  20. Nathan Rice

    Nathan Rice Guest

    On Tue, Apr 3, 2012 at 11:01 AM, Ian Kelly <> wrote:
    > On Tue, Apr 3, 2012 at 6:39 AM, Nathan Rice
    > <> wrote:
    >> Did you miss the part where I said that most people who learn to
    >> program are fascinated by computers and highly motivated to do so?
    >> I've never met a BROgrammer, those people go into sales.  It isn't
    >> because there aren't smart BROmosapiens (sadly, there are), they just
    >> couldn't give two shits about computers so programming seems like a
    >> colossal waste of time to them.

    >
    > I have never met the brogrammer stereotype.  I have also never met the
    > non-brogrammer stereotype of nerdy solitude (well, maybe once).
    > That's all these things are -- stereotypes.  Real programmers are much
    > more complex.


    I have never met a programmer that was not completely into computers.
    That leaves a lot unspecified though.

    >> Computers require you to state the exact words you're searching for as
    >> well.  Try looking again, and this time allow for sub-categories and
    >> synonyms, along with some variation in word order.

    >
    > Lazy troll.  You made the claim.  The onus is on you to provide the evidence.


    I reserve the right to be lazy :)

    As part of my troll-outreach effort, I will indulge here. I was
    specifically thinking about some earlier claims that programming
    languages as they currently exist are somehow inherently superior to a
    formalized natural language in expressive power.

    I think part of this comes from the misconception that terse is better
    (e.g. Paul Graham's thoughts on car/cdr), which doesn't take into
    account that your brain compresses frequently occurring English words
    VERY efficiently, so they actually take up less cognitive bandwidth
    than a much shorter non-word. This behavior extends to the phrase
    level as well; longer phrases that are meaningful in their own right
    take up less bandwidth than short nonsensical word combinations.

    On the semantic side, most people already understand branched
    processes and procedures with conditional actions pretty well. People
    "program" other people to perform tasks constantly, and have been
    doing so for the entirety of our existence. The problem occurs when
    programming language specific semantic artifacts must be considered.
    These artifacts are for the most part somewhat arbitrary, or you would
    see them frequently in other areas, and they wouldn't confuse people
    so much. I think the majority of these relate to how the computer
    operates internally - this is the stuff that really turns most people
    off to programming.

    The crux of my view is that programming languages exist in part
    because computers in general are not smart enough to converse with
    humans on their own level, so we have to talk to them like autistic 5
    year-olds. That was fine when we didn't have any other options, but
    all the pieces exist now to let computers talk to us very close to our
    own level, and represent information at the same way we do. Projects
    like IBM's Watson, Siri, Wolfram Alpha and Cyc demonstrate pretty
    clearly to me that we are capable of taking the next step, and the
    resurgence of the technology sector along with the shortage of
    qualified developers indicates to me that we need to move now.
     
    Nathan Rice, Apr 3, 2012
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. PhilC
    Replies:
    4
    Views:
    361
    PhilC
    Oct 30, 2004
  2. Alex Willmer

    Human readable number formatting

    Alex Willmer, Sep 28, 2005, in forum: Python
    Replies:
    9
    Views:
    652
  3. John Friedland
    Replies:
    18
    Views:
    556
    Adam Warner
    Jul 12, 2006
  4. Chris Angelico
    Replies:
    15
    Views:
    286
    Dave Angel
    Mar 24, 2012
  5. Chris Angelico
    Replies:
    4
    Views:
    270
    Chris Angelico
    Apr 3, 2012
Loading...

Share This Page