C and the future of computing

Discussion in 'C Programming' started by Eric Sosman, Apr 1, 2011.

  1. Eric Sosman

    Eric Sosman Guest

    A new C Standard is on the horizon, and since the gestation period of
    a Standard is roughly that of six point three elephants it is perhaps
    not too soon to start thinking about the next one. What will computing
    be like in the late 2020's, and what role will C play? Prognostication
    is an uncertain business for those who lack groundhog genes, and it is
    virtually certain that my predictions will be incorrect in detail, but
    I hope and believe that many of them will be true in broad outline.

    First, I believe that the future of computing is rooted in its past.
    Athene sprang full-armed from the brow of Zeus, but capitogenesis is
    rare in computing: Somebody has an idea, various people kick it around
    for years or even decades, and finally somebody else makes money off
    it. Computing in the 2020's will be an outgrowth of something that's
    already shivering on the chilly fringes today. Will it be Gallic
    Asinine? Probably not: I find myself agreeing with those who say "It's
    the future of computing: Always has been, always will be." Super-
    parallelism? I doubt it, since Lobachevsky and Riemann have long
    since shown the parallel postulate to be an arbitrary construct.
    Artificial Intelligence will never match Natural Stupidity, RAD is
    just a FAD, and The Singularity is only a trap representation.

    What nascent development will nurture our future, or nuture our
    furture? What mighty hint will we all ignore for the next several
    years, until we suddenly all start claiming "We were There?" It's
    impossible to be certain, of course, but I think the most likely
    candidate is Quondam Computing.

    Quondam Computing is thought of as a very modern development, but it
    is in fact quite old. Some of the fundamental concepts can be found
    in the works of Zweistein, although he himself professed disbelief
    in the ideas and dismissed them as "Stupid action at a distance." A
    few historians trace the origins of QC all the way back to Heisenbug
    himself. But even they are too timid: The earliest references to QC
    are in Genesis 6:15, where the Lord commands Noah to build an Ark
    "three hundred qubits long, fifty qubits wide and thirty qubits high."
    That's 450,000 qubic qubits (times a scale factor for the shape, but
    programmers traditionally ignore O(1) multipliers), a quantity of
    qubits that would be the envy of modern quomputer scientists.

    I believe, though, that the day is not far off when today's physicists
    will deploy exotic materials like selenium trixenide, gadolinoleum
    emancipate, and perhaps thiotimoline to duplicate what Noah did with
    cypress. (Space does not permit a recitation of the properties of
    these compounds; for details consult Wonkypedia or Giggle.) When, not
    if, this happens, we will surely see the rise of a new Zack Mickerburg
    or Allen Learison to claim ownership of the idea and lock it up with
    patents and the like. Even so, I believe QC will be too important to
    lie dormant: The future Farceback or Borgacle will stand to make so
    much money from QC that they'll be forced to market it, and then it
    will be only a matter of time until someone open-sores it.

    What, then, are the salient features of Quondam Computing, and how
    well do they mesh with C? I've already mentioned "stupid action at
    a distance," the phenomenon by which entangled quondam particles (i.e.,
    those whose divorces aren't yet final) can affect each other without
    communicating (e.g., by refusing to answer the phone). It turns out
    that C has supported stupid action at a distance since its earliest
    versions, with a construct called the "wild*" or "wild pointer!" Up
    to now, programmers have tried to avoid SAAAD, and to stamp it out with
    tools like Eclectic Farce and Poorify; in the new world of QC they
    should learn instead to welcome it -- since it's pretty much inevitable,
    they might as well.

    Another feature of Quondam Computing is what's called the superstition
    of states, where you believe your program is behaving as intended even
    though there is no rational reason to think so. Again, C is admirably
    suited to this attitudinal shift: Since a C program almost never does
    what it's supposed to, superstition of states is practically a given.
    Indeed, some C programmers can attain superstition not only of states,
    but of nations or even of continents!

    The most important QC characteristic of all, though, is indeterminacy.
    A quondam system is useful because its state is undefined; the moment
    you observe the state its waif function prolapses to a single outcome
    and all the other possible outcomes are forever lost. Just as a cat
    has maximum entropy when you can't see it (for all you know, it might
    be chasing mice in the cellar or chasing squirrels outside or barfing
    on your bed), so the quondam computer has maximum information content
    when you haven't yet looked at the output. The moment you see the
    output is the moment when the QC must commit to giving you just one
    answer and losing the other billion. How can any programming language
    avoid dissipating QC's indeterminacy and thus losing its value?

    Here, it seems, is where C really shines. Languages like Ada have
    rules like "Do X, and Y will happen." Java makes almost a fetish out
    of such strictures: "Do X on any system anywhere, and Y will happen
    on every system everywhere, maybe." Unlike these rigidly certain
    languages, though, C is the language that deliberately erects shrines
    to Undefined Behavior, Unspecified Behavior, Implementation-Defined
    Behavior, and Impolite Behavior. C is thus perfectly suited for the
    implementation of quondam computing algorithms: to get maximum benefit
    from QC you must not ask what will happen, and C is the language that
    wouldn't tell you even if you did.

    America's best-known computer scientist, Dilbert, once explained
    quondam computing as being like a rotating doughnut shot from a
    cannon at the speed of light, but without the doughnut. That, it
    seems to me, is the Spirit of C that will carry the language forward.

    (Permission is granted for unlimited electronic transmission and
    storage of this Intellectual Property. Permission is also granted
    to any single person to print exactly one hard copy, no more. All
    photocopies require individual licenses, from originator's own lawyers.)

    --
    Eric Sosman
    d
     
    Eric Sosman, Apr 1, 2011
    #1
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Guadala Harry
    Replies:
    9
    Views:
    414
    Guadala Harry
    Nov 6, 2004
  2. jelle

    future of computing languages

    jelle, Jan 29, 2005, in forum: Python
    Replies:
    2
    Views:
    292
    Michele Simionato
    Jan 31, 2005
  3. knorth
    Replies:
    1
    Views:
    341
    knorth
    Oct 23, 2008
  4. optical supercomputing

    Optical Computing: special issue - Natural Computing, Springer

    optical supercomputing, Dec 19, 2008, in forum: C Programming
    Replies:
    0
    Views:
    419
    optical supercomputing
    Dec 19, 2008
  5. optical supercomputing

    Optical Computing: special issue - Natural Computing, Springer

    optical supercomputing, Jan 16, 2009, in forum: C Programming
    Replies:
    0
    Views:
    448
    optical supercomputing
    Jan 16, 2009
Loading...

Share This Page