Re: Why (or not) use single letter variable names?

Discussion in 'C Programming' started by Arthur J. O'Dwyer, Dec 18, 2004.

  1. [xposted and fups set to clc]

    On Fri, 17 Dec 2004, Shmuel (Seymour J.) Metz wrote:
    >
    > (Herman Rubin) said:
    >> Someone has already pointed out several other meanings
    >> for * in C. Why wasn't "@" used for indirection in C?

    >
    > You'd have to aski K&R to be sure, but my guess is that it is one of:

    ^^^^
    (Pun intended? ;)

    My first guess was that the '@' sign wasn't yet in popular use on
    computer systems in those days. But BCPL did use ASCII, and ASCII has
    always had the '@', right? So that's a good question --- why didn't
    C use '@' for /anything/? It uses practically every other glyph, except
    the backquote ` and the dollar sign $, and the latter is an
    understandable omission.

    > 1. The common usage of * in many assembler for indirect addressing
    >
    > 2. Perhaps BCPL used * for indirect addressing


    AFAICT, indirection in BCPL was accomplished with the 'V*' notation
    for vector (array) indexing and the 'rv L' notation for "getting an rvalue
    from an lvalue"; 'rv V' and 'V*[0]' were equivalent.

    I thought I read somewhere that one of C's ancestors didn't have a
    special symbol for indirection, but used some alphabetic token for it,
    such as 'deref' or something similar. Is that a garbled memory of BCPL,
    or does anyone have a better idea why I might have thought that?

    -Arthur
     
    Arthur J. O'Dwyer, Dec 18, 2004
    #1
    1. Advertising

  2. [OT] Why (or not) use single letter variable names?

    Arthur J. O'Dwyer <> scribbled the following
    on comp.lang.c:
    > [xposted and fups set to clc]
    > On Fri, 17 Dec 2004, Shmuel (Seymour J.) Metz wrote:
    >> (Herman Rubin) said:
    >>> Someone has already pointed out several other meanings
    >>> for * in C. Why wasn't "@" used for indirection in C?

    >>
    >> You'd have to aski K&R to be sure, but my guess is that it is one of:

    > ^^^^
    > (Pun intended? ;)


    > My first guess was that the '@' sign wasn't yet in popular use on
    > computer systems in those days. But BCPL did use ASCII, and ASCII has
    > always had the '@', right? So that's a good question --- why didn't
    > C use '@' for /anything/? It uses practically every other glyph, except
    > the backquote ` and the dollar sign $, and the latter is an
    > understandable omission.


    Computers have had @ signs on their keyboards at least since the late
    1970's. My first computer was a Commodore 64, with the @ key quite
    clearly present.
    Of course back when the Commodore 64 was still popular, no one in
    Finland had ever heard of e-mail. So when I went to a "computer camp"
    (which turned out to be a pretty disappointing experiment) I heard
    someone call the @ sign a "spiral".
    Actually I wish something other than @ had been chosen as the separator
    in e-mail addresses. It has now more-or-less became the universal
    symbol for the Internet. It keeps popping up in logos and advertisements
    everywhere the Internet is mentioned. If a more common character was
    chosen, there would be no single character representing the Internet.

    --
    /-- Joona Palaste () ------------- Finland --------\
    \-------------------------------------------------------- rules! --------/
    "Immanuel Kant but Genghis Khan."
    - The Official Graffitist's Handbook
     
    Joona I Palaste, Dec 18, 2004
    #2
    1. Advertising

  3. On Sat, 18 Dec 2004 14:14:59 -0500 (EST), in comp.lang.c , "Arthur J.
    O'Dwyer" <> wrote:

    >So that's a good question --- why didn't
    >C use '@' for /anything/?


    Possibly it has some meaning in the shell used on the PDP-11 or 11/780?

    And actually I think it might have been missing from the original ASCII
    set.

    --
    Mark McIntyre
    CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
    CLC readme: <http://www.ungerhu.com/jxh/clc.welcome.txt>

    ----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
    http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
    ----= East and West-Coast Server Farms - Total Privacy via Encryption =----
     
    Mark McIntyre, Dec 18, 2004
    #3
  4. Arthur J. O'Dwyer wrote:
    >
    > [xposted and fups set to clc]
    >
    > On Fri, 17 Dec 2004, Shmuel (Seymour J.) Metz wrote:
    >
    >>
    >> (Herman Rubin) said:
    >>
    >>> Someone has already pointed out several other meanings
    >>> for * in C. Why wasn't "@" used for indirection in C?

    >>
    >>
    >> You'd have to aski K&R to be sure, but my guess is that it is one of:

    >
    > ^^^^
    > (Pun intended? ;)
    >
    > My first guess was that the '@' sign wasn't yet in popular use on
    > computer systems in those days. But BCPL did use ASCII, and ASCII has
    > always had the '@', right? So that's a good question --- why didn't
    > C use '@' for /anything/? It uses practically every other glyph, except
    > the backquote ` and the dollar sign $, and the latter is an
    > understandable omission.


    As I recall, and as another poster pointed out already, in the
    early days of Unix, the '@' character was used for line-delete.
    This was before the days of sophisticated terminal settings
    and you didn't have a choice about it. The tty driver would,
    I think, echo the '@' sign followed by a CR/LF.

    I would guess that there were just too many instances of:
    "Oh, spit!, 60 characters into the line and I hit the
    [expletive deleted] '@' without escaping it again!"

    During that same time frame '#' was used as character delete.
    If I recall correctly (and I may not be), the # was only used
    in preprocessor directives, and then only at the beginning
    of the line. If you typed it by mistake anywhere else, you
    only lost one character. If you didn't escape it at the
    beginning of the line in the pre-processor, you would soon
    find out your error.

    [ SNIP ]

    >
    > -Arthur


    NPL

    --
    "It is impossible to make anything foolproof
    because fools are so ingenious"
    - A. Bloch
     
    Nick Landsberg, Dec 18, 2004
    #4
  5. Arthur J. O'Dwyer

    Guest

    In article <>,
    "Arthur J. O'Dwyer" <> wrote:
    >
    >[xposted and fups set to clc]
    >
    >On Fri, 17 Dec 2004, Shmuel (Seymour J.) Metz wrote:
    >>
    >> (Herman Rubin) said:
    >>> Someone has already pointed out several other meanings
    >>> for * in C. Why wasn't "@" used for indirection in C?

    >>
    >> You'd have to aski K&R to be sure, but my guess is that it is one of:

    > ^^^^
    >(Pun intended? ;)
    >
    > My first guess was that the '@' sign wasn't yet in popular use on
    >computer systems in those days.


    Nope. This is an incorrect guess :).

    > ... But BCPL did use ASCII, and ASCII has
    >always had the '@', right? So that's a good question --- why didn't
    >C use '@' for /anything/? It uses practically every other glyph, except
    >the backquote ` and the dollar sign $, and the latter is an
    >understandable omission.


    An @ is 100 in octal and 40 in SIXBIT octal. IOW the high order
    bit of an octal number.

    /BAH

    >
    >> 1. The common usage of * in many assembler for indirect addressing
    >>
    >> 2. Perhaps BCPL used * for indirect addressing

    >
    > AFAICT, indirection in BCPL was accomplished with the 'V*' notation
    >for vector (array) indexing and the 'rv L' notation for "getting an rvalue
    >from an lvalue"; 'rv V' and 'V*[0]' were equivalent.
    >
    > I thought I read somewhere that one of C's ancestors didn't have a
    >special symbol for indirection, but used some alphabetic token for it,
    >such as 'deref' or something similar. Is that a garbled memory of BCPL,
    >or does anyone have a better idea why I might have thought that?
    >
    >-Arthur


    Subtract a hundred and four for e-mail.
     
    , Dec 19, 2004
    #5
  6. In article <>, "Arthur J. O'Dwyer" <> writes:
    >
    > [xposted and fups set to clc]


    alt.folklore.computers might have been a better bet, since it deals
    with computing history in general, but I'll leave this on c.l.c.

    > On Fri, 17 Dec 2004, Shmuel (Seymour J.) Metz wrote:
    > >
    > > (Herman Rubin) said:
    > >> Someone has already pointed out several other meanings
    > >> for * in C. Why wasn't "@" used for indirection in C?

    > >
    > > You'd have to aski K&R to be sure, but my guess is that it is one of:

    >
    > My first guess was that the '@' sign wasn't yet in popular use on
    > computer systems in those days. But BCPL did use ASCII, and ASCII has
    > always had the '@', right?


    The "commercial at sign" (@) was in ASCII in 1964, anyway.[1]

    The 1963 CCITT report on adopting the ISO 7-bit code lists commercial-
    at as one of 13 symbols required due to "Additional Demand for Data
    Processing", which suggests to me that it *was* "in popular use on
    computer systems". The others in that category in that report are
    !, #, ", &, *, @, <, >, [, ], \, and the up and left arrows.[2] The
    last two were removed by ISO to make room for some of the diacritical
    marks required by CCITT, including the grave accent, which is the
    "backquote" character used by various Unix shells (though, like
    commercial-at, not employed by C).[3]

    Interestingly, the commercial-at sign was listed by the CCITT report
    along with up- and left-arrow as a "soft" character in the ISO set
    eligible for replacement (see 5.ii in [2]). Nonetheless it survived
    the CCITT-inspired changes to the ISO set, though in the December
    1963 draft it's marked as one of the positions that might be replaced
    with an accented letter.[4]

    I don't know just how much the ISO 7-bit code influenced the choice
    of ASCII characters, but they're clearly closely related.

    > So that's a good question --- why didn't C use '@' for /anything/?


    Didn't Herman already answer this? According to him, the "@"
    character was the deletion command for the editor Ritchie used at
    the time. Would that have been QED?[5]

    I don't know anything about QED, aside from the brief description I
    cited, because I'm too lazy to read all of [6], but I note it says:

    Certain characters are impossible to generate on certain devices,
    and the commercial-at sign "@" cannot be input to TSS at all.

    (Note this document was written by Ritchie and Thompson.) C wasn't
    developed on TSS, but if it was developed using QED, that seems to
    cast some doubt on Herman's claim. But maybe Ritchie wanted to allow
    for a future C port to TSS, or was simply avoiding the at-sign
    because he knew that it was unavailable on at least one prominent OS.

    All in all, though, I have to say that this seems to me one of the
    sillier complaints I've read about C. Why not complain that it
    doesn't use "/\" for logical-AND and "\/" for logical-OR, while
    we're at it? That's why we have the backslash in ASCII, after all.


    1. http://www.cl.cam.ac.uk/~mgk25/ucs/ascii-history/usa.html
    2. http://www.cl.cam.ac.uk/~mgk25/ucs/ascii-history/ccit.html
    3. http://www.cl.cam.ac.uk/~mgk25/ucs/ascii-history/README
    4. http://www.cl.cam.ac.uk/~mgk25/ucs/ascii-history/draft.html
    5. http://www.multicians.org/mgq.html#qed
    6. http://cm.bell-labs.com/cm/cs/who/dmr/qedman.html

    --
    Michael Wojcik

    Not the author (with K.Ravichandran and T.Rick Fletcher) of "Mode specific
    chemistry of HS + N{_2}O(n,1,0) using stimulated Raman excitation".
     
    Michael Wojcik, Dec 19, 2004
    #6
  7. Arthur J. O'Dwyer

    Richard Bos Guest

    Re: [OT] Why (or not) use single letter variable names?

    Joona I Palaste <> wrote:

    > Computers have had @ signs on their keyboards at least since the late
    > 1970's. My first computer was a Commodore 64, with the @ key quite
    > clearly present.


    > Actually I wish something other than @ had been chosen as the separator
    > in e-mail addresses.


    It's actually quite suited to that purpose, because it's pronounced
    "at". If you read my email address as "rlb at hoekstra-uitgeverij.nl"
    you have exactly what it means: the mailbox of rlb (that's me, Richard
    L. Bos), at that mail server.

    > It has now more-or-less became the universal
    > symbol for the Internet. It keeps popping up in logos and advertisements
    > everywhere the Internet is mentioned. If a more common character was
    > chosen, there would be no single character representing the Internet.


    There isn't, but that doesn't stop e-marketeers from e-calling
    e-everything "e-myproduct".

    Richard
     
    Richard Bos, Dec 20, 2004
    #7
  8. Re: [OT] Why (or not) use single letter variable names?

    On Mon, 20 Dec 2004, Richard Bos wrote:
    >
    > Joona I Palaste <> wrote:
    >> Computers have had @ signs on their keyboards at least since the late
    >> 1970's. My first computer was a Commodore 64, with the @ key quite
    >> clearly present.

    >
    >> Actually I wish something other than @ had been chosen as the separator
    >> in e-mail addresses.

    >
    > It's actually quite suited to that purpose, because it's pronounced
    > "at".


    Only in English and in languages which have borrowed the word from
    English, apparently. While looking for answers to whether '@' was in
    ASCII in 1967 (it was), I came across the sci.lang.translation thread
    http://groups.google.co.uk/groups?threadm=


    which claims that the '@' sign originated in the unit of weight "arroba,"
    which is one pronunciation still extant in French, Spanish, and
    Portuguese, while a lot of other countries' native pronunciations are
    based on the glyph's shape: "little snail" in Italy, etc.
    (I don't want to start another gigantic OT thread sharing how you or
    your countrymen pronounce '@'; see the sci.lang.translation thread if
    you want that. :)

    > If you read my email address as "rlb at hoekstra-uitgeverij.nl"
    > you have exactly what it means: the mailbox of rlb (that's me, Richard
    > L. Bos), at that mail server.
    >
    >> It has now more-or-less became the universal
    >> symbol for the Internet. It keeps popping up in logos and advertisements
    >> everywhere the Internet is mentioned. If a more common character was
    >> chosen, there would be no single character representing the Internet.

    >
    > There isn't, but that doesn't stop e-marketeers from e-calling
    > e-everything "e-myproduct".


    iDontKnowWhatYouMean, really iDont!

    -iArthur
     
    Arthur J. O'Dwyer, Dec 20, 2004
    #8
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. vertigo

    big letter -> small letter

    vertigo, Jul 6, 2004, in forum: Python
    Replies:
    4
    Views:
    759
    Reinhold Birkenfeld
    Jul 6, 2004
  2. Tony Meyer

    RE: big letter -> small letter

    Tony Meyer, Jul 6, 2004, in forum: Python
    Replies:
    0
    Views:
    507
    Tony Meyer
    Jul 6, 2004
  3. Andrew McNamara

    Re: big letter -> small letter

    Andrew McNamara, Jul 6, 2004, in forum: Python
    Replies:
    2
    Views:
    753
    Scott David Daniels
    Jul 6, 2004
  4. Mr. SweatyFinger
    Replies:
    2
    Views:
    1,995
    Smokey Grindel
    Dec 2, 2006
  5. Replies:
    6
    Views:
    313
    Phlip
    Sep 23, 2005
Loading...

Share This Page