How to convert Infix notation to postfix notation

Discussion in 'C Programming' started by Tameem, Oct 26, 2009.

  1. Tameem

    spinoza1111 Guest

    Great book, praised for its style: A Customer at Amazon writes: "This
    book contains a huge amount of code that has obviously been under
    development and evolving for a long time. The code has a level of
    documentation, error checking and self-consistency testing that is
    rare to see even in commercial code, much less sample code for a
    book."

    The book has been selling consistently for five years ranking between
    750,000 and 250,000. Often when I've randomly checked its numbers, it
    is mentioned in the top 20 and sometimes top ten Compilers books: once
    it was adjacent to Aho Sethi Ullman (I'm not worthy! I'm not worthy!).
    I've been banking royalty checks for years on this book despite the
    fact that the Amazon site was deliberately spammed by people here with
    negative reviews.

    [Gee how is Richard Heathfield's famous C Unleashed doing? Hmm 984000
    and change rankwise. Also, it appears to have gone out of print, since
    I cannot order it through Amazon. Of course, it was published in 2000,
    and I may myself be out of print in 2013: all flesh is grass.]

    But, in recent years, owing to the precedent set by Seebach wrt
    Schildt, Twerps Withour Standing have gotten around, in recent years,
    to analyzing my code for sins of ommission and commission. Some
    genuine problems that the Twerps have noticed include the use of
    weakly typed Collections.

    The original .Net and pre-.Net collection contained Objects and your
    could add any sort of garbage to them. Their precedent in Visual Basic
    was the Variant, which could contain "anything" and caused as much
    damage if not more than the void pointer of C.

    But what bothered the Twerps was the extra code for inspect() which
    rummaged through Collections making sure that in fact each Object in
    the traditional Collections was of proper type: furthermore, if this
    type was itself other than what I called scalar in the book (a basic
    type, suitable to being placed as-is on the Stack) the object was
    subject to its own inspect().

    The Twerps howl on the site that I should have used Generics. The
    problem being that the code was written in 2002 when there were not
    Generics: these date in their full glory from 2005 and they are great:
    they allow you to declare a collection of objects of type foo as
    Collection<foo>.

    The book is in its humble way helping people learn. This bothers only
    Twerps and gives me a good feeling.
     
    spinoza1111, Nov 12, 2009
    1. Advertisements

  2. Tameem

    spinoza1111 Guest

     
    spinoza1111, Nov 12, 2009
    1. Advertisements

  3. Tameem

    spinoza1111 Guest

    Fair and Balanced, Peter. Thanks. Some of the reviewers hate it but
    some like it. It's a cult classic, like Plan Nine from Outer Space.
     
    spinoza1111, Nov 12, 2009
  4. Tameem

    spinoza1111 Guest

    Wow, pretty impressive assembler. I can confirm hastily (writing from
    Starbuck's that my code is left associative. But don't use its results
    as a benchmark or an authority. The code is a work in progress.
     
    spinoza1111, Nov 12, 2009
  5. Tameem

    Seebs Guest

    Me too.
    http://www.embeddedarm.com/software/arm-netbsd-toaster.php

    An actual, literal, toaster, with a working operating system, accepting
    keyboard input.

    Anyway, here's the thing. The reason I consider those embedded things
    "computers" is this: When I was a little kid, I used to go up to the
    college my dad worked at, sit on a phonebook, and play rogue on a large
    system shared by many users.

    I feel comfortable asserting that any device which has a faster CPU and
    more memory than those devices ought to be counted as a "computer". It's
    a thing which processes digital signals in a fairly general way, using
    memory to store both code and data, it takes inputs, performs computations,
    and produces outputs... It's a computer. That the inputs aren't mouse
    clicks or keystrokes doesn't change that.

    Note that ordinary toasters aren't computers; they're purely mechanical
    gizmos with no real control logic. On the other hand, the toaster oven
    I got a while back definitely contains a computer. Not a very advanced
    one, sure, but it's a computer. It is entirely conceivable that the
    code which decides what to do based on button presses was compiled by a
    C compiler.

    -s
     
    Seebs, Nov 12, 2009
  6. Tameem

    spinoza1111 Guest

    Let me clarify what happened lest this THUG again obscure the truth.

    I wrote one positive review under a pseudonym when persuaded to do so
    by a coworker up in China. My main review under my own name gave
    myself four stars because I didn't have enough time to cover object
    code generation in full.

    I later asked Amazon to withdraw the pseudonym review having decided
    that it was venially dishonest, but dishonest, to post this review,
    even though (according to an article in the New Yorker) several well
    known authors who are otherwise ethical have done this: post favorable
    reviews of their own work from pseudonymical accounts.

    These authors do so because the Internet gives a false authority to
    people without standing (such as Peter Seebach and the attackers of
    Java author Kathy Sierra) and they need to defend their livelihoods.

    My Chinese coworker felt I was justified because he saw the deliberate
    spams at the booksite. However, I feel that it is best to verbally
    self-defend and not be dishonest.

    Richard Heathfield is consistently dishonest, especially as regards
    other people's qualifications and their "errors", which he delights in
    exagerrating and in fabricating. For example, Heathfield lied when he
    claimed that it took me any time at all to see the so-called "bug" of
    relying on Microsoft file case insensitivity, and Heathfield lied when
    he claimed that "not using const" was a bug.

    Heathfield appropriated the first "error" from another's research, and
    he found the second by mindlessly running a compiler with rich
    warnings in effect. His modus operandi is to break into discussions
    where he is not wanted, and then to steal others' bug reports, making
    them into global claims about the competence of people Heathfield
    targets. This behavior, I am confident, is going land him in a UK
    courtroom which takes a dim view of civil and criminal libel.

    I hope that this clears this matter up.
     
    spinoza1111, Nov 12, 2009
  7. Tameem

    spinoza1111 Guest

    Then the New York times deals in "non-sequiturs"? My source:

    http://www.nytimes.com/2002/08/14/obituaries/14NYGA.html?scp=7&sq=stroustrup&st=cse

    [Article reproduced in full below]

    For little clerks who aren't really programmers, it's in effect a "non-
    sequitur" to so much as name something so ... scarey ... as a labor
    union, but the New York Times confirms that the origin of C++ lies in
    Stroustrup's participation in Simula, and Simula was developed so that
    little "programmers" couldn't deceive union members and workers, and
    install some C-like crap in factories that endangered their lives or
    livelihoods.
    **** you, asshole: **** you very, very much. And, you're lying.

    Kristen Nygaard, Who Built Framework for Computer Languages, Dies at
    75
    By JOHN MARKOFF
    Published: Wednesday, August 14, 2002
    Sign in to Recommend
    Twitter

    E-Mail


    Reprints

    Share
    Close
    LinkedinDiggFacebookMixxMySpaceYahoo! BuzzPermalinkKristen Nygaard, a
    Norwegian mathematician who laid the groundwork for modern computer
    programming languages and who helped Scandinavian workers influence
    the design of labor-saving computer technologies, died on Saturday in
    Oslo, Norway. He was 75.

    The cause of death was a heart attack, said Ole Lehrmann Madsen, a
    friend and colleague at Aarhus University in Denmark.

    From 1962 to 1967, with his co-worker Ole-Johan Dahl, Mr. Nygaard
    designed Simula, a programming language intended to simulate complex
    real-world systems. The ideas underlying Simula emerged from Mr.
    Nygaard's work in the area of operations research while he was
    employed at the Norwegian Defense Research Establishment from 1948 to
    1960.

    Although the original use for Simula was a physics simulation for a
    military laboratory, workers at the Norwegian Iron and Metal Union
    approached Mr. Nygaard, in the late 1960's with concerns about
    computers in displacing and altering their jobs. Mr. Nygaard began
    working with them, pioneering an approach that became known as
    participatory design, in which workers help design new technologies in
    the workplace.

    "It was originally thought of as a socialistic movement," said Dr.
    Madsen, who worked with Mr. Nygaard over several decades. "However,
    eventually large corporations began to realize this was a reasonable
    practice and it is widely used around the globe today."

    Simula was significant for pioneering the concept of "object oriented"
    programming. Before Simula, computer programs were thought of in terms
    of software instructions and data. Simula introduced the idea of
    objects, or modules, and classes of objects. Such object-based
    programs made it easy for programmers to reuse software, thus
    dramatically increasing productivity and efficiency.

    "He understood that simulation was the ultimate application of
    computers," said Larry Tesler, a computer scientist who has worked at
    the Xerox Corporation and Apple Computer . "It was a brilliant
    stroke."

    Simula would ultimately influence the designers of a wide range of
    programming languages, including Smalltalk, C++, and Java, and it
    would leave a deep impression on the personal computer world as well,
    influencing the designers of both the Macintosh and Windows operating
    systems.

    As a graduate student at the University of Utah, the computer
    scientist Alan Kay became familiar with Simula, which was to become
    one of the principal influences on Smalltalk, an object-oriented
    programming language he developed with a small group of programmers at
    Xerox's Palo Alto Research Center in the early 1970's. Because Simula
    permitted the creation of classes of objects and permitted
    "inheritance," in which all the objects of a class could automatically
    take on certain attributes, it led Dr. Kay to begin thinking in
    biological terms. He conceived of software in a framework where
    complex processes could emerge from simple building blocks.

    The Palo Alto research in turn influenced a generation of computer
    designers at both Apple Computer and the Microsoft Corporation in the
    early 1980's, when the modern personal computer was taking form.

    Several years after Dr. Kay discovered Simula, Bjarne Stroustrup, a
    Danish programmer who studied at Cambridge and who would later become
    a software designer at Bell Laboratories, also encountered the
    language. Like Dr. Kay, he would be influenced by the idea of software
    objects, and he would build that concept into his widely influential C+
    + programming language.

    Kristen Nygaard was born on Aug. 27, 1926, in Oslo. He received his
    master's degree in mathematics at the University of Oslo in 1956.

    He taught in both Denmark and at the University of Oslo, where he was
    a professor until he retired until 1996.

    In the 1970's, Mr. Nygaard's research interests increasingly turned to
    the impact of technology on the labor movement, and he became involved
    in other political, social and environmental issues. He was the first
    chairman of the environment protection committee of the Norwegian
    Association for the Protection of Nature. He was also the Norwegian
    representative for the Organization for Economic Cooperation and
    Development's activities on information technology.

    He also helped run an experimental program to create humane living
    conditions for alcoholics.

    In the mid 1960's he became a member of the National Executive
    Committee of the Norwegian party Venstre, a left-wing non-socialist
    party, and chairman of that party's strategy committee. In 1988 he
    became chairman of a group that successfully opposed Norway's
    membership in the European Union.

    This year, with Ole-Johan Dahl, Mr. Nygaard shared both the
    Association of Computing Machinery's Turing Award and the Institute
    for Electrical and Electronics Engineers von Neumann Medal. In 1990
    the Computer Professionals for Social Responsibility awarded him the
    Norbert Weiner Prize.

    He is survived by his wife, Johanna Nygaard, three children and seven
    grandchildren
     
    spinoza1111, Nov 12, 2009
  8. it is nevertehless a widely used term

    http://welcome.hp.com/country/us/en/prodserv/servers.html

    (I equally have posted a Dell, Intel or half a dozen other hardware
    manufacturers). Wikipedia agrees with you though :)

    we need a term for "heavy duty back office hardware" and mainframe is
    out of fashion
     
    Nick Keighley, Nov 12, 2009
  9. Tameem

    spinoza1111 Guest

    This makes NO sense, since the question isn't whether Microsoft
    invented C (it didn't). It is whether knowledge of Microsoft platforms
    is required for true C expertise, and the large amount of C code
    running on Microsoft platforms means it is.

    For example, a true C expert knows that a C program, given C's access
    to low-level, machine and OS dependent facilities, cannot be
    mindlessly ported. Suppose he's given the task of porting a Microsoft
    C program to a non-Microsoft platform.

    One of the first issues on his to-do list will be to check all file
    identifiers identified in the code for case, and to make sure this
    matches the case in use in the non-Microsoft environment! This is
    because he knows that sensitivity to case differs across "the two
    cultures".

    He will also check newline usage for the difference between the
    (traditional and being phased out) Microsoft use of carriage return,
    line feed to mark line boundaries in many files, and the use of
    linefeed in other platforms.

    He won't have time to whine about Microsoft in some fashionable
    register. He will also buy the 4th edition of Schildt's "C: The
    Complete Reference" to find Microsoft-centric issues for follow up.

    This has an interesting result: it means that the C expert of clc in
    terms of crossing the culture divide is Edward G. Nilges. From 1981
    through 2005 I worked in multivendor environments across this divide,
    and I've also crossed the line into the IBM mainframe line.

    Or, more precisely, I am well on the way IF I persist in coding C on
    the ferry, and ramp back up to a former level of knowledge. This may
    not happen because when I code C I have an urge to kick small animals
    and break up Salvation Army meetings; so much of C is an insult to
    intelligence.
     
    spinoza1111, Nov 12, 2009
  10. Tameem

    spinoza1111 Guest

    Are you dumb, stupid, retarded or brain damaged?

    Prior to the first Microsoft compiler there were C experts such as
    dmr. But to be a C expert today, you need to know the Microsoft
    platform differences.

    I'm using "expert" in the normative, not descriptive sense. That is,
    I'm not talking about C programmers who "think" they are experts. Nor
    am I talking about C programmers who are regarded as experts by some
    workgroup, or within this dysfunctional community clc. I mean someone
    who is genuinely knowledgeable about a wide spectrum of applications.

    And I'm saying that people with fashionable hatreds and shibboleths
    aren't even emotionally mature grownups. Rather, with jobs that
    constitute welfare for white males, they believe that they can
    gerrymander and redefine reality so as to make themselves experts in a
    sufficiently well-defined game.

    I'm not even sure that dmr is truly an expert...any more. And, Brian
    Kernighan's remarks in Beautiful Code about the putative superiority
    of C to Java based on "efficiency" commits, as far as I'm concerned,
    one of the oldest sins in the computing book: the prizing of
    efficiency over correctness and elegance.

    If one insists that a field be artificially bounded so that one is an
    authority, one lacks expertise.
     
    spinoza1111, Nov 12, 2009
  11. Tameem

    Seebs Guest

    Hey, I did that too!

    .... but not that.
    Whereas I thought it would be hilarious, so I submitted a review under
    the title "Best book I ever wrote." I feel that any concerns about
    full disclosure are adequately addressed by the opening paragraph:

    Having heard that authors frequently review their own books,
    I thought I'd give it a try. This is, without a doubt, the
    best book on portable shell scripting I have ever written.
    Sadly, it is also the worst book on portable shell scripting
    I have ever written.

    -s
     
    Seebs, Nov 12, 2009
  12. Tameem

    spinoza1111 Guest

    You don't see where this logic is leading. If it's possible for the
    "expert" to define his area of competence, here to ignore C-on-
    Microsoft and case insensitive file ids, then he has no standing in
    criticising a book which as a practical proposition was about C on the
    most common platform.

    This was Seebach who in his Vitriolic Tirade did not include a
    disclaimer about his expertise, instead assumed through apparent
    ignorance that all platforms will penalize a programmer who uses the
    wrong case pattern.
     
    spinoza1111, Nov 12, 2009
  13. "to hold"

    Not to access. C can't access bytes, except for the the null character. C
    can access characters as part of a string. I.e., a byte can be larger than
    a char in size, but a string in the C context isn't comprised of bytes but
    of characters - characters being the portion of a byte the C context can
    recognize.


    Rod Pemberton
     
    Rod Pemberton, Nov 12, 2009
  14. No, a byte cannot be larger than a char in size. sizeof(char) == 1
    (byte) *by definition*. Type unsigned char cannot have padding bits;
    an object of type unsigned char can have one of exactly 1<<CHAR_BIT
    distinct values. I think that plain char or signed char also cannot
    have padding bits.

    Certainly C can access bytes; just declare an unsigned char, or an
    array of unsigned char.
     
    Keith Thompson, Nov 12, 2009
  15. that's just crap. If you can't read it back in what sense does it
    "hold" something. This reminds me of the $ETHNIC_GROUP joke about WOM
    chips, Write Only Memory.
    yes it can. Everything you say is wrong. If you use the C definition
    of a byte (equal to a char) or the common definition of a byte (8
    bits), in which case char may be *larger* than a byte
    never ever ever. Unless you are using pre-1980 bytes? 6-bit bytes?
    oh yes it is
    actually chars. Characters might be unicode or something.
    rubbish
     
    Nick Keighley, Nov 13, 2009
  16. Tameem

    spinoza1111 Guest

    These questions are not rhetorical. I want a straight answer from you.
    Consider yourself on a witness stand as you yet might find yourself on
    the hot seat.

    (1) At the time CTCR was first written, did the "standard" mandate
    that (for example) "stdio.h" and other common libraries be named all
    lower case?

    (2) Does the standard mandate this now?

    I think that owing to lack of representation of Microsoft C
    programmers on the standards bodies, a lack of representation caused
    in part by anti-Microsoft and anti-IBM snobbery, this issue was not
    faced.

    I don't think the original intent of K & R was to make file
    identifiers into Sacred Names. I do think that it's an instance of
    techno-barbarism and a Negative Dialectic that they have become
    sacred. That is, technical progress up until circa 1980 created free
    thinking human beings able to communicate civilly (for example in the
    old structured walkthrough) *sans peur et sans reproche* giving mere
    technicians of the time aristocratic privileges of mutual respect (an
    early book describing the rise of Bell Northern Research along the
    line of the more famous "Soul of a New Machine" was called "knights of
    the new technology".

    But under the pressure of the Reagan-Thatcher re-assertion of the
    rights of money and property, these same technicians *undt seinem
    kinder* found they had willy-nilly to erect a defensive laager to
    protect their status much as cities in Germany in the middle ages
    needed charters of liberty to protect them against aristocratic
    thugs.

    At the gates, the guards ask all who would enter how many bits are in
    an ASCII code and they slay them if they reply with any number other
    than seven.

    This is bullshit, because it sacrifices communication and clarity-as-
    understanding for membership in a fear-ridden guild whose members,
    unlike the guild members of the middle ages, cannot even protect their
    jobs.

    Thus, stdio.h and other ridiculous (too short for starters) names
    become standards in a way that violates our need for elegance. I think
    that a younger Kernighan meant there to be many Cs all with different
    libraries for different tasks on the model of a later "kernel OS" but
    this openness and flexibility was destroyed ... in part by the fact
    that not only is entry (or in the case of unemployment, re-entry) to
    the laager is controlled not only by shibboleth but also head hunters
    who don't know C but are nonetheless delegate to ask if the applicant
    or re-applicant knows C.
     
    spinoza1111, Nov 16, 2009
  17. Tameem

    Walter Banks Guest

    I guess that is why Microsoft hosted two of the WG14
    standards meetings within the last few years.
     
    Walter Banks, Nov 16, 2009
  18. [/QUOTE]
    In any case, the idea that Microsoft would be absent from a standards
    body becaue of "anti-Microsoft snobbery" is ridiculous. Microsoft's
    participation in standardisation is decided entirely on the basis
    of whether they consider it to their commercial advantage.

    -- Richard
     
    Richard Tobin, Nov 16, 2009
  19. Tameem

    spinoza1111 Guest

    Ok, if the Standard does not require that the names be in lower case,
    what possible basis could there have been for Seebach to count this as
    an error?

    From CTCN:

    "Page 284
    All of the header files are listed in capitals; the standard specifies
    them in lower case. It is not required that a C compiler reject all-
    caps, but nor is it required that it accept them."

    Your claim that the standard "grants explicit permission to ignore
    case" is consistent with Seebach's "not required to accept all-caps",
    but what's really bothering Seebach, and which he falsely implies is
    an error, is Herb's use of upper case.

    Herb's use of upper case was NOT A BUG and not in violation of the
    standard, yet it was a listed AS AN ERROR in a document which listed
    20 "known" errors, this nonerror being included.

    His failure is only a failure to pronounce the name of an ear of corn
    correctly in a barbaric world: to use a shibboleth.

    Had Seebach had his document reviewed properly by McGraw Hill, this
    error, which is HIS error and NOT Schildt's, would have been caught.
    Had he admitted it, Herb and McGraw Hill may very well have changed
    the names to the unix-preferred lower case in the 2nd edition.

    But instead, Seebach's unrefereed and dishonest document went viral
    and grievously harmed Schildt.

    Again: an apology and a withdrawal of the document is in order.
    This is irrevelant, since Herb's platform was not case-sensitive, and
    he made no error, since on his platform, the mapping exists. To
    include it as an error was a falsehood, and libelous in the twofold
    sense of causing harm and meaning to.
    They appear to have been about as represented as was Herb, who was on
    the C99 committee. It appears to me that both committees were
    dominated by greedy non-Microsoft vendors whose main concern was that
    the committee members "standardize" an existing mess by pronouncing
    dozens of practices "undefined", here the effect of using upper case
    file identifiers. Again, I believe that this was done since non-
    Microsoft vendors didn't want to change their compilers, having laid
    off their competent people.
     
    spinoza1111, Nov 16, 2009
  20. http://www.open-std.org/jtc1/sc22/wg14/www/docs/n805.htm

    I don't know how you define being on a committee, but a membership
    list is evidence of some form of being "on the committee".
     
    Ben Bacarisse, Nov 16, 2009
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.