Discussion in 'C++' started by Jonathan Lee, Jul 7, 2009.

  1. Jonathan Lee

    Jonathan Lee Guest

    Hello all,
    To be a good little coder I want to ensure all of my functions pass
    a checklist of "robustness". To keep things simple, I want to document
    each function with a string that will indicate which of the checklist
    items the function has been audited for. Something like


    which would show that items J, M, R, and T have been checked. Off the
    top of my head I came up with the list below. I wonder if anyone has
    items they think should be added to the list. Any advice welcome,


    Audit list (an implicit "where applicable" should be assumed)
    A - Arguments checked against domain
    B - Arrays have bounded access
    C - No C style casts, other casts as appropriate. Avoid
    D - No #define's - use static const, enum, or function
    E - Exception safe
    F - Floating point comparisons are safe (eg., don't check against 0.0)
    I - Use initialization lists in constructors
    L - Loops always terminate
    M - Const qualify member functions that need it
    N - "new" memory is not leaked, esp., in light of exceptions
    O - Integer overflow
    P - Wrap non-portable code in "#if"s and warn user with #else
    R - Reentrant
    Q - Const Qualify object arguments
    T - Thread safe
    V - Virtual destructor
    Jonathan Lee, Jul 7, 2009
    1. Advertisements

  2. Jonathan Lee

    Jonathan Lee Guest

    Well, they don't really. I'm just breaking up my auditing function by
    function. I don't always have the time to audit the whole module. But
    if all of the functions pass 'D', for example, then the module pretty
    much will, too.

    Admittedly, 'V' doesn't really apply to any function but the
    destructor "function". Maybe I can make a module checklist, too.
    Understood. I don't mean the list to *insist* on anything -- just that
    I should check it. If it turns out I don't need a virtual destructor,
    I won't use one. But I should look.
    As with 'V', I think it's good practice to check if these apply. Then
    act appropriately.
    Heh. Vague, I know. But I mean if the class is going to be accessed by
    multiple threads, will that "be a problem". Do I need to guard RW
    access to data with mutexes, for example. I think it will depend a lot
    on context.

    Thanks for your input, Victor.

    Jonathan Lee, Jul 7, 2009
    1. Advertisements

  3. In message
    There's nothing intrinsically "unsafe" about comparing floating-point
    values with 0.0, if that's what your algorithm requires. What's unsafe
    is programming floating-point arithmetic if you don't understand the
    floating-point data model or the algorithm.
    In a _function_?
    Richard Herring, Jul 7, 2009
  4. Jonathan Lee

    Jonathan Lee Guest

    Perhaps I should extend it to say: ensure floating point is rigorous.
    Consider normalizing a vector. The naive way does a bad job of this:

    void normalize(double v[3]) { // An example
    double x = sqrt(v[0] * v[0] + v[1] * v[1] + v[2] * v[2])
    v[0] /= x;
    v[1] /= x;
    v[2] /= x;

    I don't want to overlook code like this, which has flaws that are less
    obvious than the "if (x == 0.0){}" construction.

    Jonathan Lee, Jul 7, 2009
  5. Its impractical to double check every function for all of these. It
    will slow down development too much. Someone else will be able to
    produce equally robust code in fewer manhours. If you must make rigid
    rules for yourself, than you need to apply them as you write the
    code. The best cooks don't weigh ingredients to the microgram.
    Andrew Tomazos, Jul 7, 2009
  6. Jonathan Lee

    Jonathan Lee Guest

    In order to write better code in the future, I would need to know what
    my mistakes in the past have been, no?

    Jonathan Lee, Jul 7, 2009
  7. You have to distinguish between two different kinds of mistakes.
    There are logical bugs which result in faulty behavior of the
    program. The "rule list" will only protect you from a tiny fraction
    of them. The time spent in testing/debugging is far less than the
    cost of doggedly applying these rules. Everytime you find a logic bug
    in debugging, spend time thinking about the root cause. Most of the
    time you'll find that one of the rules wouldn't have saved you.

    The other kind of mistake is "bad design". Bad design is subjective.
    It is like talking about good and bad art. It is as much to do with
    the reader of the code as it is to do with the writer. You have to
    negotiate and form an agreement with the people that will read your
    code what the definition of "good design" is. Programmers will not
    agree on this, anymore than people agree on what constitutes good art.

    Furthermore, the rule list you have presented are in fact only rough
    guidelines. There are exceptions to the majority of your list.

    For example, the one about using inline functions rather than
    #defines. Rather than just memorizing that rule, understand the
    mechanics of the preprocessor, the mechanics of type/parameter binding
    and multipass text processing. Write a compiler for some toy language
    with a toy preprocessor. This is good programming practice, as well
    as understanding language design. Once you do that you won't need to
    think about the "rule", as you will have something better. You will
    never accidentally use a #define when you meant to use an inline
    function. As a note, there is no way to write...

    #define f1(x, y) f2(x, #x, x##y)

    ....with an inline function. In at least some cases a #define is
    preferable. There are similar exceptions to other rules in your list.
    Andrew Tomazos, Jul 7, 2009
  8. Jonathan Lee

    Jonathan Lee Guest

    Of course. The fact that *you think* that *I think* it is anything
    more than that is causing you to be very condescending toward me.
    Seriously, read that paragraph about bad design as if someone else
    had sent it to you.

    It's meant to be a simple mnemonic. You introduced the words "rule",
    "doggedly" and "rigid", but I never used them. See? Just a list.
    The letters even sorta match up with the item. Like "A" for
    argument, "B" for "bounds", "C" for "C-style cast", "D" for
    "define, "E" for "exceptions".

    Cute, no?

    Jonathan Lee, Jul 7, 2009
  9. It wasn't my intention to be condescending.
    The source of my belief that you were holding this rule list sacred
    was that you wanted to go through it for every function that you
    Andrew Tomazos, Jul 8, 2009
  10. Jonathan Lee

    James Kanze Guest

    And there's nothing intrinsically "safe" about any of the
    Which is generally (not just with regards to floating point) an
    important issue: did the author understand the techniques he
    used? Not too sure how to make that a check point, however:).
    (One important point with regards to floating point: if any code
    uses floating point, one of the code reviewers should be an
    expert in numeric processing.)

    The only #if's that code should contain are include guards
    (which should normally automatically be inserted by the editor).
    Non-portable code belongs in a separate file, in a target
    dependent directory.

    For the rest, there is some value in having a list of
    checkpoints---in many cases, one could even arrange to check
    them automatically. But they won't cover everything, and more
    or less vague statements like "thread safe" don't belong in
    them. On the other hand, he seems to have missed one or two
    important ones:

    -- All variables are initialized before use (preferrably in the

    -- Functions which return a value have a return statement.

    -- Don't return a reference or a pointer to a local variable or
    a temporary.

    (Good compilers will warn about these. Sometimes incorrectly,

    I'd also set a fairly low limit to the cyclometric complexity
    (with an exception for functions which consist of a single
    switch with a lot of entries).
    James Kanze, Jul 8, 2009
  11. Jonathan Lee

    James Kanze Guest

    It is impractical NOT to check every function for many of these.
    Not doing so will slow down development too much.
    Equally robust?
    The problem is that code is written by human beings, who are
    imperfect, and make mistakes. One of the most effective ways of
    reducing total development time is code review. It also
    improves the quality of the final product.
    James Kanze, Jul 8, 2009
  12. Jonathan Lee

    James Kanze Guest

    It's usual practice for such formal methods to have such a check
    list, and apply it. Not necessarily one exactly like his, and
    in many cases, they actually use mechanical means of applying
    it, but the list exists and is used.
    Having a check list for common, easily identified errors can
    effectively help reduce the occurance of those errors, which
    improves maintainability. Obviously, such a list isn't
    sufficient. You also need thoughtful code review, which does
    more than just go through a check list. But the check list can
    James Kanze, Jul 8, 2009
  13. Jonathan Lee

    Jonathan Lee Guest

    Yeah, I know. Sorry for being a jerk about it yesterday. I just
    see a lot of posts in this newsgroup that assume someone
    else doesn't possess common sense. Did I write "I want to ensure
    all my functions"? Yes. Do I mean literally every function
    I've ever written since I was 17? No, of course not. I'm a
    reasonable human being. Moreover, it is also true that "I want
    to go to the moon". Does it mean I'm going to? No.

    Or "Google is your friend". I see that a lot but honestly, how
    many people get that as a reply and then think "ZOMG i totaly
    didnt search teh Google!".

    So sorry for snapping at you. It was just one too many posts
    where I found myself shaking my head.

    Jonathan Lee, Jul 8, 2009
  14. Jonathan Lee

    REH Guest

    There is if your platform allows for denormalized values. For example:

    if (x != 0.0)
    a = b / x;

    Assume x, a, and b are all doubles. If x contains a denormalized
    number, the comparison will be true (x is not 0.0). When the
    expression is evaluated, x will be normalized (x becomes 0.0) and thus
    cause a divide-by-zero. I usually write:

    if (x + 1.0 != 1.0)
    a = b / x;

    This will force x to be normalized in the condition expression.

    REH, Jul 8, 2009
  15. In message
    One could argue that that's "safer" than accepting the wildly inaccurate
    value you'll get by continuing to calculate with a denormalized value
    And will treat *any* number less than numeric_limits<double>::epsilon()
    as if it were zero, which may not be at all what you wanted.
    Richard Herring, Jul 8, 2009
  16. Jonathan Lee

    Jonathan Lee Guest

    Hi Stuart

    Thanks for your comments. I'll consider those points. But in turn can
    I ask you
    a question? Suppose your boss asked you to check over someone else's
    code and
    bring it up to snuff. Would you not run through a similar sort of list
    in your
    head as you examined the code?

    To be honest I'm more interested in what such a list might look like
    than how to apply it. I asked in the newsgroup because I think people
    more experience than me would have valuable input about common coding
    pitfalls. I'm perfectly willing to abandon a documentation scheme that
    a nightmare, but I'm sure everybody has some version of the above list
    their heads.

    Below you mention mechanically checking code. FWIW, I typically
    compile my
    code (with g++) using

    -ansi -pedantic -Wall -W -Wshadow -Wpointer-arith -Wcast-qual
    -Wcast-align -Wwrite-strings -Wconversion -Wold-style-cast

    and I never get warnings (well, OK, there's ONE warning I can't get
    of in one project). I also run my source through cppcheck, and the
    through the valgrind suite. In other words, in practice I do much more
    than rely that list.

    (Actually, if anyone thinks I'm not doing due diligence with the above
    feel free to add)

    Jonathan Lee, Jul 8, 2009
  17. Jonathan Lee

    Jonathan Lee Guest

    Hi James,
    thanks for your contribution

    What do you recommend for this scenario: I have a project
    which uses the Qt libraries. From version 3 to version 4
    of Qt there were many changes to the API, so my code has
    a lot of this:

    #ifndef QT_V3
    statusbar->addWidget(ageLabel, true);

    I would like to support both versions, but having two
    separate cpp files for these little one liners seems
    difficult to maintain. Having one file, I figure,
    allows me to keep the two versions in step.

    Jonathan Lee, Jul 8, 2009
  18. That is not necessarily in contradiction with what I said. Clearly
    there is a BALANCE between (A) just quickly hacking away whatever
    works; and (B) going through checklists and formal process at every
    step. Checking every function for a list of 20 points is too far
    toward B, and will slow down development.
    What is your question? You don't understand what "equally robust"
    There is a difference between a 1 hour code review of a manweek's
    worth of code, and double checking every function for 20 things.
    Andrew Tomazos, Jul 8, 2009
  19. Jonathan Lee

    REH Guest

    I believe IEEE rules state that anytime a denormalized value is used
    with a normalized value (e.g., arithmetic), the denormalized value is
    normalized (becomes 0.0).

    REH, Jul 8, 2009
  20. Jonathan Lee

    REH Guest

    Well, I could be wrong (I usually am!). My understand is that it is
    not just division. Any time a denormalized number is used
    arithmetically with a normalized number, the former is "flushed" to

    REH, Jul 8, 2009
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.