Urgent!!! UPGRADE METHODOLOGY

Discussion in 'C++' started by dondora, Sep 26, 2007.

  1. dondora

    dondora Guest

    Hey. Hi~!

    I created a beauty salon management program for a small project.
    It has a few funtions but centainly necessary.
    The 2nd semeter began already. I'm given a new project class
    and I decided just to enhance my former program.

    While I was making the program,
    I comformed to the design procedure which is widely used in practice.
    I drew up requirement specifications and drew actor-class diagram, use
    case and related diagrams.
    And then I extracted candidate classes and drew up data dictionary and
    blah blah sequence, class diagrams also.

    >From beginning, I was thinking It'd be enough for me to comform to the

    early procedure.
    But now I wonder and am confused after look over the former
    requirement specification.
    I think to upgrade a program there will be a scheme or methodology
    applied generally over software development field.

    To sum up, is there a scheme or methodology to upgrade programs?
    If It is, Could you let me know where I can learn it in the internet
    or a book?
     
    dondora, Sep 26, 2007
    #1
    1. Advertising

  2. On 2007-09-26 13:26, dondora wrote:
    > Hey. Hi~!
    >
    > I created a beauty salon management program for a small project.
    > It has a few funtions but centainly necessary.
    > The 2nd semeter began already. I'm given a new project class
    > and I decided just to enhance my former program.
    >
    > While I was making the program,
    > I comformed to the design procedure which is widely used in practice.
    > I drew up requirement specifications and drew actor-class diagram, use
    > case and related diagrams.
    > And then I extracted candidate classes and drew up data dictionary and
    > blah blah sequence, class diagrams also.
    >
    >>From beginning, I was thinking It'd be enough for me to comform to the

    > early procedure.
    > But now I wonder and am confused after look over the former
    > requirement specification.
    > I think to upgrade a program there will be a scheme or methodology
    > applied generally over software development field.
    >
    > To sum up, is there a scheme or methodology to upgrade programs?
    > If It is, Could you let me know where I can learn it in the internet
    > or a book?


    This question is better answered in a general programming group like
    comp.programming or even better a software engineering group like
    comp.software-eng or comp.softwareeng, or perhaps a group discussing
    object oriented programming such as comp.object.

    --
    Erik Wikström
     
    =?UTF-8?B?RXJpayBXaWtzdHLDtm0=?=, Sep 26, 2007
    #2
    1. Advertising

  3. dondora

    Phlip Guest

    dondora wrote:

    > While I was making the program,
    > I comformed to the design procedure which is widely used in practice.
    > I drew up requirement specifications and drew actor-class diagram, use
    > case and related diagrams.
    > And then I extracted candidate classes and drew up data dictionary and
    > blah blah sequence, class diagrams also.


    I don't know what the professors have told you, but that's not a primary
    development "methodology". Modeling is just a way to visual a proposed or
    existing design; it's not a complete system to verify that design.

    One primary development methodology that is deceptively simple but extremely
    effective is Test Driven Development, with Refactoring.

    That means, for each line of code you intend to write, you first write a
    simple test case that fails because the line is not there yet. This is not a
    "unit test" or a "QA test" - it's just a test that can fail for the correct
    reason - the line is not there yet. You run the test and successfully
    predict it will fail, before you upgrade the tested code.

    When you pass the test, you write whatever sloppy bad design you need. It
    will only be a few edits-worth of sloppy code, so it's safe. When all the
    tests pass, only then do you upgrade the design. You try to see how many
    lines of code you can delete, and how you can simplify the design. You
    should only make small edits and pass all the tests after each one.

    If at any time the tests fail unexpectedly, you should revert and try again.
    People using that system always report these benefits:

    - almost no debugging
    - simple clear designs
    - no bugs released to production
    - your project velocity does not decay over time
    - you can deploy to production daily

    "Project velocity" is the average time required to implement one feature.

    This system has a lot of mindshare among our industry's leading
    consultants - the people whose job is rescuing huge projects from years of
    junior programmers attempting to over-design everything the way their
    professors told them to.

    --
    Phlip
     
    Phlip, Sep 26, 2007
    #3
  4. dondora

    James Kanze Guest

    On Sep 26, 1:26 pm, dondora <> wrote:
    > I created a beauty salon management program for a small project.
    > It has a few funtions but centainly necessary.
    > The 2nd semeter began already. I'm given a new project class
    > and I decided just to enhance my former program.


    > While I was making the program,
    > I comformed to the design procedure which is widely used in practice.
    > I drew up requirement specifications and drew actor-class diagram, use
    > case and related diagrams.
    > And then I extracted candidate classes and drew up data dictionary and
    > blah blah sequence, class diagrams also.


    > From beginning, I was thinking It'd be enough for me to comform to the
    > early procedure.
    > But now I wonder and am confused after look over the former
    > requirement specification.
    > I think to upgrade a program there will be a scheme or methodology
    > applied generally over software development field.


    > To sum up, is there a scheme or methodology to upgrade programs?
    > If It is, Could you let me know where I can learn it in the internet
    > or a book?


    There is no one simple answer; it depends on the type of
    upgrade. The important thing, always, is simply not to cut
    corners; if the upgrade requires modifications in the design,
    you modify the design; you don't just hack the code. There's
    even something to be said for rethinking the design each time
    (at least whenever there's a major upgrade), refactoring common
    parts again (since the upgrade may end up creating additional
    common parts, or require that previously common parts behave
    differently). Only once you're sure that the design for the new
    requirements is correct should you start to look at the code;
    typically, you will find a lot of the existing code which you
    can reuse, but that should be because it fulfills the new design
    requirements, and not because you've forced the design in such a
    way as to reuse it.

    Failure to do this will lead very quickly to a "hacked" design
    and unmaintainable code.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, Sep 27, 2007
    #4
  5. dondora

    James Kanze Guest

    On Sep 26, 3:07 pm, "Phlip" <> wrote:
    > dondora wrote:
    > > While I was making the program,
    > > I comformed to the design procedure which is widely used in practice.
    > > I drew up requirement specifications and drew actor-class diagram, use
    > > case and related diagrams.
    > > And then I extracted candidate classes and drew up data dictionary and
    > > blah blah sequence, class diagrams also.


    > I don't know what the professors have told you, but that's not
    > a primary development "methodology". Modeling is just a way to
    > visual a proposed or existing design; it's not a complete
    > system to verify that design.


    Obviously, design is a creative activity, which takes place
    first in the designer's head. However, in a very real sense,
    there is no design until it's on paper (or "electronic" paper,
    written down in the computer somewhere). UML is probably the
    most widespread way of doing this, at least for larger projects.

    (And just as obviously, until the design has been written down
    somehow, it's impossible to verify it.)

    > One primary development methodology that is deceptively simple
    > but extremely effective is Test Driven Development, with
    > Refactoring.


    No professional would make such a silly statement. There's no
    silver bullet. Testing doesn't drive design; in some cases, you
    can't even know what to test until part of the design has been
    specified. (Don't get me wrong: testing is important, to catch
    out the cases where you've done something else wrong. But
    anytime a test fails, the first thing you do is revisit your
    process, to see what you did wrong upstream.)

    Don't put the cart before the horse.

    > That means, for each line of code you intend to write, you
    > first write a simple test case that fails because the line is
    > not there yet.


    The order is irrelevant. The important thing is that before you
    write any line of code (test or not), you have some sort of
    design.

    > This is not a "unit test" or a "QA test" - it's just a test
    > that can fail for the correct reason - the line is not there
    > yet. You run the test and successfully predict it will fail,
    > before you upgrade the tested code.


    Running a test that you know will fail, because you've not
    written the code yet, is just a waste of time.

    > When you pass the test, you write whatever sloppy bad design
    > you need. It will only be a few edits-worth of sloppy code, so
    > it's safe. When all the tests pass, only then do you upgrade
    > the design. You try to see how many lines of code you can
    > delete, and how you can simplify the design. You should only
    > make small edits and pass all the tests after each one.


    > If at any time the tests fail unexpectedly, you should revert
    > and try again. People using that system always report these
    > benefits:


    > - almost no debugging
    > - simple clear designs
    > - no bugs released to production
    > - your project velocity does not decay over time
    > - you can deploy to production daily


    People who use that system don't produce high quality code,
    which can be used reliably in large systems.

    > "Project velocity" is the average time required to implement one feature.


    An application is more than just a collection of features.

    > This system has a lot of mindshare among our industry's leading
    > consultants


    You mean you and a couple of other amateurs who aren't involved
    in serious software? I don't know of any serious specialist in
    software engineering who recommends anything so silly.

    > - the people whose job is rescuing huge projects from years of
    > junior programmers attempting to over-design everything the way their
    > professors told them to.


    Does it occur to you that in well run companies, the design
    isn't done by junior programmers, but by professionals, applying
    professional methodology (which includes modeling, and a lot of
    other things). You might want to take a look at
    http://www.sei.cmu.edu/, for example (which is the ultimate
    reference for software engineering issues);
    http://www.idinews.com also has some good articles about
    software development.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, Sep 27, 2007
    #5
  6. dondora

    Ian Collins Guest

    James Kanze wrote:
    > On Sep 26, 3:07 pm, "Phlip" <> wrote:
    >
    >> One primary development methodology that is deceptively simple
    >> but extremely effective is Test Driven Development, with
    >> Refactoring.

    >
    > No professional would make such a silly statement.


    I would, and so would any member of my team.

    > There's no silver bullet.


    No one disputes that.

    > Testing doesn't drive design; in some cases, you
    > can't even know what to test until part of the design has been
    > specified.


    We probably work in different worlds, my clients often either don't
    really know what they want or they are chasing a rapidly evolving
    market, so at the beginning of a project, there is little, if anything
    to design. I'm sure there are domains where the requirements are an
    invariant and a well though out design is a good approach. One of these
    days I might get to work on one!

    >
    > Don't put the cart before the horse.
    >

    Writing tests before the code is both more enjoyable and leads to
    better, more thorough tests. Developers hate going back to write tests
    for exiting code and tend to do a piss poor job when they do.

    --
    Ian Collins.
     
    Ian Collins, Sep 27, 2007
    #6
  7. dondora

    Phlip Guest

    James Kanze wrote:

    > Obviously, design is a creative activity, which takes place first in the
    > designer's head. However, in a very real sense, there is no design until
    > it's on paper


    You have a typo there. You were clearly trying to write "there is no design
    until it's in code".

    --
    Phlip
     
    Phlip, Sep 27, 2007
    #7
  8. dondora

    Phlip Guest

    Ian Collins wrote:

    >>> One primary development methodology that is deceptively simple
    >>> but extremely effective is Test Driven Development, with
    >>> Refactoring.

    >>
    >> No professional would make such a silly statement.

    >
    > I would, and so would any member of my team.


    One of the hardest sells in TDD is to people who understand and practice
    high-end automated QA testing. For example, one of them read "write a simple
    test case, first", and then got mired in writing a complex QA test case,
    first.

    Don't do that! It's not what we are talking about...

    --
    Phlip
     
    Phlip, Sep 27, 2007
    #8
  9. On 2007-09-27 10:40, Phlip wrote:
    > James Kanze wrote:
    >
    >> Obviously, design is a creative activity, which takes place first in the
    >> designer's head. However, in a very real sense, there is no design until
    >> it's on paper

    >
    > You have a typo there. You were clearly trying to write "there is no design
    > until it's in code".


    No, you have to make a difference between design and implementation, a
    design is at a higher abstraction layer. One design can result in
    several different (though quite similar) implementations in different
    languages. An implementation on the other hand maps only to one design.
    Of course, one should not assume that the design will not have to be
    adjusted while implementing, since implementation can bring into light
    issues that were not considered during the initial design.

    --
    Erik Wikström
     
    =?UTF-8?B?RXJpayBXaWtzdHLDtm0=?=, Sep 27, 2007
    #9
  10. dondora

    James Kanze Guest

    On Sep 27, 10:32 am, Ian Collins <> wrote:
    > James Kanze wrote:
    > > On Sep 26, 3:07 pm, "Phlip" <> wrote:


    > >> One primary development methodology that is deceptively simple
    > >> but extremely effective is Test Driven Development, with
    > >> Refactoring.


    > > No professional would make such a silly statement.


    > I would, and so would any member of my team.


    > > There's no silver bullet.


    > No one disputes that.


    That's apparently what Philip was claiming. Use TDD, and you
    don't need anything else. Testing is an essential part of
    software development, but it isn't everything.

    > > Testing doesn't drive design; in some cases, you
    > > can't even know what to test until part of the design has been
    > > specified.


    > We probably work in different worlds, my clients often either
    > don't really know what they want or they are chasing a rapidly
    > evolving market, so at the beginning of a project, there is
    > little, if anything to design.


    If you don't know what the project is supposed to do, you can't
    very well design it. Of course, you can't code it either, and
    above all, you can't write tests to verify that it does it.

    Requirements do evolve. The user's idea of his requirements
    also may become more precise as time goes on. But that doesn't
    mean you don't design---just the opposite: you intentionally
    design in flexibility where you know things are going to change.
    And you rework the design each time his requirements evolve.

    (One frequent problem, of course, is that the user wants
    flexible requirements, but a fixed price. But that just doesn't
    work; you can't fix a price without knowing the actual
    requirements. And of course, if the user then changes them, you
    re-estimate, and fix a new price. After which, he either
    forgoes the changes, or accepts the new price.)

    > I'm sure there are domains where the requirements are an
    > invariant and a well though out design is a good approach.
    > One of these days I might get to work on one!


    I don't think it's that black and white. Every project I've
    ever seen or heard of has some fixed requirements (it shouldn't
    crash, regardless of the input), and some that evolve. A well
    thought out design isn't cast in stone; it will evolve, just as
    anything else will. (A well thought out design may help in
    estimating the cost of a given evolution, of course.)

    > > Don't put the cart before the horse.


    > Writing tests before the code is both more enjoyable and leads
    > to better, more thorough tests.


    If you find that to be true, do so. I prefer the inverse, but
    with regards to implemention code and tests (which are, of
    course, also code), the order is really irrelevant, and each
    developer can do whatever he feels like. The question is rather
    one of design vs. code/tests: without the design, how do you
    know what classes will even be needed.

    > Developers hate going back to write tests for exiting code and
    > tend to do a piss poor job when they do.


    And then their code doesn't pass review (which, of course,
    includes the unit tests, and validates their completeness).

    In this regard, you might care to read
    http://www.idinews.com/agileDoc.html. It doesn't cover the
    question of TDD so much as basic professionalism; a professional
    doesn't "do a piss poor job" just because he doesn't consider
    something very interesting. And if he does, then that's a
    problem, regardless of the methodology being used.

    Having said that, of course, I would repeat what I said before.
    Both the code and the tests must be written, regardless. And
    the order isn't really that important; the code isn't finished
    until it passes the tests. If you find it more agreeable to
    write the tests first, and then the code, I don't see where that
    would cause any problem. I generally do the reverse, but that's
    just because that's the way I feel most comfortable. And in
    practice, it's probably mixed for both of us: I'll write the
    constructors, and the accessor functions used in the tests, and
    test them, then generally add one or two functions at a time,
    with their tests, until the class is complete. But whichever
    way you do it, you have to know what to write, which means that
    you have to know what the class is supposed to do, which means
    that you must know the requirements. (In just about every place
    I've worked, too, you have to make an estimation of cost and
    time before you begin writing the code. On the basis of the
    stated requirements, of course.)

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, Sep 27, 2007
    #10
  11. dondora

    James Kanze Guest

    On Sep 27, 10:40 am, "Phlip" <> wrote:
    > James Kanze wrote:
    > > Obviously, design is a creative activity, which takes place
    > > first in the designer's head. However, in a very real
    > > sense, there is no design until it's on paper


    > You have a typo there. You were clearly trying to write "there
    > is no design until it's in code".


    There's no implementation of the design until the product has
    been deployed. And you can't really be certain that your design
    was correct until then. But you can't write a line of code
    until you know what classes are going to be there, and
    determining that is part of design. Until you have documented
    the class interactions, for example, you don't know what you're
    going to have to test.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, Sep 27, 2007
    #11
  12. dondora

    Phlip Guest

    James Kanze wrote:

    > That's apparently what Philip was claiming. Use TDD, and you don't need
    > anything else. Testing is an essential part of software development, but
    > it isn't everything.


    TDD is not testing, it's just writing the code twice, once backwards. It
    works within a methodology that's extraordinarily effective. Nobody said you
    only need testing.

    If you want a "silver bullet", I think the definition was "a new technique
    providing an order of magnitude improvement in less than a decade of normal
    growth". So if a high-end shop that was already using whatever methodology
    you advocate switches to TDD (all the practices, not just the test-like
    things), and if they report a 10x drop in bugs reported from their
    customers, that might just qualify...

    > Requirements do evolve. The user's idea of his requirements also may
    > become more precise as time goes on. But that doesn't mean you don't
    > design---just the opposite: you intentionally design in flexibility where
    > you know things are going to change.


    Given feature X, you implement it with simple code. Even though X2 and X3
    are very near term, you _don't_ design-ahead for them. When you have X2, you
    add it (under matching test-like things), and you refactor the code until
    its simple again. When X3 comes along - the third requirement along the same
    kind of feature - you probably won't need to refactor very much.

    > And you rework the design each time his requirements evolve.


    If you order him to provide new, narrow requirements, once per week, then
    you improve his ability to steer a project in realtime. This allows you to
    right-size a program, without adding to many speculative features.

    > I generally do the reverse, but that's

    just because that's the way I feel most comfortable.

    People using test-first routinely report surprise at how soon a design locks
    into the "Open Closed Principle", and stops changing. This implies they are
    rapidly under-engineering a project, instead of over-engineering it.

    --
    Phlip
     
    Phlip, Sep 27, 2007
    #12
  13. dondora

    Phlip Guest

    James Kanze wrote:

    > There's no implementation of the design until the product has been
    > deployed. And you can't really be certain that your design was correct
    > until then. But you can't write a line of code until you know what
    > classes are going to be there


    I don't have that problem. I go test -> behavior -> little method. Over
    time, that method might sprout into an object, or even a module. If it
    doesn't, then I right-sized the design.

    --
    Phlip
     
    Phlip, Sep 27, 2007
    #13
  14. dondora

    Phlip Guest

    Erik Wikström wrote:

    > No, you have to make a difference between design and implementation, a
    > design is at a higher abstraction layer. One design can result in
    > several different (though quite similar) implementations in different
    > languages. An implementation on the other hand maps only to one design.
    > Of course, one should not assume that the design will not have to be
    > adjusted while implementing, since implementation can bring into light
    > issues that were not considered during the initial design.


    Paraphrasing that great methodologist, Bill Clinton, that depends on the
    definition of "is".

    A team should invest its designing energy into code first, diagrams and
    plans second. If the boss asks "how's the design coming along?", the answer
    should be deployed features, not speculation and documentation.

    I'm aware some people don't have experience producing clean designs without
    up-front planning. This thread started when a student, for a very small
    project, planned its design, then accepted new requirements, and discovered
    the design did not magically cover them. That's how it works; no matter how
    awesome our design skills, we will always discover requirements that force
    rework.

    So as part of my design goals, I intend to produce code, with tests, that
    are all highly resilient to change. And I get there by refactoring the
    design, as it grows, and forcing it to actually change as it grows. This
    tests that it can.

    --
    Phlip
     
    Phlip, Sep 28, 2007
    #14
  15. dondora

    dondora Guest

    Well, my question causes a dispute.
    I've determined to conform to the design methodology I
    talked(requirement specifications, use casees, etc).
    There's no problem with things I've done in my project as you ask me.
    I just wanted to know there's systematic methodology as I did.
    Anyway, TDD looks bad. When it comes to time to handing over your own
    program in industry, you just give code?
    Let us think the situation about when you are given just a
    tremendously amount of sources without anything explained.
     
    dondora, Sep 28, 2007
    #15
  16. dondora

    Phlip Guest

    dondora wrote:

    > Well, my question causes a dispute.


    Sorry it looks like that. In many circles the matter is quite settled. And
    you could also try "Design by Contract", to much the same effect.

    The best way to do something is often the simplest, but there are always
    newbies who need to be brought up to speed. The only "debate" here has been
    whether we should write test cases just before or just after writing the
    tested code. Nobody here has advocated you _not_ write automated tests.

    > I've determined to conform to the design methodology I
    > talked(requirement specifications, use casees, etc).


    Again: Those are not a methodology. And if you describing doing all of them
    first, before any coding, then that is "Waterfall", which is among the worst
    known methodologies.

    > There's no problem with things I've done in my project as you ask me.
    > I just wanted to know there's systematic methodology as I did.
    > Anyway, TDD looks bad.


    What have you read about it? Try Steve McConnell's /Code Complete, 2nd Ed/.
    And nobody has said that people using TDD never document what they are
    doing. Read more.

    > When it comes to time to handing over your own
    > program in industry, you just give code?


    And tests.

    I want you to imagine picking one of two new jobs. This example is
    contrived - the real life example is always somewhere in between - but it
    illustrates the situation. At either job, your first task will be adding a
    feature to 1 million lines of well-written C++ code.

    At Job A, the code comes with lots of nice, accurate, reliable, indexed
    requirements documents, design model diagrams, and use cases.

    At Job B, the code comes with almost no documents, 1.5 million lines of
    clearly written and simple test cases, and a Wiki documenting and running
    test cases covering all the inputs and outputs the users expect.

    Now lets see what you do at your first day at Job A. You make a change.
    Then, for hours, you slowly read all that documentation, and you manually
    operate the program, making sure your change did not break any of the
    existing features. When you make that change, you have the odious choice to
    add new code, or to change existing code. If you get this choice wrong
    (likely), the design quality will go down. Further, if you make any mistake,
    you will probably spend a long time debugging to figure out what went wrong.

    At Job B, during and after your first change, you quickly run all the tests.
    They work like little elves reading all that documentation, and applying all
    those checks for you. If you break something - or even if the elves
    _suspect_ you might break something - you have the option to revert your
    change and try again.

    You have the option to _not_ debug.

    Understand the elves are not omniscient - they only know what they are told.
    So did the documentation at Job A. But the elves prefer to err on the side
    of caution. Many of your edits that should have worked, the test cases will
    reject them!

    You will work faster and safer at Job B. If a test case fails, its assertion
    diagnostic should describe what went wrong. These test cases form a living
    documentation, showing you what systems, structures, and behaviors the code
    should reveal.

    Next, each "use case" was expressed as a high-level test in that Wiki. This
    forced the code to be testable, which overwhelmingly improved its design,
    and decoupled its objects. This improves communication with your users'
    representatives. No more hand-waving or white-boarding when discussing
    features. You can see them in action.

    Real life, of course, is not so distinct. Many projects have no tests
    whatsoever (and many also have no documentation!). Well-managed projects
    usually find some balance between automated tests and of _reliable_
    documentation. (Tests can't lie like some documentation can!) So the
    question resolves to one point: At crunch time, when the programmers are
    doing something important, would you rather they devote their energy to
    documentation? or to automated tests? Which one is more important for your
    project's success?

    --
    Phlip
     
    Phlip, Sep 28, 2007
    #16
  17. dondora

    James Kanze Guest

    On Sep 28, 4:39 am, "Phlip" <> wrote:
    > dondora wrote:
    > > Well, my question causes a dispute.


    > Sorry it looks like that. In many circles the matter is quite
    > settled.


    Quite. Take a look at the SEI site, for example. Software
    engineering is actual a fairly mature discipline, even if a lot
    of developers (including some who are experts in other things,
    such as software design) choose to ignore it.

    > And you could also try "Design by Contract", to much the same
    > effect.


    > The best way to do something is often the simplest, but there
    > are always newbies who need to be brought up to speed. The
    > only "debate" here has been whether we should write test cases
    > just before or just after writing the tested code.


    I'm not even sure that that's being debated; I certainly don't
    think it matters (and have expressed that opinion). My
    impression was that the debate was over whether there were
    phases that should precede writing tests or code: a separate
    design phase.

    > Nobody here has advocated you _not_ write automated tests.


    Very true. NO development methodology would ever allow that.
    In industry, typically, the check-in procedures for the software
    will run the unit tests, and won't accept the check-in if they
    fail.

    > > I've determined to conform to the design methodology I
    > > talked(requirement specifications, use casees, etc).


    > Again: Those are not a methodology. And if you describing
    > doing all of them first, before any coding, then that is
    > "Waterfall", which is among the worst known methodologies.


    There, you're being intellectually dishonest. There is no such
    thing as a "waterfall" methodology, and never has been; it's a
    strawman that was invented for the sole purpose of criticising
    it, and justifying some new approach. If you don't know what
    the code you want to write is supposed to do, then you can't
    write either the tests or the code. And if you haven't put it
    down in writing, then you don't know it. It's that simple. The
    "requirements specification" must be complete for the code you
    write. (That doesn't mean, and has never meant, that it is
    complete for every aspect of the final system. The requirements
    specification may evolve, just like everything else in the
    system.)

    You might want to read http://www.idinews.com/waterfall.html for
    more details.

    > > There's no problem with things I've done in my project as
    > > you ask me. I just wanted to know there's systematic
    > > methodology as I did. Anyway, TDD looks bad.


    > What have you read about it? Try Steve McConnell's /Code
    > Complete, 2nd Ed/. And nobody has said that people using TDD
    > never document what they are doing. Read more.


    > > When it comes to time to handing over your own
    > > program in industry, you just give code?


    > And tests.


    What you hand over depends on the contract:). Code, tests,
    documentation... Whatever the customer wants (and is willing to
    pay for). I'm sure, for example, that you provide user manuals,
    if that's part of your responsibility in the project---you don't
    really expect users to figure it out from the tests.

    Typically, of course, you will provide a requirements
    specification (at least partial) much, much earlier. When you
    specify the price. Because most customers don't particularly
    like writing blank checks: they want to know what they will get,
    for what price.

    > I want you to imagine picking one of two new jobs. This
    > example is contrived - the real life example is always
    > somewhere in between - but it illustrates the situation. At
    > either job, your first task will be adding a feature to 1
    > million lines of well-written C++ code.


    > At Job A, the code comes with lots of nice, accurate,
    > reliable, indexed requirements documents, design model
    > diagrams, and use cases.


    > At Job B, the code comes with almost no documents, 1.5 million
    > lines of clearly written and simple test cases, and a Wiki
    > documenting and running test cases covering all the inputs and
    > outputs the users expect.


    Again: intellectual dishonesty. Have you ever heard of a
    company that had a good enough process to produce the
    documentation of job A, which didn't have automated tests as
    part of the process.

    > Now lets see what you do at your first day at Job A. You make
    > a change. Then, for hours, you slowly read all that
    > documentation, and you manually operate the program, making
    > sure your change did not break any of the existing features.
    > When you make that change, you have the odious choice to add
    > new code, or to change existing code. If you get this choice
    > wrong (likely), the design quality will go down. Further, if
    > you make any mistake, you will probably spend a long time
    > debugging to figure out what went wrong.


    > At Job B, during and after your first change, you quickly run
    > all the tests. They work like little elves reading all that
    > documentation, and applying all those checks for you. If you
    > break something - or even if the elves _suspect_ you might
    > break something - you have the option to revert your change
    > and try again.


    You forget the essential: if the role and the responsibilities
    of the class in the project are well defined and documented (job
    A), you understand what you are doing, and your code will be
    correct first time. If they're not (job B), you guess, run the
    tests, they fail, guess something else, run the tests, that
    fails as well, etc., until you guess right.

    > You have the option to _not_ debug.


    > Understand the elves are not omniscient - they only know what
    > they are told. So did the documentation at Job A. But the
    > elves prefer to err on the side of caution. Many of your edits
    > that should have worked, the test cases will reject them!


    > You will work faster and safer at Job B.


    Have you any real measured studies to support such a ridiculous
    claim.

    > If a test case fails, its assertion diagnostic should describe
    > what went wrong. These test cases form a living documentation,
    > showing you what systems, structures, and behaviors the code
    > should reveal.


    > Next, each "use case" was expressed as a high-level test in
    > that Wiki. This forced the code to be testable, which
    > overwhelmingly improved its design, and decoupled its objects.
    > This improves communication with your users' representatives.
    > No more hand-waving or white-boarding when discussing
    > features. You can see them in action.


    > Real life, of course, is not so distinct. Many projects have
    > no tests whatsoever (and many also have no documentation!).


    In practice, such companies went out of business a long time
    ago. At least in the fields I work (where software usually has
    to run 24 hours a day, 7 days a week, with contractual penalties
    for down time).

    > Well-managed projects usually find some balance between
    > automated tests and of _reliable_ documentation. (Tests can't
    > lie like some documentation can!) So the question resolves to
    > one point: At crunch time, when the programmers are doing
    > something important, would you rather they devote their energy
    > to documentation? or to automated tests? Which one is more
    > important for your project's success?


    Unless you have both, you've failed.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, Sep 28, 2007
    #17
  18. dondora

    Ian Collins Guest

    James Kanze wrote:
    >
    > Typically, of course, you will provide a requirements
    > specification (at least partial) much, much earlier. When you
    > specify the price. Because most customers don't particularly
    > like writing blank checks: they want to know what they will get,
    > for what price.
    >

    The process tends to be different with agile projects, where the
    customer pays by iteration, with the option to pull that pug after the
    iteration is complete. The level of required documentation is
    considerably less because the risk to the client is much lower, the most
    they will loose is a week or two of the suppliers time.

    I've run a couple of successful projects this way, one in particular was
    very successful, delivering the client exactly what they wanted, which
    turned out to be considerably more than they would have specified up
    front. They got extra business value, I got several more months work.
    At no time were the requirements anything more than 4x6 cards.

    --
    Ian Collins.
     
    Ian Collins, Sep 28, 2007
    #18
  19. dondora

    Phlip Guest

    James Kanze wrote:

    > Quite. Take a look at the SEI site, for example. Software engineering is
    > actual a fairly mature discipline, even if a lot of developers (including
    > some who are experts in other things, such as software design) choose to
    > ignore it.


    Fallacy of excluded middle and argumentum ad hominem in one 'graph.

    --
    Phlip
     
    Phlip, Sep 28, 2007
    #19
  20. dondora

    Phlip Guest

    James Kanze wrote:

    > Quite. Take a look at the SEI site, for example.


    Okay.

    http://www.sei.cmu.edu/productlines/frame_report/testing.htm

    ----8<-----------------------------------------

    TDD is one practice of the agile development community. Its goal is "clean
    code that works" [Beck 2002a]. In this practice, developers define
    requirements for the piece they are assigned to construct by maintaining
    close communication with the developers who are constructing related units
    and writing test cases that serve as the specification for the unit. The
    developer then writes and revises code until the unit passes all the tests.
    The rhythm of TDD is very short cycles of these steps:

    Define a new test.
    Execute all tests.
    Write code to fix tests that fail.
    Execute all tests.

    TDD has the advantage that the test code is always in synch with the product
    code, because the test code defines the product code. The disadvantage of
    TDD is that there is not a good method for determining whether the set of
    test cases is complete, since the completeness of a test set is usually
    determined by comparing it to the specification.

    TDD is applicable to product line organizations provided it is applied to
    units that are first defined in the context of the product line
    architecture. TDD does not provide tools and techniques for balancing the
    diverse quality attributes usually present in a product line. TDD can be
    successful if applied to units that a small group of developers, often a
    two-person or pair programming team, can produce in a timely manner. The
    range of variability for the unit should also be sufficiently narrow to
    allow for timely completion. The success of TDD depends on the availability
    of tools, such as JUnit, to assist with development and automate testing.

    ----8<-----------------------------------------

    ftp://ftp.sei.cmu.edu/pub/documents/articles/pdf/xp-from-a-cmm-perspective.pdf

    XP satisfaction of key process areas, given the appropriate environment

    Level Satisfaction Key process area

    2 ++ Requirements management
    2 ++ Software project planning
    2 ++ Software project tracking and oversight
    2 - Software subcontract management
    2 + Software quality assurance
    2 + Software configuration management
    3 + Organization process focus
    3 + Organization process definition
    3 - Training program
    3 - Integrated software management
    3 ++ Software product engineering
    3 ++ Intergroup coordination
    3 ++ Peer reviews
    4 - Quantitative process management
    4 - Software quality management
    5 + Defect prevention
    5 - Technology change management
    5 - Process change management

    + Partially addressed in XP
    ++ Largely addressed in XP (perhaps by inference)
    - Not addressed in XP

    ----8<-----------------------------------------

    Note that this survey only compares XP's documentation and verbiage to
    CMMi's verbiage. It is not a study of real projects in action. So under
    "Training program", the - represents the author, Dr. Mark Paulk, declines to
    speculate that pair programming could be used as an ideal training program.

    Next, all Agile projects, in practice, automate their entire build chain.
    Maybe the CMMi has higher goals for its "Integrated software management"
    KPA.

    And note that "Defect prevention" gets only one +. The actual response from
    folks who switched to XP (and did all its practices, not just the convenient
    ones) is their code grows very robust and difficult to break over time.
    Agile development provides aspects of design and teamwork which the SEI is
    not yet capable of interpreting.

    So, in conclusion, I don't think it's the Agile community who is being
    immature here.

    --
    Phlip
     
    Phlip, Sep 28, 2007
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. imavroid
    Replies:
    0
    Views:
    859
    imavroid
    Mar 12, 2006
  2. Replies:
    3
    Views:
    453
    Clint Hill
    Jul 25, 2005
  3. Imran Aziz
    Replies:
    2
    Views:
    1,226
    Imran Aziz
    Aug 5, 2005
  4. dondora

    Urgent!!! UPGRADE METHODOLOGY

    dondora, Sep 26, 2007, in forum: Java
    Replies:
    2
    Views:
    378
    dondora
    Sep 27, 2007
  5. Replies:
    5
    Views:
    306
    Dr.Ruud
    Jul 5, 2006
Loading...

Share This Page