How much should I charge for fixed-price software contract?

  • Thread starter Robert Maas, see http://tinyurl.com/uh3t
  • Start date
R

Robert Maas, see http://tinyurl.com/uh3t

From: said:
You say that you are running running System 7.5.5 on your Mac Performa.
There should be no problem using PPP on your system.

With only 8 megabytes of RAM, and not just a simple test/echo loop but
actual useful software: InterNet Explorer / Mail And News
When I tried that in 1998, it took 20 minutes just to download the home
page of AT&T WorldNet, and 5 minutes each time I clicked on the scroll
bar to see a different portion of that home page. Do you know how to
get better response time with only 8 megabytes of RAM?
Configuring PPP is something that even a child can do.

Only if that child has the proper set of instructions.
Currently when I try to start IE from CD-ROM, it freezes the whole
machine, requiring cold restart. Do you know where to find instructions
for diagnosing what is causing that problem and fixing it?
 
G

Guest

bar to see a different portion of that home page. Do you know how to
get better response time with only 8 megabytes of RAM?

With system 7.5.5 8 megs should be fine for the trivial things you are trying.
I have run graphical web browsers on systems with 4 megs.

Only if that child has the proper set of instructions.

But you are the "experienced programmer with 20 years of experience"
And it turns out that your inability to access the internet is because you
dont know how to install software on your system !
LOL!!!
Currently when I try to start IE from CD-ROM, it freezes the whole
machine, requiring cold restart.

So dont use IE
There are many other browsers.
Only idiots who are completely oblivious to security issues use IE.

I think that you should abandon any idea of working in a computer related field.
Your computer skills are abyssmal.

--

Seek simplicity and mistrust it.
Alfred Whitehead

A witty saying proves nothing.
Voltaire
 
P

Patricia Shanahan

With system 7.5.5 8 megs should be fine for the trivial things you are trying.
I have run graphical web browsers on systems with 4 megs.





But you are the "experienced programmer with 20 years of experience"
And it turns out that your inability to access the internet is because you
dont know how to install software on your system !
LOL!!!




So dont use IE
There are many other browsers.
Only idiots who are completely oblivious to security issues use IE.

I think that you should abandon any idea of working in a computer related field.
Your computer skills are abyssmal.

That is a bit harsh, but I think there is a real message here. You see
identifying useful applications, downloading them, and getting them
running as absolutely essential computer skills. It doesn't matter
whether someone can program well, if they can't make effective use of
the enormous body of existing software. That is, I believe, the reality
today. It wasn't, when I started programming. Few programs existed and
there was relatively little chance of finding one that did just what you
wanted, and ran on the computer you had.

If Robert is going to get a computing job, he needs to develop the
skills that are needed now. My suggestion, given his limited resources,
is to become a CS grad student.

Student loans have lower interest rates than credit cards, he would have
access to college computing labs, and college placement services, and
would benefit from immersion in a mass of youngsters with 10+ years of
experience selecting and downloading software. Between being able to TA
for the more mathematical courses, and writing scientific programs for
professors as a research assistant, he could probably cover his fees
plus some living money. Robert might also be able to make some money on
the side doing mathematics tutoring.

Of course, I may be biased, because I'm enjoying being a student again,
as well as learning a lot.

Patricia
 
T

Tim X

I suppose it depends on what you consider "debugging". I write a line
of code and immediately apply it to the data I have to see if it gives
the data I want. If not, I change that line of code. If it works, I
move on to writing the next line of code. When I get to the end of the
function I'm working on, I put all those lines together and unit-test
the entire function on the data I was just using and any other cases
that are useful to make sure it really works correctly in all cases.
You consider that not to be debugging, correct? What do you call it then?

I'm guessing he was referring to the irritating development of
programmers who are not able to debug code without a high level IDE
which includes a debugger which allows them to step/trace through the
code one line at a time and watch what happens to variables in a watch
window.

IMO this is a really bad trend because it means programmers stop
thinking about their code and how it actually works. this sort of
'debugging' tends to be extremely slow - especially for trivial type
bugs which would noramlly be spotted very quickly by someone wh
understands what their code is doing and how it works.

I cold be just from an 'old school' that pretty much predated
integrated IDEs with all the snazzy features that are common these
days. Debuggers were around, but often getting them and your code
configured to use them, loadinig them up and setting breakpoints etc,
was more effort than it was worth except for really difficult problems
which could not be solved by looking at the code and thinking about
what was going on at the logical level.

Tim
 
U

Ulrich Hobelmann

Roedy said:
Debugging is fun. Programs behave strangely. You get to perform
experiments, make hypotheses, and gradually track down the problem. It
is like an Ruth Rendell mystery novel.

You think it's fun if all you *really* want is a working program, to
poke its innards until you find the tumor? For hours maybe?
Debugging is far more entertaining than cranking out code that works
first time. I think people are sometimes unconsciously careless just
to gives themselves that challenge later.

More entertaining that constructing a program, a living entity that
actually *does* something useful (hopefully)? That's like *walking* to
work if it takes you 20 minutes just by car...
This may also be the root of why programmers are so reluctant to use
any sort of tools that accelerate, automate or check programming.

Two reasons: learning curve and black magic. Show me a tool that boots
productivity while still being intuitive, easy to learn, and transparent
to see what the hell it does in the background and I'll take it.
 
U

Ulrich Hobelmann

Robert said:
With only 8 megabytes of RAM, and not just a simple test/echo loop but
actual useful software: InterNet Explorer / Mail And News
When I tried that in 1998, it took 20 minutes just to download the home
page of AT&T WorldNet, and 5 minutes each time I clicked on the scroll
bar to see a different portion of that home page. Do you know how to
get better response time with only 8 megabytes of RAM?

In 1998 some people were using P2s running at 300 MHz with 32MB RAM or
more. Those machines are now considered garbage by most. *Nobody* buys
them, because you can get better (used) machines that are even more
quiet, several times as fast, and that still cost less than $200.

Go find a decent, old machine that works and sell you Mac to the museum.
 
R

Roedy Green

I cold be just from an 'old school' that pretty much predated
integrated IDEs with all the snazzy features that are common these
days.

Up until the mid 70s, you could only count on one compile/run a day
per project. You REALLY had to make it count. The alternative was to
book time at 3 AM to have exclusive use of the mainframe.

I think though, code that has been watched running catches more bugs
than just examining output. You need to account for any unexpected
run behaviour, not just unexpected final results.

The problem is, today's programs are so big, there is no way you could
exhaustively watch them over every code path.

Because finding and fixing bugs is so easy, we tend to be more
careless about writing code in the first place. Dr. Kennedy, the head
of our computer science department bragged that he had never had a
compile error. He was a fanatical desk checker.
 
G

Gerry Quinn

look- said:
Debugging is fun. Programs behave strangely. You get to perform
experiments, make hypotheses, and gradually track down the problem. It
is like an Ruth Rendell mystery novel.

But then there's the pain when you find the bug after searching for
hours and it's something really really stupid.

- Gerry Quinn
 
J

Jason Curl

Tim said:
I'm guessing he was referring to the irritating development of
programmers who are not able to debug code without a high level IDE
which includes a debugger which allows them to step/trace through the
code one line at a time and watch what happens to variables in a watch
window.

IMO this is a really bad trend because it means programmers stop
thinking about their code and how it actually works. this sort of
'debugging' tends to be extremely slow - especially for trivial type
bugs which would noramlly be spotted very quickly by someone wh
understands what their code is doing and how it works.

Totally agree. The amount of time I save by "thinking" about my code
(before I write it, and after I write it) is enormous. More often than
not, my own testing doesn't find the problems that could occur, because
it's difficult to get those test conditions. 1 hour of not doing any
typing, just performing thought experiments and I've discovered very
subtle problems with home-brew IPC protocols (via sockets).

Even better, when I am in a rush to release software to somebody, it
works relatively well.
I cold be just from an 'old school' that pretty much predated
integrated IDEs with all the snazzy features that are common these
days. Debuggers were around, but often getting them and your code
configured to use them, loadinig them up and setting breakpoints etc,
was more effort than it was worth except for really difficult problems
which could not be solved by looking at the code and thinking about
what was going on at the logical level.

I find that not many people have thought about "unit testing" their new
code. They'd prefer to write a new function that is reasonable (i.e. 3-5
printed pages) and just compile it in with the already 100k line program
and test if it works to find typos and stupid decisions that would have
been caught if they'd unit tested it with another framework. They keep
up the compile, fix cycle until it compiles the first time. No wonder it
never works (and takes a day).

For me, it's emacs and the command line. I know exactly what's going on.
And I read my code before I compile it to see that it makes some kind of
sense.

On the other hand, debugging tools are extremely useful when interfaces
that are external don't work (I've had this problem repeatedly with
cygwin assuming a behaviour the same as Linux).

Jason.
 
W

Wade Humeniuk

Ulrich said:
In 1998 some people were using P2s running at 300 MHz with 32MB RAM or
more. Those machines are now considered garbage by most. *Nobody* buys
them, because you can get better (used) machines that are even more
quiet, several times as fast, and that still cost less than $200.

Go find a decent, old machine that works and sell you Mac to the museum.

Or go even crazier and get a

http://vfxweb.com/index.php?productid=8655

for $29US or if that is too steep, get a

http://vfxweb.com/index.php?productid=7689

for $9US.

At my children's school a business gave 150 PIII's.
(Some of them end up going to the dump
since there are too many to use). They have a small
mountain of hard drives, cards and stacks of monitors
in one room.

Wade
 
W

Wade Humeniuk

Ulrich said:
Like all webstores, it's kindof ugly and probably won't display nicely
in lynx ;)

Also, they want 50 bucks shipping in the US. Ouch!

It was just an example. I am sure most everyone living in an urban
area has a store similar to this. Panhandle on the way to the store,
purchase your "new" PC, and hike back.

Wade
 
P

Phlip

Tim said:
I'm guessing he was referring to the irritating development of
programmers who are not able to debug code without a high level IDE
which includes a debugger which allows them to step/trace through the
code one line at a time and watch what happens to variables in a watch
window.

No. I'm talking about developers who don't write unit tests as they write
code. These provide the option of using Undo, instead of debugging, when the
tests fail unexpectedly.

This leads to a development cycle with highly bug-resistant code, and
without proactive debugging to implement new functions.

Yes, you still need the debugger - typically for legacy situations - and you
still need elaborate debugging skills. New code stays ahead of them.

The idea that we can implement without debugging is incomprehensible to most
programmers. But that really is what I meant.
 
R

Russell Shaw

Phlip said:
No. I'm talking about developers who don't write unit tests as they write
code. These provide the option of using Undo, instead of debugging, when the
tests fail unexpectedly.

This leads to a development cycle with highly bug-resistant code, and
without proactive debugging to implement new functions.

Yes, you still need the debugger - typically for legacy situations - and you
still need elaborate debugging skills. New code stays ahead of them.

The idea that we can implement without debugging is incomprehensible to most
programmers. But that really is what I meant.

You can only write tests before the code if you are very familiar with what
you want and have thought about it for hours/days/weeks. In the process of
making the solution of a very tedious problem clear to me, i can do a *lot* of
throwing away of code and refactoring for weeks at a time. The amount of extra
churn from rewriting unit tests would be hopeless. Refactoring of code *is* my
thought process. Write code that works, then write unit tests to make sure it
keeps working. Gdb/ddd is indispensible.
 
G

Greg Menke

Phlip said:
No. I'm talking about developers who don't write unit tests as they write
code. These provide the option of using Undo, instead of debugging, when the
tests fail unexpectedly.

This leads to a development cycle with highly bug-resistant code, and
without proactive debugging to implement new functions.

Yes, you still need the debugger - typically for legacy situations - and you
still need elaborate debugging skills. New code stays ahead of them.

The idea that we can implement without debugging is incomprehensible to most
programmers. But that really is what I meant.


Its quite hard and maybe impossible to write unit tests for some (new)
things; OS bootloaders, device drivers, interrupt handlers, etc. In
some circumstances, a debugger can be used. Otherwise you have to do it
the old fashioned way; diagnostic counters, printks, bus analyzers,
sometimes even logic analyzers.

Unit tests are great, but they also have their limits.

Gregm
 
H

Hartmann Schaffer

Tim said:
...
I'm guessing he was referring to the irritating development of
programmers who are not able to debug code without a high level IDE
which includes a debugger which allows them to step/trace through the
code one line at a time and watch what happens to variables in a watch
window.

IMO this is a really bad trend because it means programmers stop
thinking about their code and how it actually works. this sort of
'debugging' tends to be extremely slow - especially for trivial type
bugs which would noramlly be spotted very quickly by someone wh
understands what their code is doing and how it works.

working with an IDE debugger beats analysing a binary (i.e. hex or
octal) core dump

hs
 
C

CBFalconer

Hartmann said:
working with an IDE debugger beats analysing a binary (i.e. hex
or octal) core dump

I can't really remember when I last used a debugger. Judicious
printf statements, or the equivalent, have handled everything for
me for years.
 
P

Patricia Shanahan

CBFalconer said:
I can't really remember when I last used a debugger. Judicious
printf statements, or the equivalent, have handled everything for
me for years.

The key issue for me is the round trip time to make a change in the
debug output, recompile, and run to the point of failure. If that takes
a few minutes, I have no problem using printouts.

On the other hand, I have been faced with problems in unfamiliar
programs that took several hours from starting the run to first symptoms
of the problem. Once I was at a failure point, I wanted to squeeze every
scrap of data I could.

An interactive debugger allows you to ask questions you didn't know you
wanted to ask until you saw the answer to another question. For example,
you can see which variable is incorrect at the failure point, look at
the source code to find the variables that affect it, and immediately
probe their values.

Patricia
 
P

Phlip

You can only write tests before the code if you are very familiar with
what
you want and have thought about it for hours/days/weeks.

We ain't talking about white-box acceptance tests.

If you can think of the next _line_ of code to write, you must perforce be
able to think of a complementing test case that would fail if the line were
not there. Just make writing the test case part of writing that line.
In the process of
making the solution of a very tedious problem clear to me, i can do a
*lot* of
throwing away of code and refactoring for weeks at a time.

That's not "refactoring" it's "rework". But it's okay - more below.
The amount of extra
churn from rewriting unit tests would be hopeless.

There are two forks to the answer to this.

One fork: Those tests support you because they should _decouple_ from the
tested code. After you have that line, you can move it around (_that's_
refactoring), screw with it, rewrite it, etc. The tests _support_ this
noodling around, when you run them over and over again. If a test fails
unexpectedly, you use Undo until it passes, then try another change.
Refactoring of code *is* my
thought process. Write code that works, then write unit tests to make sure
it
keeps working. Gdb/ddd is indispensible.

Here's the other fork of the answer:

Software Engineering is 99% inspiration and 1% perspiration. Harnessing any
kind of creativity is priceless. Until you learn TDD enough to use it
creatively, you might like to work out designs and algorithms using
code-and-fix (or UML diagrams, or Tarot cards, or LSD, or NBC, or whatever).

When you like a solution, now convert it into robust coding by using TDD to
recreate it. If you already know the code's behaviors and design, you must
perforce be able to think of a series of test cases that will force the code
to reappear in your production codebase.

While I find TDD for creativity very useful, TDD to reconstitute an existing
module is as easy as falling off a log.
 
P

Phlip

Greg said:
Its quite hard and maybe impossible to write unit tests for some (new)
things; OS bootloaders, device drivers, interrupt handlers, etc. In
some circumstances, a debugger can be used. Otherwise you have to do it
the old fashioned way; diagnostic counters, printks, bus analyzers,
sometimes even logic analyzers.

Unit tests are great, but they also have their limits.

Don't use the debugger to delay the inevitable: Writing an emulator for your
CPU.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Staff online

Members online

Forum statistics

Threads
473,769
Messages
2,569,577
Members
45,052
Latest member
LucyCarper

Latest Threads

Top