S
Seebs
The scientist is interested in *doing the calculation*. The fact that
it takes him less time to write and debug the code -- because he
*doesn't* have to fuss with manual memory management, because he
*doesn't* have to think about the nuts and bolts of string parsing,
because he *doesn't* have to think about whether to roll his own
hash/dictionary/map library or to decide which publicly available one to
use -- is far more important than the fact that he could get results 8%
faster if he wrote the code in C instead of Perl, Python, or Ruby.
Because the first time his genetic sequence analysis code crashes 36
hours into a 48-hour run because of a dangling pointer, that 8% time
savings means very little.
A couple of points:
1. 8% is not a reasonable estimate. My experience is it's usually a factor
of two or three, although that can vary widely between scripted languages.
2. Back in the day, one of the posters here, whose name I've forgotten
(Tanmoy, last name started with a B?), pointed something out, in a debate
between C and Ada users. It was to the effect of (paraphrased):
I pay for computer time, and I am paid for programming time. If
I spend twelve months writing something that will complete execution
in five months, that is better than if I spend ten months writing
something that will complete execution in six months.
Consider the case of tasks being run on supercomputers. There was a press
release a while back about some people who were building a supercomputer
cluster based on the Cell microprocessor. The *power savings* of running
the more-efficient CPU instead of a conventional CPU were in the millions
of dollars. At that point, a 10% reduction in processing time to complete
a task could be enough to pay for a couple of years' development effort...
Also, I should point out: It is not at all impossible for scripted languages
to crash due to crazy bugs. I found a beautiful and very hard to reproduce
bug in the Ruby<->PostgreSQL bindings once, the net result of which was that
you could VERY occasionally get data corruption under circumstances where
you bound a large number of variables into a query and at least a few of
them were not strings to begin with, but objects of other sorts which had a
possible string representation. You may rest assured, this was harder to
debug in Ruby than it would have been in C...
-s