Rui Maciel said:
I see what you mean and, from your example, you have a point.
Nonetheless, in this case the application domain is restricted to
scientific computing. Basically, that means that all an application
must do is input data, crunch numbers and output the result. I doubt
that adopting an interpreted language such as Ruby or Python will help
anyone handle data input and output and I really doubt that those
language's "expressiveness" will be of any use when writing code to
crunch numbers.
From that, why would anyone believe that writing this sort of programs
in ruby or python is easier than writing it in some other well
established language such as C? It may appear to be easier when
dealing with GUIs or dynamic web content but number-crunching? I
don't see how.
Most of the engineers I work with use Fortran. Some even use a recent
version of Fortran, many do not (even when they think they do).
This is largely because scientists and engineers like the reliability
which comes with long-established models. One very important piece of
research code we use models atomic behaviours. Its been in continual
development since 1971 and there are literally hundreds of research
papers which have developed using this software. Resistance to changing
language are well founded and largely relate to:
- Reliability: this specific implementation has been under constant
scrutiny for decades. No software is bug-free, but this one is
now very well understood.
- Comparability: there is real concern that even slight changes in
the effective computation means that the results can't be reliably
compared to historic results - which means a break in the research
results.
These things are important.
Onto your point, though. This software takes input data, crunches
numbers and outputs the results. Because of the reasons listed above,
no-one is going to move to rewrite this (right now) in a language like
Ruby, but I think you oversimplify the needs. In particular, the
complexity of the data representation the computation itself. Languages
like Ruby might not be any better at the IO, but the stuff in-between
may well be more elegantly expressed using such a language. I don't
think you can be as black-and-white as to say it has no place in SC.
Another key issue biting SC continually is good language support for
parallelism. Bolt-on solutions like MPI are good (and extensively
used) but, as these issues start to really bite, it wouldn't surprise me
if it forced a full redesign of these models to fully exploit modern
hardware's potential.
At that point, a lot of the arguments in favour of sticking with
existing code are washed away. At that point, they may well jump to
some sort of extended Fortran, C or whatever...but they might be more
adventurous and choose the language which provides the "best" way to
express their models (in terms of performance, elegance,
maintainability, proveability) rather than simply using the same
language they've always used.
Would it be Ruby or Python? I don't know. I do know a fair few of our
Computer Scientists and Electronic Engineers have used NumPy in their
research...