C in Science and Engineering...

R

Rui Maciel

bart.c said:
The point is not having to bother with these details (together with all
their clutter, and potential for bugs) especially with code that is being
revised over and over again.

Quite bluntly, if someone is unable to figure out how a linked list works and believes that it's a
source of "clutter", along with being a potential source of bugs, then if that person intends to
write a science/engineering... Well, let's say that handling a linked list is the least of his
problems.

By having to deal with an element at a time?

And why exactly do you believe you are forced to "deal with with an element at a time"?

Typically such a language will use the most accurate type for floating
point. But you don't have to tell it what to use, and it's possible also
to have lists of mixed numeric types.

The word "typically" also reads as "the user doesn't really know what's going on". There's a lot
of science invested in evaluating margin of errors involving FP operations, all of which is
rendered useless once that info (or, better yet, the control over that parameter) is taken away
from you. And what good is your calculation for if you don't have a way to know how far off it
may be?

For someone who's a clumsy typist, there is just more opportunity for
mistyping. Again, for code that is being revised every few seconds, a lot
of time is wasted dotting i's and crossing t's there will never make it to
the final version.

Quite bluntly, this is a terrible excuse. If you can't even manage to type what you mean then
what makes you believe that your program is working right, let alone as you expected it to work?

You've chosen an extreme example. There are languages that are not quite
300 times slower than C, and yet offer rapid development.

How exactly is this an "extreme example"? It's an objective benchmark which provides a
quantitative comparison of a set of parameters which can't simply be ignored.

Moreover, we aren't discussing general purpose applications, where the most demanding part of the
program is how fast it can repaint a button. We are discussing an application domain where
performance is king, where people spend a lot of money on high-end systems to be able to crunch
numbers a tidbit faster than before. In some cases people even pay good money to run their
programs on specific computers, a service which comes with a price tag which is proportional to
the amount of time it takes for the program to run.

Anyway it's also possible to use such a language to develop an algorithm
(trying perhaps dozens of approaches), then to write the final version in
C.

It's also possible to develop algorithms with pen an paper. That doesn't mean that we should
suddenly drop C in favor of a bic pen and a notepad.


Rui Maciel
 
R

Rui Maciel

Charlton said:
RM> Linked lists are terribly basic data structures that don't pose
RM> any challenge to anyone remotely invested in writing software.

But most of the people involved in scientific computation are *not*
interested in writing software. They're interested in getting the
answer to a complex question that involves significant computation.

That doesn't make any sense. Are you actually claiming that people whose living consists of
writing software are not interested in writing software?

RM> And would your reaction be if you were able to go from those
RM> 48-hour runs to some other run time between 24h and less than 2
RM> minutes just by picking up the right language for the job?

If you can actually cut the run time *in half*, and you don't double the
development time by doing so, then go for it.

That does not match anything I've ever observed in practice, but hey,
it's your fantasy.

I don't deal with fantasies. I deal with measurable, objective results. What's great about this
is that you simply can't contradict a fact with vague, subjective claims about some anecdotal
observation that no one can be sure you even made, let alone if they are remotely close to being
true.


Rui Maciel
 
J

jacob navia

Rui Maciel a écrit :
Looks like they did a terrible job with their choice, as the performance hit goes a bit beyond 8%

http://www.bioinformatics.org/benchmark/results.html


Rui Maciel

Please do not disturb Mr Wilbur with FACTS. As everybody knows, FACTS
are just unimportant. Much more important is to avoid C.

Those benchmark tell the whole story. In some of them perl is 400 times
slower than C. This means a calculation that takes a minute in C, will
take 400 minutes (more than 6 hours) in perl!

Obviously waiting more than 6 hours is OK...
 
J

jacob navia

Rui Maciel a écrit :
I don't deal with fantasies. I deal with measurable, objective results. What's great about this
is that you simply can't contradict a fact with vague, subjective claims about some anecdotal
observation that no one can be sure you even made, let alone if they are remotely close to being
true.

Facts are stubborn.

Thanks for reporting them here.
 
J

Jens Thoms Toerring

chutsu said:
I think I've started a flame war.

Looks like you managed to do that;-)
Getting back to the point, me as a
undergrad physicist I understand (or try to) that there are many tools
for the job, but when I look at stuff available on the internet, being
the "New Generation" we tend to like newer things like scripting
languages such as Python and Ruby. My reason for posting the question
was to to see whether C will cease to be used in future, and to
determine the "General Language" scientists and engineers alike use.

Perhaps you should clarify what you mean with "scientific pro-
gramming". There are at least four domains I can think of

a) not too complicated data manipulation of experimental data
b) not too complicated modelling
c) heavy number crunching applications
d) controlling the hardware for an experiment

Note that the "not too complicated" bit in a) and b) obviously
depends on time - things that were considered "heavy number
crunching" 15 years ago now are easily done on a slow desktop
machine. I once had to resort to writing an integer based FFT
in assembler back then because it was too slow even in C, but
with a not too ancient machine you can do the same nowadays
much faster in an interpeted scripting language...

For a) and b) I would guess that "scripting languages" can be
used quite successfully, especially if there are plug-ins (or
how you want to call it) for a lot of the functionality needed
(I guess the success of MATLAB etc. is due to so many "toolboxes"
being available). If you want to simulate global weather patterns
I guess nobody would use Python or Ruby etc., you will typically
find that stuff is written in FORTRAN, C or C++, especially due
to the availability of very well-tested libraries. And for con-
trolling the experiment's hardware a lot of the programs will be
written in C or C++ or in the equivalent of a "scripting language",
LabView. (And, of course, that again depends on the size of the
experiment, a "simple" spectrometer is quite a different beast
from e.g. an electron synchrotron).
Obviously one cannot predict the future, and I accept that. From what
I have learnt through scanning through the replies is that Fortran is
used. However I personally don't like Fortran all that much, because:
1.) Its seems only old men aged 40~50+ use them

Do you have a problem with people in that age bracket?;-) I can
assure you that being 50 instead of 25 doesn't make that much of
a difference - the change from 15 to 25 is a lot bigger. All
you gain (hopefully) is a bit of experience in not falling for
the latest hype anymore since you have seen that too many times;-)
2.) Its not very widely used in anything else other than science

Since you're asking about programming in science explicitely I do
not see your point.
3.) GNU seems to only have a compiler for the Fortran 95 dialect (am I
right?)
4.) Fortran keeps changing, trying to be something its not by adding
Object Oriented features...

The last time I had to compile a rather complex FORTRAN 77 program
with gcc I didn't experience too many problems (but that was, ad-
mittedly, a few years ago)...
Note, I think the last point is quiet important, because when thinking
in the long term, when one creates a piece of code, I would like to
keep it for a loooonnnnggg time. I want to be able to write reusable
code, so 10-20 years down the line I'll still be able to use functions
or libraries I've created. Scripting languages, like python and ruby
change so much that they aren't backwards compatible. Like in Python
3000, or ruby 1.8 to 1.9. One has to change their old code to work
with the new, but really*? How stupid is that?

Ok, what kind of stuff do you want to write? If it's not too
complicated data manipulation nobody (not even you) will care
in 20 years - there will be a lot better methods then. If I
need to do something that I wrote a program for 15 years ago
I nearly never try to use what I did back then without modi-
fications - I normally will have a look at how I did it but not
just copy and paste it, that hardly ever looks feasible (and I
don't really trust stuff I wrote 15 years ago - one of the (few)
good things of becoming older is learning to avoid some of the
mistakes one made before;-)

If it's real heavy number crunching then you probably won't write
it all on your own as an undergraduate. You will most likely have
to use what others with a bit more of experience picked for the
project.

And if you're into hardware control then a scripting language
may not give you enough freedom to do what needs to be done.
Or you may pick a "mixed approach" - write the bits that need
low-level access to hardware in C and use that in a wrapper
written in a scripting language.
Some times I wish there was a decent scripting language, mature,
and stable. That doesn't change every 10 or 5 years...

MATLAB (and Octave) and Perl look most like what you're asking
here. And they have lot of useful libraries. Python and Ruby
are the "new kids on the block" and when they mature for this
problem domain they may become very interesting contenders. If
you like Python or Ruby you personally could make a difference
by writing good libraries/modules for them and thus help adop-
tion in the science community;-) Just don't expect any language
to become the "silver bullet", that won't happen.

Regards, Jens
 
B

bart.c

Rui Maciel said:
bart.c wrote:

Quite bluntly, if someone is unable to figure out how a linked list works
and believes that it's a
source of "clutter", along with being a potential source of bugs, then if
that person intends to
write a science/engineering... Well, let's say that handling a linked
list is the least of his
problems.

I was talking of the problems of managing flexible, random-access arrays. I
like to be able to write:

a &:= x

or some such syntax, to append an element to array 'a', and worry *only*
about what I want to do with this data, rather than the inherent memory
management problems
And why exactly do you believe you are forced to "deal with with an
element at a time"?

You must know something about C that I don't then.
The word "typically" also reads as "the user doesn't really know what's
going on". There's a lot
of science invested in evaluating margin of errors involving FP
operations, all of which is
rendered useless once that info (or, better yet, the control over that
parameter) is taken away
from you. And what good is your calculation for if you don't have a way
to know how far off it
may be?

The chances are the floating-point format will be 64-bits; there are not so
many alternatives.
Quite bluntly, this is a terrible excuse. If you can't even manage to
type what you mean then
what makes you believe that your program is working right, let alone as
you expected it to work?

Nevertheless, C is fiddlier to type compared with 'easier' languages not
based on the same syntax:

for (int i=1; i<=n; +=i) /* common typo left in */

for i:=1 to n do

14 tokens vs. 7 (and half the time the latter can be written as 'to n do').
(Notice also the 'i' in the C-version appears 3 times; how many times have I
written that third one especially as j or n instead...)
How exactly is this an "extreme example"? It's an objective benchmark
which provides a
quantitative comparison of a set of parameters which can't simply be
ignored.

You've chosen Ruby which is quite slow anyway. Python is faster and some
special versions of it I understand are very fast. There's also LuaJIT which
appeared to rival C for speed. Functional languages have been mentioned too
(although they are not for me).

I've also had a go myself; my last interpreted language was 4-5x slower than
C in floating point benchmarks, and I'm now working on a compiled version
that retains it's rapid development features.
Moreover, we aren't discussing general purpose applications, where the
most demanding part of the
program is how fast it can repaint a button. We are discussing an
application domain where
performance is king, where people spend a lot of money on high-end systems
to be able to crunch
numbers a tidbit faster than before.

One approach is to use the fastest possible language to implement the
essential part of a calculation, then use something like a scripting
language to glue everything together and perhaps provide a user-interface.
Using C for everything is silly.
It's also possible to develop algorithms with pen an paper.

It's not so easy to try them out however and find out they don't work.
 
S

Seebs

That doesn't make any sense. Are you actually claiming that people whose living consists of
writing software are not interested in writing software?

Seems likely to me. Software is a tool for getting a job done. Sometimes,
if you can't get cost-effective rates on having someone else write the
software you need, it may be reasonable for you to write it yourself, even
though writing software is not what you are actually trying to accomplish.

-s
 
B

Ben Bacarisse

jacob navia said:
Rui Maciel a écrit :

Please do not disturb Mr Wilbur with FACTS. As everybody knows, FACTS
are just unimportant. Much more important is to avoid C.

Unfortunately the link to the source code did not work for me so it's
hard to check the details. If anyone gets it to work, please post.
Those benchmark tell the whole story. In some of them perl is 400 times
slower than C. This means a calculation that takes a minute in C, will
take 400 minutes (more than 6 hours) in perl!

Obviously waiting more than 6 hours is OK...

The author's software page lists six items. Two use Java and four use
Perl. It would be interesting to find out why he did not use C for any
of them.
 
C

Charlton Wilbur

RM> That doesn't make any sense. Are you actually claiming that
RM> people whose living consists of writing software are not
RM> interested in writing software?

No. I'm claiming that people whose living consists of finding the
answers to complex and subtle questions are interested in doing that;
they only write software because that is how they find the answers to
those complex and subtle questions.

Charlton
 
C

Charlton Wilbur

S> Interesting. I guess it depends a lot on various factors. For
S> instance, multi-layered loops tend to be very expensive, while
S> string operations are usually not noticably more expensive in
S> perl than in C. For mathematical operations, though, I would
S> expect to see a pretty significant cost to the bytecode
S> interpreter.

Also, the performance advantage of C goes away rather quickly if you're
bound by anything other than processor. If you're disk bound or I/O
bound -- or if you're waiting for human interaction -- it doesn't matter
whether the processor is pegged or is 95% idle.

S> I don't know.

That's the $64,000 question.

Charlton
 
C

Charlton Wilbur

RH> You would have much more credibility in this newsgroup if you
RH> spent more time arguing the issues and less time trying to
RH> ridicule your opposition.

Don't worry - I've seen Jacob frothing at the mouth before, and I am
prepared to accord to him exactly as much credibility as he's earned.

Charlton
 
K

Keith Thompson

jacob navia said:
Exactly that is why perl is GREAT for throw away software.

You want a program that will do something, then be thrown away?

Use perl. In one incrempehensible line that will give good
result in perl 5.1.2 under RedHat 7.5 it will run perfectly.

You got perl 4.xx or perl 6.xx?

Bad luck

You want it to run under windows?

Bad luck.

Perl is write only software. Write it, use it once or twice,
then forget it...

Nobody will be able to debug it, or to understand it later.

This is wrong. If it were topical here, or if you were interested
in my opinion, I'd explain why.
 
B

blmblm

[ snip ]
I think I've started a flame war.

You think? what, you thought you could raise the issue of this
language versus that one and *not* get a certain amount of heat? :)
Getting back to the point, me as a
undergrad physicist I understand (or try to) that there are many tools
for the job, but when I look at stuff available on the internet, being
the "New Generation" we tend to like newer things like scripting
languages such as Python and Ruby. My reason for posting the question
was to to see whether C will cease to be used in future, and to
determine the "General Language" scientists and engineers alike use.

Obviously one cannot predict the future, and I accept that. From what
I have learnt through scanning through the replies is that Fortran is
used. However I personally don't like Fortran all that much, because:
1.) Its seems only old men aged 40~50+ use them
2.) Its not very widely used in anything else other than science

No comment about whether either of these points is true, but what if
they are?
3.) GNU seems to only have a compiler for the Fortran 95 dialect (am I
right?)

No. The Fortran compiler that's part of gcc compiles Fortran 95 (and
Fortran 90), but it also compiles FORTRAN 77. Indeed, aside from
differences in source-code format (FORTRAN 77 is "fixed form", while
Fortran 90 and later can be "fixed form" or "free form"), as far as
I know a valid FORTRAN 77 program is also a valid Fortran 95 program.
4.) Fortran keeps changing, trying to be something its not by adding
Object Oriented features...

It's true that the language continues to evolve, but one of the
reasons people like it, again as far as I know, is that it allows
them to (continue to) use library code developed over many decades.
Backward compatibility is a powerful force .... As you seem to agree:
 
T

Tim Streater

"bart.c said:
The chances are the floating-point format will be 64-bits; there are not so
many alternatives.

A colleague at CERN tells me that the physics analysis programs tend to
be written in C++ these days; FORTRAN is out. But either way you need to
have confidence in what the floating point is doing.
Nevertheless, C is fiddlier to type compared with 'easier' languages not
based on the same syntax:

for (int i=1; i<=n; +=i) /* common typo left in */

Is this going to compile, much less execute properly?
for i:=1 to n do

14 tokens vs. 7 (and half the time the latter can be written as 'to n do').
(Notice also the 'i' in the C-version appears 3 times; how many times have I
written that third one especially as j or n instead...)

Might be nice if a simple loop is all you want. What if you want
something more complex?
 
B

bart.c

Tim Streater said:
Is this going to compile, much less execute properly?

No; the += was an actual typo; I decided to leave it in.

The 'int i' I put in as a reminder that a declaration for i is also needed
(and declared externally, a bit more hassle).
Might be nice if a simple loop is all you want. What if you want
something more complex?

90% of the time, a simple loop, with or without an index, is all that's
needed. For complex loops, the C version can work well (and I've included a
variation of it in one of my projects).
 
C

chutsu

[ snip ]
You think?  what, you thought you could raise the issue of this
language versus that one and *not* get a certain amount of heat?  :)

well I never really did raise the issue of C itself but the
applications of it. I was sort of expecting, YES, or NO to see if C is
being used in science. I stated with the rise of scripting languages
(Python, Ruby), does one still use compiled languages such as C to do
scientific analysis. But I guess it kinda got haywire...
 
N

Nick Keighley

Charlton Wilbur a écrit :
I mean, here's a task.  [...]  How many lines of C code is that?
I can accomplish it with one line of Perl.  An inexperienced Perl
programmer can probably do it in less than two dozen.

Exactly that is why perl is GREAT for throw away software.

you can write readable perl. You can write maintainable perl.

You want a program that will do something, then be thrown away?

Use perl. In one incrempehensible line that will give good
result in  perl 5.1.2 under RedHat 7.5 it will run perfectly.

You got perl 4.xx or perl 6.xx?

Bad luck

you might have a point here. I remember C++ going through a phase a
bit like that.
You want it to run under windows?

Bad luck.

perl runs under Windows. In fact that's one of its attractions to me.
It's a portable script language.
Perl is write only software. Write it, use it once or twice,
then forget it...

Nobody will be able to debug it, or to understand it later.

this isn't actually true. Or needn't be true
 
C

Charlton Wilbur

NK> you can write readable perl. You can write maintainable perl.

I think the project that currently pays for my salary, which is about
500,000 lines of readable, maintainable Perl, and which generates an
amount of revenue measured in the hundreds of millions of dollars, is
proof of that.

Of course, the company *also* uses C++ for a Windows-specific client,
database code in our database engine's quirky stored procedure language,
and Python for a configuration and system administration language.
They've also tried a Java pilot project, but that hasn't gone so well.

NK> you might have a point here. I remember C++ going through a
NK> phase a bit like that.

Perl is much like C in that regard. You can write Perl that runs only
on RedHat (and RedHat was in the habit for a long time of subtly
customizing their Perl builds), and that relies on bugs in particular
versions of the interpreter; but this is really no different than
writing C code that takes advantage of a particular compiler's
resolution of undefined behavior.

NK> perl runs under Windows. In fact that's one of its attractions
NK> to me. It's a portable script language.

In that it's *exactly* as portable as C is. If you write a Perl script
that uses Perl abstractions for I/O and interfacing with the system,
it's completely portable. If you insist on using platform-specific
features, well, surprise, they aren't portable to other platforms!

NK> this isn't actually true. Or needn't be true

Why should being *true* or *false* get in the way of anything Mr. Navia
chooses to believe?

Charlton
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,780
Messages
2,569,611
Members
45,273
Latest member
DamonShoem

Latest Threads

Top