That entirely depends on what you need to do. Text manipulation in
Perl is generally very fast and it is often not trivial to write a set
of tools that will outperform most of Perl's text manipulation.
Yes it can. If you use Perl for things it wasn't designed for. Or if
you need to start up your program many, many times for very small run
times (although there are ways around that).
C will usually beat perl hands down. In this simple example C is over
3 times faster.
That depends entirely on your processing and how much of Perl's
internal trickery you can use.
Apart from that, Perl programs generally have some startup cost
(parsing and compiling) that you need to factor out. If you have a
program that spends a large amount of time (relative to the startup
time) processing text, Perl might actually beat a C program.
There are many things that Perl is really, really fast at, for which
you would have to write large amounts of C code to achieve the same
speed. The equivalent Perl program is likely to have many fewer lines
of code, especially if builtin regular expressions, grep, map and
other Perl niceties can be used.
#include <stdio.h>
int main(void) {
int i=0;
char line[BUFSIZ+1];
while(fgets(line, BUFSIZ, stdin) != NULL) {
i++;
}
printf ("-> %d\n", i);
return 0;
}
This of course, is trivial stuff. You have fixed your input line
length, and don't correctly deal with possibly longer lines. You're
not manipulating the contents of the line buffer in any way (even
determining the length of the string, or replacing parts of the text
with something else would possibly already be slower than the Perl
equivalents). Perl strings are much smarter than C strings, and,
depending on what you do, can outperform certain operations on them.
length($buffer) is much faster than strlen(buffer) in the general
case, and O(1) instead of O(n). For long strings, this matters a lot.
The s/// operation in Perl is fast, and probably no slower than using
another regular expression library (or even Perl's own), and manually
replacing things; however, the amount of code to write and maintain is
much, much smaller.
You don't factor out the startup cost for the Perl program. Only
if your program has to run very often for small files would that
startup cost be important. If your program has to run once, for a
large file, the startup cost is likely to be insignificant.
In other words: It all depends, but your example is not at all
representative, or in any way indicative of what the OP could
realistically expect.
I generally use Perl first. If I then decide that there are
performance problems that can't trivially be fixed by using decent
hardware, I see whether I can extract some of the code, and rewrite
the slow bits in XS or C (with Inline::C), or maybe move part of the
processing to a server component that's faster. Writing everything in
C (or another machine-compiled language) is something I generally only
consider for soemthing that I know, beforehand, is time-critical, and
for which I can predict that Perl will be too slow, e.g. numerical
computations (even though there are modules for many computation
intensive things, like PDL).
Without a lot more information, it is absolutely impossible to predict
whether Perl or C will run faster, for the OP. It is likely that the
amount of development time the OP has to spend, will be much smaller
with Perl, however, and time-critical parts of the code could still be
written in C.
Martien