K
Kelsey Bjarnason
[snips]
For some things, sure. Not for others.
I've worked on several small apps written in things such as PHP where
language features made the scripting approach more appealing than using C
or C++, but when doing testing the actual performance was so poor that I
tossed 'em entirely and went back to C.
If code exists in an already latency laden environment, such as a web
page, a little extra overhead doesn't matter, particularly when the code
doesn't actually do all that much; even with a lot of overhead compared to
C, the net result isn't that much in absolute terms. Now try the same
thing with, oh, a chess-playing program - the overhead is quite sufficient
to render the app essentially useless.
CPU usage only accounts for a small time of the run time of a program,
so increasing the CPU usage by 63.9% (or conversely, reducing it by 39%)
doesn't change the overall runtime much. (As an example, look at all
those programs written in scripting languages: They typically have a lot
more overhead than 60%. Yet the performance is adequate.
For some things, sure. Not for others.
I've worked on several small apps written in things such as PHP where
language features made the scripting approach more appealing than using C
or C++, but when doing testing the actual performance was so poor that I
tossed 'em entirely and went back to C.
If code exists in an already latency laden environment, such as a web
page, a little extra overhead doesn't matter, particularly when the code
doesn't actually do all that much; even with a lot of overhead compared to
C, the net result isn't that much in absolute terms. Now try the same
thing with, oh, a chess-playing program - the overhead is quite sufficient
to render the app essentially useless.