Double increment question.

A

av

Robbie said:
[...]
memcpy did occur to me, yes. But I'm writing a program which I want to
be as small and fast as possible, so I'm doing things "manually". [...]

Digging a hole in the ground is a wearisome and tedious
task, and I'd like it to take as little time as possible.
That's why I told that guy with the backhoe to go somewhere
else, threw away my silly old shovel, and am now "doing things
manually" by scrabbling in the dirt with my fingernails. ;-)

More seriously, it seems more than a little likely that
you are committing the sin of premature optimization. Until
and unless you have MEASURED a performance problem -- not
hypothecated, not supposed, not "it stands to reason-ed" --
until you have made MEASUREMENTS it is irresponsible folly to
micro-optimize.

"Premature optimization is the root of all evil."
-- D.E. Knuth

don't know
"We follow two rules in the matter of optimization:
Rule 1: Don't do it.
Rule 2 (for experts only): Don't do it yet."
-- M.A. Jackson

these rules are wrong
"More computing sins are committed in the name of efficiency
(without necessarily achieving it) than for any other single
reason, including blind stupidity."
-- W.A. Wulf

this can be for the c language too (that would find the speed with 0
terminated string but find only difficulties and buffers overflows)
it seems i can think of a string class that do everyting more simple
and with no possible errors in each its 'atomic' operations
In other words, I'm not the only person crying that ab initio
micro-optimization is folly; smart people do so, too. Be smart.

i not agree if assembly can reduce the executable dimendion for a
factor 10 and the code is in the cache of cpu
 
A

av

If it works that way with the one, I suppose it does with the
other. However, my usual way of coping strings in C++ is:

std::string str1 ("Fred"); // make string str1 containing "Fred"
std::string str2 (str1); // make string str2 and copy str1 to str2

A bit simpler than in C. :)

in c++ all can be more simple and secure than c
 
F

Flash Gordon

av said:
Robbie said:
[...]
memcpy did occur to me, yes. But I'm writing a program which I want to
be as small and fast as possible, so I'm doing things "manually". [...]
Digging a hole in the ground is a wearisome and tedious
task, and I'd like it to take as little time as possible.
That's why I told that guy with the backhoe to go somewhere
else, threw away my silly old shovel, and am now "doing things
manually" by scrabbling in the dirt with my fingernails. ;-)

More seriously, it seems more than a little likely that
you are committing the sin of premature optimization. Until
and unless you have MEASURED a performance problem -- not
hypothecated, not supposed, not "it stands to reason-ed" --
until you have made MEASUREMENTS it is irresponsible folly to
micro-optimize.

"Premature optimization is the root of all evil."
-- D.E. Knuth

don't know
"We follow two rules in the matter of optimization:
Rule 1: Don't do it.
Rule 2 (for experts only): Don't do it yet."
-- M.A. Jackson

these rules are wrong

What makes you think that you know better than well respected experts?
this can be for the c language too (that would find the speed with 0
terminated string but find only difficulties and buffers overflows)
it seems i can think of a string class that do everyting more simple
and with no possible errors in each its 'atomic' operations

Classes are off topic here. In any case, what you are saying has nothing
significant that I can see that is relevant to optimisation. Buffer
overflows are a rather different problem, and micro-optimisation
increases the likely hood of them and other errors.
i not agree if assembly can reduce the executable dimendion for a
factor 10 and the code is in the cache of cpu

Most people will *not* be able to get anything like that level of
improvement. They are even less likely to get it whilst staying with C,
as evidence a recent thread in which someone posted what they thought
was a more efficient solution, but it was demonstrated to be generally
*less* efficient.

One is in general far more likely to achieve significant improvements be
selecting the correct algorithm and design. Only when those are optimal,
and if there is still a demonstrable problem, should one use
measurements to find the problem areas and optimise them.
 
C

Chris Dollin

av said:
these rules are wrong

Why do you think that?

Address what needs to be addressed when it needs to be addressed.

("Not optimising" doesn't mean "write stupidly slow code".)
 
A

av

av said:
Robbie Hatley wrote:
[...]
"Premature optimization is the root of all evil."
-- D.E. Knuth

don't know
"We follow two rules in the matter of optimization:
Rule 1: Don't do it.
Rule 2 (for experts only): Don't do it yet."
-- M.A. Jackson

these rules are wrong

What makes you think that you know better than well respected experts?

i find a lot of fun searching for optimisation code (and algorithm)
and until now not have wrong experiences
i would translate my c files in assembly and i'm sure in the
translated files, instructions will be between 1/10 and 1/100 than C
files, but the cpu has enough cache for the code and i'm lazy
Classes are off topic here. In any case, what you are saying has nothing
significant that I can see that is relevant to optimisation. Buffer
overflows are a rather different problem, and micro-optimisation
increases the likely hood of them and other errors.

don't all you want to optimise when use 0 ended strings?
is it that a "More computing sins ..."?
Most people will *not* be able to get anything like that level of
improvement.
They are even less likely to get it whilst staying with C,
as evidence a recent thread in which someone posted what they thought
was a more efficient solution, but it was demonstrated to be generally
*less* efficient.

think for find efficiency is never wrong because one can test the
solution. the only wrong i see is the time someone spend to think
about it. But is "to think" wrong?
One is in general far more likely to achieve significant improvements be
selecting the correct algorithm and design. Only when those are optimal,
and if there is still a demonstrable problem, should one use
measurements to find the problem areas and optimise them.

one good idea is better that all the optimal-ways walk
 
A

av

Why do you think that?

Address what needs to be addressed when it needs to be addressed.

("Not optimising" doesn't mean "write stupidly slow code".)

i have a routine R1 that with input A has output B that follow the
algo C.
optimisation is
search a routine R2 that with input A has output B that follow an algo
H that has minimum instructions for doing calculation

are we agreeing on that definition?

why i would not try to write the "H" routine?
in my little experience errors could be in R1 too ...
and think on that routine can make to see errors in R1 too
 
A

Andrew Poelstra

in c++ all can be more simple and secure than c

Hardly. In C you can see exactly what's happening. No black boxes, hidden
*this pointers, overridden functions, *shudder* passing by reference with
no discernable change in how you call it, and no *also shudder* overloaded
operators.

That, and a monkey could read some manpages and figure out what any line
of C does (although he obviously would be unable to recreate that or
figure out /why/ each line does what it does. Monkeys aren't the brilliant
typists that people would like us to believe.) C++ has a plethora of new
concepts, keywords, styles, etc. Not everything is simpler in C++.

Finally, C itself is not insecure. Just because you're a bad programmer
(and a lazy typist) does not mean that you should switch to a language
where your inadequacies are hidden. It means that you should learn your
language well. Buffer overflows are easy to avoid ("Don't use gets()!"
will prevent 90% of them), as are pretty well all of the common pitfalls.
 
A

Andrew Poelstra

av said:
[...]
"Premature optimization is the root of all evil."
-- D.E. Knuth

don't know

"We follow two rules in the matter of optimization:
Rule 1: Don't do it.
Rule 2 (for experts only): Don't do it yet."
-- M.A. Jackson

these rules are wrong

What makes you think that you know better than well respected experts?

i find a lot of fun searching for optimisation code (and algorithm)
and until now not have wrong experiences
i would translate my c files in assembly and i'm sure in the
translated files, instructions will be between 1/10 and 1/100 than C
files, but the cpu has enough cache for the code and i'm lazy

Set the compiler to its maximum optimization level, and see whether or
not you're smarter than it.
one good idea is better that all the optimal-ways walk

This is a typical sentence of yours: please use proper grammar and
punctuation, and try to make your sentences more coherant (if you
don't speak English natively, this will be harder than the others).
As it stands, I have no idea what you just said.
 
R

Robbie Hatley

Eric Sosman wrote of my string copy experiments:
... it seems more than a little likely that you are
committing the sin of premature optimization.

If this was a large app for long-term use by many, being
built and maintained by a team, then perhaps that might be
true.

But it's actually a tiny hobby app which I wrote expressly
for the purpose of optimization experimentation.
Until and unless you have MEASURED a performance problem
not hypothecated, not supposed, not "it stands to reason-ed"
until you have made MEASUREMENTS it is irresponsible folly to
micro-optimize.

It's already in a batch file like so:

clock
MyProgram
clock

So I can instantly see whether making a particular change
causes execution time to go up, go down, or stay about the
same.

Hypotheses must come first. Then experimentation to determine
whether your hypotheses are brilliance or bunk. That's the
scientific method.
"Premature optimization is the root of all evil."
-- D.E. Knuth

No, I think religiosity (addiction to untested ideas) is the
cause of most evil (including many bad computer programs, and
also including most wars).
"We follow two rules in the matter of optimization:
Rule 1: Don't do it.
Rule 2 (for experts only): Don't do it yet."
-- M.A. Jackson

That's stupid. Fear gains nothing. Say instead:
1. If you see an opportunity to optimze, do it, as long as it
doesn't significantly damage readibility or modularity.
2. Test it.
3. If it didn't significantly improve performance, revert.
"More computing sins are committed in the name of efficiency
(without necessarily achieving it) than for any other single
reason, including blind stupidity."
-- W.A. Wulf

I think far more computing sins have been commited in the name
of impatience and the desire to "get this shit done and get out
of here so I can go home and have a beer and watch the game",
as my (fired) former boss used to say.
In other words, I'm not the only person crying that ab initio
micro-optimization is folly; smart people do so, too. Be smart.

False reasoning. Imitating superficial aspects of the behavior
of smart people will not make one smart.

And while it is perhaps true that most programs should not be
"optimized ab inito", some should be.

--
Cheers,
Robbie Hatley
East Tustin, CA, USA
lone wolf intj at pac bell dot net
(put "[usenet]" in subject to bypass spam filter)
home dot pac bell dot net slant earnur slant
 
F

Flash Gordon

Andrew said:
av wrote:
[...]
"Premature optimization is the root of all evil."
-- D.E. Knuth
don't know

"We follow two rules in the matter of optimization:
Rule 1: Don't do it.
Rule 2 (for experts only): Don't do it yet."
-- M.A. Jackson
these rules are wrong
What makes you think that you know better than well respected experts?
i find a lot of fun searching for optimisation code (and algorithm)
and until now not have wrong experiences
i would translate my c files in assembly and i'm sure in the
translated files, instructions will be between 1/10 and 1/100 than C
files, but the cpu has enough cache for the code and i'm lazy

Set the compiler to its maximum optimization level, and see whether or
not you're smarter than it.

And use a modern compiler. Compiler optimisation has moved on a lot in
the last 20 years.
This is a typical sentence of yours: please use proper grammar and
punctuation, and try to make your sentences more coherant (if you
don't speak English natively, this will be harder than the others).
As it stands, I have no idea what you just said.

Agreed. We can cope with bad English to a degree, but av is extremely
hard to understand.
 
E

Eric Sosman

av wrote On 07/26/06 12:33,:
i have a routine R1 that with input A has output B that follow the
algo C.
optimisation is
search a routine R2 that with input A has output B that follow an algo
H that has minimum instructions for doing calculation

are we agreeing on that definition?

why i would not try to write the "H" routine?

To begin with, things are seldom that simple. For
example, suppose you can implement R1 in portable C but
must use assembly language for R2. This means you need
to re-implement R2 every time you move to a new machine;
if you really think optimization is The Most Important
Thing In The World, you probably need to re-implement R2
even if the new machine has the same instruction set as
the old. So it's not a question of implementing R1 vs.
R2, but a question of implementing R1 vs. all of R2a (for
Pentium IV), R2b (Xeon), R2c (Opteron), R2d (Power4),
R2e (Power5), R2f (UltraSPARC-IIIi), R2g (UltraSPARC-IV),
R2h (UltraSPARC-IV+), R2i (UltraSPARC-T1 "Niagara"), ...

Second, you must assess the likely benefits. "R2 is
faster" is not a strong enough statement; you need to know
how much faster it will be and balance that against the
extra effort required to develop and maintain it. The
question of how many times Rx will be used comes into this
assessment: I put it to you that it is of *no* use at all
to save ten nanoseconds per abort() call! (If you spend
just one minute inventing, implementing, documenting, and
testing a ten-nanosecond improvement to abort(), you will
not recoup your effort until *six billion* programs have
died horrible deaths.)

Third, there's the phenomenon that "clever" code is
more attractive to bugs than is simple code. It is easier
to see and remove a problem from the neatly-trimmed lawn
of a stretch of simple code than when it's camouflaged amid
the high grass, thorn thickets, and meandering waterways
of the wild Everglades. [Insert comment about "up to one's
rear in reptiles" here.] Even if tricksy code starts life
bug-free, it is more susceptible to acquiring bugs later on
as programmers who understand it imperfectly (and don't
realize the lack) "improve" the code to meet new requirements.
Those programmers are not some inferior subspecies of half-
wits, either: They are likely to be YOU. I believe it was
Kernighan who observed that debugging code is harder than
writing it; it follows that if you write it at the limits
of your own cleverness, it will be beyond your power to fix.

Fourth and finally, let's be clear about what kinds of
"optimization" we're considering. Replacing bubblesort with
Shellsort is not an "optimization," but a redesign using a
superior algorithm. Finding a way to re-use some of the work
of iteration N so that iteration N+1 needn't recalculate it
is a similar transformation. Changes in algorithm, changes
in data structure -- these are "finding better solution
methods," not "optimizations." The specific substitution
that sparked this sub-thread had to do with copying data
via a loop in open code instead of using memcpy(), based on
an untested assumption that the open code loop would be
faster -- that sort of thing is an "optimization" or (for
clarity) a "micro-optimization," and that sort of thing is
folly if done without measurement and a clear understanding
of the (supposed) benefits.

The days of Mel are behind us. The economics that made
his heroics worth while are gone, in fact, inverted: CPU
time became cheaper than programmer time long ago. Practices
that were once essential to success have become irrelevant,
replaced by newer practices that would not have made sense in
the world as it once was. If you act as if you were still in
that world you are dreaming; wake up and smell the gigahertz!
 
K

Keith Thompson

av said:
i have a routine R1 that with input A has output B that follow the
algo C.
optimisation is
search a routine R2 that with input A has output B that follow an algo
H that has minimum instructions for doing calculation

are we agreeing on that definition?

why i would not try to write the "H" routine?
in my little experience errors could be in R1 too ...
and think on that routine can make to see errors in R1 too

The rules of optimization, quoted above, refer mostly to source-level
micro-optimization, things like adding "register" keywords and
manually unrolling loops. These are things that should be avoided
unless (a) you really know what you're doing, and (b) it significantly
and *measurably* helps performance. Without those conditions, you're
just as likely to make the code more difficult to read and maintain
for the sake of an optimization that the compiler could have done for
you.

On the other hand, choosing the best high-level algorithm for a task
is perfectly reasonable. For example, using a quicksort or mergesort
rather than a bubblesort is perfectly reasonable. Tweaking the inner
loop of a bubblesort so it runs 20% faster almost certainly is not.

Write clear code, with an emphasis on making it obvious what you're
doing rather than how you're doing it. If it works correctly and is
fast enough, you're done. If not, *then* you might consider
source-level micro-optimization if all other approaches have failed.
 
A

av

Hardly. In C you can see exactly what's happening. No black boxes, hidden
*this pointers, overridden functions, *shudder* passing by reference with
no discernable change in how you call it, and no *also shudder* overloaded
operators.

in borland c++ i can see what happen and in 100-150 routines until now
i have no problem in this direction
you sound like someone that has never tried ...
That, and a monkey could read some manpages and figure out what any line
of C does (although he obviously would be unable to recreate that or
figure out /why/ each line does what it does. Monkeys aren't the brilliant
typists that people would like us to believe.) C++ has a plethora of new
concepts, keywords, styles, etc. Not everything is simpler in C++.

i don't use all; i choose what "work for me" and classes
distructors/constructors are good
Finally, C itself is not insecure. Just because you're a bad programmer
(and a lazy typist) does not mean that you should switch to a language
where your inadequacies are hidden. It means that you should learn your
language well. Buffer overflows are easy to avoid ("Don't use gets()!"
will prevent 90% of them), as are pretty well all of the common pitfalls.

i not agree
 
A

av

Agreed. We can cope with bad English to a degree, but av is extremely
hard to understand.

if someone can correct where i wrong, i can say to him-her-it(the
automatic corrector) a big "thank you"
 
A

av

av wrote On 07/26/06 12:33,:

To begin with, things are seldom that simple. For
example, suppose you can implement R1 in portable C but
must use assembly language for R2. This means you need
to re-implement R2 every time you move to a new machine;
if you really think optimization is The Most Important
Thing In The World, you probably need to re-implement R2
even if the new machine has the same instruction set as
the old. So it's not a question of implementing R1 vs.
R2, but a question of implementing R1 vs. all of R2a (for
Pentium IV), R2b (Xeon), R2c (Opteron), R2d (Power4),
R2e (Power5), R2f (UltraSPARC-IIIi), R2g (UltraSPARC-IV),
R2h (UltraSPARC-IV+), R2i (UltraSPARC-T1 "Niagara"), ...

ok, if you have only one cpu

all software is not the same
the are the difficult part (routine for calculus and difficult algo)
that will be difficult in any language of programming
and have the need of much effort
for that software i think is better that who maintain it == who wrote
it
Second, you must assess the likely benefits. "R2 is
faster" is not a strong enough statement; you need to know
how much faster it will be and balance that against the
extra effort required to develop and maintain it.

yes (for how you see the thing)
The
question of how many times Rx will be used comes into this
assessment: I put it to you that it is of *no* use at all
to save ten nanoseconds per abort() call! (If you spend
just one minute inventing, implementing, documenting, and
testing a ten-nanosecond improvement to abort(), you will
not recoup your effort until *six billion* programs have
died horrible deaths.)
yes
....

I believe it was
Kernighan who observed that debugging code is harder than
writing it; it follows that if you write it at the limits
of your own cleverness, it will be beyond your power to fix.

i have not meet such function until now
the problems i fear are not errors that show themselves but hidden
errors that show themselves one time over 1000000
there is the need of some way to trace them
Fourth and finally, let's be clear about what kinds of
"optimization" we're considering. Replacing bubblesort with
Shellsort is not an "optimization," but a redesign using a
superior algorithm. Finding a way to re-use some of the work
of iteration N so that iteration N+1 needn't recalculate it
is a similar transformation. Changes in algorithm, changes
in data structure -- these are "finding better solution
methods," not "optimizations."

so there is "optimizations" in rewrite a c routine in assembly because
1) the number of assembly instruction decrease in the assembly version
2) in the translation, 90% of times, i can see the problem in a
different way and so i change the data structure or the routines way
of doing something so i change the algorithm

all has its pro and contra

contra - i know sometime all this can improve on "nanoseconds" and the
code will be unmantenable because few know assembly

contra - for write all particulars of that routines there is the need
of much time

pro - there will be more time to think on it, there will be more ideas
in the mind, and in what i know bugs will be less in the assembly
version than in the C version

pro - if someone write in assembly he/she grow the ability in debug
area and all this allow to understand how all pc works and why
something [can] goes wrong
 
I

Ian Collins

av said:
if someone can correct where i wrong, i can say to him-her-it(the
automatic corrector) a big "thank you"

Correct capitalisation would be good start...
 
F

Flash Gordon

Ian said:
Correct capitalisation would be good start...

Fullstops between sentences are also essential. I just gave up on one of
av's messages because working out where one sentence ends and another
begins was too much like hard work.

I believe the av's native language is one which uses fullstops and
capitalisation, so English not being his/her first language is no excuse.
 
K

Keith Thompson

Flash Gordon said:
Fullstops between sentences are also essential. I just gave up on one
of av's messages because working out where one sentence ends and
another begins was too much like hard work.

I believe the av's native language is one which uses fullstops and
capitalisation, so English not being his/her first language is no
excuse.

Some non-British readers, particularly non-native English speakers,
might not be aware that "fullstop" refers to the '.' character (more
commonly called "period" in American English).
 
R

Richard Heathfield

Keith Thompson said:
Some non-British readers, particularly non-native English speakers,
might not be aware that "fullstop" refers to the '.' character (more
commonly called "period" in American English).

To be a little more precise, the English name for '.' is "full stop", not
"fullstop".
 
P

Philip Potter

Robbie Hatley said:
Eric Sosman wrote of my string copy experiments:


If this was a large app for long-term use by many, being
built and maintained by a team, then perhaps that might be
true.

But it's actually a tiny hobby app which I wrote expressly
for the purpose of optimization experimentation.

By all means then, experiment. It is an interesting exercise to find out how
fast you can implement something.

But I hope you understand how useless such practices are in real code,
outside of system libraries (such as those that implement memcpy() ). Also
see the below caveat about how to measure improvements.
It's already in a batch file like so:

clock
MyProgram
clock

So I can instantly see whether making a particular change
causes execution time to go up, go down, or stay about the
same.

Hypotheses must come first. Then experimentation to determine
whether your hypotheses are brilliance or bunk. That's the
scientific method.

That is indeed the scientific method. However, your experiment doesn't
measure what you think it measures, and so will lead you to false
conclusions.

Much of the time between the two calls to "clock" will be set-up before
running MyProgram (allocating memory, loading code into memory, initialising
data, changing memory protection setup) and clean-up afterwards
(deallocating memory &c). These can take variable amounts of time depending
on how busy the OS is, how much of your program is in cache already, and the
phase of the moon, and so the variations in setup and cleanup can be greater
than the time-change to the core of the program. This means that even if
your program is running faster, it can appear to run slower, or worse - if
it is running slower, it can appear to run faster.

It's much better to use a tool designed for the job, such as a profiler.
No, I think religiosity (addiction to untested ideas) is the
cause of most evil (including many bad computer programs, and
also including most wars).

It's not an untested idea. It's based on years of collective experience.
That's stupid. Fear gains nothing. Say instead:
1. If you see an opportunity to optimze, do it, as long as it
doesn't significantly damage readibility or modularity.

You've already broken your own rule. The "optimization" you wrote was
sufficiently confusing that you had to ask c.l.c about it. memcpy() would
have been much more readable.
2. Test it.
3. If it didn't significantly improve performance, revert.

Amdahl's law: optimize the common case.

If you optimize code which executes rarely, you spend lots of time on steps
1,2 and 3 above for an optimization which doesn't gain you much *overall*
speedup. Programmer time is expensive.

Remember that you don't know what "the common case" is until you have
measured it using a profiler. If you hold off optimization until the program
is complete or near-complete, you have a much better idea of a) whether any
optimization is necessary, and b) if it is necessary, where it should be
targeted. Hence "Don't do it yet.".

In most cases, optimization won't be necessary at all - the code is good
enough. All that effort on your 3-step plan will have been wasted.
False reasoning. Imitating superficial aspects of the behavior
of smart people will not make one smart.

Ignoring smart people does not make one smart either.
And while it is perhaps true that most programs should not be
"optimized ab inito", some should be.

Yes. One example of code which should be "optimized ab inito" is that used
in system libraries, like implementations of memcpy(). If you are truly
interested in optimization, I suggest you learn about how memcpy() is
implemented on various machines. It's very interesting, and a good example
of why such optimizations should be restricted to library code.

Philip
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,049
Latest member
Allen00Reed

Latest Threads

Top