I find this answer interesting, mainly because I think is suggests a
very different view about programming. I would not try to avoid
recursion "if at all possible". In general I consider that it is up to
the system to provide the resources needed by a program and this
includes stack for a reasonable amount of recursion. These resources
can run out of course, and the stack is special in that few languages
provide any way to handle its exhaustion elegantly, but that does not
seem enough reason to design out all recursion.
If you compare recursive to equivalent non-recursive algorithms, i
think you will find recursion is usually slower and demands (much)
more resources. So as a rule of thumb, don't recurse, unless you have
to.
And yes, i do have a background in embedded systems, where these
considerations are more important than on modern PC's.
There are special cases: some environments are severely resource limited
but there's no indication that the OP is using such a system and,
anyway, I don't think you can make general rules from specific situations..
well... It's bread and butter for the science guys and the practice is
called "inference". Nevertheless, it's easy to show that recursive
algorithms are outperformed my equivalent non-recursive ones.
The saving grace of recursion is that recursive implementations are
usually easier to understand. If it weren't for that, i'd ban the
practice outright.
Isn't this what you are doing? Isn't the ban on recursion not an
artificial limitation?
Nope. It follows from the design of processors and a wish to avoid
unneccesary overhead (passing arguments, stack-frame maintenance,
pushing and popping return adresses) and that's without even
considering any effect a call may have on things like branch-
prediction, instruction pipe-lines and such. A 'call' instruction is
(relative to a branch) quite expensive and prevents some optimisations
like loop-unrolling.
So no, that's not an artificial limitation, but a design
consideration.