Putting aside the aesthetics of code for a second, I was wondering if there are cases where having a recursive approach has more benefits than an iterative one.

Am I correct to assume that the recursive approach is always slower and more memory consuming?
On the contrary ! the recursive approach is better for optimized memory usage :) that's at least what i have been taught :P
You can't discuss this out of the scope of a compiler.
Functions that you can transform into tail recursion will incur neither more memory nor a performance hit on a sufficiently recent compiler. So that alone makes your statement wrong: It's not always "slower or more memory consuming".
Perhaps (I am not sure) the only instances where recursion is more optimized than iteration is when code size is an important factor (since you get recursion for free with the call stack).

Still I think all of this is not really anywhere as important as the asymptotic analysis of an iterative program versus a recursive one. Fibonacci natural recursive definition is way more unoptimized than the iterative definition, and the performance difference is an avalanche of scale more than the intricacies of small operations the CPU does for either iter. or recur.
Also, if you're dealing with recursive data structures, recursive code is better (and more natural) than an iterative approach, not just aesthetically. It's just simpler to think about them recursively, and we all know programmer time is more valuable than machine time :)
I agree with arithma on this one
saeidw wroteprogrammer time is more valuable than machine time
nope, it's the other way round when you want to evaluate code performance
mesa177 wrotenope, it's the other way round when you want to evaluate code performance
Performance is important to certain degree, not as important when the gain is minimal.
It depends really. For one thing, recursion is harder to debug.
I would say keep recursion for really simple cases, otherwise iteration. But again, it really depends on the problem at hand, I cant really discuss here without knowing what environment we're talking about and hundreds of other variables.
@arithma: Talking about optimization or performance issues in general generally implies to be on asymptotic behaviors. Especially with such powerful computers widely available in the market.

This article shows an interesting benchmark of iterative vs recursive approaches in the cases of factorial and Fibonacci.

From my point of view, using recursion is improving code readability, at the (usually minimal) expense of performance.

@mesa177: If you spend 3 months working to save 20ms on your execution time, 9 times out of 10 you're losing your time :)
@arithma: Talking about optimization or performance issues in general generally implies to be on asymptotic behaviors. Especially with such powerful computers widely available in the market.
This has been a trend here lately. Developers shouldn't discuss development from their own narrow perspective unless they declare it as such. For some people an hour is a blink and 10 ms are a life time. In both of these, there are cases when asymptotic analysis doesn't give you the insight into performance that you need.
When you usually open a topic about performance, or anyone does, it usually means that it matters with performance critical applications (like signal processing system that has to run in realtime with strict deterministic behavior with respect to time).

Another example: you're writing the floating point division algorithm in VHDL for your microprocessor.

20ms may be a lifetime for some applications (like GPUs). In real time applications (medical imaging, gaming, simulations, ...) the whole time to execute all your code is at best 33ms (to get 30 FPS which is low) or 16ms for 60 FPS which is now standard.

What I was trying to say earlier is recursion does not necessarily imply a performance loss and usually your iterative approach will be emulating recursion: you're getting it either wrong, creating more machine code, or creating something additional to maintain. In those not too unusual cases, there's no trade off, recursion is the better option on all accounts.

However shoehorning a for loop to become a recursive algorithm may mean that obvious transformations to parallelizing code will be become harder to spot (either with your eye or with a profiler). This becomes an issue when your doing iterative programming first for correctness then for profiling and solving bottlenecks.

All of this means nothing when it comes to web development for example. No matter how fast your code is, you'll have to wait for the database to reply to you and populate data. PHP takes advantage of that for example and can get away with being many magnitudes slower than it possibly be optimized.

[edit] Just checked the link, and I have to say it's an ill comparison. I'll have more to say after I get back home.
if you are talking about recursion for simple cases such as factorials, it is more optimized to calculate factorials @ compile time instead of doing a recursive factorial function (using template meta programming in C++) you would have something like:
#include <iostream>

template <int N>
struct Factorial
{
    enum { value = N * Factorial<N-1>::value };
};

template <>
struct Factorial<1>
{
    enum { value = 1 };
};

// example use
int main()
{
    const int fact5 = Factorial<15>::value;
    std::cout << fact5 << endl;
    return 0;
}
notice the value 15, its factorial is computed @ compile time instead of a normal recursive function:
int factorial( int n)
{
    return (n==0) ? 1 : n*factorial(n-1)l
}

int main()
{
    cout << factorial(5) << endl;
}
What do you gain? well 0 cpu ticks :)

Now as arithma said when working on time constrained realtime projects you would need to optimize as much as possible and minimize frame drops.
In many algorithms, you can avoid calling the next few iterations if you do simple checks, where as if implemented recursively (and blindly) you will end up having useless calls.
I remember one time I had a job to re-do a huge recursive function that was actually building a bsp-tree and modify it into iterative with some optimizations in between loops. It gained around 50% speed, but I have to admit that the recursive version was very dumb, and the resultant iterative one was around 1000 lines.
rahmu wroteIf you spend 3 months working to save 20ms on your execution time, 9 times out of 10 you're losing your time :)
It really depends on the application at hand. In my field, that's actually quite an accomplishment concerning medical imaging. In reality, some medical companies spend billions of dollars and years of research to reduce execution time by even 0.5 ms => less ececution time = less radiation exposure => less chances of exposing patient and doctor to cancer (skin cancer mostly).

What I meant is that sometimes you work on your program for several hours in order to cutdown on execution time.

Example: I needed to transmit and receive medical images over optical fibers with an encoding/decoding algorithm that I developed. In order to compare it's performance with other coding techniques, I needed to conduct a test where I send an image 50 times and find the BER, PER, RMSE, etc... I actually worked 3 days non-stop to reduce execution time by 8 hours. The results were more than satisfying, and I showed that my coding technique was much more efficient than others and not just more accurate.
In micro-controllers, memory is limited, you need to optimize your code in a memory efficient way.
J4D wroteOn the contrary ! the recursive approach is better for optimized memory usage :) that's at least what i have been taught :P
I've been taught exactly the opposite.

Whom to believe ?



I always go for the iterative coding, unless I don't find a solution. Then I try out the recursive method.


The Josephus Algorithm is a good example for using a recursive, cause honestly I can't seem to find a solution using iterative. But then again, I suck.