Recursion

Lesson

Dynamic Programming is when you save the results of your recursion in memory. This lets you avoid making the same function calls multiple times. We haven't had to use Dynamic Programming so far, but you need it in certain problems.

We'll start by doing an easy example: the Fibonacci numbers.

**Problem:** Given an integer `n`

, return the $\tt n$th Fibonacci number.
The first few Fibonacci numbers are written below.

You could probably use intuition and a for-loop to solve this problem, but we'll use recursion instead, so that we can show you how Dynamic Programming works. This is because Dynamic Programming only works if you have a recursion. We'll end up getting the same exact optimal code.

First Few Test Cases:

We want to compute the ${n}^{\th}$ Fibonacci number, $f(n)$.

1. Recursion

We get each number by adding the 2 previous numbers together.

$\notag \small f(n) = f(n-1) + f(n-2)$

2. Base case

We start with $\small f(0)=1$ and $\small f(1)=1$.

3. Code

Normally we'd just code up the Fibonacci numbers like this:

But it turns out this is very inefficient. If we call $f(100)$, it will call $f(99)$ and $f(98)$, and those will call more $f(...)$s.

The problem is that many of these calls are recomputed many times, like $f(98)$. The computer recurses down the same exact call tree for both calls, which is extremely expensive and unnecessary. We get a "Time Limit Exceed" error because of this.

Time Complexity $O(2^n)$

Space Complexity $O(n)$ for the Call Stack.

When you avoid doing redundant function calls, this is called Dynamic Programming. It's a fancy name for a very simple idea. Here are the 3 methods you can use to do Dynamic Programming.

To avoid doing redundant computation, you can compute the function calls *yourself* in the order they're needed. In this problem
$f(n)$ depends on $f(\text{smaller things})$, so you can just compute $f(0)$ first, $f(1)$ next, ..., all the way up to $f(n)$. This is called Tabulation, and it totally fixes the problem.

Time Complexity $O(n)$

Space Complexity $O(n)$ for $\tt fibs$.

Sometimes with Tabulation, you can throw out old values to optimize. Here, each $f(n)$ only uses the previous two function calls $f(n-1)$ and $f(n-2)$, and none of the smaller ones. So we can save memory by just storing the most recent two $f(n)$s we've seen. This gives you the intuitive for-loop solution.

Time Complexity $O(n)$

Space Complexity $O(1)$

We don't recommend Memoization because it doesn't always give you the optimal solution, while Tabulation does. Only use it if you're in a rush or don't care about optimizing. To use Memoization, you just write your code like you normally would, as a recursive function. Before you make a function call, you check to see if you've already computed it.

Memoization avoids doing redundant computations, but it still uses recursion. This means we have to keep track of the function calls in a Call Stack, which takes up memory.

Time Complexity $O(n)$

Space Complexity $O(n)$ for the Call Stack and for $\tt memo$.

Mark as Completed: