Recursion

Dynamic Programming

Lesson

Dynamic Programming is when you save the results of your recursion in memory. This lets you avoid making the same function calls multiple times. We haven't had to use Dynamic Programming so far, but you need it in certain problems.

We'll start by doing an easy example: the Fibonacci numbers.

Problem: Given an integer n, return the n\tt nth Fibonacci number. The first few Fibonacci numbers are written below.

You could probably use intuition and a for-loop to solve this problem, but we'll use recursion instead, so that we can show you how Dynamic Programming works. This is because Dynamic Programming only works if you have a recursion. We'll end up getting the same exact optimal code.

First Few Test Cases:

We want to compute the nth{n}^{\th} Fibonacci number, f(n)f(n).

1. Recursion

We get each number by adding the 2 previous numbers together.

f(n)=f(n1)+f(n2)\notag \small f(n) = f(n-1) + f(n-2)

2. Base case

We start with f(0)=1\small f(0)=1 and f(1)=1\small f(1)=1.

Normal Recursion

3. Code

Normally we'd just code up the Fibonacci numbers like this:

But it turns out this is very inefficient. If we call f(100)f(100), it will call f(99)f(99) and f(98)f(98), and those will call more f(...)f(...)s.

The problem is that many of these calls are recomputed many times, like f(98)f(98). The computer recurses down the same exact call tree for both calls, which is extremely expensive and unnecessary. We get a "Time Limit Exceed" error because of this.

Time Complexity O(2n)O(2^n)

Space Complexity O(n)O(n) for the Call Stack.

When you avoid doing redundant function calls, this is called Dynamic Programming. It's a fancy name for a very simple idea. Here are the 3 methods you can use to do Dynamic Programming.

Option 1: Tabulation

To avoid doing redundant computation, you can compute the function calls yourself in the order they're needed. In this problem f(n)f(n) depends on f(smaller things)f(\text{smaller things}), so you can just compute f(0)f(0) first, f(1)f(1) next, ..., all the way up to f(n)f(n). This is called Tabulation, and it totally fixes the problem.

Time Complexity O(n)O(n)

Space Complexity O(n)O(n) for fibs\tt fibs.

Option 2: Tabulation+

Sometimes with Tabulation, you can throw out old values to optimize. Here, each f(n)f(n) only uses the previous two function calls f(n1)f(n-1) and f(n2)f(n-2), and none of the smaller ones. So we can save memory by just storing the most recent two f(n)f(n)s we've seen. This gives you the intuitive for-loop solution.

Time Complexity O(n)O(n)

Space Complexity O(1)O(1)

Option 3: Memoization

We don't recommend Memoization because it doesn't always give you the optimal solution, while Tabulation does. Only use it if you're in a rush or don't care about optimizing. To use Memoization, you just write your code like you normally would, as a recursive function. Before you make a function call, you check to see if you've already computed it.

Memoization avoids doing redundant computations, but it still uses recursion. This means we have to keep track of the function calls in a Call Stack, which takes up memory.

Time Complexity O(n)O(n)

Space Complexity O(n)O(n) for the Call Stack and for memo\tt memo.

Mark as Completed:
Submits:
f
Test your code to get an output here!
f(
)
Test your code to get an output here!