MIT - Dynamic Programming
MIT: 19. Dynamic Programming I: Fibonacci, Shortest Paths
https://www.youtube.com/watch?v=OQ5jsbhAv_M
- DP: Careful brute-force
- Badly named, the name was picked sort of arbitrarily
- You can apply memoization to any recursive algorithm
- To memoize means to “write on your memo pad” (really!?)
- Running time (for recursive + memoized) is typically: $\text{number of subproblems} \times \text{time per subproblem}$
- “You can pick the one you find most intuitive” (talking about bottom-up + tabulation vs. top-down + memoization, implying that they’re interchangeable)
- For Fibonacci, the difference is essentially: do you use a loop or a recursive function
We’re doing a topological sort of subproblem dependency DAG
- Bottom-up can often help save space
Shortest Path
- Guessing! Try all the guesses and take the best one)
- Naive (recursive) shortest path:
- Say you want to compute $\delta(s, v)$, the shortest path from vertex $s$ to vertex $v$.
- You start at $v$, and find all $\delta(s, u)$ where $u$ is every node with an edge leading to $v$
- And then use $min(\delta(s, u) + weight(u→v)) ; \forall u$ as the shortest path to $v$
- If you memoize this runs (on DAGs, no cycle) in $O(V+E)$ time
- This is equivalent to a depth-first search + topological sort + one round of Bellman-Ford 🤔
- Recursion + memoization requires that subproblems are acyclic
- Something about generalizing this to work with cyclic graphs but I didn’t really understand it
MIT: 20. Dynamic Programming II: Text Justification, Blackjack
https://www.youtube.com/watch?v=ENyox7kNKeY
- You can generalize DP as always being a shortest path problem for some DAG. You have to apply clever tricks to set up the DAG, but after that the problem becomes simple.
- The algorithmic analysis is so tedious; definitely not my thing at all!
- Word used to use a greedy algorithm for text justification, but now uses a dynamic algorithm.