Runtime Analysis. Understanding the runtime of code involves deep thought. It amounts to asking: “How long does it take to do stuff?”, where stuff can be any conceivable computational process whatsoever. It simply cannot be done mechanically, at least for non-trivial problems. As an example, a pair of nested for loops does NOT mean $\Theta(N^2)$ runtime as we saw in lecture.
Cost Model. As an anchor for your thinking, recall the idea of a “cost model” from last lecture. Pick an operation and count them. You want the one whose count has the highest order of growth as a function of the input size.
Important Sums. This is not a math class so we’ll be a bit sloppy, but the two key sums that should know are that:
- $1 + 2 + 3 + … + N \in \Theta(N^2)$
- $1 + 2 + 4 + 8 + … + N \in \Theta(N)$
Practice. The only way to learn this is through plenty of practice. Naturally, project 2 is going on right now, so you probably don’t have the spare capacity to be thinking too deeply, but make sure to work through the problems in lecture and below once you have room to breathe again.
To try the Coursera problem, you’ll need to make an account for the linked course (that I worked on while I was at Princeton).
- Suppose the optimal (and possibly unknown) solution to problem P has order of
growth F(N). Suppose that the best known solution has runtime that is
Θ(B(N)). Finally, suppose that there is a clever proof that no solution can
possibly have order of growth that is less than L(N). Which of the following
can you surmise?
- F(N) = O(B(N))
- B(N) = O(F(N))
- The limit of F(N)/B(N) as N goes to infinity cannot be infinity.
- F(N) = Ω(L(N))
- L(N) = Ω(F(N))
- B(N) > F(N) for sufficiently large N.