Review
- Big O analysis
- Motivating example: Search techniques
- Linear Search O(1) to O(n)
- Binary Search O(1) to O(logn)
- Quick Sort O(nlogn) to O(n^2)
- Merge Sort O(nlogn)
- Selection Sort O(n^2)
- Insertion Sort O(n) to O(n^2)
New Topics:
- Linear Algebra in Programming
- Having C interact with Python
- Parallelizing an algorithm
Techniques I
- Theoretically identifying “hotspots”
- Empirically Identifying “hotspots”
- optimizing whole process v. Code “chunks”
Techniques II
- estimation as optimization (what are your projects specific needs?)
- probabilistic techniques (math examples, e.g. IsPrime())
- Some deterministic optimization techniques (e.g. Numerical matrix methods, matlab Magic)
- Language dependent optimization techniques, low versus high level languages
- Hardware dependent techniques, using all your resources.
- Parallel v. Serial
Has optimization worked?
- Empirical testing
- If probabalistic methods used, is program correct for large enough class of inputs?
Review Excercises
- Exchange and optimize
- independent small research projects/ presentations on commonly used algorithms
#
#
When to optimize (later), when not to optimize (initially).
How to theoretically identify hotspots - big-O analysis.
How to empirically determine out hotspots - performance testing.
Learning about the details of a language and how that’s relevant to optimization.
When do you need to optimize?
When does optimization actually work? E.g., parallel vs. serial approaches in different settings / at different problem scales.
Optimization of whole inputs-into-results process vs. optimization of a code chunk.