******************************************************************************* Searching: find a number in an array 1) How do you find the ace in these 10 cards? * called linear search: best case 1, worst case 10, avg case 5 - this is O(n) 2) What if I told you they were sorted? * called binary search: best case 1, worst case 4, avg case 2 - this is O(log n) Sorting: get the numbers in an array to be in order 1) Take these cards. how did you sort them? give me the steps. * write down algorithms A. Selection sort: while (not sorted) select the smallest unsorted card put it in its place B. Insertion sort: for (each card) find its place in the sorted list so far insert it there 2) How do we figure out the efficiency of these? * an operation is one comparison * selection: n+(n-1)+(n-2)+...+0 * insertion: 0+1+2+...+n * they have the same worst-case formula: n(n+1)/2 -> O(n^2) - but selection always takes this long, no matter what - insertion is sometimes faster, depending on starting order 3) Other ways to sort (that a human wouldn't use) A. How would you sort two groups that were already sorted? * this is a MERGE * demo of mergesort: 8 cards, need 6 people - split in half and pass halves on - when you have just 2, sort them and pass back - when you get some back, MERGE them together * this is recursion! - a method calls itself to solve a smaller problem - each person does same instructions with smaller set - each person is another call of the method - eventually there's the "base case" (smallest problem) - then the calls return back up until they're all done * how efficient is it? - draw the tree: n * log n -> O(nlog n) (better) B. You can split around a "pivot", smaller vs. larger * demo of quicksort: - choose pivot with your eyes closed and split - keep pivot and pass halves on - when you have just 2, sort them and pass back - when you get some back, put on either side of pivot * recursion again; same sort of tree, so also O(nlog n) - but what if the pivot was always really bad? O(n) - almost always it isn't, though - and when it isn't it's faster (less copying) 4) Watch the computer demo! Recursion: method calling itself on smaller problems until base case - has to stop somewhere, or you get infinite recursions - since each call gets its own stack frame: stack overflow 1) Fractals! - these are the coolest way to look at recursion - zoom in on part and it looks the same: smaller version - fractals are mathematically infinite, but pixels aren't, so fractal images have base cases too - show pictures 2) Mystery method: what does it calculate? (n!) public int mystery (int n) { if (n==0) return 1; else return (n * mystery(n-1)); } * draw the tree for n=5 to see how it works 3) Recursion is not always the best way-- sometimes really inefficient * fibonacci sequence: 1 1 2 3 5 8 13 21 * what's the pattern? fib(n) = fib(n-1) + fib(n-2) * what's the base case? there are two: fib(0)=fib(1)=1 * here's the method: public int fib (int n) { if (n==0 || n==1) return 1; else return fib(n-1) + fib(n-2); } * draw the tree for n=5; what's silly about this? * recursion is a bad way to do this, b/c it repeats work a lot * would be better to do this with a loop * use recursion when it makes things more efficient or when iteration would be really hard to program *******************************************************************************