You probably heard about the recent breakthrough in computer science - the discovery of an efficient deterministic algorithm for testing primality. This course is about the paradigms underpinning that and several other discoveries, namely randomness and derandomization. |
Over the past few decades, randomness has become one of the most pervasive tools in computer science. Its widespread use includes algorithm design, complexity theory, cryptography, and network protocols. Randomized solutions often perform better than the best known deterministic solutions, and/or are more elegant and simpler to implement. As was the case for primality testing until August 4, the quest for efficient deterministic solutions typically remains open. A promising approach is derandomization: reducing the need for randomness and eliminating it completely at a marginal cost in performance. |
The first part of this course will be devoted to the various uses of randomness in computer science. Although our treatment cannot be exhaustive, we will cover paradigms from all application areas. |
The second part will be on derandomization and the related question of how to obtain the randomness we need. A randomized procedure usually assumes access to a lot of unbiased uncorrelated random bits. These may be hard to obtain physically. We will cover two constructs that come to rescue: pseudorandom generators and extractors. Pseudorandom generators are deterministic algorithms that stretch a short seed of truly random bits into a long sequence of ``pseudorandom'' bits, which the randomized procedure cannot distinguish from truly random. They allow us to reduce the number of truly random bits needed. Extractors are algorithms that extract almost uniformly distributed bits from sources of biased and correlated bits. They allow us to run randomized procedures when we only have access to imperfect physical sources of randomness. |
The precise balance between to two parts of the course will depend on the interest of the students. |
There is no required text. The recommended text is Randomized Algorithms by Motwani and Raghavan. Lecture notes will be made available from the course web page. |
Elementary probability theory (at the level of Math 240), algorithms (CS 577), and basic complexity theory (the parts of CS 520 dealing with the model of a Turing machine, and the classes P and NP). Sufficient mathematical maturity can substitute for the latter two prerequisites. |
|
The following books are on reserve in Wendt library:
|
dieter@cs.wisc.edu |