Hardness vs. randomness-a survey

Summary form only given, as follows. Probabilistic algorithms are considered to be as practical as deterministic ones for solving computational problems. However, obvious practical and theoretical considerations have led to the questions of when, and at what cost, can one get rid of the randomness in these algorithms. A natural direction was to follow the lead of real computers and use deterministic functions to generate from few random bits many pseudorandom bits that will be random enough for the algorithm. One of the remarkable consequences of this line of research is that obtaining such upper bounds (on simulating probabilistic algorithms by deterministic ones) is intimately related to obtaining lower bounds (on the functions used to generate the pseudorandom bits). The authors survey the development of key ideas leading to understanding the connection between hardness and randomness, and its complexity theoretic implications.<<ETX>>