Do Probabilistic Algorithms Outperform Deterministic Ones?

The introduction of randomization into efficient computation has been one of the most fertile and usefifl ide,'~q in computer science. In cryl)tography and ,~synchronous comlmting, randomization makes possil)le t.asks that are iml)ossilfle to l)erform detcrnfinistically. For fimction coml)utation , many examples are known in which randomization allows considerable savings in resources like space and time over deterministic algorithms, or even "only" simplifies them. But to what extent is this seeming power of randomness over determinism real? Tbe most famous concrete version of this question regards the power of B P P , the class problems solvable by probabilistic polynomial time algorithms making small constant error. We know nothing beyond the trivial relation P C_ B P P C E X P , so both P = B P P (read "randomness is useless") or B P P = E X P (read "randomness is all-powerful") are currently equally possible. A major problem is shrinking this gap in our knowledge, or at the very least eliminating the (proposterous) second possibility. A fundamental discovery (that emerged in the early 80's in the sequence of seminal papers [18, 4, 19]) regarding this problem is the "hardness versus randomness" paradigm. It relates this major problem to another equally important one: are there natural hard functions? Roughly speaking, "computationally hard" functions can be used to construct "efficient pseudo-random generators". These in turn lower the randomness requirements of any efficient probabilistic algorithm, allowing for a nontrivial deternfinistic simulation. Thus, under various complexity a.qsumptions, randomness is weak or even "useless", and the challenge becomes to use the'weakest possible assumption, at the hope of finally removing it altogether. Only two methods are known for converting hard functions into t)seudo-random sequences: the BMY-generator (introduced by Blum, Micali and Yao) and the NW-generator (introduced by Nisan and Wigderson). The BMY-generator [4, 19, 8, 9], in which the hardness versus randonmess paradigm first appeared, uses one-way functions. Its construction facilitates using either nonuniform or uniform hardness ~sumptions. The results ar t (informally) summarized below, for nonuiniform assumptions. We use SIZE(s(n)) to denote all functions computable with a family of Boolean circuits of size s(n), and P/poly = SIZE(n~ Also, S U B E X P = ns>oDTIME(2'J), and [~ = DTIME(exp(log n) ~ namely quasi-polynomial time.

[1]  Adi Shamir,et al.  On the Generation of Cryptographically Strong Pseudo-Random Sequences , 1981, ICALP.

[2]  Noam Nisan,et al.  Hardness vs Randomness , 1994, J. Comput. Syst. Sci..

[3]  Leonid A. Levin,et al.  Average Case Complete Problems , 1986, SIAM J. Comput..

[4]  Rolf Herken,et al.  The Universal Turing Machine: A Half-Century Survey , 1992 .

[5]  Avi Wigderson,et al.  P = BPP if E requires exponential circuits: derandomizing the XOR lemma , 1997, STOC '97.

[6]  Leonid A. Levin,et al.  A hard-core predicate for all one-way functions , 1989, STOC '89.

[7]  Noam Nisan,et al.  Pseudorandom bits for constant depth circuits , 1991, Comb..

[8]  Richard J. Lipton,et al.  New Directions In Testing , 1989, Distributed Computing And Cryptography.

[9]  Michael Luby,et al.  Pseudorandomness and cryptographic applications , 1996, Princeton computer science notes.

[10]  José D. P. Rolim,et al.  Hitting Sets Derandomize BPP , 1996, ICALP.

[11]  Manuel Blum,et al.  How to generate cryptographically strong sequences of pseudo random bits , 1982, 23rd Annual Symposium on Foundations of Computer Science (sfcs 1982).

[12]  Andrew Chi-Chih Yao,et al.  Theory and application of trapdoor functions , 1982, 23rd Annual Symposium on Foundations of Computer Science (sfcs 1982).

[13]  Russell Impagliazzo,et al.  Hard-core distributions for somewhat hard problems , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.