Learning Parities with Structured Noise

In the learning parities with noise problem —well-studied in learning theory and cryptography— we have access to an oracle that, each time we press a button, returns a random vector a ∈ GF(2) together with a bit b ∈ GF(2) that was computed as a ·u+η, where u ∈ GF(2) is a secret vector, and η ∈ GF(2) is a noise bit that is 1 with some probability p. Say p = 1/3. The goal is to recover u. This task is conjectured to be intractable. Here we introduce a slight (?) variation of the model: upon pressing a button, we receive (say) 10 random vectors a1, a2, . . . , a10 ∈ GF(2), and corresponding bits b1, b2, . . . , b10, of which at most 3 are noisy. The oracle may arbitrarily decide which of the 10 bits to make noisy. We exhibit a polynomial-time algorithm to recover the secret vector u given such an oracle. We discuss generalizations of our result, including learning with more general noise patterns. We can also learn low-depth decision trees in the above structured noise model. We also consider the learning with errors problem over GF(q) and give (a) a 2 √ n) algorithm in our structured noise setting (b) a slightly subexponential algorithm when the gaussian noise is small.