Quantum Computation via Sparse Distributed Representation

Quantum superposition says that any physical system simultaneously exists in all of its possible states, the number of which is exponential in the number of entities composing the system. The strength of presence of each possible state in the superposition, i.e., its probability of being observed, is represented by its probability amplitude coefficient. The assumption that these coefficients must be represented physically disjointly from each other, i.e., localistically, is nearly universal in the quantum theory/computing literature. Alternatively, these coefficients can be represented using sparse distributed representations (SDR), wherein each coefficient is represented by small subset of an overall population of units, and the subsets can overlap. Specifically, I consider an SDR model in which the overall population consists of Q WTA clusters, each with K binary units. Each coefficient is represented by a set of Q units, one per cluster. Thus, K^Q coefficients can be represented with KQ units. Thus, the particular world state, X, whose coefficient's representation, R(X), is the set of Q units active at time t has the max probability and the probability of every other state, Y_i, at time t, is measured by R(Y_i)'s intersection with R(X). Thus, R(X) simultaneously represents both the particular state, X, and the probability distribution over all states. Thus, set intersection may be used to classically implement quantum superposition. If algorithms exist for which the time it takes to store (learn) new representations and to find the closest-matching stored representation (probabilistic inference) remains constant as additional representations are stored, this meets the criterion of quantum computing. Such an algorithm has already been described: it achieves this "quantum speed-up" without esoteric hardware, and in fact, on a single-processor, classical (Von Neumann) computer.

[1]  H. Everett "Relative State" Formulation of Quantum Mechanics , 1957 .

[2]  Dmitri A. Rachkovskij,et al.  Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning , 2001, Neural Computation.

[3]  Diederik Aerts,et al.  A comparison of geometric analogues of holographic reduced representations, original holographic reduced representations and binary spatter codes , 2011, 2011 Federated Conference on Computer Science and Information Systems (FedCSIS).

[4]  Geoffrey E. Hinton,et al.  Distributed representations and nested compositional structure , 1994 .

[5]  Diederik Aerts,et al.  Tensor-product versus geometric-product coding , 2007, 0709.1268.

[6]  Gerard J. Rinkus,et al.  A combinatorial neural network exhibiting episodic and semantic memory properties for spatio-temporal patterns , 1997 .

[7]  Pentti Kanerva,et al.  The Spatter Code for Encoding Concepts at Many Levels , 1994 .

[8]  Todd A. Brun,et al.  Quantum Computing , 2011, Computer Science, The Hardware, Software and Heart of It.

[9]  A. Garrett Making Sense of Quantum Mechanics: Why You Should Believe in Hidden Variables , 1993 .

[10]  R. Feynman Simulating physics with computers , 1999 .

[11]  Gerard J. Rinkus,et al.  A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality , 2010, Front. Neuroanat..

[12]  Itamar Pitowsky Quantum Speed‐up of Computations , 2002, Philosophy of Science.

[13]  Bart De Moor,et al.  Geometric Analogue of Holographic Reduced Representation , 2007, ArXiv.