Data Smashing

Investigation of the underlying physics or biology from empirical data requires a quantifiable notion of similarity when do two observed data sets indicate nearly identical generating processes, and when they do not. The discriminating characteristics to look for in data is often determined by heuristics designed by experts, e:g:, distinct shapes of “folded” lightcurves may be used as “features” to classify variable stars, while determination of pathological brain states might require a Fourier analysis of brainwave activity. Finding good features is non-trivial. Here, we propose a universal solution to this problem: we delineate a principle for quantifying similarity between sources of arbitrary data streams, without a priori knowledge, features or training. We uncover an algebraic structure on a space of symbolic models for quantized data, and show that such stochastic generators may be added and uniqely inverted; and that a model and its inverse always sum to the generator of flat white noise. Therefore, every data stream has an anti-stream: data generated by the inverse model. Similarity between two streams, then, is the degree to which one, when summed to the other’s anti-stream, mutually annihilates all statistical structure to noise. We call this data smashing. We present diverse applications, including disambiguation of brainwaves pertaining to epileptic seizures, detection of anomalous cardiac rhythms, and classification of astronomical objects from raw photometry. In our examples, the data smashing principle, without access to any domain knowledge, meets or exceeds the performance of specialized algorithms tuned by domain experts.

[1]  David G. Stork,et al.  Pattern classification, 2nd Edition , 2000 .

[2]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[3]  J. H. Ward Hierarchical Grouping to Optimize an Objective Function , 1963 .

[4]  J. G. Snodgrass,et al.  A standardized set of 260 pictures: Norms for name agreement, image agreement, familiarity, and visual complexity. , 1980 .

[5]  Joelle Pineau,et al.  PAC-Learning of Markov Models with Hidden State , 2006, ECML.

[6]  Casimir A. Kulikowski,et al.  Featureless Pattern Recognition in an Imaginary Hilbert Space and Its Application to Protein Fold Classification , 2001, MLDM.

[7]  H. S. Seung,et al.  Cognition. The manifold ways of perception. , 2000, Science.

[8]  K Lehnertz,et al.  Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  Jeffrey D. Ullman,et al.  Introduction to automata theory, languages, and computation, 2nd edition , 2001, SIGA.

[10]  Lars Kai Hansen,et al.  A New Database for Speaker Recognition , 2005 .

[11]  Robert P. W. Duin,et al.  Dissimilarity representations allow for building good classifiers , 2002, Pattern Recognit. Lett..

[12]  Shigeo Abe DrEng Pattern Classification , 2001, Springer London.

[13]  Azaria Paz,et al.  Introduction to probabilistic automata (Computer science and applied mathematics) , 1971 .

[14]  G. Brumfiel High-energy physics: Down the petabyte highway , 2011, Nature.

[15]  M. Szymański The Optical Gravitational Lensing Experiment. Internet Access to the OGLE Photometry Data Set: OGLE-II BVI maps and I-band data , 2005, astro-ph/0602018.

[16]  J. G. Snodgrass,et al.  A standardized set of 260 pictures: norms for name agreement, image agreement, familiarity, and visual complexity. , 1980, Journal of experimental psychology. Human learning and memory.

[17]  Jeffrey D. Ullman,et al.  Introduction to Automata Theory, Languages and Computation , 1979 .

[18]  Asok Ray,et al.  Language-measure-theoretic optimal control of probabilistic finite-state systems , 2007, Int. J. Control.

[19]  Flemming Topsøe,et al.  On the Glivenko-Cantelli theorem , 1970 .

[20]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[21]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[22]  H. Scheraga,et al.  Solution of the embedding problem and decomposition of symmetric matrices. , 1985, Proceedings of the National Academy of Sciences of the United States of America.

[23]  J. MacQueen Some methods for classification and analysis of multivariate observations , 1967 .

[24]  Asok Ray,et al.  Pattern classification in symbolic streams via semantic annihilation of information , 2010, Proceedings of the 2010 American Control Conference.

[25]  Jürgen Duske,et al.  On cofinal and definite automata , 1983, Acta Cybern..

[26]  Richard G Baraniuk,et al.  More Is Less: Signal Processing and the Data Deluge , 2011, Science.

[27]  Miroslav Ćirić,et al.  DIRECTABLE AUTOMATA AND THEIR GENERALIZATIONS : A SURVEY , 2001 .

[28]  W. Feller,et al.  The fundamental limit theorems in probability , 1945 .

[29]  Francisco Casacuberta,et al.  Probabilistic finite-state machines - part I , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  H. Sebastian Seung,et al.  The Manifold Ways of Perception , 2000, Science.

[31]  Asok Ray,et al.  Structural transformations of probabilistic finite state machines , 2008, Int. J. Control.

[32]  B. V. K. Vijaya Kumar,et al.  Subject identification from electroencephalogram (EEG) signals during imagined speech , 2010, 2010 Fourth IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS).

[33]  James P. Crutchfield,et al.  Equations of Motion from a Data Series , 1987, Complex Syst..

[34]  Hod Lipson,et al.  Abductive learning of quantized stochastic processes with probabilistic finite automata , 2013, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[35]  Robert P. W. Duin,et al.  Experiments with a featureless approach to pattern recognition , 1997, Pattern Recognit. Lett..

[36]  J. Crutchfield The calculi of emergence: computation, dynamics and induction , 1994 .

[37]  Marcin Kubiak,et al.  The Optical Gravitational Lensing Experiment , 1992 .

[38]  Liu Yang An Overview of Distance Metric Learning , 2007 .

[39]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.