Stability Analysis of the t-SNE Algorithm for Human Activity Pattern Data

Health technological systems learning from and reacting on how humans behave in sensor equipped environments are today being commercialized. These systems rely on the assumptions that training data and testing data share the same feature space, and residing from the same underlying distribution - which is commonly unrealistic in real-world applications. Instead, the use of transfer learning could be considered. In order to transfer knowledge between a source and a target domain these should be mapped to a common latent feature space. In this work, the dimensionality reduction algorithm t-SNE is used to map data to a similar feature space and is further investigated through a proposed novel analysis of output stability. The proposed analysis, Normalized Linear Procrustes Analysis (NLPA) extends the existing Procrustes and Local Procrustes algorithms for aligning manifolds. The methods are tested on data reflecting human behaviour patterns from data collected in a smart home environment. Results show high partial output stability for the t-SNE algorithm for the tested input data for which NLPA is able to detect clusters which are individually aligned and compared. The results highlight the importance of understanding output stability before incorporating dimensionality reduction algorithms into further computation, e.g. for transfer learning.

[1]  Elmar Eisemann,et al.  Approximated and User Steerable tSNE for Progressive Visual Analytics , 2015, IEEE Transactions on Visualization and Computer Graphics.

[2]  Jarkko Venna,et al.  Dimensionality reduction for visual exploration of similarity structures , 2007 .

[3]  Andreas Müller,et al.  Introduction to Machine Learning with Python: A Guide for Data Scientists , 2016 .

[4]  Jiaxi Liang,et al.  A New Method for Performance Analysis in Nonlinear Dimensionality Reduction , 2020, Stat. Anal. Data Min..

[5]  Fethi Ben Ouezdou,et al.  Stability of Dimensionality Reduction Methods Applied on Artificial Hyperspectral Images , 2012, ICCVG.

[6]  Hassan Radvar-Esfahlan,et al.  Performance study of dimensionality reduction methods for metrology of nonrigid mechanical parts , 2013 .

[7]  Michel Verleysen,et al.  Nonlinear Dimensionality Reduction , 2021, Computer Vision.

[8]  Chris D. Nugent,et al.  Halmstad Intelligent Home - Capabilities and Opportunities , 2016, HealthyIoT.

[9]  Kyoungok Kim,et al.  Sentiment visualization and classification via semi-supervised nonlinear dimensionality reduction , 2014, Pattern Recognit..

[10]  N. McGovern,et al.  A High-Dimensional Atlas of Human T Cell Diversity Reveals Tissue-Specific Trafficking and Cytokine Signatures. , 2016, Immunity.

[11]  Michel Verleysen,et al.  Sensitivity to parameter and data variations in dimensionality reduction techniques , 2013, ESANN.

[12]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[13]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[14]  S. Durga Bhavani,et al.  Local and Global Intrinsic Dimensionality Estimation for Better Chemical Space Representation , 2011, MIWAI.

[15]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[16]  Yi Wu,et al.  Stable local dimensionality reduction approaches , 2009, Pattern Recognit..

[17]  Yee Leung,et al.  A new quality assessment criterion for nonlinear dimensionality reduction , 2011, Neurocomputing.

[18]  H. Hotelling Analysis of a complex of statistical variables into principal components. , 1933 .

[19]  Michel Verleysen,et al.  Scale-independent quality criteria for dimensionality reduction , 2010, Pattern Recognit. Lett..

[20]  Geoffrey E. Hinton,et al.  Stochastic Neighbor Embedding , 2002, NIPS.

[21]  W. Torgerson Multidimensional scaling: I. Theory and method , 1952 .

[22]  Hisashi Handa On the effect of dimensionality reduction by Manifold Learning for Evolutionary Learning , 2011, Evol. Syst..

[23]  Bo Zhang,et al.  A new embedding quality assessment method for manifold learning , 2012, Neurocomputing.

[24]  Antanas Verikas,et al.  Detecting and exploring deviating behaviour of smart home residents , 2016, Expert Syst. Appl..

[25]  Ernestina Menasalvas Ruiz,et al.  A methodology to compare Dimensionality Reduction algorithms in terms of loss of quality , 2014, Inf. Sci..

[26]  Reiner Lenz,et al.  Performance evaluation of dimensionality reduction techniques for multispectral images: Articles , 2007 .

[27]  Jude Shavlik,et al.  Chapter 11 Transfer Learning , 2009 .

[28]  Eric O. Postma,et al.  Dimensionality Reduction: A Comparative Review , 2008 .

[29]  Benjamin Rosman,et al.  Knowledge transfer for learning robot models via Local Procrustes Analysis , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[30]  Bernhard Schölkopf,et al.  Estimating the Support of a High-Dimensional Distribution , 2001, Neural Computation.

[31]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[32]  Michel Verleysen,et al.  Stability Comparison of Dimensionality Reduction Techniques Attending to Data and Parameter Variations , 2013, VAMP@EuroVis.

[33]  Reiner Lenz,et al.  Performance evaluation of dimensionality reduction techniques for multispectral images , 2007, Int. J. Imaging Syst. Technol..