Insights on representational similarity in neural networks with canonical correlation
暂无分享,去创建一个
[1] Hod Lipson,et al. Convergent Learning: Do different neural networks learn the same representations? , 2015, FE@NIPS.
[2] Yoshua Bengio,et al. A Closer Look at Memorization in Deep Networks , 2017, ICML.
[3] Jaehoon Lee,et al. Deep Neural Networks as Gaussian Processes , 2017, ICLR.
[4] Pushmeet Kohli,et al. PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions , 2015, NIPS.
[5] Richard Socher,et al. Regularizing and Optimizing LSTM Language Models , 2017, ICLR.
[6] Timo Aila,et al. Pruning Convolutional Neural Networks for Resource Efficient Inference , 2016, ICLR.
[7] Bolei Zhou,et al. Network Dissection: Quantifying Interpretability of Deep Visual Representations , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[8] Yann LeCun,et al. Optimal Brain Damage , 1989, NIPS.
[9] Fei-Fei Li,et al. Visualizing and Understanding Recurrent Networks , 2015, ArXiv.
[10] Hod Lipson,et al. Understanding Neural Networks Through Deep Visualization , 2015, ArXiv.
[11] Juho Rousu,et al. A Tutorial on Canonical Correlation Methods , 2017, ACM Comput. Surv..
[12] Matthew T. Kaufman,et al. A neural network that finds a naturalistic solution for the production of muscle activity , 2015, Nature Neuroscience.
[13] M. Bartlett. THE STATISTICAL SIGNIFICANCE OF CANONICAL CORRELATIONS , 1941 .
[14] Song Han,et al. Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.
[15] Hanan Samet,et al. Pruning Filters for Efficient ConvNets , 2016, ICLR.
[16] H. Hotelling. Relations Between Two Sets of Variates , 1936 .
[17] Wonyong Sung,et al. Structured Pruning of Deep Convolutional Neural Networks , 2015, ACM J. Emerg. Technol. Comput. Syst..
[18] Samy Bengio,et al. Understanding deep learning requires rethinking generalization , 2016, ICLR.
[19] Jason Yosinski,et al. Measuring the Intrinsic Dimension of Objective Landscapes , 2018, ICLR.
[20] Manaal Faruqui,et al. Improving Vector Space Word Representations Using Multilingual Correlation , 2014, EACL.
[21] Michael Carbin,et al. The Lottery Ticket Hypothesis: Training Pruned Neural Networks , 2018, ArXiv.
[22] Song Han,et al. Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding , 2015, ICLR.
[23] Richard E. Turner,et al. Gaussian Process Behaviour in Wide Deep Neural Networks , 2018, ICLR.
[24] Rob Fergus,et al. Visualizing and Understanding Convolutional Networks , 2013, ECCV.
[25] Jascha Sohl-Dickstein,et al. SVCCA: Singular Vector Canonical Correlation Analysis for Deep Learning Dynamics and Interpretability , 2017, NIPS.
[26] Richard Socher,et al. An Analysis of Neural Language Modeling at Multiple Scales , 2018, ArXiv.
[27] Matthew Botvinick,et al. On the importance of single directions for generalization , 2018, ICLR.