Concatenated Tensor Networks for Deep Multi-Task Learning

[1]  Zenglin Xu,et al.  Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition , 2018, AAAI.

[2]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[3]  Philip S. Yu,et al.  Learning Multiple Tasks with Multilinear Relationship Networks , 2015, NIPS.

[4]  Zenglin Xu,et al.  Tensor Ring Restricted Boltzmann Machines , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[5]  Yongxin Yang,et al.  Trace Norm Regularised Deep Multi-Task Learning , 2016, ICLR.

[6]  Yongxin Yang,et al.  Deep Multi-task Representation Learning: A Tensor Factorisation Approach , 2016, ICLR.

[7]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[8]  Ngai Wong,et al.  Matrix Product Operator Restricted Boltzmann Machines , 2018, 2019 International Joint Conference on Neural Networks (IJCNN).

[9]  W. Dur,et al.  Concatenated tensor network states , 2009, 0904.1925.

[10]  Zenglin Xu,et al.  Learning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[11]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[12]  Xiaoou Tang,et al.  Facial Landmark Detection by Deep Multi-task Learning , 2014, ECCV.

[13]  Zenglin Xu,et al.  Block-term tensor neural networks , 2020, Neural Networks.

[14]  Trevor Cohn,et al.  Low Resource Dependency Parsing: Cross-lingual Parameter Sharing in a Neural Network Parser , 2015, ACL.

[15]  F. Verstraete,et al.  Criticality, the area law, and the computational power of projected entangled pair states. , 2006, Physical review letters.

[16]  Zenglin Xu,et al.  Self-Paced Multi-Task Multi-View Capped-norm Clustering , 2018, ICONIP.

[17]  Zenglin Xu,et al.  Learning from semantically dependent multi-tasks , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).