Study on text representation method based on deep learning and topic information
暂无分享,去创建一个
[1] Sandeep Yadav,et al. Restricted Boltzmann machine and softmax regression for fault detection and classification , 2017, Complex & Intelligent Systems.
[2] G. Frege. Über Sinn und Bedeutung , 1892 .
[3] Jun Wang,et al. Learning text representation using recurrent convolutional neural network with highway layers , 2016, SIGIR 2016.
[4] Quoc V. Le,et al. Distributed Representations of Sentences and Documents , 2014, ICML.
[5] Laurens van der Maaten,et al. Accelerating t-SNE using tree-based algorithms , 2014, J. Mach. Learn. Res..
[6] Zellig S. Harris,et al. Distributional Structure , 1954 .
[7] Jason Weston,et al. A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.
[8] Christopher E. Moody,et al. Mixing Dirichlet Topic Models and Word Embeddings to Make lda2vec , 2016, ArXiv.
[9] Karl Moritz Hermann,et al. Distributed representations for compositional semantics , 2014, ArXiv.
[10] Hongwei Liu,et al. SAR Target Discrimination Based on BOW Model With Sample-Reweighted Category-Specific and Shared Dictionary Learning , 2017, IEEE Geoscience and Remote Sensing Letters.
[11] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[12] Yin Zhang,et al. TempoRec: Temporal-Topic Based Recommender for Social Network Services , 2017, Mob. Networks Appl..
[13] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[14] Andrew Y. Ng,et al. Improving Word Representations via Global Context and Multiple Word Prototypes , 2012, ACL.
[15] Mohsen Guizani,et al. CrossRec: Cross-Domain Recommendations Based on Social Big Data and Cognitive Computing , 2018, Mobile Networks and Applications.
[16] Zhenyu Qi,et al. A Bidirectional Hierarchical Skip-Gram model for text topic embedding , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).
[17] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[18] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..
[19] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[20] Jeffrey Dean,et al. Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.
[21] Stephen Clark,et al. Learning Adjective Meanings with a Tensor-Based Skip-Gram Model , 2015, CoNLL.
[22] Xiaohui Yan,et al. A biterm topic model for short texts , 2013, WWW.