暂无分享,去创建一个
[1] Quan Pan,et al. A Generative Model for category text generation , 2018, Inf. Sci..
[2] Toru Nishino,et al. Keeping Consistency of Sentence Generation and Document Classification with Multi-Task Learning , 2019, EMNLP.
[3] Ying Tan,et al. TextDream: Conditional Text Generation by Searching in the Semantic Space , 2018, 2018 IEEE Congress on Evolutionary Computation (CEC).
[4] Jacob Abernethy,et al. On Convergence and Stability of GANs , 2018 .
[5] Erhardt Barth,et al. A Hybrid Convolutional Variational Autoencoder for Text Generation , 2017, EMNLP.
[6] Jinyin Chen,et al. Customizable text generation via conditional text generative adversarial network , 2020, Neurocomputing.
[7] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[8] Ming-Yu Liu,et al. Coupled Generative Adversarial Networks , 2016, NIPS.
[9] Christopher Potts,et al. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank , 2013, EMNLP.
[10] Bin Guo,et al. Conditional Text Generation for Harmonious Human-Machine Interaction , 2021, ACM Trans. Intell. Syst. Technol..
[11] Min Yang,et al. A Multi-Task Learning Framework for Abstractive Text Summarization , 2019, AAAI.
[12] Jianfeng Gao,et al. Deep Learning Based Text Classification: A Comprehensive Review , 2020, ArXiv.
[13] Structured Attention for Unsupervised Dialogue Structure Induction , 2020, EMNLP.
[14] Anton van den Hengel,et al. Image-Based Recommendations on Styles and Substitutes , 2015, SIGIR.
[15] Terrence J. Sejnowski,et al. Modeling Large Dynamical Systems with Dynamical Consistent Neural Networks , 2007 .
[16] Yoshua Bengio,et al. A Recurrent Latent Variable Model for Sequential Data , 2015, NIPS.
[17] Pieter Abbeel,et al. Variational Lossy Autoencoder , 2016, ICLR.
[18] Mohammad Taher Pilehvar,et al. On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation , 2019, NGT@EMNLP-IJCNLP.
[19] Greg Mori,et al. Graph Generation with Variational Recurrent Neural Network , 2019, ArXiv.
[20] Yoshua Bengio,et al. Z-Forcing: Training Stochastic Recurrent Networks , 2017, NIPS.
[21] Chenguang Zhu,et al. Multi-task Learning for Natural Language Generation in Task-Oriented Dialogue , 2019, EMNLP.
[22] Samy Bengio,et al. Generating Sentences from a Continuous Space , 2015, CoNLL.
[23] Jiahai Wang,et al. CatGAN: Category-aware Generative Adversarial Networks with Hierarchical Evolutionary Learning for Category Text Generation , 2019, AAAI.
[24] Max Welling,et al. Auto-Encoding Variational Bayes , 2013, ICLR.
[25] Steven Lake Waslander,et al. State initialization for recurrent neural network modeling of time-series data , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).
[26] Eric P. Xing,et al. Toward Controlled Generation of Text , 2017, ICML.
[27] Ke Wang,et al. SentiGAN: Generating Sentimental Texts via Mixture Adversarial Networks , 2018, IJCAI.
[28] Sebastian Ruder,et al. An Overview of Multi-Task Learning in Deep Neural Networks , 2017, ArXiv.
[29] Philip Bachman,et al. An Architecture for Deep, Hierarchical Generative Models , 2016, NIPS.
[30] Jonathan Baxter,et al. A Model of Inductive Bias Learning , 2000, J. Artif. Intell. Res..
[31] Xuanjing Huang,et al. Toward Diverse Text Generation with Inverse Reinforcement Learning , 2018, IJCAI.
[32] Lantao Yu,et al. SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient , 2016, AAAI.
[33] Graham Neubig,et al. Lagging Inference Networks and Posterior Collapse in Variational Autoencoders , 2019, ICLR.
[34] Chi Zhang,et al. Multi-task learning for abstractive text summarization with key information guide network , 2020, EURASIP Journal on Advances in Signal Processing.
[35] Samy Bengio,et al. Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks , 2015, NIPS.
[36] Yong Yu,et al. Long Text Generation via Adversarial Training with Leaked Information , 2017, AAAI.
[37] Jure Leskovec,et al. Learning Attitudes and Attributes from Multi-aspect Reviews , 2012, 2012 IEEE 12th International Conference on Data Mining.