Collaborative Storytelling with Large-scale Neural Language Models
暂无分享,去创建一个
Eric Nichols | Randy Gomez | Leo Gao | Eric Nichols | R. Gomez | Leo Gao
[1] Alan W. Black,et al. WriterForcing: Generating more interesting story endings , 2019, Proceedings of the Second Workshop on Storytelling.
[2] Lukás Burget,et al. Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[3] Dongyan Zhao,et al. Plan-And-Write: Towards Better Automatic Storytelling , 2018, AAAI.
[4] Keisuke Nakamura,et al. A Holistic Approach in Designing Tabletop Robot’s Expressivity , 2020, 2020 IEEE International Conference on Robotics and Automation (ICRA).
[5] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..
[6] Sanja Fidler,et al. Aligning Books and Movies: Towards Story-Like Visual Explanations by Watching Movies and Reading Books , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[7] Alex Wang,et al. BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model , 2019, Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation.
[8] Hermann Ney,et al. On structuring probabilistic dependences in stochastic language modelling , 1994, Comput. Speech Lang..
[9] Geoffrey E. Hinton,et al. Generating Text with Recurrent Neural Networks , 2011, ICML.
[10] Natasha Jaques,et al. Approximating Interactive Human Evaluation with Self-Play for Open-Domain Dialog Systems , 2019, NeurIPS.
[11] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[12] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[13] Jason Weston,et al. ACUTE-EVAL: Improved Dialogue Evaluation with Optimized Questions and Multi-turn Comparisons , 2019, ArXiv.
[14] Cynthia Breazeal,et al. A Model-Free Affective Reinforcement Learning Approach to Personalization of an Autonomous Social Robot Companion for Early Literacy Education , 2019, AAAI.
[15] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[16] Christopher D. Manning,et al. Do Massively Pretrained Language Models Make Better Storytellers? , 2019, CoNLL.
[17] Noam Shazeer,et al. Adafactor: Adaptive Learning Rates with Sublinear Memory Cost , 2018, ICML.
[18] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[19] Hermann Ney,et al. Word Reordering and a Dynamic Programming Beam Search Algorithm for Statistical Machine Translation , 2003, CL.
[20] Jianfeng Gao,et al. DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation , 2020, ACL.
[21] Jonathan May,et al. Grounding Conversations with Improvised Dialogues , 2020, ACL.
[22] Yann Dauphin,et al. Hierarchical Neural Story Generation , 2018, ACL.
[23] Yejin Choi,et al. The Curious Case of Neural Text Degeneration , 2019, ICLR.
[24] Zhifang Sui,et al. Learning to Control the Fine-grained Sentiment for Story Ending Generation , 2019, ACL.