暂无分享,去创建一个
Quan Hung Tran | Quan Tran | Nhan Dam | Tuan Lai | Franck Dernoncourt | Trung Le | Nham Le | Dinh Phung | Dinh Q. Phung | Franck Dernoncourt | T. Lai | Trung Le | Nham Le | Nhan Dam
[1] Ankur Taly,et al. Axiomatic Attribution for Deep Networks , 2017, ICML.
[2] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[3] Zhiguo Wang,et al. FAQ-based Question Answering via Word Alignment , 2015, ArXiv.
[4] Fei-Fei Li,et al. Visualizing and Understanding Recurrent Networks , 2015, ArXiv.
[5] Johannes Gehrke,et al. Intelligible Models for HealthCare: Predicting Pneumonia Risk and Hospital 30-day Readmission , 2015, KDD.
[6] Si Li,et al. A Compare-Aggregate Model with Dynamic-Clip Attention for Answer Selection , 2017, CIKM.
[7] Quanshi Zhang,et al. Examining CNN representations with respect to Dataset Bias , 2017, AAAI.
[8] Noah A. Smith,et al. What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA , 2007, EMNLP.
[9] Andrew Zisserman,et al. Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps , 2013, ICLR.
[10] Alexander Binder,et al. Unmasking Clever Hans predictors and assessing what machines really learn , 2019, Nature Communications.
[11] Jimmy J. Lin,et al. Bridging the Gap between Relevance Matching and Semantic Matching for Short Text Similarity Modeling , 2019, EMNLP.
[12] Zhi-Hong Deng,et al. Inter-Weighted Alignment Network for Sentence Pair Modeling , 2017, EMNLP.
[13] Ingrid Zukerman,et al. The Context-Dependent Additive Recurrent Neural Net , 2018, NAACL.
[14] Alessandro Moschitti,et al. TANDA: Transfer and Adapt Pre-Trained Transformer Models for Answer Sentence Selection , 2019, AAAI.
[15] Siu Cheung Hui,et al. Hyperbolic Representation Learning for Fast and Efficient Neural Question Answering , 2017, WSDM.
[16] Richard S. Zemel,et al. Prototypical Networks for Few-shot Learning , 2017, NIPS.
[17] Bowen Zhou,et al. LSTM-based Deep Learning Models for non-factoid answer selection , 2015, ArXiv.
[18] Yi Yang,et al. WikiQA: A Challenge Dataset for Open-Domain Question Answering , 2015, EMNLP.
[19] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[20] Siu Cheung Hui,et al. Multi-Cast Attention Networks , 2018, KDD.
[21] Nedim Lipka,et al. ISA: An Intelligent Shopping Assistant , 2020, AACL.
[22] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[23] Quan Hung Tran,et al. A Gated Self-attention Memory Network for Answer Selection , 2019, EMNLP.
[24] Mark Lee,et al. Integrating Question Classification and Deep Learning for improved Answer Selection , 2018, COLING.
[25] Alex Graves,et al. Neural Turing Machines , 2014, ArXiv.
[26] Hugo Larochelle,et al. Optimization as a Model for Few-Shot Learning , 2016, ICLR.
[27] Quanshi Zhang,et al. Interpretable Convolutional Neural Networks , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[28] Stefan Roth,et al. Neural Nearest Neighbors Networks , 2018, NeurIPS.
[29] Andreas Stolcke,et al. Dialogue act modeling for automatic tagging and recognition of conversational speech , 2000, CL.
[30] Oriol Vinyals,et al. Matching Networks for One Shot Learning , 2016, NIPS.
[31] Xiangji Huang,et al. Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection Task , 2020, LREC.
[32] Sheng Li,et al. A Review on Deep Learning Techniques Applied to Answer Selection , 2018, COLING.