Hybrid Ranking Network for Text-to-SQL
暂无分享,去创建一个
Souvik Kundu | Kaushik Chakrabarti | Jianwen Zhang | Zheng Chen | Qin Lyu | Shobhit Hathi | Souvik Kundu | Zheng Chen | Jianwen Zhang | K. Chakrabarti | Qin Lyu | S. Hathi | Shobhit Hathi
[1] Dawn Xiaodong Song,et al. SQLNet: Generating Structured Queries From Natural Language Without Reinforcement Learning , 2017, ArXiv.
[2] Bowen Zhou,et al. Zero-shot Text-to-SQL Learning with Auxiliary Task , 2019, AAAI.
[3] Xiaodong Liu,et al. Multi-Task Deep Neural Networks for Natural Language Understanding , 2019, ACL.
[4] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[5] Zhiyuan Liu,et al. Understanding the Behaviors of BERT in Ranking , 2019, ArXiv.
[6] Seunghyun Park,et al. A Comprehensive Exploration on WikiSQL with Table-Aware Word Contextualization , 2019, ArXiv.
[7] Weizhu Chen,et al. IncSQL: Training Incremental Text-to-SQL Parsers with Non-Deterministic Oracles , 2018, ArXiv.
[8] Mirella Lapata,et al. Coarse-to-Fine Decoding for Neural Semantic Parsing , 2018, ACL.
[9] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[10] Po-Sen Huang,et al. Execution-Guided Neural Program Decoding , 2018, ArXiv.
[11] Richard Socher,et al. Seq2SQL: Generating Structured Queries from Natural Language using Reinforcement Learning , 2018, ArXiv.
[12] Yan Gao,et al. Towards Complex Text-to-SQL in Cross-Domain Database with Intermediate Representation , 2019, ACL.
[13] Oren Etzioni,et al. Towards a theory of natural language interfaces to databases , 2003, IUI '03.
[14] Abraham Bernstein,et al. A comparative survey of recent natural language interfaces for databases , 2019, The VLDB Journal.
[15] Tao Yu,et al. TypeSQL: Knowledge-Based Type-Aware Neural Text-to-SQL Generation , 2018, NAACL.
[16] Peter Thanisch,et al. Natural language interfaces to databases – an introduction , 1995, Natural Language Engineering.
[17] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.