Masked graph modeling for molecule generation
暂无分享,去创建一个
Elman Mansimov | Richard Bonneau | Omar Mahmood | Kyunghyun Cho | Richard Bonneau | Elman Mansimov | Kyunghyun Cho | O. Mahmood
[1] G. Klambauer,et al. Graph networks for molecular design , 2020, Mach. Learn. Sci. Technol..
[2] Matt J. Kusner,et al. Barking up the right tree: an approach to search over molecule synthesis DAGs , 2020, NeurIPS.
[3] Ling Wang,et al. Heck reaction prediction using a transformer model based on a transfer learning strategy. , 2020, Chemical communications.
[4] Bo Dai,et al. Scalable Deep Generative Modeling for Sparse Graphs , 2020, ICML.
[5] Stanislaw Jastrzebski,et al. Molecule Attention Transformer , 2020, ArXiv.
[6] José Miguel Hernández-Lobato,et al. Reinforcement Learning for Molecular Design Guided by Quantum Mechanics , 2020, ICML.
[7] Regina Barzilay,et al. Improving Molecular Design by Stochastic Iterative Target Augmentation , 2020, ICML.
[8] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.
[9] Kyunghyun Cho,et al. Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic Inference using a Delta Posterior , 2019, AAAI.
[10] Pascal Friederich,et al. Self-referencing embedded strings (SELFIES): A 100% robust molecular string representation , 2019, Mach. Learn. Sci. Technol..
[11] J. Leskovec,et al. Strategies for Pre-training Graph Neural Networks , 2019, ICLR.
[12] Seokho Kang,et al. Efficient learning of non-autoregressive graph variational autoencoders for molecular graph generation , 2019, Journal of Cheminformatics.
[13] Renjie Liao,et al. Efficient Graph Generation with Graph Recurrent Attention Networks , 2019, NeurIPS.
[14] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[15] J. Reymond,et al. Randomized SMILES strings improve the quality of molecular generative models , 2019, Journal of Cheminformatics.
[16] Alex Wang,et al. A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models , 2019, ArXiv.
[17] José Miguel Hernández-Lobato,et al. A COLD Approach to Generating Optimal Samples , 2019, ArXiv.
[18] Omer Levy,et al. SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems , 2019, NeurIPS.
[19] Aviral Kumar,et al. Graph Normalizing Flows , 2019, NeurIPS.
[20] Omer Levy,et al. Mask-Predict: Parallel Decoding of Conditional Masked Language Models , 2019, EMNLP.
[21] Zois Boukouvalas,et al. Deep learning for molecular generation and optimization - a review of the state of the art , 2019, Molecular Systems Design & Engineering.
[22] Alex Wang,et al. BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model , 2019, Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation.
[23] Jakob Uszkoreit,et al. Insertion Transformer: Flexible Sequence Generation via Insertion Operations , 2019, ICML.
[24] Kyunghyun Cho,et al. Non-Monotonic Sequential Text Generation , 2019, ICML.
[25] Qi Liu,et al. Insertion-based Decoding with Automatically Inferred Generation Order , 2019, Transactions of the Association for Computational Linguistics.
[26] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[27] Kyunghyun Cho,et al. Passage Re-ranking with BERT , 2019, ArXiv.
[28] Marwin H. S. Segler,et al. GuacaMol: Benchmarking Models for De Novo Molecular Design , 2018, J. Chem. Inf. Model..
[29] Andrew R. Leach,et al. ChEMBL: towards direct deposition of bioassay data , 2018, Nucleic Acids Res..
[30] Jan H. Jensen,et al. A graph-based genetic algorithm and generative model/Monte Carlo tree search for the exploration of chemical space , 2018, Chemical science.
[31] Li Li,et al. Optimization of Molecules via Deep Reinforcement Learning , 2018, Scientific Reports.
[32] Kyunghyun Cho,et al. Conditional molecular design with deep generative models , 2018, J. Chem. Inf. Model..
[33] Stefano Ermon,et al. Graphite: Iterative Generative Modeling of Graphs , 2018, ICML.
[34] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[35] Dmitry Vetrov,et al. Entangled Conditional Adversarial Autoencoder for de Novo Drug Discovery. , 2018, Molecular pharmaceutics.
[36] Percy Liang,et al. Know What You Don’t Know: Unanswerable Questions for SQuAD , 2018, ACL.
[37] Jure Leskovec,et al. Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation , 2018, NeurIPS.
[38] Nicola De Cao,et al. MolGAN: An implicit generative model for small molecular graphs , 2018, ArXiv.
[39] Omer Levy,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.
[40] Sepp Hochreiter,et al. Fréchet ChemNet Distance: A Metric for Generative Models for Molecules in Drug Discovery , 2018, J. Chem. Inf. Model..
[41] Jure Leskovec,et al. GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models , 2018, ICML.
[42] Razvan Pascanu,et al. Learning Deep Generative Models of Graphs , 2018, ICLR 2018.
[43] Regina Barzilay,et al. Junction Tree Variational Autoencoder for Molecular Graph Generation , 2018, ICML.
[44] Nikos Komodakis,et al. GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders , 2018, ICANN.
[45] Yibo Li,et al. Multi-objective de novo drug design with conditional graph generative model , 2018, Journal of Cheminformatics.
[46] Thierry Kogej,et al. Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks , 2017, ACS central science.
[47] Minyi Guo,et al. GraphGAN: Graph Representation Learning with Generative Adversarial Nets , 2017, AAAI.
[48] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[49] Alán Aspuru-Guzik,et al. Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules , 2016, ACS central science.
[50] Abhinav Vishnu,et al. ChemNet: A Transferable and Generalizable Deep Neural Network for Small-Molecule Property Prediction , 2017, ArXiv.
[51] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[52] Bowen Liu,et al. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models , 2017, ACS central science.
[53] Alán Aspuru-Guzik,et al. Objective-Reinforced Generative Adversarial Networks (ORGAN) for Sequence Generation Models , 2017, ArXiv.
[54] Samuel S. Schoenholz,et al. Neural Message Passing for Quantum Chemistry , 2017, ICML.
[55] Matt J. Kusner,et al. Grammar Variational Autoencoder , 2017, ICML.
[56] George Papadatos,et al. The ChEMBL database in 2017 , 2016, Nucleic Acids Res..
[57] Geoffrey E. Hinton,et al. Layer Normalization , 2016, ArXiv.
[58] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[59] Samy Bengio,et al. Order Matters: Sequence to sequence for sets , 2015, ICLR.
[60] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[61] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[62] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[63] Pablo Tamayo,et al. Parallel genome-scale loss of function screens in 216 cancer cell lines for the identification of context-specific genetic dependencies , 2014, Scientific Data.
[64] Pavlo O. Dral,et al. Quantum chemistry structures and properties of 134 kilo molecules , 2014, Scientific Data.
[65] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[66] Daan Wierstra,et al. Stochastic Backpropagation and Approximate Inference in Deep Generative Models , 2014, ICML.
[67] Max Welling,et al. Auto-Encoding Variational Bayes , 2013, ICLR.
[68] Yoshua Bengio,et al. What regularized auto-encoders learn from the data-generating distribution , 2012, J. Mach. Learn. Res..
[69] Jean-Louis Reymond,et al. Enumeration of 166 Billion Organic Small Molecules in the Chemical Universe Database GDB-17 , 2012, J. Chem. Inf. Model..
[70] Tomas Mikolov,et al. RNNLM - Recurrent Neural Network Language Modeling Toolkit , 2011 .
[71] Hugo Larochelle,et al. The Neural Autoregressive Distribution Estimator , 2011, AISTATS.
[72] Yoshua Bengio,et al. Deep Sparse Rectifier Neural Networks , 2011, AISTATS.
[73] Geoffrey E. Hinton,et al. Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.
[74] Pascal Vincent,et al. Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010, J. Mach. Learn. Res..
[75] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..
[76] Samy Bengio,et al. Modeling High-Dimensional Discrete Data with Multi-Layer Neural Networks , 1999, NIPS.
[77] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[78] W. Guida,et al. The art and practice of structure‐based drug design: A molecular modeling perspective , 1996, Medicinal research reviews.
[79] R. A. Leibler,et al. On Information and Sufficiency , 1951 .