Bidirectional generation of structure and properties through a single molecular foundation model
暂无分享,去创建一个
[1] Chaoning Zhang,et al. Text-to-image Diffusion Models in Generative AI: A Survey , 2023, ArXiv.
[2] Arkadii I. Lin,et al. Inverse QSAR: Reversing Descriptor-Driven Prediction Pipeline Using Attention-Based Conditional Variational Autoencoder , 2022, J. Chem. Inf. Model..
[3] Simone Sciabola,et al. A Transformer-based Generative Model for De Novo Molecular Design , 2022, ArXiv.
[4] R. Ramprasad,et al. polyBERT: a chemical language model to enable fully machine-driven ultrafast polymer informatics , 2022, Nature communications.
[5] Seongok Ryu,et al. Accurate, reliable and interpretable solubility prediction of druglike molecules with attention pooling and Bayesian learning , 2022, ArXiv.
[6] Bharath Ramsundar,et al. ChemBERTa-2: Towards Chemical Foundation Models , 2022, ArXiv.
[7] A. Farimani,et al. TransPolymer: a Transformer-based language model for polymer property predictions , 2022, npj Computational Materials.
[8] Shuan Chen,et al. A generalized-template-based graph neural network for accurate organic reactivity prediction , 2022, Nature Machine Intelligence.
[9] Yatao Bian,et al. Can Pre-trained Models Really Learn Better Molecular Representations for AI-aided Drug Discovery? , 2022, ArXiv.
[10] Hui Yu,et al. Visuals to Text: A Comprehensive Review on Automatic Image Captioning , 2022, IEEE/CAA Journal of Automatica Sinica.
[11] Zirui Wang,et al. CoCa: Contrastive Captioners are Image-Text Foundation Models , 2022, Trans. Mach. Learn. Res..
[12] C. Tyrchan,et al. Implications of Additivity and Nonadditivity for Machine Learning and Deep Learning Models in Drug Design , 2022, ACS omega.
[13] Duzhen Zhang,et al. VLP: A Survey on Vision-language Pre-training , 2022, Machine Intelligence Research.
[14] Jaechang Lim,et al. Drug-likeness scoring based on unsupervised learning , 2021, Chemical science.
[15] Connor W. Coley,et al. Permutation invariant graph-to-sequence model for template-free retrosynthesis and reaction prediction , 2021, J. Chem. Inf. Model..
[16] Michael S. Bernstein,et al. On the Opportunities and Risks of Foundation Models , 2021, ArXiv.
[17] Junnan Li,et al. Align before Fuse: Vision and Language Representation Learning with Momentum Distillation , 2021, NeurIPS.
[18] Esben Bjerrum,et al. Chemformer: a pre-trained transformer for computational chemistry , 2021, Mach. Learn. Sci. Technol..
[19] Brian M. Belgodere,et al. Large-scale chemical language representations capture molecular structure and properties , 2021, Nature Machine Intelligence.
[20] Tao Qin,et al. Dual-view Molecule Pre-training , 2021, ArXiv.
[21] Hua Wu,et al. Geometry-enhanced molecular representation learning for property prediction , 2021, Nature Machine Intelligence.
[22] Nathan Brown,et al. De novo molecular design and generative models. , 2021, Drug discovery today.
[23] Julien Mairal,et al. Emerging Properties in Self-Supervised Vision Transformers , 2021, 2021 IEEE/CVF International Conference on Computer Vision (ICCV).
[24] Ilya Sutskever,et al. Learning Transferable Visual Models From Natural Language Supervision , 2021, ICML.
[25] Parminder Kaur,et al. Comparative analysis on cross-modal information retrieval: A review , 2021, Comput. Sci. Rev..
[26] Youngchun Kwon,et al. Valid, Plausible, and Diverse Retrosynthesis Using Tied Two-Way Transformers with Latent Variables , 2021, J. Chem. Inf. Model..
[27] C. Tyrchan,et al. Nonadditivity in public and inhouse data: implications for drug design , 2020, Journal of Cheminformatics.
[28] Benjamin A. Shoemaker,et al. PubChem in 2021: new data content and improved web interfaces , 2020, Nucleic Acids Res..
[29] S. Gelly,et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale , 2020, ICLR.
[30] Bharath Ramsundar,et al. ChemBERTa: Large-Scale Self-Supervised Pretraining for Molecular Property Prediction , 2020, ArXiv.
[31] Michael Crawshaw,et al. Multi-Task Learning with Deep Neural Networks: A Survey , 2020, ArXiv.
[32] Jaechang Lim,et al. PIGNet: a physics-informed deep learning model toward generalized drug–target interaction predictions , 2020, Chemical science.
[33] Stanislaw Jastrzebski,et al. Molecule Edit Graph Attention Network: Modeling Chemical Reactions as Sequences of Graph Edits , 2020, J. Chem. Inf. Model..
[34] Yatao Bian,et al. Self-Supervised Graph Transformer on Large-Scale Molecular Data , 2020, NeurIPS.
[35] Pierre H. Richemond,et al. Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning , 2020, NeurIPS.
[36] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[37] Jianfeng Gao,et al. Oscar: Object-Semantics Aligned Pre-training for Vision-Language Tasks , 2020, ECCV.
[38] I. Tetko,et al. State-of-the-art augmented NLP transformer models for direct and single-step retrosynthesis , 2020, Nature Communications.
[39] 知秀 柴田. 5分で分かる!? 有名論文ナナメ読み:Jacob Devlin et al. : BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding , 2020 .
[40] Ross B. Girshick,et al. Momentum Contrast for Unsupervised Visual Representation Learning , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[41] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[42] Yu Cheng,et al. UNITER: UNiversal Image-TExt Representation Learning , 2019, ECCV.
[43] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[44] Yuedong Yang,et al. Predicting Retrosynthetic Reaction using Self-Corrected Transformer Neural Networks , 2019, ArXiv.
[45] Jaechang Lim,et al. Scaffold-based molecular design with a graph generative model , 2019, Chemical science.
[46] Christopher A. Hunter,et al. Molecular Transformer: A Model for Uncertainty-Calibrated Chemical Reaction Prediction , 2018, ACS central science.
[47] J. Leskovec,et al. Strategies for Pre-training Graph Neural Networks , 2019, ICLR.
[48] Regina Barzilay,et al. Analyzing Learned Molecular Representations for Property Prediction , 2019, J. Chem. Inf. Model..
[49] Li Li,et al. Optimization of Molecules via Deep Reinforcement Learning , 2018, Scientific Reports.
[50] Qi Zhao,et al. Predicting Drug-Induced Liver Injury Using Ensemble Learning Methods and Molecular Fingerprints , 2018, Toxicological sciences : an official journal of the Society of Toxicology.
[51] Djork-Arné Clevert,et al. Learning continuous and data-driven molecular descriptors by translating equivalent chemical representations , 2018, Chemical science.
[52] Yingyu Liang,et al. N-Gram Graph: Simple Unsupervised Representation for Graphs, with Applications to Molecules , 2018, NeurIPS.
[53] Jin Woo Kim,et al. Molecular generative model based on conditional variational autoencoder for de novo molecular design , 2018, Journal of Cheminformatics.
[54] Thierry Kogej,et al. Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks , 2017, ACS central science.
[55] Jakub Hajič,et al. Visual Question Answering , 2022, International Journal of Advanced Research in Science, Communication and Technology.
[56] Regina Barzilay,et al. Predicting Organic Reaction Outcomes with Weisfeiler-Lehman Network , 2017, NIPS.
[57] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[58] Thomas Blaschke,et al. Molecular de-novo design through deep reinforcement learning , 2017, Journal of Cheminformatics.
[59] Esben Jannik Bjerrum,et al. SMILES Enumeration as Data Augmentation for Neural Network Modeling of Molecules , 2017, ArXiv.
[60] Vijay S. Pande,et al. MoleculeNet: a benchmark for molecular machine learning , 2017, Chemical science.
[61] Alán Aspuru-Guzik,et al. Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules , 2016, ACS central science.
[62] John J. Irwin,et al. ZINC 15 – Ligand Discovery for Everyone , 2015, J. Chem. Inf. Model..
[63] Philip Gage,et al. A new algorithm for data compression , 1994 .
[64] H. Baxter Williams,et al. A Survey , 1992 .
[65] M. Glenski,et al. Foundation Models of Scientific Knowledge for Chemistry: Opportunities, Challenges and Lessons Learned , 2022, BIGSCIENCE.
[66] Amit Dhurandhar,et al. Reprogramming Large Pretrained Language Models for Antibody Sequence Infilling , 2022, ArXiv.
[67] K. Shin,et al. Self-supervised Co-learning of Uncurated Images and Reports Enables Oversight AI in Radiology , 2022 .
[68] Prashansa Agrawal,et al. Artificial Intelligence in Drug Discovery and Development , 2018 .
[69] Robert C. Wolpert,et al. A Review of the , 1985 .