» Articles » PMID: 35672310

Language Models Can Learn Complex Molecular Distributions

Overview
Journal Nat Commun
Specialty Biology
Date 2022 Jun 7
PMID 35672310
Authors
Affiliations
Soon will be listed here.
Abstract

Deep generative models of molecules have grown immensely in popularity, trained on relevant datasets, these models are used to search through chemical space. The downstream utility of generative models for the inverse design of novel functional compounds, depends on their ability to learn a training distribution of molecules. The most simple example is a language model that takes the form of a recurrent neural network and generates molecules using a string representation. Since their initial use, subsequent work has shown that language models are very capable, in particular, recent research has demonstrated their utility in the low data regime. In this work, we investigate the capacity of simple language models to learn more  complex distributions of molecules. For this purpose, we introduce several challenging generative modeling tasks by compiling larger, more complex distributions of molecules and we evaluate the ability of language models on each task. The results demonstrate that language models are powerful generative models, capable of adeptly learning complex molecular distributions. Language models can accurately generate: distributions of the highest scoring penalized LogP molecules in ZINC15, multi-modal molecular distributions as well as the largest molecules in PubChem. The results highlight the limitations of some of the most popular and recent graph generative models- many of which cannot scale to these molecular distributions.

Citing Articles

Accelerating discovery of bioactive ligands with pharmacophore-informed generative models.

Xie W, Zhang J, Xie Q, Gong C, Ren Y, Xie J Nat Commun. 2025; 16(1):2391.

PMID: 40064886 PMC: 11894060. DOI: 10.1038/s41467-025-56349-0.


Artificial intelligence in drug development.

Zhang K, Yang X, Wang Y, Yu Y, Huang N, Li G Nat Med. 2025; 31(1):45-59.

PMID: 39833407 DOI: 10.1038/s41591-024-03434-4.


Text-guided small molecule generation via diffusion model.

Luo Y, Fang J, Li S, Liu Z, Wu J, Zhang A iScience. 2025; 27(11):110992.

PMID: 39759073 PMC: 11700631. DOI: 10.1016/j.isci.2024.110992.


A hitchhiker's guide to deep chemical language processing for bioactivity prediction.

Ozcelik R, Grisoni F Digit Discov. 2024; 4(2):316-325.

PMID: 39726698 PMC: 11667676. DOI: 10.1039/d4dd00311j.


Group graph: a molecular graph representation with enhanced performance, efficiency and interpretability.

Cao P, He Y, Cui M, Zhang X, Zhang Q, Zhang H J Cheminform. 2024; 16(1):133.

PMID: 39609909 PMC: 11606038. DOI: 10.1186/s13321-024-00933-x.


References
1.
Kim S, Thiessen P, Bolton E, Chen J, Fu G, Gindulyte A . PubChem Substance and Compound databases. Nucleic Acids Res. 2015; 44(D1):D1202-13. PMC: 4702940. DOI: 10.1093/nar/gkv951. View

2.
Reker D, Perna A, Rodrigues T, Schneider P, Reutlinger M, Monch B . Revealing the macromolecular targets of complex natural products. Nat Chem. 2014; 6(12):1072-8. DOI: 10.1038/nchem.2095. View

3.
Bohacek R, McMARTIN C, Guida W . The art and practice of structure-based drug design: a molecular modeling perspective. Med Res Rev. 1996; 16(1):3-50. DOI: 10.1002/(SICI)1098-1128(199601)16:1<3::AID-MED1>3.0.CO;2-6. View

4.
Ghose A, Crippen G . Atomic physicochemical parameters for three-dimensional-structure-directed quantitative structure-activity relationships. 2. Modeling dispersive and hydrophobic interactions. J Chem Inf Comput Sci. 1987; 27(1):21-35. DOI: 10.1021/ci00053a005. View

5.
Jumper J, Evans R, Pritzel A, Green T, Figurnov M, Ronneberger O . Highly accurate protein structure prediction with AlphaFold. Nature. 2021; 596(7873):583-589. PMC: 8371605. DOI: 10.1038/s41586-021-03819-2. View