Retroformer: Pushing the Limits of End-to-end Retrosynthesis Transformer

Yue Wan, Chang-Yu Hsieh, Ben Liao, Shengyu Zhang
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:22475-22490, 2022.

Abstract

Retrosynthesis prediction is one of the fundamental challenges in organic synthesis. The task is to predict the reactants given a core product. With the advancement of machine learning, computer-aided synthesis planning has gained increasing interest. Numerous methods were proposed to solve this problem with different levels of dependency on additional chemical knowledge. In this paper, we propose Retroformer, a novel Transformer-based architecture for retrosynthesis prediction without relying on any cheminformatics tools for molecule editing. Via the proposed local attention head, the model can jointly encode the molecular sequence and graph, and efficiently exchange information between the local reactive region and the global reaction context. Retroformer reaches the new state-of-the-art accuracy for the end-to-end template-free retrosynthesis, and improves over many strong baselines on better molecule and reaction validity. In addition, its generative procedure is highly interpretable and controllable. Overall, Retroformer pushes the limits of the reaction reasoning ability of deep generative models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-wan22a, title = {Retroformer: Pushing the Limits of End-to-end Retrosynthesis Transformer}, author = {Wan, Yue and Hsieh, Chang-Yu and Liao, Ben and Zhang, Shengyu}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {22475--22490}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/wan22a/wan22a.pdf}, url = {https://proceedings.mlr.press/v162/wan22a.html}, abstract = {Retrosynthesis prediction is one of the fundamental challenges in organic synthesis. The task is to predict the reactants given a core product. With the advancement of machine learning, computer-aided synthesis planning has gained increasing interest. Numerous methods were proposed to solve this problem with different levels of dependency on additional chemical knowledge. In this paper, we propose Retroformer, a novel Transformer-based architecture for retrosynthesis prediction without relying on any cheminformatics tools for molecule editing. Via the proposed local attention head, the model can jointly encode the molecular sequence and graph, and efficiently exchange information between the local reactive region and the global reaction context. Retroformer reaches the new state-of-the-art accuracy for the end-to-end template-free retrosynthesis, and improves over many strong baselines on better molecule and reaction validity. In addition, its generative procedure is highly interpretable and controllable. Overall, Retroformer pushes the limits of the reaction reasoning ability of deep generative models.} }
Endnote
%0 Conference Paper %T Retroformer: Pushing the Limits of End-to-end Retrosynthesis Transformer %A Yue Wan %A Chang-Yu Hsieh %A Ben Liao %A Shengyu Zhang %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-wan22a %I PMLR %P 22475--22490 %U https://proceedings.mlr.press/v162/wan22a.html %V 162 %X Retrosynthesis prediction is one of the fundamental challenges in organic synthesis. The task is to predict the reactants given a core product. With the advancement of machine learning, computer-aided synthesis planning has gained increasing interest. Numerous methods were proposed to solve this problem with different levels of dependency on additional chemical knowledge. In this paper, we propose Retroformer, a novel Transformer-based architecture for retrosynthesis prediction without relying on any cheminformatics tools for molecule editing. Via the proposed local attention head, the model can jointly encode the molecular sequence and graph, and efficiently exchange information between the local reactive region and the global reaction context. Retroformer reaches the new state-of-the-art accuracy for the end-to-end template-free retrosynthesis, and improves over many strong baselines on better molecule and reaction validity. In addition, its generative procedure is highly interpretable and controllable. Overall, Retroformer pushes the limits of the reaction reasoning ability of deep generative models.
APA
Wan, Y., Hsieh, C., Liao, B. & Zhang, S.. (2022). Retroformer: Pushing the Limits of End-to-end Retrosynthesis Transformer. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:22475-22490 Available from https://proceedings.mlr.press/v162/wan22a.html.

Related Material