AnalogGenie-Lite: Enhancing Scalability and Precision in Circuit Topology Discovery through Lightweight Graph Modeling

Jian Gao, Weidong Cao, Xuan Zhang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:18277-18289, 2025.

Abstract

The sustainable performance improvements of integrated circuits (ICs) drive the continuous advancement of nearly all transformative technologies. Since its invention, IC performance enhancements have been dominated by scaling the semiconductor technology. Yet, as Moore’s law tapers off, a crucial question arises: How can we sustain IC performance in the post-Moore era? Creating new circuit topologies has emerged as a promising pathway to address this fundamental need. This work proposes AnalogGenie-Lite, a decoder-only transformer that discovers novel analog IC topologies with significantly enhanced scalability and precision via lightweight graph modeling. AnalogGenie-Lite makes several unique contributions, including concise device-pin representations (i.e., advancing the best prior art from $O\left(n^2\right)$ to $O\left(n\right)$), frequent sub-graph mining, and optimal sequence modeling. Compared to state-of-the-art circuit topology discovery methods, it achieves $5.15\times$ to $71.11\times$ gains in scalability and 23.5% to 33.6% improvements in validity. Case studies on other domains’ graphs are also provided to show the broader applicability of the proposed graph modeling approach. Source code: https://github.com/xz-group/AnalogGenie-Lite.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-gao25a, title = {{A}nalog{G}enie-Lite: Enhancing Scalability and Precision in Circuit Topology Discovery through Lightweight Graph Modeling}, author = {Gao, Jian and Cao, Weidong and Zhang, Xuan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {18277--18289}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/gao25a/gao25a.pdf}, url = {https://proceedings.mlr.press/v267/gao25a.html}, abstract = {The sustainable performance improvements of integrated circuits (ICs) drive the continuous advancement of nearly all transformative technologies. Since its invention, IC performance enhancements have been dominated by scaling the semiconductor technology. Yet, as Moore’s law tapers off, a crucial question arises: How can we sustain IC performance in the post-Moore era? Creating new circuit topologies has emerged as a promising pathway to address this fundamental need. This work proposes AnalogGenie-Lite, a decoder-only transformer that discovers novel analog IC topologies with significantly enhanced scalability and precision via lightweight graph modeling. AnalogGenie-Lite makes several unique contributions, including concise device-pin representations (i.e., advancing the best prior art from $O\left(n^2\right)$ to $O\left(n\right)$), frequent sub-graph mining, and optimal sequence modeling. Compared to state-of-the-art circuit topology discovery methods, it achieves $5.15\times$ to $71.11\times$ gains in scalability and 23.5% to 33.6% improvements in validity. Case studies on other domains’ graphs are also provided to show the broader applicability of the proposed graph modeling approach. Source code: https://github.com/xz-group/AnalogGenie-Lite.} }
Endnote
%0 Conference Paper %T AnalogGenie-Lite: Enhancing Scalability and Precision in Circuit Topology Discovery through Lightweight Graph Modeling %A Jian Gao %A Weidong Cao %A Xuan Zhang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-gao25a %I PMLR %P 18277--18289 %U https://proceedings.mlr.press/v267/gao25a.html %V 267 %X The sustainable performance improvements of integrated circuits (ICs) drive the continuous advancement of nearly all transformative technologies. Since its invention, IC performance enhancements have been dominated by scaling the semiconductor technology. Yet, as Moore’s law tapers off, a crucial question arises: How can we sustain IC performance in the post-Moore era? Creating new circuit topologies has emerged as a promising pathway to address this fundamental need. This work proposes AnalogGenie-Lite, a decoder-only transformer that discovers novel analog IC topologies with significantly enhanced scalability and precision via lightweight graph modeling. AnalogGenie-Lite makes several unique contributions, including concise device-pin representations (i.e., advancing the best prior art from $O\left(n^2\right)$ to $O\left(n\right)$), frequent sub-graph mining, and optimal sequence modeling. Compared to state-of-the-art circuit topology discovery methods, it achieves $5.15\times$ to $71.11\times$ gains in scalability and 23.5% to 33.6% improvements in validity. Case studies on other domains’ graphs are also provided to show the broader applicability of the proposed graph modeling approach. Source code: https://github.com/xz-group/AnalogGenie-Lite.
APA
Gao, J., Cao, W. & Zhang, X.. (2025). AnalogGenie-Lite: Enhancing Scalability and Precision in Circuit Topology Discovery through Lightweight Graph Modeling. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:18277-18289 Available from https://proceedings.mlr.press/v267/gao25a.html.

Related Material