DevFormer: A Symmetric Transformer for Context-Aware Device Placement

Haeyeon Kim, Minsu Kim, Federico Berto, Joungho Kim, Jinkyoo Park
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:16541-16566, 2023.

Abstract

In this paper, we present DevFormer, a novel transformer-based architecture for addressing the complex and computationally demanding problem of hardware design optimization. Despite the demonstrated efficacy of transformers in domains including natural language processing and computer vision, their use in hardware design has been limited by the scarcity of offline data. Our approach addresses this limitation by introducing strong inductive biases such as relative positional embeddings and action-permutation symmetricity that effectively capture the hardware context and enable efficient design optimization with limited offline data. We apply DevFormer to the problem of decoupling capacitor placement and show that it outperforms state-of-the-art methods in both simulated and real hardware, leading to improved performances while reducing the number of components by more than 30%. Finally, we show that our approach achieves promising results in other offline contextual learning-based combinatorial optimization tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-kim23h, title = {{D}ev{F}ormer: A Symmetric Transformer for Context-Aware Device Placement}, author = {Kim, Haeyeon and Kim, Minsu and Berto, Federico and Kim, Joungho and Park, Jinkyoo}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {16541--16566}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/kim23h/kim23h.pdf}, url = {https://proceedings.mlr.press/v202/kim23h.html}, abstract = {In this paper, we present DevFormer, a novel transformer-based architecture for addressing the complex and computationally demanding problem of hardware design optimization. Despite the demonstrated efficacy of transformers in domains including natural language processing and computer vision, their use in hardware design has been limited by the scarcity of offline data. Our approach addresses this limitation by introducing strong inductive biases such as relative positional embeddings and action-permutation symmetricity that effectively capture the hardware context and enable efficient design optimization with limited offline data. We apply DevFormer to the problem of decoupling capacitor placement and show that it outperforms state-of-the-art methods in both simulated and real hardware, leading to improved performances while reducing the number of components by more than 30%. Finally, we show that our approach achieves promising results in other offline contextual learning-based combinatorial optimization tasks.} }
Endnote
%0 Conference Paper %T DevFormer: A Symmetric Transformer for Context-Aware Device Placement %A Haeyeon Kim %A Minsu Kim %A Federico Berto %A Joungho Kim %A Jinkyoo Park %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-kim23h %I PMLR %P 16541--16566 %U https://proceedings.mlr.press/v202/kim23h.html %V 202 %X In this paper, we present DevFormer, a novel transformer-based architecture for addressing the complex and computationally demanding problem of hardware design optimization. Despite the demonstrated efficacy of transformers in domains including natural language processing and computer vision, their use in hardware design has been limited by the scarcity of offline data. Our approach addresses this limitation by introducing strong inductive biases such as relative positional embeddings and action-permutation symmetricity that effectively capture the hardware context and enable efficient design optimization with limited offline data. We apply DevFormer to the problem of decoupling capacitor placement and show that it outperforms state-of-the-art methods in both simulated and real hardware, leading to improved performances while reducing the number of components by more than 30%. Finally, we show that our approach achieves promising results in other offline contextual learning-based combinatorial optimization tasks.
APA
Kim, H., Kim, M., Berto, F., Kim, J. & Park, J.. (2023). DevFormer: A Symmetric Transformer for Context-Aware Device Placement. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:16541-16566 Available from https://proceedings.mlr.press/v202/kim23h.html.

Related Material