PrefixGPT: Prefix Adder Optimization by a Generative Pre-trained Transformer

Authors

  • Ruogu Ding Shanghai Jiao Tong University
  • Xin Ning Shanghai Jiao Tong University
  • Ulf Schlichtmann Technische Universität München
  • Weikang Qian Shanghai Jiao Tong University

DOI:

https://doi.org/10.1609/aaai.v40i25.39220

Abstract

Prefix adders are widely used in compute-intensive applications for their high speed. However, designing optimized prefix adders is challenging due to strict design rules and an exponentially large design space. We introduce PrefixGPT, a generative pre-trained Transformer (GPT) that directly generates optimized prefix adders from scratch. Our approach represents an adder's topology as a two-dimensional coordinate sequence and applies a legality mask during generation, ensuring every design is valid by construction. PrefixGPT features a customized decoder-only Transformer architecture. The model is first pre-trained on a corpus of randomly synthesized valid prefix adders to learn design rules and then fine-tuned to navigate the design space for optimized design quality. Compared with existing works, PrefixGPT not only finds a new optimal design with a 7.7% improved area-delay product (ADP) but exhibits superior exploration quality, lowering the average ADP by up to 79.1%. This demonstrates the potential of GPT-style models to first master complex hardware design principles and then apply them for more efficient design optimization.

Downloads

Published

2026-03-14

How to Cite

Ding, R., Ning, X., Schlichtmann, U., & Qian, W. (2026). PrefixGPT: Prefix Adder Optimization by a Generative Pre-trained Transformer. Proceedings of the AAAI Conference on Artificial Intelligence, 40(25), 20808-20815. https://doi.org/10.1609/aaai.v40i25.39220

Issue

Section

AAAI Technical Track on Machine Learning II