Cliqueformer: Model-Based Optimization with Structured Transformers

Authors

  • Jakub Grudzien Kuba University of California, Berkeley
  • Pieter Abbeel University of California, Berkeley
  • Sergey Levine University of California, Berkeley

DOI:

https://doi.org/10.1609/aaai.v40i27.39429

Abstract

Large neural networks excel at prediction tasks, but their application to design problems, such as protein engineering or materials discovery, requires solving offline model-based optimization (MBO) problems. While predictive models may not directly translate to effective design, recent MBO algorithms incorporate reinforcement learning and generative modeling approaches. Meanwhile, theoretical work suggests that exploiting the target function’s structure can enhance MBO performance. We present Cliqueformer, a transformer- based architecture that learns the black-box function’s structure through functional graphical models (FGM), addressing distribution shift without relying on explicit conservative approaches. Across various domains, including chemical and genetic design tasks, Cliqueformer demonstrates superior performance compared to existing methods.

Downloads

Published

2026-03-14

How to Cite

Kuba, J. G., Abbeel, P., & Levine, S. (2026). Cliqueformer: Model-Based Optimization with Structured Transformers. Proceedings of the AAAI Conference on Artificial Intelligence, 40(27), 22679-22687. https://doi.org/10.1609/aaai.v40i27.39429

Issue

Section

AAAI Technical Track on Machine Learning IV