Bridging Classical and Quantum Computing for Next-Generation Language Models

Authors

  • Yi Pan School of Computing, The University of Georgia, GA, USA
  • Hanqi Jiang School of Computing, The University of Georgia, GA, USA
  • Junhao Chen School of Computing, The University of Georgia, GA, USA
  • Yiwei Li School of Computing, The University of Georgia, GA, USA
  • Huaqin Zhao School of Computing, The University of Georgia, GA, USA
  • Lin Zhao Department of Biomedical Engineering, New Jersey Institute of Technology, NJ, USA
  • Yohannes Abate Department of Physics and Astronomy, The University of Georgia, GA, USA
  • Yingfeng Wang Department of Computer Science and Engineering, University of Tennessee at Chattanooga, TN, USA
  • Tianming Liu School of Computing, The University of Georgia, GA, USA

DOI:

https://doi.org/10.1609/aaaiss.v7i1.36909

Abstract

The remarkable success of Transformer architectures in Large Language Models (LLMs) has revolutionized natural language processing, yet the transition to quantum computing for next-generation language models remains an open challenge. While quantum computing promises exponential advantages, a fundamental gap exists between classical deep learning and quantum computing paradigms, particularly given the severe constraints of Noisy Intermediate-Scale Quantum (NISQ) devices, including barren plateaus, limited qubit coherence, and circuit depth restrictions. We present Adaptive Quantum-Classical Fusion (AQCF), the first framework to bridge classical and quantum computing for language models by reimagining Transformer architectures through quantum-classical co-design. Our key insight is that effective bridging requires dynamic adaptation rather than static translation—the framework analyzes input complexity in real-time to orchestrate seamless transitions between classical and quantum processing. AQCF introduces entropy-driven adaptive circuits that circumvent barren plateaus, quantum memory banks that unify classical attention with quantum state-based similarity retrieval, and intelligent fusion controllers that ensure each computational paradigm handles tasks where it naturally excels. This bridging architecture maintains full compatibility with existing classical Transformers while progressively incorporating quantum advantages as they become accessible. Experiments on sentiment analysis demonstrate that AQCF achieves competitive performance while significantly improving quantum resource efficiency, operating successfully within typical NISQ constraints. By establishing a seamless integration pathway from today's classical LLMs to tomorrow's quantum-enhanced models, our framework provides both immediate practical value on current quantum hardware and a clear evolution path toward full Quantum LLMs as technology matures.

Downloads

Published

2025-11-23

How to Cite

Pan, Y., Jiang, H., Chen, J., Li, Y., Zhao, H., Zhao, L., Abate, Y., Wang, Y., & Liu, T. (2025). Bridging Classical and Quantum Computing for Next-Generation Language Models. Proceedings of the AAAI Symposium Series, 7(1), 381-389. https://doi.org/10.1609/aaaiss.v7i1.36909

Issue

Section

First AAAI Symposium on Quantum Information & Machine Learning (QIML): Bridging Quantum Computing and Artificial Intelligence