Probabilities Are All You Need: A Probability-Only Approach to Uncertainty Estimation in Large Language Models

Authors

  • Manh Nguyen Applied Artificial Intelligence Initiative, Deakin University, Australia
  • Sunil Gupta Applied Artificial Intelligence Initiative, Deakin University, Australia
  • Hung Le Applied Artificial Intelligence Initiative, Deakin University, Australia

DOI:

https://doi.org/10.1609/aaai.v40i38.40531

Abstract

Large Language Models (LLMs) exhibit strong performance across various natural language processing (NLP) tasks but remain vulnerable to hallucinations, generating factually incorrect or misleading outputs. Uncertainty estimation, often using predictive entropy estimation, is key to addressing this issue. However, existing methods often require multiple samples or extra computation to assess semantic entropy. This paper proposes an efficient, training-free uncertainty estimation method that approximates predictive entropy using the responses' top-K probabilities. Moreover, we employ an adaptive mechanism to determine K to enhance flexibility and filter out low-confidence probabilities. Experimental results on three free-form question-answering datasets across several LLMs demonstrate that our method outperforms expensive state-of-the-art baselines, contributing to the broader goal of enhancing LLM trustworthiness.

Downloads

Published

2026-03-14

How to Cite

Nguyen, M., Gupta, S., & Le, H. (2026). Probabilities Are All You Need: A Probability-Only Approach to Uncertainty Estimation in Large Language Models. Proceedings of the AAAI Conference on Artificial Intelligence, 40(38), 32546–32554. https://doi.org/10.1609/aaai.v40i38.40531

Issue

Section

AAAI Technical Track on Natural Language Processing III