How Many Experts Are Enough? Towards Optimal Semantic Specialization for Mixture-of-Experts

Authors

  • Sumin Park Korea Advanced Institute of Science & Technology
  • Noseong Park Korea Advanced Institute of Science & Technology

DOI:

https://doi.org/10.1609/aaai.v40i29.39665

Abstract

Finding the optimal configuration of Sparse Mixture-of- Experts (SMoE) that maximizes semantic differentiation among experts is essential for exploiting the full potential of MoE architectures. However, existing SMoE frameworks either heavily rely on hyperparameter tuning or overlook the importance of diversifying semantic roles across experts when adapting the expert pool size. We propose Mixture-of-Experts for Adaptive Semantic Specialization (MASS), a semantic-aware MoE framework for adaptive expert expansion and dynamic routing. MASS introduces two key advancements: (i) a gradient-based semantic drift detector that prompts targeted expert expansion when the existing expert pool lacks capacity to capture the full semantic diversity of the data, and (ii) an integration of adaptive routing strategy that dynamically adjusts expert usage based on token-level routing confidence mass. We first demonstrate that MASS reliably converges to the point of optimal balance between cost-performance trade-off with notably improved sematic specialization in a highly controlled synthetic setup. Further empirical results on real-world datasets across language and vision domains show that MASS consistently outperforms a range of strong MoE baselines, demonstrating its domain robustness and enhanced expert specialization.

Downloads

Published

2026-03-14

How to Cite

Park, S., & Park, N. (2026). How Many Experts Are Enough? Towards Optimal Semantic Specialization for Mixture-of-Experts. Proceedings of the AAAI Conference on Artificial Intelligence, 40(29), 24792-24800. https://doi.org/10.1609/aaai.v40i29.39665

Issue

Section

AAAI Technical Track on Machine Learning VI