Relational Programming with Foundational Models
DOI:
https://doi.org/10.1609/aaai.v38i9.28934Keywords:
KRR: Logic Programming, CV: Visual Reasoning & Symbolic Representations, ML: Deep Neural Architectures and Foundation Models, ML: Neuro-Symbolic Learning, ML: Statistical Relational/Logic Learning, NLP: Information Extraction, RU: Relational Probabilistic ModelsAbstract
Foundation models have vast potential to enable diverse AI applications. The powerful yet incomplete nature of these models has spurred a wide range of mechanisms to augment them with capabilities such as in-context learning, information retrieval, and code interpreting. We propose Vieira, a declarative framework that unifies these mechanisms in a general solution for programming with foundation models. Vieira follows a probabilistic relational paradigm and treats foundation models as stateless functions with relational inputs and outputs. It supports neuro-symbolic applications by enabling the seamless combination of such models with logic programs, as well as complex, multi-modal applications by streamlining the composition of diverse sub-models. We implement Vieira by extending the Scallop compiler with a foreign interface that supports foundation models as plugins. We implement plugins for 12 foundation models including GPT, CLIP, and SAM. We evaluate Vieira on 9 challenging tasks that span language, vision, and structured and vector databases. Our evaluation shows that programs in Vieira are concise, can incorporate modern foundation models, and have comparable or better accuracy than competitive baselines.Downloads
Published
2024-03-24
How to Cite
Li, Z., Huang, J., Liu, J., Zhu, F., Zhao, E., Dodds, W., … Naik, M. (2024). Relational Programming with Foundational Models. Proceedings of the AAAI Conference on Artificial Intelligence, 38(9), 10635–10644. https://doi.org/10.1609/aaai.v38i9.28934
Issue
Section
AAAI Technical Track on Knowledge Representation and Reasoning