On Paraconsistent Belief Revision in LP


  • Nicolas Schwind National Institute of Advanced Industrial Science and Technology, Tokyo, Japan
  • Sébastien Konieczny CRIL-CNRS, Université d'Artois, Lens, France
  • Ramón Pino Pérez CRIL-CNRS, Université d'Artois, Lens, France




Knowledge Representation And Reasoning (KRR)


Belief revision aims at incorporating, in a rational way, a new piece of information into the beliefs of an agent. Most works in belief revision suppose a classical logic setting, where the beliefs of the agent are consistent. Moreover, the consistency postulate states that the result of the revision should be consistent if the new piece of information is consistent. But in real applications it may easily happen that (some parts of) the beliefs of the agent are not consistent. In this case then it seems reasonable to use paraconsistent logics to derive sensible conclusions from these inconsistent beliefs. However, in this context, the standard belief revision postulates trivialize the revision process. In this work we discuss how to adapt these postulates when the underlying logic is Priest's LP logic, in order to model a rational change, while being a conservative extension of AGM/KM belief revision. This implies, in particular, to adequately adapt the notion of expansion. We provide a representation theorem and some examples of belief revision operators in this setting.




How to Cite

Schwind, N., Konieczny, S., & Pino Pérez, R. (2022). On Paraconsistent Belief Revision in LP. Proceedings of the AAAI Conference on Artificial Intelligence, 36(5), 5879-5887. https://doi.org/10.1609/aaai.v36i5.20532



AAAI Technical Track on Knowledge Representation and Reasoning