Epistemic reasoning in AI

Tutorial at IJCAI 2019


In multi-agent systems, intelligent agents should be able to give explanation of their decisions. Indeed, in case of failure, agents need to justify their decisions in an understandable way, in particular to comply with recent laws (e.g. GDPR in Europe). Also, they need to make meaningful decisions to cooperate with other agents such as humans. To achieve this goal, the agents should model the humans' mental states. For instance, a robot may inform the human of the position of an object if it believes that the human needs that object and does not know its position. In this tutorial, we will present the latest advances in reasoning about knowledge/beliefs. This tutorial aims at being accessible to a broad audience. It is illustrated by the pedagogical tool Hintikka’s world. The tool depicts mental states by means of comic strips and features a lot of simple multi-agent systems such as games. We will discuss several formal tools for modeling the following:


Tristan Charrier has defended his PhD in December 2018 on epistemic reasoning. He contributed to the field in several aspects: symbolic models, epistemic planning, languages for specifying epistemic situations, demonstration of epistemic reasoning. Tristan teaches formal logic including temporal logics and model checking, algorithmics, programming.
  Tristan Charrier
François Schwarzentruber is associate professor at ENS Rennes (France). His current research interests are mainly focused on theory and applications of logic to artificial intelligence, agency and multi-agent systems and computer science. He has been a PC member of some editions in that topics such as AAMAS and IJCAI. He was reviewer for journals such as Synthese, Studia Logica, Theoretical Computer Science. Since 2011, his research mainly focuses on studying dynamic epistemic logic.
  François Schwarzentruber
  Bâtiment Alfred Sauvy
  École normale supérieure de Rennes
  Campus de Ker Lann
  35170 Bruz
  Tél : (+33) 2 99 05 93 23
  Fax : (+33) 2 99 05 93 29


Tutorial materials