Epistemic reasoning in AI

Tutorial at IJCAI-ECAI 2018

Abstract

Artificial agents take decisions according to their knowledge about the world and the knowledge they have about the knowledge of other agents: this is called higher-order knowledge. For instance, let us consider three autonomous robots on Mars called A, B, C. Agent A decides to explore the region to the North if “A knows that neither B nor C knows that there is sand to the North”. In this tutorial, we will present Dynamic Epistemic Logic which provides a framework to model such complex epistemic situations and the evolution in time of knowledge. The framework is sufficiently expressive to capture public actions (e.g. broadcast of a message) but also private and semi-private actions (e.g. private messages). The framework will be explained via a software, called Hintikka’s world. In the second part, we will address model checking and epistemic planning that are standard decision problems for verifying multi-agent systems. We will pinpoint restrictions over actions (e.g. only public actions, etc.), discuss succinct models and give an overview of complexity results and decidability results. We will also advocate the use of knowledge-based programs for representing plans that are executed in a decentralized manner.

Bio

François Schwarzentruber is associate professor at ENS Rennes (France). His current research interests are mainly focused on theory and applications of logic to artificial intelligence, agency and multi-agent systems and computer science. He has been a PC member of some editions in that topics such as AAMAS and IJCAI. He was reviewer for journals such as Synthese, Studia Logica, Theoretical Computer Science. Since 2011, his research mainly focuses on studying dynamic epistemic logic.

Goal of the tutorial

Outline

Tutorial materials