Workshop on epistemic rights and AI policy (with a focus on explainable AI, LLMs and generative AI, and fairness and accountability). Supported by the Eindhoven Center for the Philosophy of Artificial Intelligence (@ephil.ai), the NWO Veni project Explain Yourself!, the Philosophy and Ethics Group at Eindhoven University of Technology (@PhilEthicsTuE) and the Eindhoven Artificial Intelligence Systems Institute. The workshop is organized by Emily Sullivan, Philippe Verreault-Julien and Yeji Streppel.
While the GDPR arguably gave us the ‘right to explanation’ of AI decisions, we now recognize there are significant policy gaps surrounding issues of informed advocacy and AI justice. In this interdisciplinary workshop, we will explore whether an ’epistemic rights’ perspective on ML and AI systems can help address gaps in AI policy and legislation. Do we have a right to understand how the algorithms powering social media work? Do we have a right against LLMs that ‘hallucinate’ information about real people and places? As machine learning (ML) and artificial intelligence (AI) systems continue to make large social impacts, it is crucial to address epistemic injustices that arise from algorithmic decison-making.
This workshop aims to bring together esteemed researchers and experts in the field to engage in discussions and collaborative exploration of innovative approaches to address contemporary challenges in AI policy, including but not limited to epistemic rights issues. A particular goal is to bring together insights from the workshop into a position paper on the topic.
If you have any questions, please email Yeji Streppel and we will send you additional information.
Time is in Central European Time (CET).
The workshop will take place at Hotel De Bildeberg.
If travelling by air, the easiest airport to travel to is Schiphol Airport (Amsterdam). Other possibilities are:
Arriving earlier or staying longer? Here are some sights and attractions in the area.