Go to JKU Homepage
LIT Robopsychology Lab
What's that?

Institutes, schools, other departments, and programs create their own web content and menus.

To help you better navigate the site, see here where you are at the moment.

How to Explain AI.
How can we explain just what artificial intelligence is?

Educational intervention to support a better understanding of AI in the community, developed by Citizen Scientists, Artists & AI Experts.

Together with citizens, artists and AI experts, the participative project "How to Explain AI" aims to support a better public understanding of AI. The project will also explore just how well the idea can be influenced in a positive sense by means of new educational intervention.

  • „A Liadl, ans üwa KI“ – Why? THURS, Oct.12, 2023, 7:00 PM, Kepler Salon Linz or via Live Stream

    See Event

About the Project

  • Duration: 07/2022 - 11/2023
  • Funding provided by: The Ludwig Boltzmann Society (LBG)

Artificial intelligence (AI) is rapidly becoming increasingly important as part of our everyday lives and in the way we socially interact; and not everyone has the same degree of understanding and awareness. Members of the general public do not possess a level of expertise about AI (AI Literacy) in order expertly and realistically assess both the opportunities and risks of AI. To benefit from new technological developments, it is important to spread a basic understanding of AI Literacy.

"How to Explain AI" is a participatory process project supported by the LIT Robopsychology Lab to explore just how we can raise more understanding and awareness about AI among the general population: What do laymen know about the subject in terms of a base-knowledge understanding and ideas? What kind of questions do citizens have about AI in relation to their everyday lives? And just how should we present answers to these questions so that they generate interest and the greatest possible number of people can relate to them?

In order to develop an informative innovative AI explanation format, "How to Explain AI" involves three groups of co-researchers: Citizen Scientists, Artists, and AI Experts. Bringing these groups together gives us the potential to identify corresponding questions about AI and the ability to answer them in a more substantiated way applying new and artistically accompanied forms of knowledge transfer. The objective is to implement concrete intervention(s) to support a better understanding of AI among the general population and evaluate them as part of an impact study. The educational intervention structure can range from a co-creatively designed stage performance, and audio-visual installations in public spaces, to a poster campaign.

The intervention impact study findings and the entire participatory research process findings will be disseminated to the scientific community as well as to the broader population.


An Overview of Project Phases

  1. At the start of the project and together with co-researchers, information will be collected regarding the current state of knowledge, understanding and questions about AI.
  2. Afterwards and together with involved co-researchers, the team will generate intervention ideas to improve a better understanding of AI among the general population.
  3. With the support of corresponding experts, one of these intervention options will then be implemented.
  4. The intervention impact will be captured as part of an empirical study. The findings will be discussed from both a scholarly and social perspective and communicated appropriately.
  5. An in-process evaluation about the participatory process will be conducted throughout the project.

Workshop Impressions

„A Liadl, ans üwa KI“

Listen to the Song & Download the Lyrics

„A Liadl, ans üwa KI“ – But Why?

Thurs, Oct. 12, 2023, 7:00 PM, Kepler Salon Linz or via Live Stream
Mann mit Gitarre im erscheint hell angeleuchtet aus dem Nebel

Who is involved in the research?

The Project Idea in 3 Minutes