The interactive installation Faces of AI reveals fascinating results of a large-scale image analysis by the LIT Robopsychology Lab: How is Artificial Intelligence visualized in the media? How many brains, robots or humans appear?
The multimedia installation Faces of AI reveals fascinating results of a large-scale image analysis by the LIT Robopsychology Lab: How is AI visualized in public? On a daily basis, we read and hear about new technological achievements; we see images in newspapers or on the internet that are supposed to illustrate Artificial Intelligence. But what is actually being communicated to the public? How is the rather abstract, inherently non-embodied concept of Artificial Intelligence visually represented? What "explanations" do the often very impressive images of Artificial Intelligence provide in terms of public discourse? Do they accurately represent what AI is today? Do they spread fear or constructive willingness to shape the future? So far, there is a lack of scientific analysis on media communication about AI and associated audience effects.
The LIT Robopsychology Lab therefore crawled 20,000 preview images (thumbnails) on the keyword "Artificial Intelligence" from two of the largest stock image providers on the internet (Getty Images and Alamy) and drew a representative sample of 450 images out of them. These 450 images were evaluated and rated according to their degree of "human likeness", "uncanniness", "beauty" or "realism" and assigned to various categories (e.g. "industrial robot", "humanoid robot", "brain", "network", etc.).
At the Ars Electronica Festival 2021, the results will be presented to a larger audience for the first time. Images can be displayed and grouped by selected labels: For example, only those images that were classified as particularly unrealistic or only those images that show "humanoid robots" but were also rated as beautiful.
Thereby the question will be raised what impressions the images shown leave us with and what alternative ways of AI image-representation there could be.