Go to JKU Homepage
Institute of Women’s and Gender Studies
What's that?

Institutes, schools, other departments, and programs create their own web content and menus.

To help you better navigate the site, see here where you are at the moment.

IFG@Ars Electronica Festival 2020.

 

We would like to point out that when playing the video, data may be transmitted to external parties. Learn more by reading our data privacy policy

Project documentation:

(Kopie 1)

We would like to point out that when playing the video, data may be transmitted to external parties. Learn more by reading our data privacy policy
We would like to point out that when playing the video, data may be transmitted to external parties. Learn more by reading our data privacy policy

How to become a high-tech anti-discrimination activist collective

  Performance lectures & creative workshops
©Adriana Torres Topaga

New technologies have infiltrated all aspects of our life and offer a vast range of improvements, reliefs and efficiencies to their users. However, contrary to mainstream perceptions, algorithms propelling the performance of such technologies neither function in neutral ways nor treat all people equally. They are biased to the same extent as the structures, institutions and developers producing them. Thus, systemic racism and sexism are inscribed – most often unconsciously – in the modes and outputs of technologies. This project addresses this issue and asks how discriminations in design and use of technology can be overcome.

Two lecture performances by Safiya Umoja Noble and Lisa Nakamura interrogate how digital media (re)shape perceptions of race, ethnicity, and identity in general. They unpack the systemic racism and sexism inscribed into new technologies and offer perspectives on how we can find alternative, more equal, approaches to tech developments.

Four participatory workshops with experts from different fields – Adriaan Odendaal, Adriana Torres Topaga, Andrea Maria Handler, Astrid Mager, Doris Allhutter, Hong Phuc Dang, Karla Zavala, Martyna Lorenc, Nushin Isabelle Yazdani – will guide the transfer from analysis to practice. We intend to initiate a creative conversation and a reciprocal exchange of knowledge between all experts, facilitators and workshop participants.

Target group: activists, software developers, computer scientists, start-up entrepreneurs, students, gamers, software users, users of databases, apps and diverse IT tools, laypersons, and everyone interested in the topic.

The lecture performances will be streamed on Friday, September 11, 2020, 1-3 pm. There will be an on-site stream with limited access at the Kepler Hall, JKU Campus, Linz.  Lectures will be held in English.

The workshops will take place from Friday, September 11, 2020, 10 am to Saturday, September 12, 2020, 6 pm and will be held in English. Please register in advanc, opens an external URL in a new windowe, opens an external URL in a new window. Workshop particpants may be filmed and photographed. Please see DSGVO, opens a file.

Project concept and organisation: JKU’s Institute of Women's and Gender Studies (IFG) is Austria’s only interdisciplinary Gender Studies university department. It is headed by economist Doris Weichselbaumer, who researches racist and gendered discrimination. Philosopher Waltraud Ernst and sociologist Julia Schuster are university assistants with work foci in feminist studies of science & technology, and intersectionality theory, respectively.

Key Facts:

Date:


11.9.2020 - 12.9.2020

Location:


Johannes Kepler University Linz
Altenberger Straße 69
4040 Linz

Preliminary time schedule:


Workshop 1:
Friday, September 11, 10am – 2pm
“When I encountered discriminating IT-systems and did not want to take it anymore”: deconstructing affective entanglements in society-technology relations

Lecture 1:
Friday, September 11, 1pm-1:40pm
Lisa Nakamura, University of Michigan

Lecture 2:
Friday, September 11, 1:40pm-2:40pm
Safiya Umoja Noble, University of California

Workshop 2:
Friday, September 11, 4pm – 8pm
How to create your own AI device with SUSI.AI - An Open Source Platform for Conversational Web

Workshop 3:
Saturday, September 12, 10am – 2pm
Phantom Data in our bodies and imagination

Workshop 4:
Saturday, September 12, 2pm - 6pm
[d/r]econstructing AI: dreams of visionary fiction and zine-making , opens an external URL in a new window
 

Lecture-Performances:

Lisa Nakamura, University of Michigan Estranging Digital Racial Terrorism After COVID

Friday, September 11, 1pm-1:40pm

Racism and sexism are purposefully designed into digital networks as fundamentally ordinary experiences. The digital industries maintain business as usual, platforming racial formations that have always enacted everyday violence for users of color while solidifying their monopolies. This talk argues that COVID-19 forced an accelerated migration to digital networks that exposed new audiences to traumatically racist digital events as well as new openings for critique and resistance.  Hundreds of Zoom conference calls hosting college graduations, woman of color organizing meetings, and online classes have been disrupted by swastika and blackface imagery, demonstrating the everydayness of racial violence and its interpenetration into intimate everyday life.  What material, theoretical, and political work can we do to redefine racism and sexism as alien to the network? What opportunities for estranging digital racism from our lifeworlds might surface in our current moment?

©Daryl Marshke| Michigan Photography
Safiya Umoja Noble, University of California Algorithms of Oppression: How Search Engines Reinforce Racism

Friday, September 11, 1:40pm-2:40pm

The landscape of information is rapidly shifting as new demands are increasing investment in digital technologies. Yet, critical scholars continue to demonstrate how many technologies are shaped by and infused with values that are not impartial, disembodied, or lacking positionality. Technologies hold racial, gender, and class politics. In this talk, Dr. Safiya Noble will discuss her recent book, Algorithms of Oppression, and the impact of technology on the public.

 

©Stella Kalinina

Participatory Workshops:

 

“When I encountered discriminating IT-systems and did not want to take it anymore”: deconstructing affective entanglements in society-technology relations

The workshop will be held in Kepler Gardens (JKU campus), Learning Center, 3rd level (elevator available) – Schulungsraum

 

When using technologies, we often find ourselves in a paradox: we struggle with how they endanger our privacy, restrict our agency by limited options and how they materialize discriminatory worldviews — at the same time, we love engaging with tech that inspires new ways of sociality, co-creation and collectivity. When creating systems, we encounter a similar conundrum: we are urged to capitalize on the available data to make decision processes more efficient and to generate economic prosperity; we want to develop sound technical methods and deliver elegant solutions to real-world problems — at the same time, norms of technical feasibility limit our inventiveness on how computing can contribute to social change and equality.
In recent years, countless cases have shown how algorithms, machine learning and AI objectify existing discrimination and amplify social injustice in terms of sexism, racism, classism and ableism (e.g. Allhutter 2019, West el al. 2019). Many of us users, developers and citizens feel the need to stand up for equality and justice in our digitized world. Yet, our own embeddedness in systemic power relations often leaves us puzzled and paralyzed.
This workshop uses the deconstructive method of mind scripting to help us understand the grip that even technologies that we reject and technological practices that we find questionable may have on us. Using our own memories as an experimental resource we will explore our affective entanglements in society-technology relations and ask how discrimination and privilege materialize in our sociotechnical everyday practices. This exploration aims at developing collective agency and activisms.

If this resonates with your experience, please drop a line to dallhutt@oeaw.ac.at , additionally to your workshop registration, opens an external URL in a new window.

Allhutter 2019. digitalsts.net/essays/of-working-ontologists-and-high-quality-human-components/, opens an external URL in a new window


West, et al. 2019. ainowinstitute.org/discriminatingsystems.html, opens an external URL in a new window

 

 

Doris Allhutter works at the ITA, opens in new window,  Austrian Academy of Sciences, and teaches at JKU Linz and TU Wien. She researches how social inequality and difference co-emerge with sociotechnical systems and explores how practices of computing are implicitly normative and entrenched in societal power relations.

 

How to create your own AI device with SUSI.AI - An Open Source Platform for Conversational Web

The workshop will be held in Kepler Gardens (JKU campus), Learning Center, 3rd level (elevator available) – Schulungsraum

 

This workshop is about SUSI.AI - an open source conversational framework that provides user freedom and enables users to have complete control over their own data. The goal is to develop an entirely open alternative to assistants like Alexa, Siri or Google Home. The project is based on the principles of privacy and collaboration. It encompasses AI technologies, search engines, API services, and smart devices. Skills can be created in the wiki-like skill editor. SUSI is capable of chat and voice interaction and, by using local data and media or external APIs, of performing actions such as music playback, making to-do lists, setting alarms, streaming podcasts, playing audiobooks, and providing weather, traffic, and other real time information. The core of the assistant is the SUSI.AI server that holds the "intelligence" and "personality" of SUSI. A local version of the SUSI.AI server also runs on the smart speaker itself, making it ready for offline functionalities.

The workshop will cover 1) an introduction to SUSI components and current developments including the overview of SUSI’s technology stack, its wiki-like skill editor and the recently released hardware prototype, followed by 2) a hands-on session where participants can work together to create a simple bot, develop new skills and test them on a device. 3) Finally, we will reflect upon artificial intelligence, data bias, and algorithmic discrimination in more general terms and – based on the experiences with SUSI.AI – collectively think about ways of creating more just, non-discriminatory digital technologies in the future.

The workshop will be held in English. There are no programming skills required. Workshop participants are invited to bring their own laptops for the hands-on experiments.

Registration required, opens an external URL in a new window – also mail to: astrid.mager@oeaw.ac.at by 07.09.2020.

 

   

 

Astrid Mager is senior postdoc at the Institute of Technology Assessment, opens an external URL in a new window at the Austrian Academy of Sciences (ÖAW) and teaches at the Department of Science and Technology Studies, University of Vienna. She currently works on her habilitation project „Algorithmic Imaginaries. Visions and values in the shaping of search engines“ (funded by the Austrian Science Fund, FWF, grant number: V511-G29). Twitter: https://twitter.com/astridmager, opens an external URL in a new window | https://www.astridmager.net/, opens an external URL in a new window

Hong Phuc Dang is an Open Source advocate with more than a decade of experience in IT development and community building. She currently serves as the Vice President of the Open Source Initiative, opens an external URL in a new window. In 2009 Hong Phuc co-founded FOSSASIA, opens an external URL in a new window, an organization that strives to improve people’s lives through sharing open technologies, knowledge and fostering global connections and sustainable production. Twitter: twitter.com/hpdang, opens an external URL in a new window | Linkedin: linkedin.com/in/hongphucdang, opens an external URL in a new window | GitHub: github.com/hpdang, opens an external URL in a new window

 

Phantom Data in our bodies and imagination

The workshop will be held in Kepler Gardens (JKU campus), Learning Center, 3rd level (elevator available) – Schulungsraum

 

In this workshop, undertaking a performing arts´ approach to the question „How to become a high-tech anti-discrimination activist collective“, we turn to one of the origins of the creative process - imagination. Our departure point lies in the embodiment of imagination and perceptual processes. With somatic exercises and performative games, we take notice of the data that flows in from our bodies, whether it be sensation, image, emotion or memory. Observing these contents, we address the notion of „default“ contents - bits of information that are present but escape our attention and are not communicated, yet affect the outcome. We practice the art of asking questions as a way to call out the default in our imagery.

Assuming that imagination has its own training dataset - our experiences, memories, emotions, knowledge, beliefs - therefore a specific priming/bias/imprint, we look for practices of becoming aware of HOW we imagine things and what escapes our field of attention. We approach the act of creation not as an expression of individual capability but as initiating a dialogue in the collective imaginary. Relating to Ruha Benjamin´s account that „imagination is a field of action“, we give importance to that stage of the creative process, whether in art or technology, understanding that it defines what the outcome will be.

Registration required., opens an external URL in a new window 12 participants max.

 

 

LAB ON STAGEcollective for performing arts, space and design strategies

ADRIANA TORRES TOPAGA⎥VISUAL ARTIST⎥DESIGNER⎥RESEARCHER

Adriana studied industrial design and holds a master’s degree in space and design strategies. In her work she explores sensation, public space, constructs of identity, the relationship between the human body, space and technology in an experimental and interdisciplinary way.

www.puntos.at , opens an external URL in a new window

MARTYNA LORENC⎥PERFORMER⎥CHOREOGRAPHER⎥COGNITIVE SCIENTIST

Martyna completed MSc. in cognitive science and a BA in contemporary dance & pedagogy. She works since many years as a freelance dancer and performer with various artists and creates own choreographic works in different, often experimental formats. Her pieces draw from interdisciplinary research, especially relating body and imagination.

www.martynalorenc.com, opens an external URL in a new window

ANDREA MARIA HANDLER⎥PERFORMER⎥MANUAL THERAPIST⎥DANCE PEDAGOGUE

Andrea Maria holds a master’s degree in contemporary dance and a bachelor’s degree in dance pedagogy. Her interest in movement and related creative work arises from the fascination for the bodymind’s inherent capabilities for ingenuity as well as originality. She works with various artists from the field of dance and performance and also focuses on the therapeutic aspect of touch and movement.

Mail: labonstagecollective(at)gmail.com

Web: http://www.puntos.at/LAB_SITE/main/about.html, opens an external URL in a new window

FB: https://www.facebook.com/labonstage/, opens an external URL in a new window

IG: https://www.instagram.com/lab.on.stage/, opens an external URL in a new window

 

[d/r]econstructing AI: dreams of visionary fiction and zine-making

Online workshop via Zoom. Password-protected links will be sent to all registered participants, opens an external URL in a new window by e-mail before the workshop.

 

With this workshop we want to investigate the structures behind algorithmic decision making systems. We will discuss why their design is normative rather than neutral, and how AI systems reproduce and reinforce structural discrimination in our society. Using various tools as examples, we will explore existing power relations and forms of discrimination that are reflected in the tools.

We then embark on a speculative journey into the future and discuss how we want to design today (with technology and without)b to create a more just world for tomorrow. What approaches are already in place? Which artists inspire us? Which values areimportant to us? We want to collect our thoughts and creative outputs in a zine.

"We have the gift and the responsibility to imagine. And yes, this is a dark age. And a darkness such as this is the perfect setting for our dreams. Visionary fiction is a way to shape dreams of justice – to understand that art is not neutral, that what we dream andcreate is a practice ground for the futures we need."

—adrienne maree brown

 

 

Nushin Isabelle Yazdani is an interaction and transformation designer, artist andresearcher. In her work, she examines the interconnectedness of digital technologies and social justice, artificial intelligence and discrimination - from an intersectional feminist perspective. She creates collaboratively with the communities that are directly impacted by the designed outcomes and seeks to explore design processes that dismantle oppressive structures.

At the Education Innovation Lab, Nushin works on transforming the education system and creating innovative learning methods. She is a lecturer at different universities, and a member of the Design Justice Network and dgtl fmnsm. Apart from teaching Nushin also curates and organizes community events at the intersection of technology, art and design.

Internet Teapot

by Karla Zavala and Adriaan Odendaal is a Rotterdam-based collaboration that focuses on speculative and critical design projects and research. The studio stems from shared interests in digital culture, critical theory, and the belief that design can be used in a socially transformative way.  internetteapot.com, opens an external URL in a new window algorithmsoflatecapitalism.tumblr.com, opens an external URL in a new window

Karla Zavala is a designer and digital project manager from Peru, specializing in UX, QA, and communications. She holds a BA in Communication for Social Development at the University of Lima and an MA in Media Arts Cultures from Aalborg University.

@karlazavala

Adriaan Odendaal is a multimedia designer and web-developer from South Africa. He holds a BA in Visual Studies and Sociology from Stellenbosch University and is a recent MA in Media Arts Cultures graduate from Aalborg University. 

@adriaan_o

[Translate to Englisch:] ©Ars Electronica Linz GmbH & Co KG