Skip to content

πŸ›πŸ•ΉοΈπŸ€– Interactive game exhibit that explores data bias and human responsibility

Notifications You must be signed in to change notification settings

cyberandthecity/bias

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Bias & AI

Exhibit designed for the Cyber and the City exhibition open until October 22nd, 2023.

An interactive game that explores data bias and human responsibility in a playful and engaging manner.

BiasObjectGitHub.mp4

An online version of the game can be played here

Description

Please visit the exhibition for a more comprehensive understanding, or refer to our midterm report.

This repository contains the code for our interactive game, designed as an exhibit to demystify AI and deliver an intuitive understanding of bias and human responsibility. As an AI developer, the visitor explores how different choices of data influence the results of an AI algorithm. By constructing a dataset free of harmful bias, the visitor learns how bias is presented in data and that the responsibility lies with humans, not AI, to avoid harmful biases. In this documentation, we provide more details about the exhibit and the code used to build it.

There is also an interview from the local news (in german language)

Screens

The game is split into five different screens, each serving a specific purpose in the gameplay and narrative.

Screens

The first screen is the Entrance screen, designed to lure the visitor into the game and give context about the gameplay. It introduces the game's premise, which is that an AI is failing to let students into the night club as a doorkeeper. The second screen is the Introduction screen, which explains the situation in further detail and shows that the night club, called Clubhaus, is part of the TΓΌbingen student nightlife. This screen also highlights the problem that the AI is not able to judge who is a student and, therefore, not able to determine who should enter the club. It asks the visitor for help to retrain the AI by building up a dataset. This screen sets the stage for the main game and introduces the visitor to the game mechanics. The third screen is the main Game , where the visitor engages in a three-step decision process to rebuild the AI by finding datasets that are not discriminatory. In each round, the user selects one dataset, which the AI uses to improve its decision-making capabilities. The Evaluation screen shows the solution of the game and embeds the entire game in a real-life perspective. It highlights the impact that biased AI can have on people's lives and encourages players to think critically about the issue. The last screen, the Explanation screen, explains bias in AI in a more scientific manner and also briefly reminds the player who is responsible for bias in AI. This screen is designed to provide a more in-depth understanding of the topic and raise awareness about the importance of addressing bias in AI.

Overall, the game screens are carefully crafted to engage the visitor, introduce the game mechanics, and educate them about bias in AI. They are an essential part of the game's narrative and serve to reinforce its message.

Content

Description of the components used in this React app and how they contribute to the game, its logic and visual design.

.
β”œβ”€β”€ public
β”‚   β”œβ”€β”€ datasets                # Dir of datasets of student images (not included)
β”‚   β”œβ”€β”€ emojis                  # Dir of apple emojis as images
β”‚   β”œβ”€β”€ fonts                   # Dir of fonts used manly Inter
β”‚   β”œβ”€β”€ images                  # Dir of graphics such as images and svgs
β”‚   └── videos                  # Dir of video files used
β”‚
β”œβ”€β”€ src
β”‚   β”œβ”€β”€ components              # Dir of all react components
β”‚   β”œβ”€β”€ data                    # Dir of all chat related texts
β”‚   β”œβ”€β”€ pages                   # Dir of the main page hierachy 
β”‚   β”œβ”€β”€ stores                  # Dir of stores with main game logic
β”‚   β”œβ”€β”€ styles                  # Dir of global styles and animations
β”‚   β”œβ”€β”€ utils                   # Dir of utility functionality
β”‚   └── app.tsx                 # Starting point of game
β”‚
β”œβ”€β”€ ...                         # Craco, Typescript, Prettier, VSCode, Git and React setup
β”‚
.

Getting Started

This is a simple React app to run the project, you must have the following tools installed on your system: Node.JS, NPM and Yarn.

Clone the repository:

git clone https://github.com/cyberandthecity/bias.git

Navigate into the project directory:

cd bias

Install the dependencies:

yarn

To run the development build, use the following command:

yarn dev 

This will start a local development server and open the application in your default browser.

Process

In February 2023, the Stadtmuseum TΓΌbingen will host the exhibition Cyber and the City about artificial intelligence, curated by master students from the University of TΓΌbingen. Our group dedicated 1 1/2 years to building an interactive game that explains data bias and human responsibility in AI in an engaging and playful manner. Check out some pictures of our creative process and journey to build this exhibit below.

Process

About

This project was developed as part of an Artificial Intelligence seminar at the Department of Theoretical Machine Learning at the University of TΓΌbingen. We would like to thank Prof. Dr. Ulrike von Luxburg and Prof. Dr. Thomas Thiemeyer for their invaluable help and guidance throughout this project. Participants of this project are Julian Petruck, Vanessa Tsingunidis, Katja KΓΆrner, Moritz Kniebel and Jan-Niklas Dihlmann. We are proud to have worked on this project and hope that it will be useful and inspiring for others.

Licensing

This is a non-commercial student project, released under the MIT License. You are free to fork the code and use it for any purpose, including commercial ones, as long as you comply with the terms of the license. Please note that we cannot guarantee the functionality of the code, and we will not take any responsibility or liability for any issues or damages caused by its use. The artwork used for the AI in this project is licensed under its own terms by Gleb Kuznetsov, and it is not covered by the MIT License. Furthermore, the images created by the students who contributed to this project are not allowed to be shared in public or used in any other work without their explicit permission. If you have any questions about the licensing of this project, please don't hesitate to contact us.

About

πŸ›πŸ•ΉοΈπŸ€– Interactive game exhibit that explores data bias and human responsibility

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages