Technology has the power to drive action. And right now, a call to action is needed to eradicate racism. Black lives matter.
We recognize technology alone cannot fix hundreds of years of racial injustice and inequality, but when we put it in the hands of the Black community and their supporters, technology can begin to bridge a gap. To start a dialogue. To identify areas where technology can help pave a road to progress.
This project is an effort to leverage technology to prevent, detect, and remediate bias and misrepresentation in the workplace, products, and society. For corporations to succeed, it is critical to have Black representation at every level.
This is one of three open source projects underway as part of the Call for Code Emb(race) Spot Challenge led by contributors from IBM and Red Hat.
The desired outcome of this effort is an open source technology solution that can have a measurable impact on the problem statements and hills below. That solution can then be put to work within IBM and Red Hat, as well as society at large.
This repository is expected to evolve into a piece of technology that can be created and deployed, similar to the steps for other Call for Code starter kits, such as the kit for Community Cooperation or Crisis Communications.
-
Engage
- Understand the problem statements in this solution starter GitHub repository.
- Connect with colleagues in Slack to join one of the teams working on solutions to this problem.
-
Envision
-
Contribute
- IBMers: Learn about open source and certify before contributing.
- Open issues, submit pull requests, or edit the wiki for this repository.
- Black employees do not advance at the rate and to the levels of influence in the workplace they should
- Real-world bias influences inputs into algorithms, creating algorithmic biases in technology
- Bias is learned and perpetuated in different ways
Black employees do not advance at the rate and to the levels of influence in the workplace they should due to a lack of transparency around opportunities and implicit biases in recruiting, evaluation, mentorship and promotion processes.
-
A recruiter can receive data-driven recommendations of non-traditional applicants to reconsider and be confident in advocating for them to hiring managers.
-
A performance reviewer can receive real-time feedback and suggestions on possible bias in their evaluations.
-
An employee can know how common the microaggressions they face at work is are without talking to others.
-
A senior manager can quantify and receive recommendations on their team's inclusivity of Black employees in high visibility discussions and opportunities, and the diversity of their in-office communications in an easy to navigate dashboard.
- Data visualization, machine-learning-based recommendation systems, predictive analytics and bias detection algorithms are powerful tools for workplace use to increase transparency and reduce bias in the hiring, retention, and promotion pipeline.
See who's already working on this problem and join a team.
- America's Opportunity Gaps: By the Numbers
- Black STEM employees perceive a range of race-related slights and inequities at work
- Talent Matters: The case for reaching out to non-traditional IT talent
At this time, datasets are provided for reference only. Do not include dataset information in any solutions until further notice.
-
As a proof-of-concept and exemplar for other companies worldwide, we can use all available de-identified GDPR-compliant IBM HR data from which we can use the most relevant schema for analytics; many more additional columns than this ago IBM provision to Kaggle of course.
-
Universities and High Schools databases showing graduation and dropout rate. With high emphasis on the success rate so that it can be emulated in other errors.
Real-world bias influences inputs into algorithms, creating algorithmic biases in technology. The effect of these biases can range from misrepresentation of the expected end users to inequitable practices in decision making algorithms and consequently possible gap widening.
-
A data scientist can identify bias in training data and receive corrected data samples to improve training data in less than an hour.
-
A developer can test their algorithms for bias on diverse datasets and receive recommendations on how to de-bias their algorithms.
-
An end user can access explainable data on an algorithm they are using to understand its implicit assumptions and biases without any coding knowledge.
Reducing algorithmic biases requires a concerted conscience effort by diverse human-in-the-loop teams with robust algorithmic evaluation and development. Evaluation metrics, bias-mitigation algorithms, and dashboards have critical roles to help teams reduce potential biases in their products.
See who's already working on this problem and join a team.
- Bias detectives: the researchers striving to make algorithms fair
- Millions of Black people affected by racial bias in health-care algorithms
- Dissecting racial bias in an algorithm used to manage the health of populations
- How Explainable AI Is Helping Algorithms Avoid Bias
- Ethical dilemmas of AI: fairness, transparency, collaboration, trust, accountability & morality
- This Researcher's Observation Shows The Uncomfortable Bias Of TikTok's Algorithm
- AI Fairness 360: An Extensible Toolkit for Detecting, Understanding, and Mitigating Unwanted Algorithmic Bias
- AI Fairness 360: Resources
At this time, datasets are provided for reference only. Do not include dataset information in any solutions until further notice.
Bias is learned and perpetuated in different ways (e.g. societal beliefs, misrepresentation, ignorance) that consequently create inequitable outcomes across all spheres of life.
-
A media content editor (e.g., audio, gaming, movies, tv, comics, news, publications) can incorporate bias detection and remediation into their creative process to reduce racial bias and improve representation to Gen Z.
-
A media consumer can track the quantity and bias in their consumption of content from Black creators or about Black people in an easy interface.
-
A social media user can understand the historical and societal context of racial bias and cultural appropriation reflected in their posts in real time.
-
A doctor can identify their bias and discrepancies in clinical recommendations for Black patients without manual comparison of patients' notes.
Mobile and web applications can be used to help individuals tackle their own bias. Bias detection algorithms, machine learning based recommendation systems, chatbots, and predictive analytics are powerful tools to underpin these applications.Such technology has impactful use in various forms of media to more fairly diversely represent people qualitatively (by capturing a range of identifiable characteristics, beliefs, personas, interests, cultures) and quantitatively (by increasing overall and stratified numbers). A by-product of such interventions is the elimination of incorrect biased stereotypes that have historically plagued media content generation and natural language.
See who's already working on this problem and join a team.
- Confronting racial bias in video games
- Who Gets To Be A Superhero? Race And Identity In Comics
- Fighting the subconscious biases that lead to health care disparities
- Teens on TikTok have no clue they’re perpetuating racist stereotypes
- They've Gotta Have Us
- The Racial Bias Built Into Photography
- The Unfortunate History of Racial Bias In Photography
- What Hollywood movies do to perpetuate racial stereotypes
- Racial bias in expert quality assessment: A study of newspaper movie reviews
- Crossing the color line: An examination of mediators and a social media intervention for racial bias in selective exposure to movies
- Race and video games
- “Fair Play”: A Videogame Designed to Address Implicit Race Bias Through Active Perspective Taking
- Fair Play Game: Resources
- Racial disparities in automated speech recognition
- Publications as predictors of racial and ethnic differences in NIH research awards
- Interventions designed to reduce implicit prejudices and implicit stereotypes in real world contexts: a systematic review
Find help on the Support page.
This solution starter is made available under the Apache 2 License.