How I'm fighting bias in algorithms | Joy Buolamwini
The Coded Gaze: Algorithmic Bias
Joy Buolamwini, a poet of code, introduces the concept of algorithmic bias and how it can lead to unfairness and exclusionary experiences. She shares her personal experience with facial recognition software and highlights the need for diverse training sets.
Facial Recognition Software
- Joy demonstrates how facial recognition software fails to detect her face when she wears a white mask or has a darker skin tone.
- She explains that computer vision uses machine learning techniques to do facial recognition by creating a training set with examples of faces. However, if the training sets aren't diverse enough, any face that deviates too much from the established norm will be harder to detect.
- Joy emphasizes the importance of creating full-spectrum training sets that reflect a richer portrait of humanity.
Discriminatory Practices
- Police departments in the US are starting to use facial recognition software in their crime-fighting arsenal. However, Georgetown Law published a report showing that one in two adults in the US have their faces in facial recognition networks without regulation or accuracy audits.
- Misidentifying a suspected criminal is no laughing matter nor is breaching civil liberties.
- Machine learning is being used beyond computer vision for decision-making processes such as hiring/firing, loan approvals, insurance coverage etc.
Conclusion
- Algorithmic bias can lead to exclusionary experiences and discriminatory practices.
- Creating diverse training sets is crucial for reducing algorithmic bias.
The Importance of Inclusive Coding Practices
This section discusses the importance of creating inclusive code and employing inclusive coding practices to ensure fair outcomes.
Who Codes Matters
- Full-spectrum teams with diverse individuals can check each other's blind spots.
How We Code Matters
- Factoring in fairness as we're developing systems is crucial.
Why We Code Matters
- Using tools of computational creation to unlock immense wealth presents an opportunity to unlock even greater equality if we make social change a priority and not an afterthought.
The "Incoding" Movement
This section introduces the three tenets that make up the "incoding" movement: who codes matters, how we code matters, and why we code matters.
Building Platforms for Identifying Bias
- Building platforms that can identify bias by collecting people's experiences can help towards incoding.
- Auditing existing software is also important.
Creating More Inclusive Training Sets
- A "Selfies for Inclusion" campaign where developers test and create more inclusive training sets could be helpful.
Thinking Conscientiously About Social Impact
- Thinking more conscientiously about the social impact of the technology that we're developing is necessary for incoding.
Joining the Algorithmic Justice League
This section invites viewers to join the Algorithmic Justice League, where anyone who cares about fairness can help fight against algorithmic bias.
Reporting Bias and Requesting Audits
- On codedgaze.com, viewers can report bias, request audits, become a tester, and join the ongoing conversation, #codedgaze.
Creating a World of Inclusion
- The speaker invites viewers to join her in creating a world where technology works for all of us, not just some of us, and where we value inclusion and center social change.
Conclusion
This section concludes the talk by asking viewers to join the fight against algorithmic bias.
Joining the Fight Against Algorithmic Bias
- The speaker asks viewers if they will join her in the fight against algorithmic bias.