How I'm fighting bias in algorithms | Joy Buolamwini

How I'm fighting bias in algorithms | Joy Buolamwini

The Coded Gaze: Algorithmic Bias

Joy Buolamwini, a poet of code, introduces the concept of algorithmic bias and how it can lead to unfairness and exclusionary experiences. She shares her personal experience with facial recognition software and highlights the need for diverse training sets.

Facial Recognition Software

  • Joy demonstrates how facial recognition software fails to detect her face when she wears a white mask or has a darker skin tone.
  • She explains that computer vision uses machine learning techniques to do facial recognition by creating a training set with examples of faces. However, if the training sets aren't diverse enough, any face that deviates too much from the established norm will be harder to detect.
  • Joy emphasizes the importance of creating full-spectrum training sets that reflect a richer portrait of humanity.

Discriminatory Practices

  • Police departments in the US are starting to use facial recognition software in their crime-fighting arsenal. However, Georgetown Law published a report showing that one in two adults in the US have their faces in facial recognition networks without regulation or accuracy audits.
  • Misidentifying a suspected criminal is no laughing matter nor is breaching civil liberties.
  • Machine learning is being used beyond computer vision for decision-making processes such as hiring/firing, loan approvals, insurance coverage etc.

Conclusion

  • Algorithmic bias can lead to exclusionary experiences and discriminatory practices.
  • Creating diverse training sets is crucial for reducing algorithmic bias.

The Importance of Inclusive Coding Practices

This section discusses the importance of creating inclusive code and employing inclusive coding practices to ensure fair outcomes.

Who Codes Matters

  • Full-spectrum teams with diverse individuals can check each other's blind spots.

How We Code Matters

  • Factoring in fairness as we're developing systems is crucial.

Why We Code Matters

  • Using tools of computational creation to unlock immense wealth presents an opportunity to unlock even greater equality if we make social change a priority and not an afterthought.

The "Incoding" Movement

This section introduces the three tenets that make up the "incoding" movement: who codes matters, how we code matters, and why we code matters.

Building Platforms for Identifying Bias

  • Building platforms that can identify bias by collecting people's experiences can help towards incoding.
  • Auditing existing software is also important.

Creating More Inclusive Training Sets

  • A "Selfies for Inclusion" campaign where developers test and create more inclusive training sets could be helpful.

Thinking Conscientiously About Social Impact

  • Thinking more conscientiously about the social impact of the technology that we're developing is necessary for incoding.

Joining the Algorithmic Justice League

This section invites viewers to join the Algorithmic Justice League, where anyone who cares about fairness can help fight against algorithmic bias.

Reporting Bias and Requesting Audits

  • On codedgaze.com, viewers can report bias, request audits, become a tester, and join the ongoing conversation, #codedgaze.

Creating a World of Inclusion

  • The speaker invites viewers to join her in creating a world where technology works for all of us, not just some of us, and where we value inclusion and center social change.

Conclusion

This section concludes the talk by asking viewers to join the fight against algorithmic bias.

Joining the Fight Against Algorithmic Bias

  • The speaker asks viewers if they will join her in the fight against algorithmic bias.
Channel: TED
Video description

MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives. TEDTalks is a daily video podcast of the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design -- plus science, business, global issues, the arts and much more. Find closed captions and translated subtitles in many languages at http://www.ted.com/translate Follow TED news on Twitter: http://www.twitter.com/tednews Like TED on Facebook: https://www.facebook.com/TED Subscribe to our channel: http://www.youtube.com/user/TEDtalksDirector