Algorithmic Justice League

Many people think that because computers are not people, they don’t perpetuate our biases and are more impartial. But what they seem to forget is that humans write the code that computers run, so their biases are baked into the cake.

The Algorithmic Justice League aims to combat coding bias by “highlighting bias through art, media, and science; provide space for people to voice concerns and experiences with coded bias; and develop practices for accountability during the design, development, and deployment of coded systems.”

In the talk above, Joy Buolamwini describes her experience with a facial recognition system that wouldn’t register her face unless she wore a white mask. (Not surprisingly, these systems are calibrated to white faces.) As these technologies become more pervasive, it becomes even more important to make them more inclusive.

Explore: Algorithmic Justice League

This site uses Akismet to reduce spam. Learn how your comment data is processed.