Blog 11

The article “Women in Media Arts: Does AI think like a (white) man?” addresses the biases in Ai, specifically regarding gender and race. “Gender Shades,” just one of the many artists’ works mentioned in the story, explores the differences in how AI facial recognition works for different races and genders. Through the research Joy Buolamwini and Timnit Gebru, the creators of the investigation, have done has led them to the conclusion that women, especially women of color, when using facial recognition, have a higher error rate than when recognizing other genders and races. They found that the developers used incorrect or incomplete data sets to train the programs, and, in turn, created the first data set that includes all skin color types and can test face-based gender recognition.

Another set of artists mentioned in the article, Birgitte Aga and Coral Manton, focus on gender-specific AI through their project, “Women Reclaiming AI.” The pair learned that AI language assistants are usually developed by teams that lack diversity and, through AI, reinforce stereotypes that reinforce traditional gender roles. In their project, they attack these biases and created a “feminist data set” for future AI systems. The artists mentioned are taking the steps to address the downfalls of AI and bring up questions around and problems to be solved with equality and AI.

Women in Media Arts: Does AI think like a (white) man?

Leave a Reply