Looking Outwards 11

Gender Shades

https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

As artificial intelligence software that detects, recognizes, and classifies faces becomes increasingly popular, researchers Joy Buolamwini and Timnit Gebru are examining how codified biases in facial recognition software often misgender people who are not white or even fail to recognize their faces completely in their project titled “Gender Shades”. These biased facial recognition softwares are often created by male-dominated teams of computer scientists who lack diversity in ethnicity, race, and gender. Additionally, the data sets that these computer scientists feed their programs also often lack diversity, which is why the software does a poor job of recognizing people who are not white or male. To combat this, Buolamwini and Gebru have created a new standard of data set taken from a diverse group of 1270 parliamentarians from Africa and Europe. This new benchmark dataset for gender and racial diversity will help facial recognition softwares learn to recognize all faces and distinguish between genders and ethnicities without bias.

Leave a Reply