LO 11

Hannah Wyatt section A

When an AI face-recognition program fails only to detect colored women, there is clearly a systemic issue. ARS Electronica Blog dissects this bias in the article “Women in Media Arts”, attributing fault with the creators themselves: White Men. In Joy Buolamwini (US) and Timnit Gebru (ETH)’s project “Gender Shades”, they researched discrimination regarding gender and skin colour of people, finding the cause to be incomplete data sets.

The article encourages readers to reflect on the societal consequences-if these people are misrepresented, they may earn less opportunities in select technological fields. Thus, Caroline Sinders (US) devised ‘feminist data sets’ in order to counteract bias in machine-learning, surrounding art, interviews, and data collection by women. Mary Flanagan’s study “Help Me Know The Truth”  measures the larger impact/structural flaws in society through assessing participant’s views of random subjects as either ‘victims’ or ‘criminals’.

Leave a Reply