looking outwards-11

In Women in Media Arts: Does AI think like a (white) man?, Grubauer presents the view of Ars Electronica on AI for the purpose of supporting their project, which is to create “a comprehensive database devoted specifically to women in media art.” They did this to help give girls and women role models in terms of media art by increasing the presence of women artists in the public consciousness. Such projects play an important role in countering as hegemonic a phenomenon as patriarchy in the western world, which has shown through feminist philosophy and sociological research, that patriarchal (and white) tendencies permeate the cultural logic or societal common sense to the point of influencing objectivist science. “More and more activists point out the problematic prejudices and distortions of supposedly objective [my italics] algorithms. This has not only to do with the low proportion of female programmers and women in the IT sector in general, but above all with the biased data sets.” The mention of Mary Flanagan in this article aptly points out that this is a structural problem of society that creates the permeation I mention; it would be absurd to say an algorithm is inherently sexist or racist, and this claim is likely apprehended by confused defendants of the activists who push for making AI more equitable. The rest of the article introduces other women in the field of media art and their work such as Buolamwini/Gebru on skin color recognition, Aga/Manton on women-empowering AI, and Sinders’ feminist data set.

All quotes are from the article linked below.

https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

Leave a Reply