Looking Outward 11 – Racial Biases in Artificial Intelligence

In this article, Meilan Solly discusses a project (ImageNet Roulette) by Trevor Paglen and Kate Crawford that was created to expose the highly flawed and derogatory nature of AI human categorization. The project in question took the form of an AI driven identification tool that, when supplied with an image of a person, would return the category to which that image belongs (according to the algorithm). Categories or identifiers ranged on a spectrum of neutral to problematic terms like ‘pervert’, ‘alky (alcoholic)’, and ‘slut’.

While the category names are disturbing in and of themselves, the trends of categorization were far more so. Generally, the algorithm would sort people of color and women into extremely offensive categories at a disproportionately high rate. While no type of person was entirely safe from a harmful identifier, the disparity was clear. Solly describes a trial of the algorithm by a twitter user who uploaded photos of himself in various contexts and was only returned the tags “Black, Black African, Negroid, and Negro” (Solly). Paglen and Crawford have since removed ImageNet Roulette from the internet given that it has “made it’s point” (Solly), however it is still available as an art installation in Milan.

7-Training-Humans-24.jpg
ImageNet Roulette Installation

The implications of this project run deep. Arbitrary categorization of people on its own may have little consequence, but the underlying system to which it alludes is the same system that functions in the background of active AI processes with real world applications. Beyond this, the project makes comments on the concept of privacy online, having used thousands of images sourced from various locations without consent.

looking outwards-11

In Women in Media Arts: Does AI think like a (white) man?, Grubauer presents the view of Ars Electronica on AI for the purpose of supporting their project, which is to create “a comprehensive database devoted specifically to women in media art.” They did this to help give girls and women role models in terms of media art by increasing the presence of women artists in the public consciousness. Such projects play an important role in countering as hegemonic a phenomenon as patriarchy in the western world, which has shown through feminist philosophy and sociological research, that patriarchal (and white) tendencies permeate the cultural logic or societal common sense to the point of influencing objectivist science. “More and more activists point out the problematic prejudices and distortions of supposedly objective [my italics] algorithms. This has not only to do with the low proportion of female programmers and women in the IT sector in general, but above all with the biased data sets.” The mention of Mary Flanagan in this article aptly points out that this is a structural problem of society that creates the permeation I mention; it would be absurd to say an algorithm is inherently sexist or racist, and this claim is likely apprehended by confused defendants of the activists who push for making AI more equitable. The rest of the article introduces other women in the field of media art and their work such as Buolamwini/Gebru on skin color recognition, Aga/Manton on women-empowering AI, and Sinders’ feminist data set.

All quotes are from the article linked below.

https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

Looking Outward – 11

The article “6 Art Projects Prying thre Lid Off Online Privacy” discusses the blurring of private versus public on the internet. We don’t realize how much of our data and information is shared without us even knowing. We use digital applications and services that are “free”, when in reality, data is the currency of social media. Who actually reads the fine print and understands what they are agreeing to? Many artists have explored this space in connection to identity, privacy, and data collection. One of the art pieces shared in this article was “fbFaces” by Joern Roeder and Jonathan Pirnay. Roeder and Pirnay used a crawler to search and find public profiles and collect images, facebook IDs, and names. They then pixelated these images, reflecting that they are no longer people, but a whole data network of information that people no longer have control over. This work fosters awareness of how data can be used and what this mean for privacy and identity.

Link : https://www.vice.com/en/article/4x4p43/6-art-projects-prying-the-lid-off-online-privacy

Social impacts of NFT

The reading discusses NFT (Non-fungible tokens) and how it impacts digital artists and collectors. NFT adds authenticity to digital artwork like images, music, or videos, and it helps protect the artwork’s ownership to prevent people from stealing the artwork online through screenshots. The author embraces this new trend in the digital art market because it “economically legitimizes an emerging artform.”

Andrew Benson, Active Gestures 10. Sold for: $3,049

The reading mentions an artist called Andre Benson, who had been experimenting with digital video works for years but had not received any financial rewards. He had to work in a software company to support his career as an artist. However, after selling his work through NFT, he could sustain himself solely through art, which liberated him from doing tedious but necessary jobs.

Despite its benefit to artists and collectors, NFT has its setbacks. Like bitcoins, NFT relies on blockchain technology to add uniqueness to each artwork, which relies on a tremendous amount of computing power that is not environmentally friendly. Moreover, it does not legally prevent people from stealing artwork; it merely protects the metadata of the artwork rather than the work itself. Therefore, NFT is not the prime solution to add ownership to artwork.

Looking Outwards 11: Societal Impacts of Digital Art

Graham Murtha

Section A

Sebastian Smee’s interpretation of Beeple’s piece “First 5000 Everydays” proves that opinion writers on the topic of art must not make any of their own, and only look at the “industry” through a commercial lense. Beeple, for those who don’t know, is a world-famous artist/graphic designer who is known to his fans as someone who invests 4-5 hours of everyday to make Cinema4D artwork. To everyone else, he is the artist who sold the largest NFT and 3rd most expensive artpiece in the world. Smee has an astonishingly agreigious, damning look on Beeple’s art, for two main reasons. His first objection is that because it is digital and non-tangible, it does not count as art. I guess this means that every movie Smee has ever seen is also not considered art either. His second point is that the sale of this piece only proves that Beeple is a succumbent to late stage capitalism, as if he is not paid heavily by the Washington Post to spite artists. What Smee’s point fails to acknowledge about this piece is that 5000 daily artworks equates to near 14 years. I think it is inspiring that an artist such as Beeple could love their craft enough to spend 13+ years of their life consistently investing a massive chunk of their time into an unpaid passion. It seems like poetry that Beeple, after almost 14 years, would finally be enabled by Metakovan (the buyer) to make this passion an occupation, a dream that millions of artists around the world have. “First 5000 Everydays” is certainly cause for discussion on what “Fine Art” is these days, or raises the question of how to categorize digital collage (Beeple’s medium). However, it is an outright ridiculous claim to denounce his work as art in any form, as an immense amount of skill, creativity, and determination go into Beeple’s 3D digital collage work.

https://www.washingtonpost.com/entertainment/museums/beeple-digital-artwork-sale-perspective/2021/03/15/6afc1540-8369-11eb-81db-b02f0398f49a_story.html

Smee, Sebastian. “Perspective | Beeple’s Digital ‘Artwork’ Sold for More than Any Painting by Titian or Raphael. but as Art, It’s a Great Big Zero.” The Washington Post, WP Company, 17 Mar. 2021.

Looking Outwards 11

The societal issue that was mentioned in the reading talked about how shows art projects show racial prejudice through artificial intelligence both directly and indirectly. One specific example that stood out was the “ImageNet Roulette”. It is an artificial intelligence classification tool created by artist Trevor Paglen and A.I. researcher Kate Crawford. The AI would give the person it detects in the picture a title. Sometimes it’s objective, like a career but most of the time it judges based on skin color and it gives out subjective descriptions. It was an issue because the labels used to teach A.I. were sourced from the lab staff and crowdsourced workers; by categorizing presented images in terms of race, gender, age, and character, these individuals introduced their own conscious and unconscious opinions and biases into the algorithm. Something I’ve noticed is that for artwork that requires testing artists, there often is a neglecting representation of different racial groups. Being a minority, I often get disappointed to see my group not being represented in those works.

The physical AI art installation at Milan’s Fondazione Prada Osservertario

Citation:
Magazine, S. (2019, September 24). Art project shows racial biases in artificial intelligence system. Smithsonian.com. Retrieved November 16, 2022, from https://www.smithsonianmag.com/smart-news/art-project-exposed-racial-biases-artificial-intelligence-system-180973207/

LO-11: How New Media Artists Bridge the Digital Divide

In “How Artists Can Bridge the Digital Divide and Reimagine Humanity,” Chavez addresses the importance of the role that new media artists play in closing the “digital divide” – that is, equipping people around the world with the necessary knowledge as well as skills to move from simply being consumers of digital media and content to being producers of it. As outlined by Chavez, the steps to achieve this goal are as follows: 

Closing the initial gap of access to digital technologies;

Harnessing wonder through creating fun, engaging projects that facilitate interest in participating in a new, digital society;

Creating shareable digital resources, otherwise known as public goods;

Addressing the STEMarts Model.

As it currently stands, an elite minority – those with proper access to new media technologies and the necessary skills to manipulate them – are responsible for creating the vast majority of new digital media content online. In bridging the gap between art and STEM, we expand our understanding of ourselves, our humanity, and our roles in society; as Chavez outlines in her article, by supporting artists who innovate by creating new digital tools and experiences, we encourage diverse communities across the globe to participate in the reimagination of humanity.

https://www.arts.gov/impact/media-arts/arts-technology-scan/essays/how-artists-can-bridge-digital-divide-and-reimagine-humanity

srauch – Blog 11 – Societal Impacts of Digital Art

I found a very interesting episode of Sidedoor, a podcast created by the Smithsonian, that interviewed artist Stephanie Dinkins about her work on how Black stories interact with AI.

Many AIs that are programmed to generate language pull information from the internet. This makes sense – if you’re looking to train an AI to talk, why not use the biggest and most easily accessible database of language in the world? Well, because, as Dinkins articulates, the internet is racist. It’s full of language that, even if it’s not overtly racist, is filled with the inherent racist biases and assumptions that have become ingrained within the English language. Those biases, by extension, become embedded in the AI. To counter this phenomena, Dinkins created Not the Only One: an AI that creates a memoir of a Black American family. It draws its language ability not from the internet, but from the familial stories from Dinkins, her mother, and her grandmother. Her work raises valuable questions about where our information is coming from. While we think of computers as completely logical, we can’t forget that they’re a product of human creation, and thus always susceptible to our biases and human errors.