Looking Outwards 11: Societal Impacts of Digital Art

Article Read: https://www.washingtonpost.com/entertainment/museums/beeple-digital-artwork-sale-perspective/2021/03/15/6afc1540-8369-11eb-81db-b02f0398f49a_story.html

This article discusses the issue of copyright and the high value being placed on digital art which in itself is a controversial subject on whether or not it is considered art. The new trend of digital art or as the author argues “a marketable digital product” rather than referring to it as art, is being sold for very high prices to signify ownership. The idea that art has become a commodity that will significantly increase in value over time seems to be the main incentive. In this article, the author criticizes the work of Beeple, who is a digital artist who recently sold a digital file titled “Everydays: The First 5000 Days” for $69.3 million which was sold for significantly more than Titan or Raphael’s paintings. The author talks about how there isn’t a rule to determine the value of art but rather no nowadays, marketing and desire have overtaken art.

Looking Outward 11

link: https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

I look into the article “Women in Media Arts: Does AI think like a (white) man?”, where the author discusses the inherent discrimination in artificial intelligence due to the predominantly white male society. This is due to the fact that “the AI is only as good …  as the data it feeds.” Thus if the internet is still populated by artwork from white males or drawn from a white male perspective, this inherent bias will forever exist in the AI’s algorithm.

This statement has been exemplified by an AI face recognition software, where the error rate is significantly high among women of color, displaying this inherent prejudice in the AI’s database. AI teams that lack diversity in gender and race would most definitely produce software that lacks a bigger perspective, and thus bares bias that favors white males.

The article has given examples of efforts to counteract this bias, such as the feminist data set, an ongoing multi-year project that collects feminist data such as artworks, essays, interviews, and books on feminism or from that a feminist’s perspective. This data set would help push diversity in the AI algorithm and broadens the perspective of the AI’s database.

Looking Outwards 11

Gender Shades

https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

As artificial intelligence software that detects, recognizes, and classifies faces becomes increasingly popular, researchers Joy Buolamwini and Timnit Gebru are examining how codified biases in facial recognition software often misgender people who are not white or even fail to recognize their faces completely in their project titled “Gender Shades”. These biased facial recognition softwares are often created by male-dominated teams of computer scientists who lack diversity in ethnicity, race, and gender. Additionally, the data sets that these computer scientists feed their programs also often lack diversity, which is why the software does a poor job of recognizing people who are not white or male. To combat this, Buolamwini and Gebru have created a new standard of data set taken from a diverse group of 1270 parliamentarians from Africa and Europe. This new benchmark dataset for gender and racial diversity will help facial recognition softwares learn to recognize all faces and distinguish between genders and ethnicities without bias.

LookingOutwards 11

In her article, Sohpie Davies sheds light on the increase of the “digital divide” during COVID-19 through two paintings; one of a woman reading an iPad and another of children dreaming of computers. The “digital gap” is the gap in accessibility to computers and the internet. Davies explains that during the pandemic when the use of the internet increased worldwide due to quarantine but only in the more-developed countries. While the internet allowed many people to sustain incomes and keep socially connected, it also needs to be said that computers and the internet can be as inclusive as they are exclusive. People who cannot access these technologies are quickly left behind, especially during COVID-19 when so much of the world went online for economic and social value. These people are minorities, often poor, and often in the least-developed countries. One such group is women. Worldwide, women are less likely to have and connect to technology because of gender inequality, and thus, in the workplace, which is swiftly becoming dominated by the internet, women are left with jobs of lesser value as they never learned skills to manage technology. The second painting Davis brings up explains how children during the pandemic were academically left behind because they could not afford the technology that would allow them to learn remotes. Both of these groups, due to discrimination or economic disadvantage are left behind in a world that is quickly moving past them. 

“Spanish art show spotlights ‘hidden’ digital divide in pandemic” by Sophie Davies

Vase depicting children dreaming of computers.

Gender and Racial Bias in AI

A societal issue I read about was how Artificial Intelligence has a problem with gender and racial bias. The article also offers solutions in how to fix that. The article was written by Joy Buolamwini for Time magazine in 2019. Buolamwini is a computer scientist and poet of code who uses art and research to illuminate the social impacts of artificial intelligence. She has founded the Algorithmic Justice League, a foundation working towards creating more equitable and accountable technology in the world. The article itself is about her MIT Thesis, in which she used AI services from Microsoft, IBM, and Amazon, and used photos of famous black women to see how they AI would identify them. When she evaluated them, she found that for darker-skinned women, errors of guessing the gender of the faces were 35%, and failed to correctly classify the faces of Oprah Winfrey, Michelle Obama, and Serena Williams. She found that by using a white mask, the computer then was able to correctly identify the gender.


The issue surrounding gender and racial bias in AI, is due to the fact that women of color rarely are in positions to develop this technology. Most of the technology in AI fields are created by white males. The AI’s data set that it is using to help recognize faces also has less women of color than men or lighter skinned people. Buolamwini highlights the many different organizations that are trying to combat this bias in AI, and calls for governments and police to stop using this technology in identifying individuals, as it will incorrectly identify women of color and perpetuate a system of abuse.

How I’m Fighting Bias in Algorithms TED talk by Joy Buolamwini from March 2017

Looking Outwards 11: Societal Impacts of Digital Art

The article is about ImageNet Roulette, a classification tool that uses artificial intelligence to sort and categorize pictures. It was created by Trevor Pagan and Kate Crawford who were hoping to reveal some “racist, misogynistic and cruel results” with their platform to highlight issues with biases in artificial intelligence. The data set that trains the AI is widely used in the industry and consists of over 14 million images. The article highlights an extremely concerning issue of the AI highlighting white individuals with common descriptors regarding their careers, personalities, etc. However, black participants primarily reviewed results that reflected their race. Fortunately, changes were made to the database, and images and descriptors were removed to reduce the negative biases the software was creating.

https://www.smithsonianmag.com/smart-news/art-project-exposed-racial-biases-artificial-intelligence-system-180973207/

Looking Outward – 11: NFTs

Non-fungible Tokens, NFTs, gave the promise of increased income and a secure line of ownership to digital art creators. The idea was presented as a way for digital artists to reap the benefits of income and ownership that physical artists enjoy. This promised to elevate their work and to reduce copyright infringement by establishing a secure chain of ownership and proof of creation. Thus far, NFTs have failed to deliver on this promise, and in their current state, they are at best an experiment in the application of blockchain technology. At worst, they are a fad or even a fraud.

Jonathan Bailey outlines the rise and promise of NFTs in his article, “NFTs and Copyright”, which appeared in the March 16, 2021 edition of Plagiarism Times. Although NFTs have existed since 2017, they took the world by storm in 2021 with famous artists, such as Beeple, CEOs such as Twitter’s Jack Dorsey, and celebrities such as William Shatner, selling NFTs for millions of dollars. The word “non-fungible” means that something is not interchangeable, implying uniqueness. However, on the current NFT exchanges, anything can be tokenized, including URLs, images, and tweets. When something is tokenized a transaction is created on a blockchain. This makes the token unique, but only the token is unique. The original work that the token is linked to can be duplicated again and again.

Furthermore, the token does not confer ownership of the original work. When the token is sold, the original content can still be copied or owned by someone else. This creates ethical and copyright issues. These issues arise when people, or at times bots, tokenize and sell content they do not own. Some NFT exchanges attempt to prohibit unauthorized NFT sales, but many do not. This is copyright infringement and theft. A further ethical problem arises when NFTs are misrepresented to unknowing buyers. Many buyers believe they own the artwork or its copyright rather than just a transaction on a blockchain. Ultimately, it is a buyer’s responsibility to learn, however deliberate deception is ethically wrong.

NFTs could achieve their original promise and provide digital artists with income and protection; however, it would take a different implementation than what is now being pursued. In its current form, the NFT market will require artist and buyer protections or it will likely fade as another passing fad.

Looking Outwards 11

NFTs Are Shaking Up the Art World—But They Could Change So Much More

https://time.com/5947720/nft-art/ 

This article discusses the ways in which the art world could be altered via the growing popularity of NFTs (non-fungible tokens). It discusses many advantages, specifically positioning the creators of NFTs as down-on-their luck digital artists posed to receive notable benefits. These benefits include an increased ability to profit from one’s art, alongside the means to make one’s digital art pieces more exclusive, and hence, considered more valuable. This, however, completely neglects to mention the ways NFTs are actually used, alongside the culture surrounding them. NFTs are not bought because someone likes the art, or at the very least, that is not the primary reason for buying them. People buy NFTs to grow their own wealth at a gamble, and it’s safe to say that the already wealthy are not those who could lose the most from investing. Those who possess social capital (think CEOs, billionaires, celebrities) are easily able to construct their own NFTs and profit hugely, the reality being that very few digital artists are, in actuality, gaining financial stability from the practice, as the article implies. Like other assets on the blockchain, it would be ridiculous to pretend that NFTs don’t serve as a tool to further enrich the wealthy, usually at the expense of those outside of the 1%. Especially given the state of capitalism in this country, this is, without a doubt, extremely damaging. Though mentioned in the article, it is also worth re-acknowledging the fact that NFTs (and anything else on the blockchain) are bad for the environment, the computer clusters used to farm NFTs often powered by fossil fuels. This article is definitely biased in favor of NFTs, and perhaps unfairly fails to mention the realities of them. That being said, the title is right: NFTs could change so much more than the art world- change it for the worse.

Blog 11

NFTs have created vast spaces in which artists can get well deserved and long overdue compensation for their labor. Historically, digital art and media has been free for use on the Internet because of the high shareability and the ability to just copy and paste. NFTs encourage artists to be more creative as there is a space for reward and the Internet is limitless. However, NFTs do have a lot of societal and environmental downsides. Some works get plagiarized and sold, and copyright laws are blurred in the new space. NFTs also require a lot of compusing power and server farms are powered by fossil fuels. While there may be an assumption that online art is more environmentally friendly, it is actually quite the opposite.

https://time.com/5947720/nft-art/

Looking Outwards – 11

Impressions from women in media art

In Anna Gruaber’s “Women in Media Arts: Does AI think like a (white) man?” Gruaber features different artist/activists and their projects related to AI and feminism. Artificial intelligence is becoming a very important topic in regards to diversity and ethics. Activists have started pointing out problematic prejudices and distortions of supposedly objective algorithms. While there is a low proportion of women in IT professions, the real problem are the biased data sets used in AI and Machine learning.

Joy Buolamwini and Timnit Gebru are activists who investigate the prejudice of AI recognition systems. In this project, “Gender Shades,” the error rate is significantly higher among women; especially those with darker skin. Information about skin color is extremely important in the context of medical applications. 

Mary Flanagan’s project “help me know the truth,” shows that a discriminating algorithm does not come from the sexist or racist nature of the machine, but from the systemically racist structure of our society. “Help me know the truth” creates a perfect stereotype from a digital self portrait based on findings of cognitive neuroscience. 

Caroline Sinders wants to counterset bias. Her project, “Feminist Data Set,” is a multi-year art project that combines lectures and workshops to create interventions in the field of machine learning. Sinders wants to collect feminist data through them. The feminist data inclides artworks, essays, interviews, and books on feminism. The data attempts to introduce data collection as a feminist practice.

https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/