LO 11

Link of the article:https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

The article I looked into is Women in Media Arts: Does AI think like a (white) man? The article addresses the issue that because an AI is designed by white men and tested by white men, when a woman goes in front of the AI, the AI cannot even recognize her. In response to this kind of issue and to increase the visibility of women working in media art, Women in Media Arts – a comprehensive database devoted specifically to women in media art – emerges. It intends to increase public consciousness and to focus on role models for girls and women.

Looking Outwards 11: Societal Impacts of Digital Art

The reading I chose for this blog post is ‘Women in Media Arts: Does AI think like a (white) man?’ The article focuses on female digital artists and feminism within the digital art realm. Through looking at a number of female-created art pieces and projects, the article discusses the ways in which AI and other creative practices shed light on the biases within creators and consumers. One project by Mary Flanagan entitled [“help me know the truth”] explores the way people reinforce their own biases based on others’ physical appearance. Other projects in this article were created with the purpose of combatting the obvious marginalization and discrimination that result from the lack of diversity among those creating digital tools. One example of this is facial recognition software which generally does not accurately recognize people of color. This is just one of the many ways human bias influences artificial intelligence, and there are very real and dangerous consequences that can arise if these types of creative habits are not broken (ex: medical equipment that is only accurate for white male patients).

article link

LO: Societal Impact of Digital Art

For this week’s blog, I read the article “How Artists Can Bridge the Digital Divide and Reimagine Humanity” by Agnes Chavez. The article talks about how art, technology, and science can become tools to increase digital literacy. Chavez categorized the digital divide into “first-level” and “second-level” and stated that solving the problem on both levels is essential. The “first-level” is having affordable access to information and devices; the “second-level” gives people the necessary knowledge to become digital content producers. The PASEO festival is an example that describes how art and technology can support learning and create a positive attitude towards digital media, which bridges the divide on the “second-level.” Later, the author also discussed creating shareable digital resources and implementing the STEMarts model to support access to digital content. The article outlined multiple methods to bridge the digital divide and highlighted the importance of digital technology in connecting people and communities across economic sectors, which I found inspiring.

Chavez, A., “how artists can bridge the digital divide and reimagine humanity”. “How Artists Can Bridge the Digital Divide and Reimagine Humanity”. Available at: https://www.arts.gov/impact/media-arts/arts-technology-scan/essays/how-artists-can-bridge-digital-divide-and-reimagine-humanity [Accessed November 12, 2021].

Looking Outwards-11

The article I read discusses the digital divide. The digital divide is the phenomenon of poorer people and areas having less access to functional computers, smart phones, or internet, all devices that are becoming more and more needed in the modern education and work worlds. This has been especially relevant during the Covid-19 pandemic, considering how many people were working or attending school from home, those without internet or an internet capable device were severely impacted. This article discusses an exhibition at a museum in Barcelona. The exhibition focuses not only on the digital divide, but also how it can disproportionately affect women and ethnic minorities. As the world continues to become more reliant on digital access, the worse the affects of the digital divide will become.

https://www.reuters.com/article/us-health-coronavirus-tech/spanish-art-show-spotlights-hidden-digital-divide-in-pandemic-idUSKBN28S0IC

2. Looking Outwards 11: Societal Impacts of Digital Art

As technology develops, the definition of art and its value keeps evolving. The idea of craftsmanship and delicate human effort is now replaced by manufacturers and computer technology that can create similar feature within seconds with lines of codes. When I read Sebastian Smee’s criticism on Beeple’s artwork, I could understand why he was so critical of the digital artwork being valued more than Michelangelo’s. But I think that there should be different standards for work that is digitally produced and physically produced. I think they carry different values—for example, digital work can be a sign for newly developed technology and testing out their uses in creative ways, even if they don’t have the delicacy and the conetextual depth of traditional art.

https://www.washingtonpost.com/entertainment/museums/beeple-digital-artwork-sale-perspective/2021/03/15/6afc1540-8369-11eb-81db-b02f0398f49a_story.htm

Women in AI: LO-11

https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/

I started off this week reading about women in media arts. The first video included on Gender Shades, I did not, sadly, get surprised by the context. What did surprise me however, was how the computer system was so wrong on identifying females of darker color; one as a person at all, and secondly, not even being able to identify their genders correctly. The digital world needs to grow to be neutral and inclusive, and if in testing there is a gap between recognizing people of gender and race, there is a problem. The computation itself is biased, but not by the machine by the programmers. I feel that sometimes testing can be overlooked when it seems to be working fine on individuals, but this is a flaw we see commonly not just in AI but in Women’s healthcare.

For the Women reclaiming AI videos I found the idea of the voice speaking system, that was similar to Alexa to be particularly interesting. I wonder where that could go if those types of questions could be asked by younger girls. Questions about womanhood that could be answered when they need help or clarification, I think, could be pretty revolutionary. I have never tried to ask my Alexa any of those types of questions, but I wonder how current AI on the market would answer. I never really considered how AI might be biased with speaking technology, but I could imagine that if anything, it probably holds the most bias.

 In the European Platform video I also thought it was interesting to consider how AI is imagined with blue, which is a very masculine color and how women can find themselves in that space. Something that I ran across recently when concerning women in coding was hearing about Margaret Hamilton and getting men on the moon. I feel that there is a lot of gender bias in her story and people refused to listen to her and asked her why she was even at NASA, when her programming helped get the astronauts home. She was very overlooked for her astounding work and I don’t think many people ever come across her name.

LO: NFTs

I read the Plagiarism Today article which discussed how copyright relates to NFTs. I really like the example used in the article that equates NFTs to limited edition posters. Having an NFT doesn’t mean no one else will have that art, and it doesn’t give the owner permission to use it for everything. An issue right now is that some people are selling art that they don’t have a copyright for. Should this be allowed? NFTs can be a cool source of revenue for artists, but if other people are selling their work that’s theft. Sites are attempting to regulate this kind of problem, but everything is so new and they haven’t had time to catch up. Similarly, courts haven’t implemented any laws around NFTs. Until that happens a huge amount of fraud could occur within NFT trading. Right now people are making a lot of money from NFTs, so it will be interesting to see how this trend continues to evolve.

https://www.plagiarismtoday.com/2021/03/16/nfts-and-copyright/

LO-11

The technological field has been and still is filled with a lot of stereotypes and biases surrounding people’s gender, race, or sexuality. It is a concrete fact that only 20% of professional computer scientists are women and only 5.8% of professional computer scientists are African American. Even though the 21st century US appears to be a progressive utopia, there’s still a lot of stigma around the technology and its developers. 

For this week’s Looking Outwards I chose an article by Meilan Solly called “Art Project Shows Racial Biases in Artificial Intelligence System”. This article addresses an issue of racial bias in the artificial intelligence tool ImageNet Roulette that was developed by artist Trevor Paglen and A.I. researcher Kate Crawford. What this tool was programmed to do is identify some characteristics of a person based on the photograph such as a photo of John F. Kennedy would be labeled “Politician” and a photo of Shakira would be labeled “Singer”. This tool seems to be impressive, right? But not everything is so simple, so perfect, so equal. When an African American young man, Tabong Kima, uploaded a photograph of himself and his friend to the tool, the ImageNet Roulette labeled him as “Wrongdoer, Offender”.  “I  might have a bad sense of humor, but I don’t think this is particularly funny”, said Kima on his Twitter page. The developers of the tool programmed descriptions such as dog, Boy Scout, or hairdresser, while others were rapist, adultress, loser, etc. The program seemed to identify white individuals largely in terms of occupation or other functional descriptors, but it classified those with darker skin solely by race and skin color: when an African American man uploaded a picture of himself the tool was only to describe him as “Black” and when an East Asian woman uploaded a photo of herself the tool described her as “Gook, Slant-eye”. The bias and racism here are seen crystal clear. 

This tool was taken off the Internet on September 27th, 2020, due to the existence of so many offensive and upsetting terms that were used to describe human beings. “We want to show how layers of bias and racism and misogyny move from one system to the next,” Paglen tells the New York Times’ Cade Metz. “The point is to let people see the work that is being done behind the scenes, to see how we are being processed and categorized all the time.” The creators of the tool told the press that the point of this project was to show the bias towards the race but my question would be just: Why? What for? Not everyone knew the point of this tool so it upset and offended a lot of people on the Internet when racist slurs popped up as their descriptions. This article pointed out how ImageNet Roulette was wrong and how it only increased the stigma around race and technology rather than got rid of it. The bias obviously exists and we shouldn’t prove it by creating more racist technology – we should fix it!

How artificial intelligence classified Julia Carrie Wong’s headshot. Photograph: ImageNet Roulette

https://www.smithsonianmag.com/smart-news/art-project-exposed-racial-biases-artificial-intelligence-system-180973207/

Solly, M. (2019, September 24). Art Project Shows Racial Biases in Artificial Intelligence System. Smithsonian.com. Retrieved November 13, 2021, from https://www.smithsonianmag.com/smart-news/art-project-exposed-racial-biases-artificial-intelligence-system-180973207/. 

Looking Outwards 11: Societal Impacts of Digital Art

In Sebastian Smee’s article, “Beeple’s digital ‘artwork’ sold for more than any painting by Titian or Raphael. But as art, it’s a great big zero.“, discusses the recent uprising of NFTs. An NFT is a digital file that allows people to bid and claim ownership over it. The article criticizes how the value of NFTs is not from the work itself but rather from market manipulation. For instance, graphic designer Beeple recently sold his work “Everydays: The First 5000 Days” for an insanely high price of $69.3 million when the piece was actually just a collage of colorful images. This goes to show the extent to which people are willing to go for the hype and how drastically the market for artwork has changed.

“Everydays: The First 5000 Days” is a digital file by the artist Beeple, who has been posting images every day since 2007. (Christie’s Images Limited 2020)

Looking Outwards – 11

In this article, the reading introduces how AI addressed the skin color and hairstyles of African American people, and how new algorithms and programs are needed to better display their physical features. I think that after reading this article, I have realized the importance of addressing ethics and inclusion in modern-day technology. As the tech industry becomes more and more advanced, we have begun to develop new programs that can tackle problems similar to that of a human. Things like facial recognition and computer generative art begin to create and interpret different races in different ways. There are many different races in this world, all with unique characteristics, and we should do everything we can to ensure that there is no bias or subjectivity in our technology that has a bias towards any one race.

Article: AI & Creativity: Addressing Racial Bias in Computer Graphics