Rabbit Hole #2

An Introduction to Data Privacy in the Arts


In 2021, TikTok updated its privacy policy which allowed it to collect biometric data on its users, including faceprints and voiceprints. Rather than explicitly informing its users about this change, they only communicated that they were issuing a “privacy update” upon opening the app. Once people found out what the update entailed, they started growing concerned, and rightfully so. It’s a significant shift from companies collecting behavioral data on their consumers to something a little more invasive and without consent. Only 36% of Americans trust tech companies using facial recognition technology. In general, public trust in big tech has been steadily falling in the United States. Yet most of us still click “accept” to the Terms & Conditions on any website without actually knowing what we agree to. There’s been an apparent disconnect between what we expect from US businesses and what we blindly agree to.

What is Data Privacy

Everyone has data – name, age, gender, birthday, interests, browsing habits, etc. These aspects of a person’s identity fall under the Personal Data category. That data is valuable to big corporations, whether they sell the data to gain revenue for free services to their consumers or buy the data to advertise their products to potential consumers in a tactical, albeit intrusive, way. In the notorious  2018 Senate Hearing on Facebook, Mark Zuckerberg’s retort, when posed with the question of how Facebook remains profitable when its service is free, gave us a glimpse into an obvious yet disturbing truth behind the operations of a big tech company. “Senator, we run ads,” followed by a cheeky smirk. The purpose of data in the capitalist sphere we’ve grown all but too familiar with in the US has become predictive modeling of human behavior.

Data privacy can be defined as the careful handling of data throughout its lifecycle, from data creation to data deletion, based on its relative importance. It’s a field that has grown exponentially in the digital age, dealing with the management of data, governance, compliance, laws around it, consents, notices, and regulatory obligations. The Health Insurance Portability and Accountability Act, better known as HIPPA, is a policy in the healthcare sector that protects against institutions misusing patients’ medical and health data. It prevents healthcare organizations from giving out patient information on their physical and mental health and requires that healthcare and healthcare insurance industries protect this information from fraud or theft. This policy prevents insurance companies, pharmaceutical companies, and doctor’s offices from directly targeting people with personalized ads for products or services. HIPPA is the only sector-specific policy in the United States that protects people from having their personal information abused or exploited by large corporations. With only three states having adopted some form of data privacy law, it’s unclear how soon Congress will propose bi-partisan data privacy legislation. For the time being, it’s up to the individual states.


It’s no surprise that the United States government is behind on data privacy regulation. As with many contemporary issues, it often takes a while for the government to create bi-partisan cooperation toward implementing change. Currently, the European Union is leading the data privacy venture with the General Data Protection Regulation (GDPR). Its creators intended for it to be the toughest privacy and security law in the world to date. Much like the right to the pursuit of liberty or free speech, the new EU regulation is stalwart in the idea that “the protection of natural persons concerning the processing of personal data is a fundamental right.” Under this regulation, EU citizens have the right to ask companies how their personal data is collected, stored, and used. They also have the right to request that their data be deleted from a company’s database (or CRM). Companies must get consent from a consumer before collecting and storing their data. With the exception of HIPPA, there is no centralized regulation that protects US citizens’ personal information from being abused by companies.

The closest piece of legislation we have exists only in California, whose government passed the California Consumer Privacy Act (CCPA) in 2018. Under CCPA, citizens have similar rights to those established in GDPR, except for a few differences. But it can be inferred by the name that CCPA only applies to California residents. Similar pieces of state legislation were passed in Virginia (the Consumer Data Protection Act) and Colorado (the Colorado Privacy Act), both in 2021. The biggest hurdle appears to be the extent to which Democrats and Republics want big companies to be penalized for mishandling personal consumer data. While Democrats want to enable consumers to hold businesses accountable for abusing their data, Republicans opt for more business-friendly legislation. Without centralized federal regulation, having varying data privacy laws by state makes the issue much more complicated. It leaves consumers unsure or unaware of their rights.

The Artist Experience

In 2016, Mozilla partnered with Tactical Tech to bring the Glass Room to London and New York City. This immersive exhibit, made to look like a tech store, was created to teach people about who is collecting their data online and why. The Glass Room attempts to demystify the world of data privacy in a vast digital world where one can lose themselves without knowing it. It does so by getting participants to think about how they interact with online platforms and helps visualize and contextualize otherwise abstract ideas.  

To move through the Glass Room is to be reminded of the many ways we unwittingly submit ourselves and one another to unnecessary surveillance, with devastating consequences

The New York Times

As with any other complex social issue, artists have been quick to be the messengers of social and moral concern. The beauty of the arts is that they can convince people that they need to care about something. Humans are inherently visual creatures, but it sometimes takes more than just showing an audience why something is a problem. Some artists have gone so far as to make an audience live a problem, either in an immersive exhibition or through some “alternate universe Black Mirror” way. One such experience is German artist Tobias Leingruber’s Facebook ID card project. In 2012, Leingruber created Facebook ID cards for his guests at an art event. This concept was somewhat influenced by George Orwell’s 1984. Leingruber’s project lends commentary to how social media pervades our everyday lives. If we allow digital platforms to consume our identity little by little, where is the line drawn? And because we give such importance to having an online presence, is it really unimaginable that one day we base our value on the digital content we so meticulously curated to achieve some level of social status?

Tobias Leingruber’s Facebook ID Card

Netflix’s Black Mirror episode Nosedive also played with this idea in 2016, in which a woman’s life unravels over her slowly deteriorating “social media score” as she desperately tries to cling to the small amount of social standing she has. While these are two extreme examples (thankfully, in the ten years since, Leingruber’s fake project hasn’t become a reality), the main question coming to my mind is what happens to us when nothing is private anymore? In the United States, citizens are very well aware of their rights as established by the Constitution. Yet when a significant data breach at a big company occurs, collectively, we aren’t as outraged by it. This isn’t to say that civil and constitutional rights aren’t as necessary. But as the EU established in the GDPR, “the protection of natural persons in relation to the processing of personal data is a fundamental right.” As more of our personal lives become exposed and our habits become exploited by big business, how does the nonprofit sector factor into all this?

Arts Nonprofits’ Adaptability

In a society where technological trends feel almost ephemeral, it’s understandable why many nonprofit art organizations are slow to adopt the most up-to-date data collection, storage, and management technologies. Several factors come into play, like lack of funding, tech-hesitant Boards of Directors, and a general lack of awareness of systems that would make the work more efficient. Even many for-profit entities are behind the curve when it comes to having a data strategy. But as opposed to for-profit businesses, nonprofits could be using data from their communities to create more effective programming tailored to their target audience’s needs. These efforts are crucial to achieving its mission. One author noted that when organizations lack the data architecture to fully leverage the data available to them, the actual opportunity cost is innovation.

Knowing how to use the data is only part of the equation, though.

As if investing in data collection and management alone wasn’t a whole new world for nonprofits, we now add the complex issue of consumer privacy into the mix. The lack of US regulation around data privacy begs the question: what responsibility do the private and nonprofit sectors have in this realm? While there is little US legislation for the private sector, there is even less for the nonprofit sector, leaving organizations even more in the dark. Yet, data has become more crucial than ever in understanding their target demographic, creating appropriate programming, and effective marketing materials in nonprofits. However, the lack of data architecture and policies leaves nonprofit organizations vulnerable to data breaches equally, if not more. The New Jersey Shakespeare Theater experienced a ransomware attack in 2019. Hackers disabled the organization’s access to its ticketing system and patron database. Several other breaches have occurred in which hackers gained access to an organization’s funder information, costing them tens of thousands of dollars, practically forcing some to shut down.

Because nonprofits don’t operate as a traditional “business” and because those kinds of data breaches don’t typically make headlines, this may be seen as a trivial issue. But as entities that hold personal identifying information on patrons, donors, employees, and Board members, nonprofits have just as much an obligation to protect their data as do for-profit businesses. The following are starting points nonprofits should heavily consider in developing a long-term strategy for data storage and protection:

  • Understand the legislation (discussed above) that exists to develop protection plans
  • Form a data and cyber security governance committee
  • Embrace cloud-based storage software
  • Properly educate employees on data security


The arts can be used to inform the public about this issue and make it understandable through a more tangible medium. With the current lack of legislation in the United States, the arts can bring awareness to this issue and empower the public to push the federal government to implement change. All in all, nonprofit arts organizations have the same responsibility as for-profit and government entities to protect personal consumer data and use it ethically. The relationship between people, their data, and business is significantly unbalanced. But artists and arts organizations can help restore the balance by acting in the best interests of the public and using legislation such as the GDPR as a reference.


Alexander, Alistair. “The Glass Room: Big data, privacy and interactive art.” Mob Lab. April 28, 2018.

Crittenden, Elizabeth. “Let’s Talk: TikTok’s Privacy Update And Incubator For Black Creatives, Spotify’s Speech Recognition Technology, And More.” Arts Management & Technology Laboratory. June 22, 2021.

DalleMule, Leandro and Thomas H. Davenport. “What’s Your Data Strategy?” Harvard Business Review. May-June 2017.

“Data Privacy – Definitions, Importance, Legislations / Privacy laws.” Data Privacy Acts. May 12, 2020.

Fang, Jiashun. “What Makes Facial Recognition Controversial?” Arts Management & Technology Laboratory. February 13, 2020.

Frankfurt, Tal. “How To Avoid Security Breaches In The Nonprofit Sector.” Forbes. March 31, 2021.

Fried, Ina and Mike Allen. “Exclusive: Trust in tech craters.” Axios. March 31, 2021. 96946e4f8eb006c4.html. “General Data Protection Regulation (GDPR).” Accessed April 6, 2022.

Jehl, Laura and Alan Friel. “CCPA and GDPR Comparison Chart.” Baker Hostetler LLP.  Accessed April 30, 2022.

“Legislative Preview: Data privacy”. Congressional Quarterly Magazine. February 14, 2022.  a233-3ca744502911/?context=1516831.

Leingruber, Tobias. “FB Bureau Berlin: Get Your Fb Identity Card!!” Free Art and Technology Lab. February 24, 2012.

NBC News. “Senator Asks How Facebook Remains Free, Mark Zuckerberg Smirks: ‘We Run Ads’ | NBC News.” April 10, 2018, 1:00. YouTube video.

“Nosedive.” IMDb. Accessed April 30, 2022.

Pardes, Arielle. “What Is GDPR and Why Should You Care?” Wired. May 24, 2018.

Pyne, Lydia. “A Data Artist’s Guide to Putting People (and Privacy) First.” Hyperallergic. May 6, 2021.

Rippy, Sarah. “Colorado Privacy Act becomes law.” IAPP. July 8, 2021.,earlier%20this%20year%2C%20to%20enact%20comprehensive%20privacy%20legislation.

Rippy, Sarah. “Virginia passes the Consumer Data Protection Act.” IAPP. March 31, 2021.

Schenker, Dylan. “Artist Explores Online Identity and Privacy With Facebook ID Cards.” VICE. March 5, 2012.

State Of California Department of Justice Office of the Attorney General. “California Consumer Privacy Act (CCPA).” Accessed April 22, 2022.

“The Week in Breach: 12/04/19 – 12/10/19.” DeckerWright Corporation Blog. December 11, 2019.

“What is CRM?” Keap. Accessed May 2, 2022.

Younanzadeh, Emanuel. “Why Your Company’s Data Architecture Is More Important Than the Data Itself.” Forbes. April 13, 2022.

Leave a Reply