there is no incognito tab. - contextual research

Sohee Cho
Sohee Cho
// designer in digital humanities
// creative coder
// perpetual schemer
Thesis Faculty
Richard The
Anna Harsanyi
John Sharp
Barbara Morris

This research for thesis began from my curiosity in the effects that the current surveillance economy has on individual digital agency and identity. In the final stages of mapping out ideas for my potential project, I was most interested in continuing my previous work on the objectification and commodification of adolescents in the digital space. The project was a reflection on my personal experience with the overwhelming and relentless spread of unwanted content on social media platforms. Working on this project, I was able to reconcile with the amplified sense of violation I felt through the proliferation and digital permanency of such content as well as the heightened trauma of losing my agency due to the possible resurgence of that content. It was also during this time that I started questioning my relationship with the Internet, automated technologies, and the power structures created by private and public actors in the digital realm.

Prior to my studies in Design and Technology, I was not fully aware of the extent to which I had been unknowingly yet actively participating in the construction of the surveillance economy. Like most people, I enjoyed the convenience that new technologies offered us. Within seconds, I could look up the best restaurants near me, share pictures with my friends across the globe, or even find the next potential love of my life. Little did I realize that it was my privacy and data profile that I was trading for these simple luxuries. The technology industry feeds on the billions of datapoints that are generated through these mundane activities and utilizes them to shape not only our personal decision making but also the grounds in which we make our thoughts and decisions. 

In her book Algorithms of Oppression, Safiya Umoja Noble describes the Internet as “the most unregulated social experiment of our time.” [1] It was precisely this unregulated aspect of this social experiment that I wanted to examine. The Internet is the arena where all sectors come into play to collect innumerable amounts of data and manipulate behaviors, opinions, and access to certain information. In doing so, they create a hegemonic narrative that too often harms marginalized people who do not fit into constructed realities of these data-driven algorithms created by a privileged group of people. By dissecting this power dynamic at play within the realm of Google’s search algorithm, Noble brings to attention the design intentions and question of who the search algorithm truly serves. Her meticulously documented experiences of Google searches highlight the deeply rooted racism and sexism that power seemingly “‘neutral’ and ‘objective’ decision-making tools.”[2]

It was Noble’s series of experimentation with the search algorithm that prompted me to also question the reality that I had not before. I went on Google and searched for ‘Asian girls’ only to find the disappointing first page of results that Noble had also encountered nine years when she began her journey of unveiling the commodification and sexualization of Black girls in the search engine. While this finding was not as surprising as such representations of female minorities are endemic to our society, it was what it revealed to me about how these tech giants were classifying people and ultimately shaping our access to information that was the real kick in the stomach. By continuously feeding into the hegemonic white male centric narrative, companies like Google are not only profiting economically but also heavily influencing the information landscape and essentially each and every one of us.

In his seminal text Panopticism, Michel Foucault analyzes Jeremy Bentham’s Panopticon as the total surveillance apparatus made possible through the imbalance in power created through the imbalances of information, total visibility of those being surveilled, and through the unverified source of power.[3] Though he is examining a literal prison system, Foucault makes a point that is deeply relatable (perhaps even more so) to the modern surveillance system that governs our online presence. Technology is so deeply embedded in our everyday lives that it has become a literal challenge for individuals to go a day without their smartphones, laptops, and etc. Even going a few hours without looking at our phones gives us a chance to feel a small sense of accomplishment about fighting the urge to surrender to our devices. We are giving away our personal information almost every second of our days, from the moment we wake up to stop our alarms on our phones to the couple minutes of nighttime scrolling before falling asleep. This is further aggravated by the phenomenon that Tim Wu coined as ‘attention-whoredom,’ in which we are willing to publicize our private lives for instant gratification.[4]

Yet we are not privy to what we are giving away in exchange for the different services that we often use gratuitously and the likes that we crave. We are not aware of the surveillance states behind the screens of our devices that not only gather information but also enforce structures of power that deliberately mete out rewards for some and punishments for others. In other words, we barely have any idea as to who, where, why and how our personal data is being used for what purpose. Simone Browne, while explaining Kevin Haggerty and Richard Ericson’s concept of the ‘surveillant assemblage’ (the collection of data through advanced surveillance systems) best provides the visual imagination of this datafication process. Borrowing the words of Haggerty and Ericson, she states the conditions of the surveilled as the “human body [being] ‘broken down by being abstracted from its territorial setting and then reassembled elsewhere’ to then serve as virtual ‘data doubles.’” [5] The total state of surveillance that Foucault delineates then comes from our ignorance of our own bodies being broken down and the lack of access to our virtual data doubles that live out parallel lives somewhere in the algorithmic black box. 

I became intrigued by this idea of the existence of my ‘data double.’ I began to wonder what kind of person my ‘data double’ was projecting to different agencies and if it felt like a fair estimation of the person that I really am. I started looking more closely at the ads that I was being targeted with on the different social media platforms I was active on, the never opened emails sitting in my spam box, and even the physical mail that I was getting (shout-out to Capital One for a hell of a determination!). Then I made speculations about the kinds of assessment the surveillant assemblage has and would have made of me. It was quite easy to guess on the surface level; it was without doubt that I was being profiled as an Asian female in her twenties living in New York who routinely made purchases at Trader Joes, Uber Eats, and Everlane and obsessively enjoyed binge-watching shows on various streaming platforms. I was sometimes targeted with irrelevant ads depending on the type of searches I made on certain days, but generally I found myself being creeped out by the content that hit too close to home.  

After a couple days of observing my ad activities, I began to question what all of this, on top of all of the theoretical readings that I have done, meant to me on an individual level. I wanted to further examine how these systems of power operated in concert with one another to create this reality that operated on pseudo-consent. More specifically, I needed a better understanding of this coerced symbiotic relationship with these digital infrastructures that I was now a part of. That led to a desire to look into my own data exhaust created by these algorithms and think about what I could do with my personal data profile that I had never intended on creating or bargaining. In other words, I wanted to fix this displaced ownership of my data. 

I found the central impetus for my project while reading Data Feminism by Catherine D’Ignazio and Lauren F. Klein. The two authors posit the idea of drawing insights from intersectional feminism to reimagine the unjust status quo in our surveillance economy. The authors propose seven principles that set out to challenge the unequal power structures:

  1. Examine power.
  2. Challenge power.
  3. Elevate emotion and embodiment.
  4. Rethink binaries and hierarchies.
  5. Embrace pluralism.
  6. Consider context.
  7. Make labor visible.[6]

The first principle is where my research and project currently lie. We need to understand the power that governs and shapes our world by examining it first. It is crucial to understand what these power structures are stripping away from us and figure out a way to reverse engineer that process of deconstruction. As we constantly allow ourselves to be disassembled and reassembled (most of the time without our knowledge) we are subjected to a process that not only dehumanizes ourselves but also provides the arms to inflict harm on the marginalized. 

My thesis project maps out that vast examination of power but grounds it in personal experiences in the hopes of making it more digestible. Throughout the process of making this book, I pose the question of the role of the self as not only the subject of such surveillance but also as the complicit participant in creating the surveillance apparatus. I also speculate on the types of artifacts that can be made from the personal datapoints that have already been put out into the world. In doing so, I hope to uncover how these algorithms were affecting me in direct material ways. Ultimately, the goal is to understand and reflect on where I fit in this ongoing narrative of surveillance economy not only as an individual but also as a constituent of socially marginalized groups.

there is no incognito tab. seeks to manifest my research in a tangible physical form. It is a series of visual poems that explore my relationship with the power structures that perpetuate the cycle of digital surveillance in this age of data economy. The core questions I want to answer center around the loss and flattening of my digital identity as the result of algorithmic scattering of myself as multitudes of data points by corporate and governmental institutions. By delving into the personal data exhaust created by a diverse array of systems and algorithms, I want to understand what gets stripped away and speculate on a series of exercises to undo the scattering and rehumanize the practice of datafication.

 

 

 

1. Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York: New York University Press, 2018), 6.
2. Noble, 2.
3. Michel Foucault, Discipline and Punish: the birth of the Prison (New York: Vintage Books, 1995), 207-208.
4. Tim Wu, The Attention Merchants: The Epic Scramble to Get Inside Our Heads (New York: Penguin Random House, 2016), 306.
5. Simone Brown, Dark Matters: On the Surveillance of Blackness (Durham: Duke University Press, 2015), 16.
6. Catherine D’Ignazio and Lauren F. Klein, Data Feminism (Boston: MIT PRESS, 2020), 17-18
Sohee Cho
// designer in digital humanities
// creative coder
// perpetual schemer
Thesis Faculty
Richard The
Anna Harsanyi
John Sharp
Barbara Morris