Connections: In Depth
- We Keep Our Data Safe
- Share
We Keep Our Data Safe
Resisting biometric surveillance requires more than opting out.
Having data about our bodies tracked is so much a part of our daily lives that we sometimes forget it can be used against us.
When the Supreme Court overturned Roe v. Wade in 2022, a deluge of articles warned people to delete their menstrual tracking apps. While these technologies can be useful for those seeking (or avoiding) pregnancy, these apps have come under fire in the past for to social media sites. With the changing legal landscape, pointed out that this data could be used to criminalize people seeking abortions. Proponents of forced birth also recognize this potential, and are working to ensure that remains open to search warrants. Turning this data into a site of political contestation raises questions of collective response beyond the limits of bodily autonomy.
It鈥檚 become normalized for data about our bodies and actions to be collected, quantified, and sold on the market. Facial recognition and body scans have become necessary to . health information from blood pressure to sexual activity in order to access care. Colleges and universities are subjecting students to , while Amazon requires delivery drivers to . Even attending a or can subject us to facial scanning and data mining. Given this landscape, the notion of 鈥渙pting out鈥 is starting to feel impossible.
So many of these systems have become necessary, and it can be useful to have this type of information about ourselves. At some point the cost of having our behaviors predicted by an algorithm can start to feel worth it. For someone struggling to get pregnant or desperate to avoid it, the exchange of privacy for the information needed to manage their reproduction may make a lot of sense. During these sharp moments of political crisis, however, we start to feel how thin the membrane can be between the benefits and the criminalizing potential of these technologies.
New technologies aren鈥檛 simply neutral tools; they鈥檙e created by and for existing power structures. , for example, uses existing data sets of past crimes to predict where crime will occur and who is likely to be a perpetrator. The existing data is extracted from those already criminalized, creating a self-fulfilling feedback loop. In this way, as poet and scholar Jackie Wang discusses in her 2018 book, , the datafication of bodily control meant that 鈥渂iological and cultural racism was eventually supplanted by statistical racism,鈥 converting white supremacist social structures into algorithms.
The question here is what to do with these technologies when they鈥檙e used to criminalize, coerce labor, and extract profit, and whether they can be repurposed or should be dismantled. For many, this can mean relying on individual hacks to evade or confuse surveillance. There are limits, though, to relying solely on these autonomous strategies if we鈥檙e thinking about building the conditions for collective liberation. Taking a longer view of the logics and structures of our data-saturated society can help us better understand what got us here鈥攁nd which strategies can be used to avoid reifying these same systems.
Tracking Bodies All the Time
The specific possibilities of AI-generated surveillance and mobile health apps may not have existed before now, but the logics and parameters on which these technologies rely have a lengthy history. Many of the concepts used to build our current system were codified in the early 20th century as part of the massive collection of biometric data that fueled . Eugenics rests on collecting extensive data around the body and measuring it against an ideal 鈥渘orm.鈥 For those interested in tracing a critical history from the eugenics movement to our current moment, this suggests a continuity in two major goals of biometric data collection: measuring who is dangerous and determining how labor or profit can be extracted from people.
The academic branch of eugenics, often tied to who coined the term, heavily shaped emerging fields like psychology, social work, and criminology. Eugenicists invented concepts such as and to attempt to determine the ideal social body, and created statistical judgments about some individuals as better, 鈥渉ealthier,鈥 or more 鈥渘ormal鈥 than others. This included taking the social construct of race and trying to turn it into objective biological or evolutionary fact through .
Alongside this project of scientific white supremacy, eugenicists were intensely concerned with whether people would conform to many different constructs of 鈥渘ormal.鈥 These included expressions of sexuality and gender performance, and framings of disability and 鈥渇itness鈥 around who could be a 鈥減roductive worker.鈥
Measuring people for productivity and surveilling them for control predates the rise of the eugenics movement, with roots in slavery and colonization. In her book , scholar Linda Tuhiwai Smith discusses how these notions of scientific measurement required 鈥減re-existing views of the 鈥榦ther鈥 around science, philosophy, imperialism, classification, and 鈥榬egimes of truth鈥欌 that could be adapted to 鈥渘ew conceptions of rationalism, individualism and capitalism.鈥 Regarding these deeper histories, in her 2015 book, , Simone Browne writes that the conceptual and power structures of slavery solidified around the accumulation of people as a countable, surveillable subject, and Blackness as a 鈥渟aleable commodity.鈥
Even the specific technologies of the Nazi Holocaust, often remembered as the most famous and horrific example of how a regime can mobilize eugenics, were workshopped and fine-tuned in in and the . After World War II, it generally fell out of fashion for academics and public figures to openly name eugenics as their explicit research apparatus or political logic. But these measurements, value judgments, and systems of control continue to texture our world, from restricting access to to .
Algorithmic Justice Beyond Autonomy
For those of us committed to challenging these legacies, we then ask: How do we respond tactically to the criminalizing potential of biometric surveillance without fleeing into the fiction of personal responsibility? What systems and technologies can we build that aren鈥檛 merely protecting ourselves individually from the worst effects of this extraction and criminalization but are also building a different world together?
We can鈥檛 fully retreat to bodily autonomy. Although this can be a powerful and important political frame in many contexts, we can never be individual units moving as autonomous bubbles through the world. We are highly entangled with each other, our histories, our environments, and the more-than-human world; we are affected by and affect each other beyond the scope of what is legible to variables in a tracking software. Also, although the criminalizing strains of these systems might come for us differently, they are coming for all of us, whether or not you have a uterus, precarious legal status, nonnormative gender expression, or any other variable that can be criminalized.聽
There鈥檚 great work being done by groups like the and the to address the harms caused by biased biometric algorithms, such as the notorious incompetence of facial recognition software to detect darker faces, or the use of AI to deepen medical discrimination. In addition to this, we need to continue pushing beyond accuracy as the horizon of our demands. Even if we had a universal data set fully quantifying and tracking every human on the planet, is that algorithmic justice? If the systems that created these technologies and the purposes for which these technologies were designed are fundamentally oppressive, more efficient or accurate tech does not lead to a more just world.
After all, August Vollmer, the eugenicist 鈥溾 who founded with proposed courses like 鈥淩ace Degeneration,鈥 was obsessed with gathering perfect data in order to determine a criminal 鈥渢ype.鈥 His vision of social control dreamed of a moment 鈥溾 If we are to focus on pushing for better algorithms, we must ask how this is different from what the eugenics movement wanted. Perfecting statistics does not necessarily intervene in the fundamental variables against which the 鈥渘ormal鈥 is judged, or the power structures in which this biometric data is being designed and deployed.
So much of the data for these systems is crowdsourced, which allows complicity to be dispersed. Given this, we have responsibilities as the building blocks that make up this algorithm. As artist and poet Manuel Abreu asks in a 2014 article for , how do we engage with these structures when 鈥淸o]ur banal activities are the source from which algorithms automatically generate kill lists made up of nodes that deviate from the cluster of normal activity patterns鈥 and 鈥渁lgorithms make complicity incalculable鈥? When we are embedded in these systems, our answer has to be something beyond opting out.
One response could be to treat data security as community care. and other domestic surveillance programs teach us that taking sensible precautions with our own communication practices can reduce harm to people in our community. There can be deep solidarity, along the lines of 鈥渨e keep us safe,鈥 in small-scale measures such as using an encrypted messaging app or browser, and having politically or legally sensitive conversations face-to-face with physical distance from your phone.
A contradiction of political organizing is that it often requires a level of visibility to build community and draw more people into the work, but visibility attracts repression. Some projects that offer tools and resources to help people grapple with these conundrums include , , the to surveillance self-defense for abortion access activists, workers, and patients, , and .
Another response is to intervene through collective action. In 2018, members of , an immigrant justice organization, created the campaign to confront the surveillance firm , whose tendrils stretch from collaborating with U.S. Immigration and Customs Enforcement on deportations to supporting the Israeli Defense Forces in surveilling and bombing Gaza. Mijente aimed to put pressure on companies and cities to break contracts with Palantir, reducing its ability to recruit and retain workers and bringing this surveillance more under public scrutiny.
Groups such as the have also organized to create solidarity in workplaces. One of the grievances that kicked off the West Virginia teachers wildcat strike in 2018 was the and perform a certain amount of movement or face higher health insurance costs. People could have individually opted out of this ableist, fatphobic, and generally invasive requirement. Instead, they collectively withdrew their labor to change the conditions of possibility through worker power.
Fundamentally, the lesson is that in order to confront these systems, we must build something greater than individual autonomy. In Dark Matters, Browne writes about what she calls 鈥渄ark sousveillance,鈥 reversing the surveillance and turning the gaze on those in power as a tactic to keep oneself out of sight in the flight from enslavement to freedom. This flight requires networks and histories far in excess of the autonomous self, entailing the creation of entire social networks, languages, technologies, and relationships outside of and directly opposed to the systems of power that are surveilling.
Our response to a world trending toward alienation from each other and cannot be based solely on personal privacy. Any engagement with these systems that takes solidarity seriously must remember that, whether we find ourselves shoulder to shoulder or embedded in an algorithm, our liberation is bound up with each other.
Juliet Kunkel
is an independent writer, 鈥渟cholar鈥 (ambivalently), and general troublemaker who learned far more from her comrades in movement work than from the Ph.D.
|