Reproductive Justice in the Post-Dobbs Moment
This essay was conceptualised and written based on the work done by team members Avantika Tiwari, Shreya Bali, Nandini Chami and Anita Gurumurthy, from IT for Change
1. Our Data Bodies in the Post-Dobbs Moment
In June 2022, the Supreme Court of the United States decided in the case of Dobbs v. Jackson Women's Health Organisation that the right to abortion was not conferred by the constitution of the country. The judgment's nightmarish consequences of a pervasive state-backed reproductive surveillance machine trampling upon women’s bodily privacy and autonomy seemed straight out of the pages of The Handmaid’s Tale.
It is increasingly evident that these risks are exacerbated in the surveillance capitalism epoch that we inhabit.
Digital rights activists have repeatedly cautioned that myriad data footprints on digital Apps – from Google search results to geolocation information, digital payments records, menstruapp (menstrual health apps) histories and even social media exchanges – could be weaponised by law enforcement agencies seeking to identify and prosecute women who are suspected of seeking abortion. This might seem alarmist but recent incidents in the United States indicate how well-founded such fears are. In August 2022, the police force in the state of Nebraska ended up using Facebook messages to investigate whether a 41-year-old woman helped her 17-year-old teenage daughter abort an unwanted pregnancy, an act that became illegal post the Dobbs judgement as the state’s stringent anti-abortion law automatically came into effect. Instances of law enforcement agencies in the United States purchasing personal data (geolocation) from brokers such as Venntel and Babel Street without warrants/ court orders have also already come to light. In this context, The Wired’s grim forecast for a post-Dobbs dystopia seems to be an imminent reality: “Apps and app stores could be targeted for regulation by states seeking to aggressively limit residents’ access to abortion. Many sexual health apps currently provide secure and encrypted services alongside direct instructions on how to self-manage abortions. While CDA Section 230 would immunizse companies against most liability, it would not stop states’ efforts to abrogate that immunity.”
Anxiety about the risk of law enforcement agencies using methods of questionable legality – such as the illegal sourcing of personal health data from menstruapps and other data aggregators – to investigate and prosecute persons who are suspected of obtaining an abortion therefore seems justified. This threat is heightened because of the lack of comprehensive personal data protection legislation in the United States and the failure of traditional federal laws on health information privacy to cover the data collection practices of menstruapps.
In this scenario, the solution that many on the progressive, liberal side of the abortion rights debate have sought is in immediately preventing the consolidation of a state-corporate nexus of pervasive reproductive surveillance. Their strategy for change includes strengthening and expanding the current legislative proposal for a ‘My Body, My Data’ Act to build the first set of national standards for the effective protection of a person’s reproductive health data followed by comprehensive privacy legislation.
In this line of thinking, our datafied bodies are an extension of corporeal bodies and, therefore, the solution to data extractivism lies in extending to users the highest standards of bodily privacy such that, any further erosion of the already weakened ‘sense of [reproductive] choice’ that individuals have recourse to, cannot take place post the Dobbs judgment.
But in order to preserve such a “sense of [reproductive] choice”, is it sufficient to argue for the protection of our data bodies from within the liberal framework of individual privacy? Here, it is critical for us to learn from the historical mistakes of the mainstream abortion rights movement in the United States. When faced with a strong right-wing ‘pro-life’ discourse, the narrow framing of the ‘pro-choice’ position as limited to the right to bodily privacy completely erased the systemic issue of ‘reproductive justice’ from the public debate. Reproductive justice, simply put, recognises that the freedom of reproductive choices – including the right to have/ not have children – can be meaningfully exercised only in a society where access to affordable, quality healthcare and basic social care services that enable children to be raised in a healthy, safe, and flourishing environment are equally available to all individuals.
From this starting point, when we think about our data bodies in the post-Dobbs moment, it becomes evident that we need to move beyond a narrow focus on self-determination of the terms on which our sexed and gendered bodies are datafied on the digital matrix by the market. Strategies that are focused on enhancing user control/ consumer rights over one’s own data are emblematic of such an approach. However, placing limits to the state’s dominion over our data bodies, as conceptualised in a liberal personal data protection framework fails to address the fundamental inequalities underpinning the gendered digital economy. Therefore, we need to move towards a new framework, which responds effectively to how in the pervasive datafication of our sexed and gendered bodies in the age of FemTech we can still preserve and further the reproductive justice agenda beyond the act of policing borders of permissible use and abuse of data which is and can serve as an instrumental tool against women.
2. FemTech: The structural violence in the interpellation of our data bodies into the capitalist matrix
A number of studies, including our recent research at IT for Change, demonstrate that FemTech applications, especially menstruapps, have become the vehicles of a new data governmentality through which surveillance capitalism assimilates gendered bodies into its workings. A predatory market in FemTech seeks huge payoffs through easy access to (data about) menstruating bodies. Menstruapps pursue new markets and feed Big Pharma with valuable resources, at the high moral cost of invading the intimate and eroding human rights.
Privacy policies and practices of popular Menstruapps tend to be vague and ineffectual. Representing a scenario of no-holds barred data extractivism, their overbroad consent clauses lack not only purpose limitations of data use, but tend to operate on the assumption that de-identification and anonymisation are sufficient safeguards to protect individual users from any potential harms that may arise in downstream third party data sharing or data re-use. This is a blinkered view as it fails to take into account the risks of profiling that still persist the re-use of datasets from which personal identifiers are removed. For example, de-identified location information can still help pinpoint abortion facilities with a high amount of interstate traffic—which may be useful to states that might implement legal penalties for those who facilitate out-of-state abortions.
Another major issue that is lost sight of is that once their data is part of an anonymised, aggregate pool, users simply lose control over its combining and recombining in secondary uses – a testimony to the ever-proliferating, yet, unregulated data markets in which big players have a huge stake. For instance, take the case of the popular menstruapp Flo. Flo’s privacy policy states: “We may aggregate, anonymise or de-identify your Personal Data so that it cannot reasonably be used to identify you. Such data is no longer Personal Data. We may share such data with our partners or research institutions.” Flo’s announcements of such collaborations in the public domain reveal that ‘partners’ have included the bio-pharmaceutical company, Myovant Sciences, and the pharmaceutical and life sciences company, Bayer AG.
Apps such as Flo claim that by only using anonymised user data stripped of personal identifiers for the generation of data-based intelligence, they are eliminating privacy risks completely. What they sidestep is the fact that the re-use of anonymised data may still enable intrusive algorithmic profiling and behavioral manipulation by the market.
Women in the Global South face a double whammy as far as such data extractivism is concerned. Oftentimes, their governments have not enacted legislation that protects them from the harmful impacts of behavioural data profiling by transnational digital corporations. Even if laws for personal data protection do exist, the de facto flows of data away from the countries of the South into enclosures of Northern corporations undermine the justiciability of user rights in the event of any abuse. A few years ago, Facebook struck a deal with the India-based App Maya to access sensitive personal information of Maya’s users. How would Indian app users hold Facebook to account for cascading rights violations in future downstream uses of their data and what might this mean for women’s autonomy elsewhere in the world?
Even in jurisdictions where there is a high level of legal protection for individual privacy and personal data protection, such as the European Union, menstruapps have failed to uphold all legally guaranteed rights of their data subjects. Taking advantage of the contractual model of upholding privacy rights, as endorsed by mainstream personal data protection legislation, they offer users the choice to part with their data through a complex notice and consent regime. This ‘free choice’ of volunteering to share one’s own data is produced and engineered in the digital marketplace of desires and demands, where the surrendering of personal data to platforms for ‘free’ services seems like a small and even, smart trade-off to the consumer.
As scholar Catherine D’Ignazio reflected in an interview with IT for Change for the research on menstruapps, this assimilation “… perpetuates geographical inequalities and colonial inequalities. So many of these [menstruapps] are designed in Silicon Valley, and [this is reflected in the] cultural constructions [of race and gender that] are presented in the apps. Often, it doesn’t even work for the people from the US, [such as] the non-binary folks”. These apps isolate women’s bodies from their social contexts in the process of generating data about menstrual health, conflating datafication for the market with new-age discourses of empowerment through self-tracking and quantification. The datafied cultural landscape subtly pushes for a self-quantifying dynamic where self-documentation (digitally logging/ journaling our lives) and self-branding (becoming social influencers) are two sides of the same coin.
3. From protecting Data Bodies to safeguarding sovereignty rights in the social resource of data
In the post-Dobbs moment, reclaiming our data bodies from the matrix of capital requires us to move beyond a narrow focus on the protection of sensitive personal data (such as data about sexual and reproductive health) towards a feminist data sovereignty approach. The data sovereignty approach ensures that the social resource of aggregate, anonymised data about sexual and reproductive health needs and behavior is not instrumentalised for capital accumulation. In order to do so, we need to ensure that data is deployed towards enabling the community of users, and indeed, non-users, to gain collective value and shared insight, with the guarantee of privacy, dignity and autonomy for all; in other words, leveraging the data commons for reproductive justice.
This calls for actions along a range of fronts in the specific realm of menstruapps. First, business models that profile sensitive and intimate information about sexuality and reproductive health for downstream market research should not be permitted to function. This will automatically ensure that menstruapps limit their data collection to the information that is essential to providing customised informational advisories, steering clear of unbounded personal information gathering. Equally importantly, sectoral legislation must be introduced for the FemTech domain to ensure that menstruapp providers are held accountable for their data collection and third-party data sharing practices and downstream harms and data breaches. Finally, we need a new data governance regime that not only upholds the individual controls of all data subjects over their personal data, but also deals with the more complex challenge of leveraging data as a social knowledge commons for public value and benefit, so that people’s claim to datafied intelligence is not mediated by market-based frameworks.