Digital surveillance experts were already stressed enough about law enforcement’s use of facial recognition tools and large troves of data from license plate scanners, biometric databases and phone location tracking services. Now, the prospect of a world where Roe v. Wade is overturned has triggered a fresh wave of anxiety about government use of personal data. With some states likely to outlaw most abortions, and prosecutors getting more sophisticated about how they use digital tools, could people’s personal health decisions become subject to state-level surveillance? Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, told me he’s freaked out. “We’re going to see all of the tools that are being developed as a way to optimize our healthcare now being repurposed into some sort of ‘Handmaid’s Tale’-style tracking device,” Fox said. Healthcare data shared with providers through their office’s telemedicine apps or online patient portals are all protected under the Health Insurance Portability and Accountability Act, or HIPAA. But there are now a wide variety of apps, from period trackers to heart-rate monitors, that contain huge amounts of information about your body but aren’t protected as medical records. “It’s not just about those who are seeking abortion care — it’s anyone who is afraid with being wrongly charged with having an abortion simply for having a miscarriage,” Cahn said. Law enforcement officials already have access to reams of data about people through traffic cameras, facial recognition cameras placed in public places and the aggregated mobile phone data they can buy from third-party data brokers. As I reported for Pros in our Morning Cybersecurity newsletter this morning , digital surveillance experts are warning that this seemingly innocuous data could now be weaponized against anyone traveling to an abortion clinic, buying abortion pills online and even just searching for more information about advocacy work with abortion rights groups. Some of advocates' fears have already been realized: Earlier this week, Motherboard reported that data brokerSafeGuard was selling aggregated location data about people who visited Planned Parenthood locations. And in 2017, prosecutors relied on a search extracted from a woman’s phone about buying abortion pills online to charge her with the murder of her stillborn fetus. In a post-Roe world, advocates warn cases like these will only intensify, especially as more law enforcement officials take up facial recognition technologies and consumers turn more to smartwatches and health tech apps to log their menstrual cycles or the first few weeks of their pregnancies. The possible future uses for surveillance technology to crack down on abortions are endless: Police officers could place facial recognition cameras outside of a clinic to identify whoever visits. Prosecutors could request health tech apps hand over data about certain users to help inform if they’ve had an abortion. Public institutions like libraries and universities and social media sites could also decide to crackdown on abortion information in fear of violating state laws. Not helping matters, Americans became more comfortable with using online services to track their health during the pandemic, and as pandemic restrictions start to ease, people are still relying on consumer-facing apps that don’t fall under HIPAA’s purview to track their health info. And privacy researchers have found that period tracking apps are collecting more data about users than just their cycle, including information about their sexual habits and medication intake. At the same time, surveillance tools like facial recognition have become go-to tools for police and other law enforcement officials in recent years. For example, controversial facial recognition vendor Clearview AI, which built its database based off of scraped public social media photos, made its name by working with police forces and is now expanding to work with private businesses. And the Government Accountability Office found in a report released in July that 20 out of the 42 federal agencies that employ law enforcement officials rely on a facial recognition system. Right now, this is all a worst-case scenario for privacy advocates. There’s no opinion overturning Roe yet; it’s not clear how sophisticated state prosecutors will be; people could delete their apps. But one big reason digital surveillance advocates worry is that Washington policymakers have struggled to pass legislation regulating the sales of Americans’ private data, or to lay out rules for how law enforcement can use surveillance in their investigations. “There are data brokers that are selling information on the pregnancy status of literally every American,” Cahn said, “And really, it’s up to Congress and state lawmakers to decide whether that’s information we want to use against pregnant people in a court of law.” |