On the last day of 2022, researchers released a study with an eye-opening finding: if you want to know what’s happening inside a room, it’s possible to use WiFi signals the way a ship uses sonar, to sketch out a picture of where people are standing and how they’re posing. While this kind of imaging has been possible for years with purpose-built sensing technology like radar, advancements in AI make it possible to use common tools like WiFi antennas in the same way, and opens up the possibilities for new methods of tracking people’s movements. It’ll likely be a long time before something like this can happen in your home — the tracking requires access to both the transmitter and receiver antennas to pick up those signals. But it does introduce a new question for regulators right now: If and when this starts happening, how are they supposed to think about this data? Does an outline of you standing in a room, created and enhanced by AI, without any of your identifying features, belong to you? Is this data considered biometrics, which lawmakers consider a category of sensitive data? And how would you realistically opt out of something like this if you don’t even know it’s being done in a place like a mall or a parking lot? None of this is a live issue yet. But it shows why future-proofing tech regulations is so difficult — technology often moves so quickly that by the time regulators can pass laws, or make rules, its capacities may have advanced beyond what the legislation was intended for. It’s one reason privacy advocates are so concerned about the American Data Privacy and Protection Act, which Congress failed to pass in 2022 but is likely to be reintroduced this year. The main worry stems from the bill’s compromise on the issue of preemption, which would prevent states from passing any privacy laws that the federal standard already touches on — and would invalidate any existing laws that fall under that umbrella. Proponents of state-based privacy laws argue that states are where the “future-proofing” is mostly likely to happen. They argue that Congress moves too slowly to deal with tech advances — and also that tech companies often move quickly to dodge violations once major federal laws are set in stone, finding workarounds or developing new tracking methods that are outside of the laws’ coverage. States, meanwhile, can respond more quickly to new kinds of data privacy violations — and any federal law that preempts state law would exclude these quicker responses. Tech industry groups like the idea of preemption because it creates a simpler national landscape for data privacy. They argue that multiple state privacy laws are too confusing to follow, and the more laws that get added, the more complex and expensive it will get for companies to stay on the right side of all those laws. TechNet, a group that includes Apple, Amazon, Google and Meta, argues that a single, federal standard would give businesses certainty with privacy regulations. The downside, say privacy advocates, is that it also means the legal regime around tech can just get stuck. When it comes to technology leapfrogging Congress, there’s plenty of history. The Health Insurance Portability and Accountability Act, for example, was passed in 1996 and sets privacy requirements for health information for medical providers. But it failed to foresee the rise of health apps and websites, where people provide medical information outside of those formal channels — information that can be shared and sold without violating the law. Even when Congress does recognize the need to update older legislation, getting new regulations can be difficult. Last year, Sen. Ed Markey (D-Mass.) called for an update to the Children’s Online Privacy Protection Act he introduced in 1998, calling attention to growing concerns with tech’s effect on kids’ mental health. While it passed out of committee, it didn’t make it to a floor vote in Congress. Meanwhile, the states have moved on the issue: California passed regulations last September with stricter regulations on children’s online privacy, and Illinois has landmark legislation on how people’s biometric information can be used. Not every supporter of privacy laws sees the states as quite so crucial. During the debate over preemption, Rep. Frank Pallone (D-NJ), the former chair of the House Energy & Commerce committee and one of the co-sponsors behind ADPPA, noted that while states can pass privacy legislation, most states haven’t taken action, and that a federal bill is the best shot at getting the entire country covered. The bill was drafted with the help of some key privacy experts, and they acknowledge that states are likely to be quicker to pay attention to tech advances. The proposed law allows for that to happen, to an extent: Even with the pre-emption clause, the federal regulation allows for states to pass privacy laws that the national standard doesn’t cover. “If new issues come up three years, five years from now, and it’s something that is not covered by ADPPA, then it’s not going to get preempted and states will have latitude to innovate there,” said David Brody, who advised on ADPPA’s drafting and is the managing attorney of the Digital Justice Initiative at the Lawyers’ Committee for Civil Rights.
|