How (not) to future-proof the law

From: POLITICO's Digital Future Daily - Tuesday Jan 31,2023 09:33 pm
How the next wave of technology is upending the global economy and its power structures
Jan 31, 2023 View in browser
 
POLITICO's Digital Future Daily newsletter logo

By Alfred Ng

With help from Derek Robertson

A person wears an Apple Watch.

Could wi-fi enabled tools be tracking us in more ways than intended? | Ryan Emberley/Invision for Apple/AP Images

On the last day of 2022, researchers released a study with an eye-opening finding: if you want to know what’s happening inside a room, it’s possible to use WiFi signals the way a ship uses sonar, to sketch out a picture of where people are standing and how they’re posing.

While this kind of imaging has been possible for years with purpose-built sensing technology like radar, advancements in AI make it possible to use common tools like WiFi antennas in the same way, and opens up the possibilities for new methods of tracking people’s movements.

It’ll likely be a long time before something like this can happen in your home — the tracking requires access to both the transmitter and receiver antennas to pick up those signals. But it does introduce a new question for regulators right now: If and when this starts happening, how are they supposed to think about this data?

Does an outline of you standing in a room, created and enhanced by AI, without any of your identifying features, belong to you? Is this data considered biometrics, which lawmakers consider a category of sensitive data? And how would you realistically opt out of something like this if you don’t even know it’s being done in a place like a mall or a parking lot?

None of this is a live issue yet. But it shows why future-proofing tech regulations is so difficult — technology often moves so quickly that by the time regulators can pass laws, or make rules, its capacities may have advanced beyond what the legislation was intended for.

It’s one reason privacy advocates are so concerned about the American Data Privacy and Protection Act, which Congress failed to pass in 2022 but is likely to be reintroduced this year.

The main worry stems from the bill’s compromise on the issue of preemption, which would prevent states from passing any privacy laws that the federal standard already touches on — and would invalidate any existing laws that fall under that umbrella.

Proponents of state-based privacy laws argue that states are where the “future-proofing” is mostly likely to happen. They argue that Congress moves too slowly to deal with tech advances — and also that tech companies often move quickly to dodge violations once major federal laws are set in stone, finding workarounds or developing new tracking methods that are outside of the laws’ coverage. States, meanwhile, can respond more quickly to new kinds of data privacy violations — and any federal law that preempts state law would exclude these quicker responses.

Tech industry groups like the idea of preemption because it creates a simpler national landscape for data privacy. They argue that multiple state privacy laws are too confusing to follow, and the more laws that get added, the more complex and expensive it will get for companies to stay on the right side of all those laws. TechNet, a group that includes Apple, Amazon, Google and Meta, argues that a single, federal standard would give businesses certainty with privacy regulations.

The downside, say privacy advocates, is that it also means the legal regime around tech can just get stuck.

When it comes to technology leapfrogging Congress, there’s plenty of history. The Health Insurance Portability and Accountability Act, for example, was passed in 1996 and sets privacy requirements for health information for medical providers. But it failed to foresee the rise of health apps and websites, where people provide medical information outside of those formal channels — information that can be shared and sold without violating the law.

Even when Congress does recognize the need to update older legislation, getting new regulations can be difficult. Last year, Sen. Ed Markey (D-Mass.) called for an update to the Children’s Online Privacy Protection Act he introduced in 1998, calling attention to growing concerns with tech’s effect on kids’ mental health. While it passed out of committee, it didn’t make it to a floor vote in Congress.

Meanwhile, the states have moved on the issue: California passed regulations last September with stricter regulations on children’s online privacy, and Illinois has landmark legislation on how people’s biometric information can be used.

Not every supporter of privacy laws sees the states as quite so crucial. During the debate over preemption, Rep. Frank Pallone (D-NJ), the former chair of the House Energy & Commerce committee and one of the co-sponsors behind ADPPA, noted that while states can pass privacy legislation, most states haven’t taken action, and that a federal bill is the best shot at getting the entire country covered.

The bill was drafted with the help of some key privacy experts, and they acknowledge that states are likely to be quicker to pay attention to tech advances. The proposed law allows for that to happen, to an extent: Even with the pre-emption clause, the federal regulation allows for states to pass privacy laws that the national standard doesn’t cover.

If new issues come up three years, five years from now, and it’s something that is not covered by ADPPA, then it’s not going to get preempted and states will have latitude to innovate there,” said David Brody, who advised on ADPPA’s drafting and is the managing attorney of the Digital Justice Initiative at the Lawyers’ Committee for Civil Rights.

 

JOIN POLITICO ON 2/9 TO HEAR FROM AMERICA’S GOVERNORS: In a divided Congress, more legislative and policy enforcement will shift to the states, meaning governors will take a leading role in setting the agenda for the nation. Join POLITICO on Thursday, Feb. 9 at World Wide Technology's D.C. Innovation Center for The Fifty: America's Governors, where we will examine where innovations are taking shape and new regulatory red lines, the future of reproductive health, and how climate change is being addressed across a series of one-on-one interviews. REGISTER HERE.

 
 
the (ai-generated) sound of music

First, images and text — can AI master music, too?

Late last week, Google published a research paper showing off its newest AI-powered generative model: MusicLM, a model that generates quasi-professional-sounding music from text prompts in much a similar manner to tools like ChatGPT or Stable Diffusion.

You’re going to want to click through the above link to this one: The paper includes full examples of the prompts in question, from “The main soundtrack of an arcade game. It is fast-paced and upbeat, with a catchy electric guitar riff. The music is repetitive and easy to remember, but with unexpected sounds, like cymbal crashes or drum rolls” (remarkably accurate, and indeed pretty catchy) to “Epic soundtrack using orchestral instruments. The piece builds tension, creates a sense of urgency. An a cappella chorus sing in unison, it creates a sense of power and strength” (a little more work to do on the fidelity of this one, but impressively ominous nonetheless).

Much like MusicML’s counterparts in other mediums, the tool isn’t remotely strong enough to put its human competitors out of business yet. But also like those counterparts, the impressive early results show how it could be a powerful tool for creative types with limited resources. — Derek Robertson

more clarity on when, where, and whether robots are allowed to kill you

 DJI Mavic 3 drone flies past a U.S. government surveillance tower.

A DJI Mavic 3 drone flies past a U.S. government surveillance tower near the U.S.-Mexico border on September 27, 2022 in Yuma, Arizona. | John Moore/Getty Images

Last week DFD tackled the evergreen question: “Should a robot be allowed to kill you?

And late yesterday afternoon the author of that dispatch, POLITICO’s Matt Berg, had with his colleague Alexander Ward in National Security Daily another report on the U.S. and autonomous weapons on the battlefield.

The Department of Defense issued guidance yesterday clarifying exactly when, where, and how autonomous weapons are authorized, shedding light on a policy gray area between the emerging worlds of drone warfare and AI. The authors describe a slew of exemptions in the new DoD policy that make autonomous weapon use “much easier,” according to Zak Kallenborn, a policy fellow at George Mason University.

Kallenborn then tries to familiarize readers with a wonky policy debate by invoking man’s best friend — namely, a scenario in which the authors write “A robotic dog carrying supplies… could carry a weapon to defend itself without approval.”

“If Spot happens to wander near an enemy tank formation, Spot could fight. So long as Spot doesn't target humans,” Kallenborn said. “Of course, clear offensive uses like turning Spot into a robo-suicide bomber would require approval, but there's a lot of vagueness there.” — Derek Robertson

tweet of the day

wait what???

the future in 5 links

Stay in touch with the whole team: Ben Schreckinger (bschreckinger@politico.com); Derek Robertson (drobertson@politico.com); Steve Heuser (sheuser@politico.com); and Benton Ives (bives@politico.com). Follow us @DigitalFuture on Twitter.

If you’ve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.

 

DOWNLOAD THE POLITICO MOBILE APP: Stay up to speed with the newly updated POLITICO mobile app, featuring timely political news, insights and analysis from the best journalists in the business. The sleek and navigable design offers a convenient way to access POLITICO's scoops and groundbreaking reporting. Don’t miss out on the app you can rely on for the news you need, reimagined. DOWNLOAD FOR iOSDOWNLOAD FOR ANDROID.

 
 
 

Follow us on Twitter

Ben Schreckinger @SchreckReports

Derek Robertson @afternoondelete

Steve Heuser @sfheuser

Benton Ives @BentonIves

 

Follow us

Follow us on Facebook Follow us on Twitter Follow us on Instagram Listen on Apple Podcast
 

To change your alert settings, please log in at https://www.politico.com/_login?base=https%3A%2F%2Fwww.politico.com/settings

This email was sent to by: POLITICO, LLC 1000 Wilson Blvd. Arlington, VA, 22209, USA

Please click here and follow the steps to .

More emails from POLITICO's Digital Future Daily

Jan 30,2023 09:01 pm - Monday

2023's crypto characters to watch

Jan 27,2023 09:22 pm - Friday

5 questions with IBM's Christina Montgomery

Jan 26,2023 10:18 pm - Thursday

Tech's strange D.C. alliances

Jan 25,2023 09:01 pm - Wednesday

Should a robot be allowed to kill you?

Jan 24,2023 09:14 pm - Tuesday

The fight for the airwaves in your house

Jan 23,2023 09:29 pm - Monday

A new wild west: state-backed digital money

Jan 20,2023 09:10 pm - Friday

5 Questions for Navrina Singh