The skewed geography of AI

From: POLITICO's Digital Future Daily - Thursday Jul 20,2023 09:32 pm
How the next wave of technology is upending the global economy and its power structures
Jul 20, 2023 View in browser
 
POLITICO's Digital Future Daily newsletter logo

By Derek Robertson

With help from Mohar Chatterjee

This photo illustration shows the ChatGPT logo at an office in Washington, DC, on March 15, 2023. - The company behind the ChatGPT app that churns out essays, poems or computing code on command released on March 14, 2023, a long-awaited update of its artificial intelligence (AI) technology that it said would be safer and more accurate than its predecessor. (Photo by Stefani Reynolds / AFP) (Photo by STEFANI REYNOLDS/AFP via Getty Images)

The OpenAI and ChatGPT logo. | AFP via Getty Images

As the AI industry takes off, what kind of signature will it leave on the U.S. map?

Optimists see it as a democratizing force, putting power into the hands of small companies, entrepreneurs, and creative types wherever they may be.

But that’s not really how the rest of the tech boom has gone. Almost paradoxically, the huge spread of the internet and social media to every corner of society has vastly concentrated wealth in Silicon Valley — and there’s a growing fear that AI will do much the same thing.

This worry has percolated into politics over the past decade, with figures like Ro Khanna pushing ideas for spreading tech wealth to the American heartland, and Congress authorizing a series of “Regional Technology and Innovation Hubs” in last year’s CHIPS and Science Act.

The idea is to boost tech innovation across the country in various non-coastal regions. But how is that really supposed to work?

To help get a grip on the problem and propose some solutions as the AI economy grows, the Brookings Institution is out with a new report on how America can broaden out the benefits of AI, getting them to areas outside Mark Zuckerberg and Jeff Bezos’ zip codes.

Co-authored by Mark Muro, Julian Jacobs, and Sifan Liu, the report documents how the benefits of the generative AI boom have concentrated in Silicon Valley, and then makes a strong argument that Washington needs to sharply increase its investment in regional programs to boost nonprofit and university cooperation and spending on AI.

In an interview with DFD this week, Muro emphasized the urgency of the issue: “It's a national problem if the United States isn't unlocking enough of its talent, and ideas, and university resources across the country,” he said.

One of the most eye-popping new stats in Brookings’ report is on the recent concentration of the generative AI industry. The Bay Area counts for a full quarter of generative AI job postings in May 2023, with “early adopter” hubs like New York and Seattle taking another third — leaving the rest of the country mostly in the dust, especially when considering the additional concentration of investment and high-impact research.

To fix the problem, Brookings argues for more federal cash to be appropriated for those universities and nonprofit, NGO-style consortiums. They point approvingly to efforts like the Alabama Artificial Intelligence Center of Excellence, based in Auburn, Alabama, which pulls together resources from eight of the state’s colleges and universities along with regional data center AUBix.

In the Brookings proposal, universities serve two key functions: first, providing the human talent necessary to compete in such a bleeding-edge industry, and second, the raw technical infrastructure necessary to power it. Muro pointed to the University of Florida’s recent collaboration with NVIDIA on a $70 million supercomputer as a prime example.

From the perspective of regions trying to benefit, AI has some unique advantages and unique challenges as a technology.

On one hand, generative AI tools are ultimately just software — something that with enough know-how anyone clever enough can theoretically devise, whether they’re in the East Bay or eastern Germany.

On the other hand, it takes lots of computational power to build the most sophisticated AI models, and with it a whole lot of money. Just look at the most-cited AI papers of 2022: They’re dominated by private-sector players, and the most influential university researchers tend to be Bay Area universities like the University of California at Berkeley or Stanford.

“There clearly will need to be some degree of democratization of access to data for training and computing,” said Muro, nodding to research and recommendations the National Artificial Intelligence Research Resource has already made on the topic. “Right now, those are two of the most binding constraints… without those, places will devolve to dependency on big tech, or disenfranchisement.”

One point of the Brookings report is simply to get Congress to honor its own promise to pay for these ideas. The think tank was involved in the “Regional Technology and Innovation Hubs” program included in last year’s CHIPS and Science Act — which so far has barely been funded.

Of the $10 billion set aside for those hubs in the bill, Congress only ended up appropriating $500 million for the program amid the debt limit negotiations.

Though politics has so far bottled up the funding, Muro suggests that politics could also, eventually, help turn the faucet back on.

“There is a degree of self-interest among members of Congress in trying to build more diverse economic and AI clusters,’" he said. “Because business as usual looks to be leading towards yet another hyper-concentrated, Bay Area-centric outcome.”

 

JOIN 7/11 FOR A TALK ON THE FAA’S FUTURE: Congress is making moves to pass the FAA Reauthorization Act, laying the groundwork for the FAA’s long-term agenda to modernize the aviation sector to meet the challenges of today and innovate for tomorrow. Join POLITICO on July 11 to discuss what will make it into the final reauthorization bill and examine how reauthorization will reshape FAA’s priorities and authorities. REGISTER HERE.

 
 
quantum's foray into national security

The Senate version of the 2024 defense policy bill contains amendments meant to ease the way for quantum research for national security applications.

Those provisions include a public-private talent exchange program, a Defense Department fellowship program, and a mandate to pull in the DOD’s quantum research under the umbrella of 2018’s National Quantum Initiative Act, which is up for reauthorization this year.

Sens. Maggie Hassan (D-N.H.) and John Thune (R-S.D.) are the lawmakers behind the latest quantum research amendments. The bipartisan amendments are meant to “ensure that national security priorities that the Defense Department identifies are part of the entire government’s quantum efforts,” Hassan said.

The House also has quantum on the brain, though it’s taking a slightly different approach. The House version of the 2024 defense policy bill — called the National Defense Authorization Act — includes a quantum pilot program that would see the DOD enter research partnerships with one or more companies in the private sector in addition to a federally funded research and development center.

The House passed its version of the NDAA last week. Senate leaders expect to pass a final version of their defense policy bill before they leave for the August recess. The Senate and the House will then need to reconcile their versions of the defense policy bill before it passes into law. — Mohar Chatterjee

british ai legislation?

More AI news from the United Kingdom — POLITICO’s Morning Technology U.K. newsletter reported today that the British government is strongly hinting it’ll soon unveil comprehensive AI legislation.

POLITICO’s Tom Bristow wrote that the U.K. Department for Science, Innovation and Technology’s Secretary of State Chloe Smith gave the “strongest hint yet” in the House of Commons that the government could propose legislation this fall, following a period of “monitoring” the development of AI outlined in a recent DSIT white paper. In a blog post published yesterday, the U.K.’s four agencies tasked with enforcing digital regulations made the case that they already have considerable statutory authority to regulate AI.

In light of which… if the U.K. does come forth with its own legislative proposal on AI, don’t expect it to be anything as sweeping as the European Union’s AI Act, Tom writes — in keeping with the country’s stated goal to be an intermediary of sorts between the EU and the United States’ regulatory environments. — Derek Robertson

Tweet of the Day

The future of technology is hard to predict, McKinsey edition

the future in 5 links

Stay in touch with the whole team: Ben Schreckinger (bschreckinger@politico.com); Derek Robertson (drobertson@politico.com); Mohar Chatterjee (mchatterjee@politico.com); and Steve Heuser (sheuser@politico.com). Follow us @DigitalFuture on Twitter.

If you’ve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.

 

SUBSCRIBE TO POWER SWITCH: The energy landscape is profoundly transforming. Power Switch is a daily newsletter that unlocks the most important stories driving the energy sector and the political forces shaping critical decisions about your energy future, from production to storage, distribution to consumption. Don’t miss out on Power Switch, your guide to the politics of energy transformation in America and around the world. SUBSCRIBE TODAY.

 
 
 

Follow us on Twitter

Ben Schreckinger @SchreckReports

Derek Robertson @afternoondelete

Steve Heuser @sfheuser

 

Follow us

Follow us on Facebook Follow us on Twitter Follow us on Instagram Listen on Apple Podcast
 

To change your alert settings, please log in at https://www.politico.com/_login?base=https%3A%2F%2Fwww.politico.com/settings

This email was sent to by: POLITICO, LLC 1000 Wilson Blvd. Arlington, VA, 22209, USA

Please click here and follow the steps to .

More emails from POLITICO's Digital Future Daily

Jul 19,2023 08:39 pm - Wednesday

AI just wrote a bill to regulate itself

Jul 18,2023 08:03 pm - Tuesday

The religious mystery of AI

Jul 17,2023 08:09 pm - Monday

Musk's big bet on anti-'woke' AI

Jul 14,2023 08:01 pm - Friday

5 questions for Paul Barrett

Jul 13,2023 08:02 pm - Thursday

Toward humanity's off-planet future

Jul 12,2023 08:08 pm - Wednesday

Algorithms get a new watchdog

Jul 11,2023 08:18 pm - Tuesday

Europe’s agenda for… not 'the metaverse'