As regulators worldwide hash out their response to the rapid advance of artificial intelligence, it appears they have found an early target: the microchips that power the most advanced systems like OpenAI’s ChatGPT or Google’s Bard. French regulators raided the office of a major chipmaker last week, reportedly market leader Nvidia, and EU regulators are said to be in the early stages of investigating whether chipmakers have taken steps to illegally hamper competition. The moves in Europe are the latest sign that policy fights over the virtual future are set to increasingly focus on its physical underpinnings, in particular the vast quantities of raw computing power needed to make AI work. There are fears that the biggest chipmakers — or dominant players in other facets of computing infrastructure — could become something like the Standard Oil of AI if they are able to further entrench their positions in the technology’s supply chain. And it’s not just the EU that’s concerned. At POLITICO’s Tech & AI Summit Wednesday, Federal Trade Commission Chair Lina Khan made clear that antitrust issues with AI are a priority, including a focus on the technology’s building blocks: “I met with some VCs the other month, many of whom are investing in startups that are using AI tools, and several shared that one of their big concerns is that right now, some of these tools can be acquired relatively cheaply, but the fact that a handful of companies control some of the raw materials and critical inputs, gives them pause,” Khan told POLITICO’s Josh Sisco. “In three months or six months, they could wake up and face dramatically jacked-up prices, or potentially coercive terms where access to certain key base models or other AI technologies is being conditioned on reupping your cloud contract or some other part of the business. And so those are all dynamics that we're aware of and trying to stay vigilant.” As DFD noted last spring, much of the initial antitrust concern that followed the sudden explosion in popularity of ChatGPT focused on the data needed to train new AI models, because of the small number of companies that possess enough high-quality data to do it well. But AI policy researcher Jai Vipra told DFD there are several reasons why computing power is emerging as a top early concern for AI competition. For one, it's easier for regulators to understand than data, which is abstract and opaque, making it difficult to know how AI firms are using it. "Right now, compute seems like a more concrete input to regulate," said Vipra, who co-authored a "Computational Power and AI" report released last week by the AI Now Institute, a New York-based think tank focused on the technology’s social implications. And — due largely to a shortage of the advanced chips needed to train and run the most advanced AIs — computational power has become the most important bottleneck to AI adoption, lending it a sense of urgency. ”It’s the constraint right now,” Vipra said. Then there’s the extraordinary run-up in NVIDIA’s stock. The value of the American company has more than tripled this past year, making it the first chipmaker to reach a trillion-dollar valuation. That sent a flashing red signal to pay attention to it, Vipra said. The early focus on computing power does not mean that the data used to train AI will evade fights over market structure forever. “I don't think there is enough evidence in the public domain yet,” Vipra said. “Once it does come out, I think it’s going to be a really big issue.”
|