A disconcerting window into just how deeply AI could interfere with human livelihoods opened in Washington yesterday. The setting was, of all places, the Federal Trade Commission. The FTC’s virtual roundtable on the creative economy and generative AI had immediate interest for the tech industry, since FTC Chair Lina Khan has been pushing the agency forward as a potential AI regulator while the rest of Washington largely spins its wheels on the issue. On the regulatory front, there weren’t many surprises from the hearing; it was a listening session, not a press conference. But for anyone trying to get a grip on just how broadly generative AI might impact human striving, some of the testimony was eye-opening, even for those of us who’ve been watching it for a while. The lineup included models, writers, musicians and voice actors, who offered some unsettling twists on the fast-evolving picture of what, exactly, generative AI models are doing to creative work. - In the modeling industry, Sarah Ziff, founder and executive director of the Model Alliance, brought up concerns around models being asked to undergo 3D body scans without getting much transparency about how those scans would be used. She also raised the alarm on companies turning to AI generated models to fill diversity quotas instead of hiring people of color.
“Earlier this year, Levi's announced that the company is creating AI generated models to increase the number and diversity of their models,” Ziff said. “There is a real risk that AI may be used to deceive investors and consumers into believing that a company engages in fair and equitable hiring practices and is diverse and inclusive when they are not.” - In the music industry, “the increasing scale of machine-generated music dilutes the market and makes it more difficult for consumers to find the artists they want to hear,” said Jen Jacobsen, executive director of the Artists Rights Alliance. “Musicians’ work is being stolen from them and then used to create AI generated tracks that directly compete with them,” she said. That competition concern has already become familiar in the AI debate and has echoes in other industries (including mainstream media, where AI-filled junk sites are already pulling in advertising dollars).
- The hearing had some real-world examples of how AI was being used to deliberately confuse consumers: Jacobsen pointed to a hacked podcast episode where AI-generated voices purporting to be from the band Avenged Sevenfold told fans that its upcoming performances would be canceled — and an AI version of Tom Hanks promoting a dental plan against the actor’s consent.
- Creators raised concerns over needing to constantly “opt out” of having their work be caught in the digital dragnet of AI development. Compound that with the very obvious worry that AI-generated voices, books, illustrations and music could elbow aside human artists in the long run, and you can see why the creative industry is freaked out.
So who’s going to solve the problem? The FTC has already said it wants to discourage tools that would allow the kind of “deepfake” deception that fools people into thinking they’re listening to a famous actor or favorite band without the artist’s consent. But the hearing indicated the agency is looking further: “Copyright is not and cannot be the only tool to address the deeply personal concerns creators hold about how their works are used,” FTC Commissioner Rebecca Slaughter said. “There are powerful tools we can use on behalf of creators, workers and consumers,” she added. Outside the regulatory world, some of the burden of negotiating fair terms for artists in the AI era has fallen on unions. The Writers Guild of America recently concluded a 148-day strike upon reaching a historic agreement with Hollywood studios over the use of AI. But “the solution cannot merely be the bargaining of replacement and remuneration, if the job opportunities are replaced wholesale,” said John Painting from the American Federation of Musicians of the United States and Canada. “The solution needs to be wider than the traditional paths we’ve all taken, owing to the cultural damage that this problem yields.” On Capitol Hill, there’s already a piece of legislation in the House, sponsored by Rep. Deborah Ross, that would grant more bargaining power to artists. The law would create an antitrust exemption to allow artists to band together to negotiate licensing terms with major streaming platforms and generative AI developers. Whatever happens next, the solution is likely to be as multi-pronged as the worries that are arising. FTC commissioner Alvaro Bedoya — who has been an outspoken advocate for data privacy and digital rights in tech policy circles for years — said that the FTC’s mandate was always meant to be flexible enough to deal with innovation in unfair methods of competition. “When I hear about new writers, young writers, worried that ‘The moment I arrive, I’m gonna be asked to feed my scripts in to train a new AI’; when I hear about background actors, young actors — how lots of future actors are discovered but who are the least powerful, least experienced, least savvy of all actors — being forced to get scanned, in the nude sometimes, or other really uncomfortable situations, it strikes me as more than innovative,” Bedoya said. “It fills me with concern.” |