With Mark Zuckerberg threatened with contempt of Congress charges by the GOP, and concerns mounting about AI deepfakes in the 2024 election, the messy fight over social media’s role in politics took another turn Thursday with new research suggesting that Facebook is not making Americans more partisan. The findings, the result of a collaboration between outside researchers and Facebook parent company Meta, cut against the grain of critics, as well some of the company’s past internal findings, which blame the platform’s content algorithms and other features for worsening political polarization. Many experts caution that it's nearly impossible to quantify social media's role in politics: The biggest platforms, like Facebook, are an unprecedented combination of real-time news, campaign messaging, advertisement and public conversation. They've become targets of both the right, which accuse them of squelching conservative views, and the left, which sees them as vehicles for right-wing misinformation. The studies released Thursday tried to tease out the influence of particular factors, such as Facebook's algorithm for serving up content to users. Two studies published in the journal Science that examined the effects of Facebook’s algorithm and reshare feature during the fall of 2020 found that both features increased user engagement — but neither affected people's existing political attitudes or polarization. A separate study published in the journal Nature found that reducing users’ exposure to sources that echo their existing beliefs didn't affect their political attitudes either. Meta trumpeted the results in a memo circulated ahead of the studies’ release: “Despite the common assertions that social media is ‘destroying democracy,’” the company wrote, “the evidence in these and many other studies shows something very different.” Social media critics — many of whom have spent years sounding the alarm about the ways it has changed American politics — suggested the studies were too limited, and too close to Meta itself, to be persuasive, including Frances Haugen, the former Facebook executive who leaked internal company files in 2021, and Jesse Lehrich, co-founder of Accountable Tech, an advocacy group focused on information controls for social media. A fourth study, also published in Science, found that a cluster of news sources consumed by conservatives produced most of the misinformation flagged by the platform’s third-party fact-checking system. (A study co-author, Sandra González-Bailón of the University of Pennsylvania, declined to provide a list of those sources.) The studies were the result of a collaboration between Meta and 17 outside researchers from universities including Northeastern, Stanford and Princeton. An independent rapporteur tasked with evaluating the collaboration vouched for the soundness of its results, but said its framework gave Meta influence over the ways in which outside researchers evaluated its platforms. “Meta set the agenda in ways that affected the overall independence of the researchers,” wrote Michael Wagner, a professor at the University of Wisconsin-Madison’s School of Journalism and Mass Communication. In the years since Trump’s election, liberals and the establishment-minded have generally decried the free-wheeling information environment on social media, arguing that it is a breeding ground for dangerous disinformation and extremism. Populists and conservatives have resisted efforts to rein in the online information ecosystem, arguing that they provide liberal-leaning institutions cover to censor politically inconvenient facts and opinions. In undermining arguments that blame social media for polarization, while affirming that conservative-linked sources produce the lion’s share of misinformation, this new batch of studies is unlikely to put these arguments to rest. Billionaire builder and philanthropist Frank McCourt, a Meta critic who is working on alternative social media models, said that the studies do not address the most fundamental civic issues created by concentrating power over information flow in the hands of for-profit businesses. “You get what you optimize for," said McCourt, "and social media platforms are not optimizing for a healthy society." Katie Harbath, who served as Facebook’s public policy director during the 2020 campaign, said that “more research is needed,” and that ongoing updates to Facebook's algorithm mean that research from 2020 may already be out of date. “Algorithms are always changing, and so while this is a very helpful snapshot, it is just that — a snapshot,” wrote Harbath, who is now the director of technology and democracy at the International Republican Institute, in an email. “This is why transparency is important.” Rebecca Kern contributed to this report. Read the full story here.
|