AUSTIN, Texas — A multi-university academic team working in collaboration with Meta has published in Nature and Science the first findings from an unprecedented project examining the impact of social media in the 2020 U.S. election.
The study, co-led by researchers Talia Stroud of The University of Texas at Austin and Joshua Tucker of New York University, found that algorithms have a tremendous impact on what people see in their feeds. Although changing fundamental parts of the algorithm affected the content people saw, it did not affect participants’ political attitudes.
“We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes,” Stroud and Tucker said in a joint statement. “What we don’t know is why. It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources.”
The findings are the result of years of research to understand more about the impact of social media on American democracy, using data collected in the context of the 2020 U.S. election. Before the 2020 election, Meta partnered with Stroud, UT Austin communication studies professor and founder and director of the Center for Media Engagement, and Tucker, New York University politics professor, co-founder and co-director of the Center for Social Media and Politics and director of the Jordan Center for the Advanced Study of Russia, to understand more about its impact.
This came amid claims that Facebook and Instagram — through their algorithms and platform design — had influenced people’s political beliefs in the 2016 election. The resulting study represents the most comprehensive research project to date examining social media in American democracy.
The first four published papers document the ways that algorithmic changes affect what people see. For example, switching from Facebook’s personalized algorithm to a chronological feed significantly increased content from moderate friends and sources with ideologically mixed audiences. It also increased the amount of political and untrustworthy content relative to the default algorithmic feed. The chronological feed decreased uncivil content.
The analysis also showed significant ideological segregation in the political news people see on Facebook and Instagram, meaning many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.
To uncover these findings, the team conducted three experiments with consenting participants during the 2020 election period where they altered different aspects of Meta’s algorithm for three months. This included removing reshared content, deprioritizing content from like-minded sources and changing from an algorithmically ranked to a chronological feed, where users saw the newest content first.
None of these substantial changes resulted in significant effects on outcomes such as political polarization, ideological extremity and how people evaluate candidates.
The academic team worked with Meta researchers to design the experimental studies. Participants answered survey questions and agreed to share data about their on-platform behavior. The team also analyzed platform-wide phenomena based on the behavior of all adult U.S. users of the platform. Platform-wide data was made available to the academic researchers only in aggregated form to protect user privacy.
Meta could not restrict or censor findings, and the lead academic authors had final say over writing and research decisions.
Additional papers from the project will be publicly released after completing the peer-review process. They will provide insight into the content circulating on the platforms, people’s behavior, and the interaction between the two.
For more information about the studies, visit Moody College of Communication.