Researchers Examine 'Like-Minded Sources' on Social Media

News subtitle

Study looks at Facebook users’ news feeds, evaluates if “echo chambers” play a role.

Image
Image
Brendan Nyhan
Brendan Nyhan, the James O. Freedman Presidential Professor, has studied the effect of news content on Facebook users from like-minded sources. (Photo by Katie Lenhart)
Body

People often debate whether social media creates “echo chambers” by showing users content that matches their politics and in turn increases polarization. A new study published today in the journal Nature reports that reducing Facebook users’ exposure to content from politically “like-minded” sources had no measurable effect on their political beliefs or attitudes during the 2020 U.S. presidential election.

The findings are part of a broader research project examining the role of social media in U.S. democracy. Known as the U.S. 2020 Facebook and Instagram Election Study, the project is the first of its kind, providing social scientists with social media data that previously has been largely inaccessible.

Seventeen academics from U.S. colleges and universities, including government professor Brendan Nyhan at Dartmouth, teamed up with Meta, the parent company of Facebook, to conduct independent research on what people see on social media and how it affects them. To protect against conflicts of interest, the project built in several safeguards, including pre-registering the experiments. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions.

The study included two main components. The first measured how often all adult Facebook users in the U.S. from June to September 2020 were exposed to content from like-minded sources, which the researchers define as sources on the same political side of the aisle as the user.

The results showed that the median Facebook user received 50.4% of their content from like-minded sources, 14.7% from sources with the opposite political leaning (“cross-cutting” sources), and the remainder from friends, groups, or users that were neither like-minded nor cross-cutting (near the middle of the political spectrum).

When the authors broke down these exposure levels, they found that 20.6% of U.S. adult Facebook users obtained 75% or more of their exposures from like-minded sources; 30.6% got 50% to 75% of their exposures from like-minded sources; 25.6% obtained 25% to 50% of their exposures from like-minded sources; and 23.1% got 0% to 25% from like-minded sources.

The second component of the study was a multiwave experiment among 23,377 U.S. adult users of Facebook who opted in via informed consent. Participants were assigned to either a treatment or control group and were asked to complete five surveys on their political attitudes and behaviors before and after the 2020 presidential election.

In the treatment condition, content of any type from politically like-minded sources was reduced by approximately one-third on average in participants’ Facebook feeds from September to December 2020.

In the control group, 53.7% of the content in their Facebook news feed was from like-minded sources, compared to 36.2% for the treatment group. On average, 143 of 267 views in a day were from like-minded sources for the control group versus 92 of 255 views for the treatment group.

Exposure to cross-cutting sources increased from 20.7% in the control group to 27.9% in the treatment group. However, exposure to sources that were neither like-minded nor cross-cutting increased more (from 25.6% to 35.9%). Respondents in the treatment group were also exposed to less uncivil content.

The reduction in exposure to like-minded sources also reduced total engagement with content from like-minded sources. However, the rate of engagement with this type of content increased in the treatment group, which highlights how human behavior can at least partially offset algorithmic changes.

Finally, reduced exposure to content from like-minded sources had no measurable effects on numerous political attitudes, including polarization in people’s feelings toward the parties, ideological extremity, and their belief in false claims.

The authors also found no evidence that these effects varied based on user characteristics such as party affiliation or years on the platform.

“These findings do not mean that there is no reason to be concerned about social media in general or Facebook in particular,” says Nyhan, the James O. Freedman Presidential Professor in the Department of Government and one of the four lead academic authors on the study. “There are many other concerns we could have about the ways social media platforms could contribute to extremism, even though exposure to content from like-minded sources did not seem to make people more polarized in the study we conducted.”

“We need greater data transparency that enables further research into what’s happening on social media platforms and its impacts,” says Nyhan. “We hope our evidence serves as the first piece of the puzzle and not the last.”

The release of this study coincides with the publication of three other studies today in Science that are part of the broader project with Meta.

The other studies examine the effects of using a reverse chronological feed instead of an algorithmic feedthe effects of exposure to reshared Facebook content during the election; and how exposure to political news content on Facebook is ideologically segregated.

Nyhan is a co-author on all three related papers. The research team plans to publish additional findings from the U.S. 2020 Facebook and Instagram Study in the future.