- Barberá, Pablo;
- Chen, Annie;
- Allcott, Hunt;
- Brown, Taylor;
- Crespo-Tenorio, Adriana;
- Dimmery, Drew;
- Freelon, Deen;
- Gentzkow, Matthew;
- González-Bailón, Sandra;
- Guess, Andrew;
- Kennedy, Edward;
- Kim, Young;
- Lazer, David;
- Malhotra, Neil;
- Moehler, Devra;
- Pan, Jennifer;
- Thomas, Daniel;
- Tromble, Rebekah;
- Rivera, Carlos;
- Wilkins, Arjun;
- Xiong, Beixian;
- de Jonge, Chad;
- Franco, Annie;
- Mason, Winter;
- Stroud, Natalie;
- Tucker, Joshua;
- Nyhan, Brendan;
- Settle, Jaime;
- Thorson, Emily;
- Wojcieszak, Magdalena
Many critics raise concerns about the prevalence of echo chambers on social media and their potential role in increasing political polarization. However, the lack of available data and the challenges of conducting large-scale field experiments have made it difficult to assess the scope of the problem1,2. Here we present data from 2020 for the entire population of active adult Facebook users in the USA showing that content from like-minded sources constitutes the majority of what people see on the platform, although political information and news represent only a small fraction of these exposures. To evaluate a potential response to concerns about the effects of echo chambers, we conducted a multi-wave field experiment on Facebook among 23,377 users for whom we reduced exposure to content from like-minded sources during the 2020 US presidential election by about one-third. We found that the intervention increased their exposure to content from cross-cutting sources and decreased exposure to uncivil language, but had no measurable effects on eight preregistered attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims. These precisely estimated results suggest that although exposure to content from like-minded sources on social media is common, reducing its prevalence during the 2020 US presidential election did not correspondingly reduce polarization in beliefs or attitudes.