For all the blame Facebook has received for fostering extreme political polarization on its ubiquitous apps, new research suggests that the problem may not strictly be a function of the algorithm.
Doing so during the three-month period, "did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes," the authors wrote.
When altering the kind of content these Facebook users were receiving to presumably make it more diverse, they found that the change didn't alter users' views.
"However, the data clearly indicate that Facebook users are much more likely to see content from like-minded sources than they are to see content from cross-cutting sources."
The polarization problem exists on Facebook, the researchers all agree, but the question is whether the algorithm is intensifying the matter.
Persons:
Meta, Holden Thorp, Science's, Thorp, Nick Clegg, Clegg, Stephan Lewandowsky, Lewandowsky, Susan Li
Organizations:
Facebook, Nature, Princeton University, Dartmouth College, University of Texas, Meta, University of Bristol
Locations:
U.S