The studies, by researchers at MIT, Ben-Gurion University, Cambridge and Northeastern, were independently conducted but complement each other well.
© 2024 TechCrunch. All rights reserved. For personal use only.
A pair of studies published Thursday in the journal Science offers evidence not only that misinformation on social media changes minds, but that a small group of committed “supersharers,” predominately older Republican women, were responsible for the vast majority of the “fake news” in the period looked at.
The studies, by researchers at MIT, Ben-Gurion University, Cambridge and Northeastern, were independently conducted but complement each other well.
In the MIT study led by Jennifer Allen, the researchers point out that misinformation has often been blamed for vaccine hesitancy in 2020 and beyond, but that the phenomenon remains poorly documented. And understandably so: Not only is data from the social media world immense and complex, but the companies involved are reticent to take part in studies that may paint them as the primary vector for misinformation and other data warfare. Few doubt that they are, but that is not the same as scientific verification.
The study first shows that exposure to vaccine misinformation (in 2021 and 2022, when the researchers collected their data), particularly anything that claims a negative health effect, does indeed reduce people’s intent to get a vaccine. (And intent, previous studies show, correlates with actual vaccination.)
Second, the study showed that articles flagged by moderators at the time as misinformation had a greater effect on vaccine hesitancy than non-flagged content — so, well done flagging. Except for the fact that the volume of unflagged misinformation was vastly, vastly greater than the flagged stuff. So even though it had a lesser effect per piece, its overall influence was likely far greater in aggregate.
This kind of misinformation, they clarified, was more like big news outlets posting misleading info that wrongly characterized risks or studies. For example, who remembers the headline “A healthy doctor died two weeks after getting a COVID vaccine; CDC is investigating why” from the Chicago Tribune? As commentators from the journal point out, there was no evidence the vaccine had anything to do with his death. Yet despite being seriously misleading, it was not flagged as misinformation, and subsequently the headline was viewed some 55 million times — six times as many people as the number who saw all flagged materials total.
Figures showing the volume of non-flagged misinformation vastly outweighing flagged stories.Image Credits: Allen et al.
“This conflicts with the common wisdom that fake news on Facebook was responsible for low U.S. vaccine uptake,” Allen told TechCrunch. “It might be the case that Facebook usership is correlated with lower vaccine uptake (as other research has found) but it might be that this ‘gray area’ content that is driving the effect — not the outlandishly false stuff.”
The finding, then, is that while tamping down on blatantly false information is helpful and justified, it ended up being only a tiny drop in the bucket of the toxic farrago social media users were then swimming in.
And who were the swimmers who were spreading that misinformation the most? It’s a natural question, but beyond the scope of Allen’s study.
In the second study published Thursday, a multi-university group reached the rather shocking conclusion that 2,107 registered U.S. voters accounted for spreading 80% of the “fake news” (which term they adopt) during the 2020 election.
It’s a large claim, but the study cut the data pretty convincingly. The researchers looked at the activity of 664,391 voters matched to active X (then Twitter) users, and found a subset of them who were massively over-represented in terms of spreading false and misleading information.
These 2,107 users exerted (with algorithmic help) an enormously outsized network effect in promoting and sharing links to politics-flavored fake news. The data show that one in 20 American voters followed one of these supersharers, putting them massively out front of average users in reach. On a given day, about 7% of all political news linked to specious news sites, but 80% of those links came from these few individuals. People were also much more likely to interact with their posts.
Yet these were no state-sponsored plants or bot farms. “Supersharers’ massive volume did not seem automated but was rather generated through manual and persistent retweeting,” the researchers wrote. (Co-author Nir Grinberg clarified to me that “we cannot be 100% sure that supersharers are not sock puppets, but from using state-of-the-art bot detection tools, analyzing temporal patterns and app use they do not seem automated.”)
They compared the supersharers to two other sets of users: a random sampling and the heaviest sharers of non-fake political news. They found that these fake newsmongers tend to fit a particular demographic: older, women, white and overwhelmingly Republican.
Figure showing the demographics of supersharers (purple) with others (grey, whole panel; yellow, non-fake news sharers; magenta, ordinary fake news sharer).Image Credits: Baribi-Bartov et al.
Supersharers were only 60% female compared with the panel’s even split, and significantly but not wildly more likely to be white compared with the already largely white group at large. But they skewed way older (58 on average versus 41 all-inclusive), and some 65% Republican, compared with about 28% in the Twitter population then.
The demographics are certainly revealing, though keep in mind that even a large and highly significant majority is not all. Millions, not 2,107, retweeted that Chicago Tribune article. And even supersharers, the Science comment article points out, “are diverse, including political pundits, media personalities, contrarians, and antivaxxers with personal, financial, and political motives for spreading untrustworthy content.” It’s not just older ladies in red states, though they do figure prominently. Very prominently.
As Baribi-Bartov et al. darkly conclude, “These findings highlight a vulnerability of social media for democracy, where a small group of people distort the political reality for many.”
One is reminded of Margaret Mead’s famous saying: “Never doubt that a small group of thoughtful, committed, citizens can change the world. Indeed, it is the only thing that ever has.” Somehow I doubt this is what she had in mind.
Leave a Reply