A study has found that small extremist groups heavily influenced parents in mainstream online communities
One of the most disturbing things about the COVID-19 pandemic has been how many people have fallen victim to dangerous health misinformation — to the point where it has harmed families and affected public health on the whole.
Now, researchers are analyzing how this misinformation gets created and then popularized — and the results thus far can hopefully help us learn how we can recognize and prevent future misinformation.
A new study published this week in the journal IEEE Access and conducted by researchers at George Washington University took a close look at how COVID-19 misinformation spread on Facebook during the beginning of the pandemic — and what we can learn about preventing similar occurrences to happen in the future.
“By studying social media at an unprecedented scale, we have uncovered why mainstream communities such as parents have become flooded with misinformation during the pandemic, and where it comes from,” Neil Johnson, a professor of physics at George Washington University, and an author on the study said.
What the group found was that there were two groups responsible for the spread of misinformation: alternative health groups (which usually focus on positive messaging) and anti-vaxx groups. Both groups would post conspiracy theories and false information in large, mainstream parenting forums that otherwise offered more reliable information.
Other misinformation flowed between these groups as well, including lies about climate change, fluoride, chemtrails, and 5G—and as these groups shared more information, the bond between the groups would strengthen and grow.
The study examined a number of parenting groups on Facebook, which included almost 100 million users.
“Our study reveals the machinery of how online misinformation ‘ticks’ and suggests a completely new strategy for stopping it, one that could ultimately help public health efforts to control the spread of COVID-19,” Johnson said.
The paper explains that many social media platforms control and target misinformation by moderating the largest Facebook groups — but that their research shows that the misinformation comes from much smaller extremist groups that fly under the radar of many of Facebook’s policing techniques. Stopping misinformation from spreading may well require new strategies that stop the misinformation flow from smaller, more clandestine communities to large, mainstream forums.
“Our results call into question any moderation approaches that focus on the largest and hence seemingly most visible communities, as opposed to the smaller ones that are better embedded,” Johnson said. “Clearly, combatting online conspiracy theories and misinformation cannot be achieved without considering these multi-community sources and conduits.”
What can we as parents learn from the study? That while large, mainstream parenting websites and news sites are reliable sources of information, even large parenting forums on social media platforms can contain inaccurate information. And as always: Get your health information and COVID-19 information from sources like the CDC and your family doctor.