Lifestyle

Do Social Media 'Community Guidelines' Help Curb The Spread Of COVID Misinformation?

by Elizabeth Broadbent
Updated: 
Originally Published: 
SOPA Images/Getty

Search for “Mercola” on YouTube. He’s one of the “Disinformation Dozen” that the Center for Countering Digital Hate named as a top spreader of COVID-19 misinformation. In fact, with 394K YouTube subscribers, conspiracy-haunted Mercola is the most prolific anti-vax videographer. You’ll find his channel immediately. His pinned post is titled, “Introducing Dr. Mercola’s New Book, The Truth About COVID-19.” Spoiler alert: by “truth,” Mercola means “this conspiracy theory I ginned up in my basement using string and news print-outs and a desperate hatred for Bill Gates.”

Andriy Onufriyenko/Getty

Getty Images

“The technocratic overlords continue to control the pandemic narrative and take advantage of the chaos they have created,” Mercola claims in his trailer, while showing a COVID-19 vaccine. The word “misinformation” comes against a backdrop of what’s clearly rising death toll numbers. “They are now releasing… tracking technology,” he claims (clear code for “vaccine microchip!”). When he says “Joining the fight for the future” (Fight the Future: the subtitle of the first X-Files movie, which is only the second-worst X-Files movie), the trailer shows people dropping their masks.

This is some anti-vax shit, yo.

No, he never outright says that the pandemic is fake or you shouldn’t wear masks or that the virus was manufactured. It’s carefully crafted not to say anything that directly violates YouTube’s COVID-19 Medical Misinformation Policy. However, it’s clearly advertising something that does, and linking people to a source that violates those policies. Um… shouldn’t this violate something somewhere?

It doesn’t.

His advertising video for “The Truth About COVID-19” contains a screenshot of an Amazon review saying that, “Dr. Mercola uncovers the true facts behind the worldwide PLANdemic.” He’s still not really violating policy… but then YouTube recommended that I watch “Vitamin D and COVID 19: The Evidence for Prevention and Treatment of Coronavirus.” Frighteningly, and frustratingly — since Professor Roger Seheult, MD (who is somehow a legit medical professional) is not claiming Vitamin D will prevent all COVID-19 infections, or cure all COVID-19 infections — he’s within YouTube’s content guidelines. The same applies to mega-wack “COVID-19 and Zinc,” which is recommended when I watch Dr. Mercola’s book trailer. Dr. Joseph Campbell (who doesn’t bother with “MD”, yet has over a million subscribers) never says outright that Zinc will prevent or cure every Covid infection, so it’s all good.

Except it’s so far from all good.

If this whackjob COVID-19 misinformation is allowed, YouTube’s guidelines on COVID-19 misinformation need to be rewritten. Stat.

It’s All About The Algorithm

We know social media plays a huge role in spreading vaccine misinformation, which is, you know, actively killing Americans. But Facebook and YouTube aren’t only allowing the information slipping through their enforcement cracks. Their algorithms actively recommend more. They’re creating echo chambers, which manufacture vaccine-resistant hordes, who then cough the Delta variant through your local Target.

Dr. Joseph Campbell’s zinc video led me to this video… which, in direct violation of YouTube’s content guidelines, proposes Ivermectin (an anti-parasitic agent used primarily in animals to treat things like roundworms) as a cure for COVID-19. The video’s subtitled, “Ivermectin and the odds of hospitalization due to COVID-19: evidence from a quasi-experimental analysis based on a public intervention in Mexico City.” And that evidence says: dewormer works! (I should not need to inform you that dewormer does not treat COVID-19, but here we are).

That leads me to another video also promoting Ivermectin… but with a disclaimer. “I will not be giving any of my own opinions,” Dr. Campbell says, “at least I’ll be trying hard not to… and because we’re going to be talking about therapeutics and a drug I have to tell you, it’s in the rules I have to tell you, that you must never take any drugs based on anything I say in an of my videos, always go to your own prescriber, this is for educational purposes only.”

He’s actively trying to work around the Community Guidelines and failing.

He’s still conning your Nana into downing horse dewormer.

Then we’re onto another video about Ivermectin, then “Vaccines for Children“: don’t vax kids because, well, they don’t really get COVID-19 at the same rates as adults. This skirts the content guideline which states you can’t post “claims that children cannot or do not contract COVID-19.”

It’s all about the skirting.

Community Guidelines need reviewed, and they need reviewed now.

What About Facebook and COVID-19 Misinformation?

The Washington Post reports that in June, advocacy group Avaaz ran an experiment to see how well/how depressingly horribly Facebook’s algorithm worked on anti-vax material. It set up two accounts. They were recommended 109 anti-vaccines pages in only two days. Facebook says it’s removed 18 million pieces of COVID-19 misinformation since last year. But it’s clearly not enough, considering:

I typed in “covid vaccine.”

I touched the search bar again; it recommended the hashtag #covidvaccinesideeffects.

That led me, about five posts down, to this, from FluMist Facts (by “facts,” they mean “conspiracy theories about Big Pharma, which are obviously conspiracy theories because they use the words ‘Big Pharma.'”). This insinuates that COVID-19 vaccines are dangerous, and sort of violates Facebook’s policy against “claims about the safety or serious side effects of COVID-19 vaccines,” which falls under its “Community Standards” guidelines.

Then, a few posts down, there’s this. You know, claiming death as a #covidvaccinesideeffect. This definitely violates Facebook policy, since it’s coupled with that hashtag.

I didn’t go looking for any of this material. It was recommended by the algorithm based on one search. Of course, not everything the search recommended was anti-vax. But it contained lots and lots of COVID-19 misinformation. One of the Avaaz experiments worked much like mine: it started with searching for “vaccine.” Another liked an anti-vaccine page. The recommendations poured in from there, according to The Washington Post.

Clearly Facebook needs more enforcement… and needs to reconsider its algorithm.

Then There’s The Shielded Misinformation

They’re wily critters, those anti-vaxxers. The Washington Post reports that one anti-vax group on Facebook called itself “Dance Party” — and boasted over 40K members when Zuck busted it. They using “pizza” as a code name for “Pfizer.”

Who knows how much of this is out there? Before she was called out and blocked, Erin Elizabeth used this strategy on Instagram, blanking out the word “Covid” and “vaccine” on her photos so they would slide past content guidelines. Here’s an example, which is hashtagged with her company’s name, #healthnutnews:

And while there’s a link asking people to visit the CDC for more information on vaccines, this is allowed to stay — because they’ve crossed out the word.

We Need New Rules On COVID-19 Misinformation

YouTube’s guidelines are being skirted — or aren’t enforced. Facebook’s aren’t being enforced either. They need to implement better enforcement of COVID-19 misinformation policies, and they need to do it soon. Part of that falls onto us: I reported every single anti-vax video I found. These social media platforms need us to report the information we find. If we don’t help, we put them at a significant disadvantage.

But both platforms need a serious redo of their algorithm. Why, when I search “covid vaccine,” am I getting a recommendation for “#covidvaccinesideeffects,” and then seeing the above posts? Both algorithms are like misinformation rabbit holes: find one opening, and you’re tumbling like Alice in Wonderland, except Wonderland is a dark place full of microchips and vaccine death and an evil Bill Gates intent on world domination (WTF is up with the Bill Gates haters?!).

Neither platform is doing enough. Both platforms need fixing. And they need fixing badly, or COVID-19 misinformation will continue to spread.

And people will keep dying.

This article was originally published on