Social media platforms are being used to downplay the threat of the coronavirus and push back on COVID-19 restrictions in the leadup to the 2020 U.S. election.
In a global pandemic, inaccurate information not only misleads but could also be a matter of life and death if people start taking unproven drugs, ignoring public health advice or refusing a coronavirus vaccine when one becomes available.
“A very dangerous element of all of this misinformation is distrust in institutions, in media and in democracy,” said Luca Nicotra, a disinformation researcher with non-profit research and activism foundation Avaaz.
“And this has very clear effects, for instance on vaccination rates. We have already seen how Facebook and other social media have promoted the rise of the anti-vaccination movement all around the world.”
A study by his organization found that content from the top 10 websites spreading health misinformation had almost four times as many views on Facebook than websites providing evidence-based information, like public health institutions such as the World Health Organization and the Centers for Disease Control and Prevention.
Read more:
Nearly half of Canadians can’t tell coronavirus fact from conspiracy theory, survey finds
Nicotra says this has a lot to do with Facebook’s business model.
“Facebook is not a neutral platform. So basically, every time a user logs in, its algorithm decides what you see from the thousands of posts of all the pages you like or the friends you have. It selects the one that it believes will keep you in the platform the most,” he said.
“And what Facebook knows, (CEO Mark) Zuckerberg himself has said that they know that its algorithm, if left unchecked, will promote in a user’s timeline, divisive, sensationalist content and disinformation.”
Despite all evidence, strong rhetoric downplaying the risks associated with COVID-19 has been endorsed at the highest levels of the U.S government.
According to a study by Cornell University, President Donald Trump has been the world’s biggest driver of COVID-19 misinformation during the pandemic.
A team from the Cornell Alliance for Science looked at 38 million articles published by English-language, traditional media worldwide between Jan. 1 and May 26 of this year.
And misinformation is increasingly moving offline and spilling over into the streets in the form of protests or sometimes aggressive refusals to follow social distancing restrictions.
In April, thousands of people gathered at Michigan’s state capitol to protest executive orders issued by Gov. Gretchen Whitmer that shut down most of the state.
Trump openly encouraged such protests, tweeting, “LIBERATE MICHIGAN!”
A group of men known as the Wolverine Watchmen, said to have been motivated by Whitmer’s actions to limit the spread of COVID-19, have been arrested on conspiracy charges, accused of plotting to kidnap the Michigan governor.
Trump has admitted to downplaying the pandemic, continuing to do so even after he was diagnosed with COVID-19 — fuelling the growing coronavirus-denial movement.
“His success in responding or reacting personally to COVID that is now being fed into those conspiracies as well, that it proves that it’s a hoax, that it’s not nearly as serious as we went on it was,” said Barbara Perry, the director of Ontario Tech University’s Centre on Hate, Bias and Extremism.
And with Facebook’s algorithm trying to keep people on its platform for as long as possible, it’s no surprise that what keeps people engaged are sensational posts often full of false information.
“So Facebook’s responsibility then comes from the inaction on not constraining the algorithm (from going into) these black holes,” Nicotra said. “That, really, in the best case, radicalizes people. In the worst case, during a global pandemic like the one we are in the middle of, really, it puts people’s lives in danger.”
Facebook has not responded to Global News’ request for comment but it has made an effort to label posts with warning notices about coronavirus misinformation — including posts by politicians.
But advocates say it’s not enough.
One idea set forth by Nicotra’s foundation is that when Facebook deems a post false or dangerous, it should not only add a warning on the initial post but also when someone shares it, sending them notifications that what they have shared is untrue.
There’s also a push to downgrade the algorithm, says Nicotra, so that when a post is verified false, its reach is automatically decreased.
And as we get closer and closer to the U.S. election and important COVID-19 regulations are debated, access to fact- and science-based information is more important now than ever.