Social media sites shouldn’t ban misleading content, UK scientists say

Calls for social media sites to remove misleading content – ​​for example about vaccines, climate change and 5G technology – should be rejected, according to the UK’s top science academy.

After investigating the sources and impact of online misinformation, the Royal Society concluded that removing the false claims and offending accounts would do little to limit their harmful effects. Instead, the bans could lead to misinformation “in the hardest-to-reach corners of the internet and exacerbate feelings of distrust in authorities,” its report said.

In the UK there have been calls from across the political spectrum for Twitter, Facebook and other platforms to remove anti-vax messages. However, “suppressing claims outside the consensus may seem desirable, but it can hamper the scientific process and force genuinely malicious content underground,” said Frank Kelly, a professor of mathematics at the University of Cambridge who chaired the report. Royal Society survey.

He added that removing content and moving users away from mainstream platforms makes it harder for scientists to engage with people such as anti-vaxxers. “A more nuanced, sustainable and targeted approach is needed,” he said.

While illegal content that incites violence, racism or child sexual abuse should be removed, legal content that goes against scientific consensus should not be banned, according to the report. Instead, there should be far-reaching action to “build collective resilience” so people can detect harmful misinformation and react against it.

“We need new strategies to ensure that high-quality information can compete in the online attention economy,” said Gina Neff, professor of technology and society at the University of Oxford and co – author of the report. “That means investing in lifelong information literacy programs, provenance enhancement technologies, and mechanisms for sharing data between platforms and researchers.”

The well-informed majority can act as a “collective intelligence” protecting against misinformation and flagging inaccuracies when they encounter them, said Sir Nigel Shadbolt, executive chairman of Britain’s Open Data Institute and another co-author. “Many eyes can provide in-depth examination of content, as we see on Wikipedia,” he added.

Some fears about the amplification of misinformation on the internet – such as the existence of “echo chambers” and “filter bubbles”, which cause people to encounter only information that reinforces their own beliefs – have been exaggerated, according to the report.

While the internet has led to a vast proliferation of all kinds of information, the vast majority of Britons hold opinions close to those of mainstream science, according to a YouGov poll commissioned for the report. The proportions of the 2,000 participants agreeing that Covid vaccines are unsafe were 7% for the BioNTech/Pfizer jab and 11% for the Oxford-AstraZeneca jab, while 90% said human activity is changing the climate.

Vaccination opponents would eventually have to face evidence that their opposition to Covid shots is wrong, Shadbolt said: “The great natural experiment on the efficacy and safety of being vaccinated is the best evidence we have. For [anti-vaxxers] the evidence is not good.

Comments are closed.