Press release -
EXPERT COMMENT: Pseudoscience is taking over social media – and putting us all at risk
In light of a new study that argues YouTube is not only promoting climate denial content, but allowing deniers to hijack scientific terms, Dr Santosh Vijaykumar, Vice Chancellor's Senior Research Fellow in Digital Health at Northumbria, discusses the role of social media in spreading conspiracy theories.
Search for “climate change” on YouTube and before long you’ll likely find a video that denies it exists. In fact, when it comes to shaping the online conversation around climate change, a new study suggests that deniers and conspiracy theorists might hold an edge over those believing in science. Researchers found evidence that most YouTube videos relating to climate change oppose the scientific consensus that it’s primarily caused by human activities.
The study highlights the key role of social media use in the spread of scientific misinformation. And it suggests scientists and those who support them need to be more active in developing creative and compelling ways to communicate their findings. But more importantly, we need to be worried about the effects that maliciously manipulated scientific information can have on our behaviour, individually and as a society.
The recent study by Joachim Allgaier of RWTH Aachen University in Germany analysed the content of a randomised sample of 200 YouTube videos related to climate change. He found that a majority (107) of the videos either denied that climate change was caused by humans or claimed that climate change was a conspiracy.
The videos peddling the conspiracy theories received the highest number of views. And those spreading these conspiracy theories used terms like “geoengineering” to make it seem like their claims had a scientific basis when, in fact, they did not.
Health misinformation
Climate change is far from the only area where we see a trend for online misinformation about science triumphing over scientifically valid facts. Take an issue like infectious diseases, and perhaps the most well-known example of the measles-mumps-rubella (MMR) vaccine. Despite large amounts of online information about the vaccine’s safety, false claims that it has harmful effects have spread widely and resulted in plummeting levels of vaccination in many countries around the world.
But it’s not just well-known conspiracy theories that are causing a problem. In May 2018, one troublemaker came into his own at the height of the Nipah virus outbreak that eventually claimed 17 lives in the southern Indian state of Kerala. He duplicated the letterhead of the District Medical Officer and spread a message claiming that Nipah was spreading through chicken meat.
In reality, the scientifically established view is that the fruit bat is the host for the virus. As the unfounded rumour went viral on WhatsApp in Kerala and neighbouring states like Tamil Nadu, consumers became wary of consuming chicken, which sent the incomes of local chicken traders into a tailspin.
The effects of misinformation surrounding the MMR vaccine and Nipah virus on human behaviour should not be surprising given we know that our memory is malleable. Our recollection of original facts can be replaced with new, false ones. We also know conspiracy theories have a powerful appeal as they can help people make sense of events or issues they feel they have no control over.
This problem is complicated further by the personalisation algorithms underlying social media. These tend to feed us content consistent with our beliefs and clicking patterns, helping to strengthen the acceptance of misinformation. Someone who is sceptical about climate change might be given an increasing stream of content denying it is caused by humans, making them less likely to take personal action or vote to tackle the issue.
Further rapid advances in digital technologies will also ensure that misinformation arrives in unexpected formats and with varying levels of sophistication. Duplicating an official’s letterhead or strategically using key words to manipulate online search engines is the tip of the iceberg. The emergence of artificial intelligence-related developments such as DeepFakes – highly realistic doctored videos – is likely to make it a lot harder to spot misinformation.
So how do we tackle this problem? The challenge is made greater by the fact that simply providing corrective scientific information can reinforce people’s awareness of the falsehoods. We also have to overcome resistance from people’s ideological beliefs and biases.
Social media companies are trying to developing institutional mechanisms to contain the spread of misinformation. Responding to the new research, a YouTube spokesperson said: “Since this study was conducted in 2018, we’ve made hundreds of changes to our platform and the results of this study do not accurately reflect the way that YouTube works today … These changes have already reduced views from recommendations of this type of content by 50% in the US.”
Other companies have recruited fact checkers in large numbers, awarded research grants to study misinformation to academics (including myself), and search terms for topics where misinformation could have harmful health effects have been blocked.
But the continuing prominence of scientific misinformation on social media suggests these measures are not enough. As a result, governments around the world are taking action, ranging from passing legislation to internet shutdowns, much to the ire of freedom-of-speech activists.
Scientists need to get involved
Another possible solution may be to hone people’s ability to think critically so they can tell the difference between actual scientific information and conspiracy theories. For example, a district in Kerala has launched a data literacy initiative across nearly 150 public schools trying to empower children with the skills to differentiate between authentic and fake information. It’s early days but there is already anecdotal evidence that this can make a difference.
Scientists also need to get more involved in the fight to make sure their work isn’t dismissed or misused, as in the case of terms like “geoengineering” being hijacked by YouTube climate deniers. Conspiracy theories ride on the appeal of certainties – however fake – whereas uncertainty is inherent to the scientific process. But in the case of the scientific consensus on climate change, which sees up to 99% of climate scientists agreeing that humans are responsible, we have something as close to certainty as science comes.
Scientists need to leverage this agreement to its maximum and communicate to the public using innovative and persuasive strategies. This includes creating social media content of their own to not only shift beliefs but also influence behaviours. Otherwise, their voices, however highly trusted, will continue to be drowned out by the frequency and ferocity of content produced by those with no concrete evidence.
This article was originally published on The Conversation. You can read it here.
Topics
A limited number of places are available to high-achieving students on Northumbria’s Psychology BSc (Hons) degree course through Clearing this year. For more information, visit www.northumbria.ac.uk/clearing or call the Clearing Hotline on 0800 085 1085.
Northumbria is a research-rich, business-focused, professional university with a global reputation for academic excellence. To find out more about our courses go to www.northumbria.ac.uk
If you have a media enquiry please contact our Media and Communications team at media.communications@northumbria.ac.uk or call 0191 227 4604.