- A recent study finds misinformation on the new coronavirus spreads differently across various countries. However, there was a consistent misunderstanding of 5G technology.
- Among the search topics examined, the myth around 5G having links to COVID-19 was the one that spread fastest.
- Dispelling myths and encouraging people to fact-check sources could help build trust with the public.
Stay informed with live updates on the current COVID-19 outbreak and visit our coronavirus hub for more advice on prevention and treatment.
The year 2020 brought a COVID-19 pandemic as well as a pandemic of misinformation.
From the first reported case in Wuhan, China, scientists have worked around the clock to gather information about this new coronavirus. In a year, we have learned a lot about the structure of the new coronavirus, how it spreads, and ways to reduce transmission.
But with new information comes misinformation. There have been many potentially dangerous theories related to COVID-19, ranging from the new coronavirus being human-made to the idea that injecting bleach or other disinfectants could protect against infection.
With the coincidental rollout of 5G technology, rumors have also linked the new technology to the new coronavirus.
Factors behind the spread of misinformation
The COVID-19 pandemic resulted in widespread lockdowns across the world in 2020. With billions stuck at home, people have increasingly turned to social media, which is playing a pivotal role in the spread of misinformation.
According to an October 2020 study in Scientific Reports, some social media sites, such as Gab, have a far higher proportion of articles from questionable sources circulating than other platforms such as Reddit. Engagement with the content on social media platforms also varied, with Reddit users reducing the impact of unreliable information and Gab users amplifying its influence.
Not all misinformation is shared maliciously. A July 2020 modeling study in Telematics and Informatics found people shared COVID-19 articles — even if they were false — because they were trying to stay informed, help others stay informed, connect with others, or pass the time.
One particular social media platform, Twitter, has become a double-edged sword regarding coronavirus news. A 2020 commentary in the Canadian Journal of Emergency Medicine suggests that Twitter helps rapidly disseminate new information. Still, constant bad news can result in burnout, or push users to seek out more optimistic information that may be false.
But who is more likely to share articles from dubious sources? A 2016 study in PNAS found that like-minded individuals tend to share more articles with each other, but this can lead to polarized groups when article sharing involves conspiracy theories or science news.
Sharing articles with inaccurate information was most observed among conservatives and people over the age of 65 years, suggests a 2019 study in Science Advances. The research was looking at fake news surrounding the 2016 United States political election.
To investigate how misinformation spreads worldwide, an international team of researchers explored what types of misinformation were more likely to be shared with others, and the patterns in how that misinformation spread. Their findings appear in the Journal of Medical Internet Research.
Common misinformation terms
Using the World Health Organization (WHO) website, researchers compiled a list of words falsely associated with causing, treating, or preventing COVID-19. The scientists also included “hydroxychloroquine,” even though it was not part of the WHO new coronavirus mythbuster page at the start of the study.
The authors focused on four misinformation topics that claimed:
- drinking alcohol, specifically wine, increases immunity to COVID-19
- sun exposure prevents the spread of COVID-19, or it is less likely to spread in hot, sunny areas
- home remedies may prevent or cure COVID-19
- COVID-19 spreads via 5G cellular networks.
From December 2019 to October 2020, the team used Google trends to look at the frequency of these search terms in eight countries spanning five different continents: Nigeria, Kenya, South Africa, the U.S., the United Kingdom, India, Australia, and Canada.
5G myth spread fastest
The researchers observed that searches related to the new coronavirus and 5G started at different times but peaked in the same week of April 5 for six of the countries. The U.K. and South Africa observed a peak during the previous week.
The volume of searches for 5G also doubled in size at a faster rate than other search terms.
Searches for hydroxychloroquine displayed a unique pattern, with three distinct peaks. This was likely a reflection of the ongoing discussions over several months about the drug’s possible benefits.
Searches for ginger and coronavirus occurred in several countries, including the U.S., the U.K., Canada, Australia, and India, during the week of January 19, 2020.
The remaining countries did not search for these terms until February or March, while Nigeria reported no searches for ginger and coronavirus for two consecutive weeks. However, the authors note this may be due to Google’s scaling algorithm and not from Nigeria having no searches for those weeks.
The sun’s effect on the new coronavirus was the subject of searches from the week of January 19, 2020, in several countries. However, Kenya did not show any such topic search until a month later. Compared to other countries, searches for coronavirus and the sun doubled more slowly in Canada.
Search trends for wine concerning the new coronavirus were inconsistent across countries. Scientists excluded Nigeria and Kenya from the analysis because of low search volumes.
The U.S. had the earliest searches during the week of January 12, 2020, with a peak in April. The researchers noted no obvious groupings in terms of peak weeks across countries, with search peaks spreading across March 15 to April 12.
“This study illustrates that neighboring countries can have different misinformation experiences related to similar topics, which can impact control of COVID-19 in these countries,” concluded the authors.
Limitations of study
While the study tracked how often people encountered a topic dealing with new coronavirus information, the researchers could not deduce whether people believed in misinformation. The authors suggest further studies would be needed to determine a person’s interest in looking up a particular search term.
Other limitations include variable access to the internet across countries — the authors note that less than 10% of Nigeria’s population has access, compared with more than 90% for the U.K. Another limitation was the search terms used, which may have excluded relevant content or included noise.
Lastly, the researchers point out that it would be helpful to know the characteristics of people who tend to share articles with inaccurate or false data. This could help in developing future intervention strategies.
“Although monitoring misinformation-seeking behavior via Google Trends is one pathway for identifying belief prevalence and trends, we should monitor information flow across multiple platforms including social media sites, such as Facebook, Twitter, and Instagram, and messaging apps such as WhatsApp.”
Strategies to reduce misinformation
Dispelling new coronavirus rumors is a collective effort.
A 2017 study in Science Communication found that the Centers for Disease Control and Prevention (CDC) reduced misinformation about the Zika virus when they corrected inaccurate information. Correcting others did not harm their credibility.
A July 2020 study in Progress in Disaster Science suggests fact-checking is critical for the average person to avoid misinformation and make informed health decisions.
Addressing common myths with COVID-19 may ultimately help the public trust medical experts, reduce the use of ineffective therapies, and potentially overcome vaccine hesitancy.
Source: Read Full Article