No sense to them: Tackling COVID-19 vaccine misinformation on social media – Sydney Morning Herald

Growing misinformation

The scale of the problem is evident in Facebook data showing it removed more than 12 million pieces of content on Facebook and Instagram (which it also owns) between March and October this year for containing misinformation that may lead to imminent physical harm, such as content relating to fake preventative measures or exaggerated cures.

During the same period the social media giant displayed warnings on about 167 million pieces of content on Facebook. The warnings are based on articles written by its fact-checking partners.

The recent approval of COVID-19 vaccines has led social media users to focus on misinformation about the vaccines efficacy and safety.

This includes misinformation about side-effects as well as conspiracy theories, including unfounded and false claims that the vaccines are being used to insert microchips into people and that Microsoft founder and philanthropist Bill Gates is spreading COVID-19 to profit from the vaccine.

Its given a boost to existing anti-vaccination sentiment, with The Age and The Sydney Morning Herald tracking three anti-vaccination Facebook groups in Australia using the Facebook-owned tool Crowd Tangle and finding that over the last 12 months the groups have recorded 22,000 likes, a growth of 57 per cent.

The social media platforms know the surfacing of misinformation is a problem and are scrambling to tackle the issue.

Facebook has made anti-vaccination groups, such as those tracked by The Age and The Sydney Morning Herald, difficult to find by removing them from search results on the platform.

Its a practice known as shadow banning, where the platforms try to limit the spread of misinformation about COVID-19 online by making content more difficult to find rather than de-platforming it completely.

The platforms have also restricted the use of hashtags used by misinformation spreaders, such as #covidisahoax and #vaccinescauseautism.

If you type these hashtags into Facebook, TikTok or Instagram, all you get is a message advising that posts using the hashtag have been temporarily hidden as some content in those posts goes against our community standards.

Ninety-year-old Margaret Keenan became the first patient in the world to receive the Pfizer-BioNTech COVID-19 vaccine outside of a medical trial after it was approved by the UK regulator.Credit:Getty Images

Facebook this week started sending notifications to users who shared, commented on or liked posts which contain misinformation about COVID-19. The notifications provide these users with links to trustworthy sources on the virus.

Our position on vaccine misinformation is clear we remove false claims about the safety, efficacy, ingredients or side-effects of the vaccines, including conspiracy theories, and continue to remove COVID-19 misinformation that could lead to imminent physical harm, says Josh Machin, head of public policy for Facebook Australia.

Twitter announced on Friday that when someone in Australia searches for certain keywords associated with vaccines on its platform, a prompt will direct them to the Department of Healths information resources on vaccination and its Twitter account.

From Monday Twitter will start removing the most harmful information and begin to label tweets that contain potentially misleading information on the vaccines in the same way it labels political tweets that are factually incorrect.

TikTok also released new guidelines this week detailing how users will be directed to relevant and trusted information from public health experts when they search for COVID-19 misinformation.

TikToks head of trust and safety for the Asia Pacific region, Arjun Narayan, says misinformation in itself is not new and has existed through the ages.

Its just a given everything is on social media these days, everything is digital, so a lot of the societal fault lines now manifest on social media, he says. Misinformation survives and thrives in an information vacuum and the best antidote ... is countering that with accurate information.

Narayan says TikToks proactive detection algorithms and its team of over 1000 content moderators around the world also remove misinformation about COVID-19 from the video platform in Australia.

Any medical misinformation which poses a threat to public interest, which creates a health hazard, we do not allow for that kind of content on the platform, he says. So when it comes to dangerous conspiracy theories, we have zero tolerance for that.

The social media platforms efforts are not a simple matter of altruism. The British government has announced it will introduce laws next year under which Facebook, Instagram, Twitter and TikTok will be fined more than 18 million ($31.6 million) if they allow users to post child exploitation material, terrorist content or anti-vaccination disinformation.

Disinformation is distinguished from misinformation in that it is made with intent.

In Australia different agencies deal with each of these issues, with the eSafety Commissioner having regulatory oversight over cyber-bullying material, image-based abuse and child exploitation material, while the Therapeutic Goods Administration has powers to take action against illegal advertising of therapeutic products.

There is currently no broad regulation of online misinformation in Australia, a spokesperson for the Australian Communications and Media Authority (ACMA) says.

The government has asked the ACMA to oversee the development of a voluntary code of practice on disinformation and news quality for digital platforms but it is not expected to be in place until next year.

Between March and October of 2020, Facebook removed more than 12 million pieces of content from Facebook and Instagram for containing misinformation.

Dr Belinda Barnet, a senior lecturer at Swinburne University of Technology, says the social media platforms need to do more.

If a piece of content containing information about vaccination, for example, starts to go viral it needs to be immediately fact-checked, she says. Its in their capacity to do this - they know immediately which content is going viral and has been shared a thousand times.

Barnet says misinformation is increasing in Australia and shadow banning is limited in its efficacy.

The people it doesnt catch, this particular policy, is the people that already belong to these groups, so if you are already part of an anti-vaccination group, you can immediately see this [misinformation] content and any content related to it, she says.

Barnet is also concerned the platforms strategy does not prevent high-profile social media users spreading misinformation, such as celebrity chef and anti-vaxxer Pete Evans, who suggested sunlight could be the best vaccine, and politician Mark Latham, who last week posted on Twitter that the University of Queenslands COVID-19 vaccine was deliberately implanted with the HIV-AIDS virus.

Well have a problem on our hands, not as big as America, but as the government has already pointed out when it comes to the rollout of the vaccination, there will be people who believe this misinformation, Barnet says.

The risk is by drawing attention to misinformation and calling it out we are unintentionally amplifying something that might otherwise go unnoticed and ignored.

Associate Professor Adam Dunn at the University of Sydney has been studying misinformation on social media related to vaccinations for the past five years and published research in July in the American Journal of Public Health looking at 21.7 million vaccine-related tweets.

The research found that for typical Twitter users, the vast majority of the content they see or engage with is not critical of vaccination or promoting misinformation. Only about 5 per cent of social media users belong to communities where vaccine-critical content is more common, and the tiniest fraction of users are posting or passing along vaccine-critical content.

Misinformation makes up a tiny proportion of what most people see, so it seems a massive stretch to suggest that it could be changing their beliefs and decisions, Dunn says. Were worrying too much about people being anti-vaccine, what we need to worry about first is making sure that everybody who needs access to the vaccines has access to the vaccines.

However Mrozinski believes it is important misinformation is called out and limited in its reach: By the time it is being seen by hundreds of millions of people the damage has been done, thats how the word spreads.

At times Mrozinski admits its a hassle, as he and other health professionals are subject to onslaughts from anti-vaxxers when they post on social media, but he is determined to continue.

People who are against things always seem to shout louder and make the most noise, he says.

Stay across the news you need to know related to the pandemic. Sent Monday and Thursday. Sign up here.

Cara is the small business editor for The Age and The Sydney Morning Herald based in Melbourne

Follow this link:

No sense to them: Tackling COVID-19 vaccine misinformation on social media - Sydney Morning Herald

Related Posts
This entry was posted in $1$s. Bookmark the permalink.