Covid-19 pandemic driving a ‘downgrading of social discourse’, researcher says

covid-19-pandemic-driving-a-‘downgrading-of-social-discourse’,-researcher-says

The Delta outbreak has unleashed a greater willingness to use derogatory and offensive slurs as part of public discourse, according to a University of Auckland research fellow.

WhatsApp, Facebook and Instagram aps on an Apple iPhone X smartphone screen close-up.

Photo: 123rf

The Disinformation Project lead Kate Hannah said terms like Nazism were being thrown around in a carefree manner.

But where does the apparent recent decline in standards stem from?

Those tuning into Covid-19 briefings on social media will find posts and videos accompanied by a flood of antivax sentiment.

Many of the comments come from new, anonymous accounts.

So could they be bots – in other words a software programme that imitates humans?

Hannah said that was unlikely.

“That is very much grassroots-driven,” she said.

“It’s organised via the platform Telegram usually by real individuals who are encouraging others to participate in this and obviously there’s a real sense of feedback.”

Those flooding the comments section could see the impact they were having in real time, providing a dopamine hit at a time when many felt quite low, she said.

However, it also had the side-effect of turning others off what could be important public health messaging.

But many of those behind the comments would genuinely hold their beliefs, Hannah said.

What had also been noteworthy was the coarsening of public debate since the August outbreak.

“We’ve really witnessed a downgrading of social discourse – so an acceptability of really vulgar, obscene, denigrating, rude, misogynistic, racist terminology just being used.”

Terms such as Nazism, communism and authoritarianism were casually being thrown around, including by those who should know better.

Last week a prominent member of the public service compared some of the country’s leading scientists to Nazi physician and war criminal Josef Mengele, and a National MP posted on Facebook: “Generations before us have fought the tyranny of socialism, now it’s our turn”.

Brainbox Institute director and researcher Tom Barraclough said it was important to remember behind the comments were real people.

“I think there’s real risk when we’re talking about stuff like this, that we see everything as being an intentional disinformation campaign to do harm. When it’s actually much more likely that there’s shades of grey through all of this and for many people, pushing political messages online in a very loud and co-ordinated way, it may actually be that they don’t have any sort of coherent political ideology.”

Stepping away from the computer and reaching out to people was key.

“Engage in conversations with people and try to understand where they are coming from and treat them as if their concerns are sincerely held and are coming from a place of pro-social concern,” Barraclough said.

But what could not be ignored was the minority of nefarious bad-faith actors trying to drive such debate, he said.

Social media platforms had more work to do to address the discourse they were allowing, but no one should expect or want them to act as the arbiters of free speech, he said.

Hannah said while bots were not behind the torrent of comments on Facebook livestreams, they were still playing a role in proliferating misinformation.

“We did observe a couple of weeks ago that a petition that was started by a New Zealand-based organisation on Change.org around vaccine mandates very quickly got picked up by different foreign language Twitter accounts who were retweeting it at volume, which really did suggest a bot farm.”

University of Auckland Business School associate professor Arvind Tripathi said misinformation was not new as rumour and innuendo had always existed.

But social media had allowed for misinformation on steroids with the platform’s own algorithms facilitating that, he said.

“So with the social media algorithm the idea is that the things that you like or comment on or that you show you have a preference for, they will bombard you with more of those things,” Tripathi said.

Barraclough put it another way: “People need to understand that what they’re seeing on social media platforms like Facebook, Twitter and YouTube is not a representative picture of what everyone is saying or thinking.

“The platforms use AI recommender systems that mean that they will mainly show you what they think you want to see.

“Social media isn’t like a window to look outside, it’s more like having someone bring you things they think you want to see. That’s crucial to understand when it comes to navigating what you see online.”

0 Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like