Americas growing fake news problem on social media, in …

Posted: February 21, 2022 at 5:50 pm

Americas fake news problem is getting worse, not better.

According to an analysis released by NewsGuard and first reported by Axioss Sara Fischer on Tuesday, websites that provide unreliable news increased their share of social media interactions this year. In 2019, 8 percent of engagement with the 100 top-performing news sources on social media was dubious. In 2020, that number more than doubled to 17 percent.

NewsGuard, which rates news websites according to reliability, found that people are engaging in a lot more news this year than they were last year. Engagement with the top 100 US news sources (meaning likes, shares, and comments on Facebook and Twitter) went from 8.6 billion reactions to 16.3 billion reactions between 2019 and 2020. That makes sense given, well, everything that has happened in 2020. There has been a lot of news, and due to pandemic-related factors such as unemployment and lockdowns, people have a lot of time on their hands to read stuff online.

But an increasing amount of the news people are seeing is problematic, inaccurate, or suspicious. And thats something to worry about. The analysis found that the Daily Wire, the outlet founded by right-wing commentator Ben Shapiro, saw 2.5 times more interactions this year than last.

The blossoming of false and unreliable news on the internet is a cultural, political, and technological phenomenon thats hard to get your head around, let alone tackle. Conspiracy theories, misinformation, and disinformation run rampant on the internet, and its often difficult for people to tell what is true and whats not. Social media companies are not exactly doing a bang-up job of addressing the problem, either.

Right-wing content, in particular, thrives on platforms such as Facebook. But just because someone sees certain content doesnt necessarily mean they are particularly influenced by it, and figuring out just how powerful certain messages are can be complicated. Over the summer, Kevin Roose at the New York Times reported on what he described as a parallel media universe of super-conservative content on Facebook, noting that right-leaning pages and posts on the platform consistently get more interactions and shares than more liberal and mainstream ones. (Though just because someone likes a news post doesnt mean they actually read it.)

As Recodes Rebecca Heilweil pointed out at the time, its hard to know whats happening on Facebook just by engagement:

Theres now a running debate among academics, analytics experts, and observers like Roose around what we know about whats happening on Facebook and why. Dartmouth political scientist Brendan Nyhan recently argued that likes, comments, and shares are just a small part of what people actually see on Facebook, and that its difficult to draw conclusions from these interactions alone or to know what they might mean for an election.

Still, the trend is concerning. Social media is making political polarization worse in America, and its often the case that people no longer agree on even basic facts. What people consume shapes what they see basically, someone clicks on a certain article and algorithms start to predict what else they might like in alignment with that. And the further down the rabbit hole they go, the more they begin to seek out that media, often winding up in an information bubble.

Republicans have spent years complaining that social media companies are biased against them and that their content is being censored and removed. President Donald Trump has often lashed out against tech companies with unfounded claims of bias. He and his administration have also attempted to undercut and scrap Section 230, a law that basically says social media companies are allowed to police their platforms however they want and arent liable for the content third parties post on them. (Recodes Sara Morrison has a full explainer on Section 230.)

Rather than bias toward a certain political leaning, social media algorithms are often biased toward outrage they push content that people have an emotional reaction to and are likely to engage with. The NewsGuard data and other research shows that people are increasingly being drawn to unreliable content and often, unreliable content that has a conservative bent. And that content can influence all sorts of attitudes and cause confusion on even basic facts.

The New York Times recently took a look at Georgia and how misinformation and unreliable news is playing a role in the US Senate runoffs there. A conservative local news network called Star News Group announced it would launch the Georgia Star in November, and NewsGuards analysis found that the website has published misleading information about the presidential election and the Senate races. One story making false claims about Georgias presidential election results reached up to 650,000 people on Facebook.

Combating fake and misleading news would require efforts from multiple stakeholders. Yet Facebook recently rolled back changes to its algorithm that would promote news from reliable sources. Given the pace at which the problem is growing, the matter is likely to worsen without intervention.

Read more:

Americas growing fake news problem on social media, in ...

Related Posts