top of page
Search

Networks, Feeds & the Infodemic: Virality, Distrust, and the Social Web

  • Writer: VeroVeri
    VeroVeri
  • Jun 10
  • 3 min read

Updated: Jul 8

Part 3 of The Evolution of Misinformation

Timeline of social-media misinformation: 2004 Facebook feed, 2009 retweet, 2018 viral post, 2020 infodemic globe, VeroVeri shield.
Brand icons shown are for illustration only. No affiliation or endorsement is implied.
September 23, 2020: Speaking from Geneva, the World Health Organization warned that COVID-19 was riding alongside an “infodemic” - a flood of true and false claims moving through social feeds “faster and further” than the virus itself. It was the same alarm bell TV critics rang decades earlier, but this time, every phone scroll added to the noise. Years of instantaneous sharing had flattened our reflex to pause at shocking information; the once-jarring question “Can that be real?” now floated past with a weary swipe.

When the Algorithm Became Editor

Facebook’s News Feed (2004) and Twitter’s Retweet button (2009) replaced chronology with engagement. A 2018 study by three MIT scholars, published in Science, examined 126,000 Twitter cascades and found that false news travels “farther, faster, deeper and more broadly” than fact, by an order of magnitude in some categories.


The shift to engagement-based ranking meant virality was rewarded regardless of accuracy. Infinite scroll made it seamless. The old tension between speed and scrutiny? Gone. Speed won.


Desensitization thread: What once shocked us into attention now simply scrolls by.

Echo Chambers & the Slow Leak of Trust

By 2021, Pew Research reported that 48% of U.S. adults said they “sometimes” or "often" got their news from social media. Meanwhile, Gallup reported that trust in mass media news outlets was stuck at 34%: “Americans' Trust In Media Remains Near Record Low.

Social platforms had become a dominant gateway to information, yet the trust floor was collapsing. Mass repetition made even accurate updates feel suspicious. People stopped trusting individual claims and began to distrust the entire process.


Desensitization thread: Repetition breeds fatigue. Fatigue turns to cynicism.

Supply Meets Demand: Conspiracy Uptake

In 2021, the results of a study were published in nature human behaviour. The study found that just a single exposure to COVID-19 vaccine misinformation could reduce vaccination intent by 6 to 8 percentage points.


Every share made disbelief more socially acceptable. The mechanics of social media (likes, comments, shares, etc.) and human nature, rewarded emotionally charged falsehoods. Misinformation no longer needed to fool the system. It was the system.


Desensitization thread: With each click, doubt becomes more familiar than trust.

Case Vignette: 5G Arson Rumors

In April 2020, the British newspaper The Guardian reported on conspiracy theories linking 5G antenna towers to the COVID-19 pandemic. The theories spread rapidly across Facebook and YouTube. A false post encouraging people to act “before it’s too late” reached millions within 48 hours. By mid-April, UK police had recorded at least 40 tower attacks, including one servicing an NHS COVID-19 hospital.


Desensitization thread: A video of burning cell towers doesn’t shock, it validates a fear.

The COVID-19 Infodemic

During the early phase of the pandemic, WHO’s EPI‑WIN team documented a rapidly growing number of false claims circulating online, often outpacing official guidance.

False claims consistently outpaced corrections. Platforms removed many, but often after they had already gone viral. For some audiences, the corrections never arrived.


Desensitization thread: The question isn’t “Is this true?” but rather “Is anything true enough to matter?”

Why the Platform Era Matters

Visuals once carried authority. Speed once impressed. Now, volume dominates. Each post is just another tile in a wall of noise. Trust erodes, not because people are careless, but because even careful people can’t keep up.

Built on our VALID™ Framework, the VeroVeri information audit screens out weak signals, allowing only substantiated, traceable content to reach your stakeholders.

Next: Synthetic Words, Synthetic Worlds (2022–Today)

How large language models and deepfakes blur reality, and why proactive,

machine-assisted/human-driven verification is now essential.

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page