Skip to main content
ukiyo journal - 日本と世界をつなぐ新しいニュースメディア Logo
  • All Articles
  • 🗒️ Register
  • 🔑 Login
    • 日本語
    • 中文
    • Español
    • Français
    • 한국어
    • Deutsch
    • ภาษาไทย
    • हिंदी
Cookie Usage

We use cookies to improve our services and optimize user experience. Privacy Policy and Cookie Policy for more information.

Cookie Settings

You can configure detailed settings for cookie usage.

Essential Cookies

Cookies necessary for basic site functionality. These cannot be disabled.

Analytics Cookies

Cookies used to analyze site usage and improve our services.

Marketing Cookies

Cookies used to display personalized advertisements.

Functional Cookies

Cookies that provide functionality such as user settings and language selection.

A Large Minority Darkens the World — The Issue of "Number" and "Exposure" of Toxic Comments

A Large Minority Darkens the World — The Issue of "Number" and "Exposure" of Toxic Comments

2025年12月18日 00:41

1) Feeling that "social media is too toxic" is not because you're overly sensitive

Opening your timeline reveals angry assertions, mockery, personal attacks, and sloppy conspiracy theories——. Just a few minutes of browsing can drain your mood. When such experiences continue, you might start to think, "The internet is beyond saving" or "Everyone has become aggressive."


However, let's take a moment to break down the issue.
What we see daily is the "total amount of posts" or "exposure that catches our eye," not necessarily the actual "number of people making such posts."


A study introduced by Phys.org in PNAS Nexus suggested that this "estimate of numbers" might be more off than we imagine. Phys.org



2) The study looked at "how many people are toxic," not "how much toxicity there is"

The research team surveyed 1,090 U.S. adults (using CloudResearch Connect) to compare people's estimates of the "proportion of users making harmful posts" with "actual platform data" from existing large-scale studies. Phys.org


Here's the point.
The study doesn't deny that "social media appears toxic."
Rather, it checks whether the "feeling of toxicity" has inadvertently shifted to the perception that "many people are toxic."



3) Shocking gap: 43% vs 3%, 47% vs 8.5%

The results were quite extreme.

  • Participants estimated that the proportion of users posting "severely toxic comments" on Reddit was an average of **43%.
    However, existing data suggests it's actually closer to
    about 3%**. Phys.org

  • For Facebook, participants estimated that **47% of users share "false news."
    Meanwhile, existing research indicates it's about
    8.5%**. Phys.org

In the study's terms, the overestimation is about 13 times for Reddit and 5 times for Facebook. Phys.org


This is where the structure of "even a few can appear as many if they are highly active" comes into play.
If the source of toxicity is concentrated among a "small number of heavy posters," the frequency of encounters skyrockets, leading our brains to estimate "there are many such people."



4) The interesting point is "even if you can identify toxicity, the estimate of numbers is still off"

The counterargument "isn't it just that people can't identify toxic content?" is valid. However, the study found that even when participants could appropriately identify toxic posts in a signal detection task, the overestimation of **"how many people post them"** persisted. Phys.org


In other words, the issue is not just literacy (the ability to distinguish).
The human estimation mechanism itself is prone to errors in the social media environment..



5) Why a few appear as "many": Three mechanisms that distort perceived safety

From here, let's organize the findings to connect research results with everyday perceptions.


Mechanism A: Bias in post volume (one person can appear as dozens)
Most people do not post. Meanwhile, some post a lot. If toxicity is concentrated in that group, the number may be small, but it feels like "the same atmosphere all the time."


Mechanism B: Algorithmic exposure amplification
Anger, conflict, and assertions easily garner reactions. If things that get reactions are displayed more, the "encounter rate" becomes high even if the proportion is low.


Mechanism C: Memory bias (strong discomfort is memorable)
A single piercing insult is more memorable than 100 ordinary interactions. As a result, the sample in our brain becomes biased, leading to the estimation of "many."



6) Correcting misunderstandings can "actually" change mood and societal perception

More importantly, the study found that when participants were informed of the actual proportion (that harmful posters are a minority), they

  • felt more positive,

  • the sense that "society is experiencing moral decline" weakened,

  • and related misunderstandings, such as "most people do not want less harmful content," also decreased.

as reported. Phys.org


Discussions about social media often lean towards a binary of "should be regulated" vs. "freedom of expression," but before that, there's a mechanism where **"our assumptions (estimates of numbers) darken our view of society"**—this is the core of this research.



7) Social media reactions: The "proportion" debate sparks both agreement and disagreement

This topic also sparked a lively debate on social media, dividing people into those who found it "convincing" and those who thought "that's not the point."


Reaction ① "Of course. A few hyper-actives shape the atmosphere"

On Hacker News, comments suggesting that "most bad content comes from a 'small, highly toxic and active minority (plus bots),' so moderation works" gained support. Hacker News


On LinkedIn, comments like "The difference between 43% and 3% is huge. When one person posts a lot, it looks like 'dozens of different people are angry'" also appeared. LinkedIn


Reaction ② "The issue is 'exposure (reach),' not 'numbers'"

Meanwhile, on HN, there were notable comments pointing out that "if nodes (a few) are connected to the entire network, the whole thing can appear toxic even if the proportion is low. What's important is 'how much is pushed.'" Hacker News


In short, the counterargument is that the main issue is not "few or many," but rather the design of distribution and dissemination.


Reaction ③ "Isn't the definition too narrow?"

On HN, there were also questions like "If the study's definition of 'toxicity' focuses mainly on blatant things like insults or threats, doesn't it fail to explain the overall 'bad atmosphere' we feel?" Hacker News


Reaction ④ "The discussion jumps to KYC and governance"

Typical tug-of-war arguments have emerged, with opinions like "reduce trolls with KYC" being countered by concerns over privacy and abuse of power. Hacker News



8) The "realistic" implications this study suggests

This study is not saying "social media is fine." Even a few can cause significant harm, and misinformation can be fatal in certain areas.


However, at least the following implications are practical.

  • Individuals: Do not equate encounters with the majority (reduce mental wear).

  • Platforms: How to handle a small number of high-frequency accounts (visualizing posting frequency, creating less tiring reporting pathways, revising reach design).

  • Society: The very pessimism of "the internet is over" can create silence, which in turn can lead to a vicious cycle where "toxic voices" stand out. Therefore, educational interventions that share the reality of proportions might surprisingly be effective. Phys.org


Reference Article

Online trolls are not as numerous as people think
Source: https://phys.org/news/2025-12-online-trolls-people.html

← Back to Article List

Contact |  Terms of Service |  Privacy Policy |  Cookie Policy |  Cookie Settings

© Copyright ukiyo journal - 日本と世界をつなぐ新しいニュースメディア All rights reserved.