Skip to main content
ukiyo journal - 日本と世界をつなぐ新しいニュースメディア Logo
  • All Articles
  • 🗒️ Register
  • 🔑 Login
    • 日本語
    • 中文
    • Español
    • Français
    • 한국어
    • Deutsch
    • ภาษาไทย
    • हिंदी
Cookie Usage

We use cookies to improve our services and optimize user experience. Privacy Policy and Cookie Policy for more information.

Cookie Settings

You can configure detailed settings for cookie usage.

Essential Cookies

Cookies necessary for basic site functionality. These cannot be disabled.

Analytics Cookies

Cookies used to analyze site usage and improve our services.

Marketing Cookies

Cookies used to display personalized advertisements.

Functional Cookies

Cookies that provide functionality such as user settings and language selection.

The More You Delete, the More It Spreads? The Risk Loop of Censorship and Dissemination: Crisis Management Techniques in the Platform Era

The More You Delete, the More It Spreads? The Risk Loop of Censorship and Dissemination: Crisis Management Techniques in the Platform Era

2025年11月05日 00:26

Introduction: Why "Deleting" Fuels the Fire

Internet moderation is no longer the simple task of "delete and it's over." The act of deletion simultaneously generates the "fact of deletion" socially. When human psychology (forbidden fruit effect), network structures (inter-cluster propagation), and algorithms (topic boost) overlap, it transforms from suppression to a "spread accelerator." This is what this article refers to as the "hidden dynamics of censorship."


1. Three Amplification Loops

(1) Attention Loop
Deletion or warnings become signals that "something happened." Many people focus on "why it was deleted," leading to secondary reports and posts. On social media, the label "censored" itself holds spreading value.


(2) Solidarity Loop
Supporters of the content form a group the moment they feel it has been unfairly treated. The community evolves into a "restoration device" through external link collections, archive spreading, mirror distribution, and volunteer translations.


(3) Transboundary Loop
The tighter the restrictions in one place, the more movement occurs to another. Open Web, messengers, video sites, overseas social media, and even offline. The "externalization" of moderation may temporarily lower visibility but can actually increase the total reach.


2. "Strength" Is Not Justice: The Concept of Proportionality

A common misconception is that "immediate deletion is best if it's dangerous." However, when considering the total damage (reach × credibility × duration), "excessive strength" tends to be counterproductive.

  • Labeling (warnings and source presentation): Maintain visibility while adjusting the recipient's trust level.

  • Downgrade (suppress search and recommendations): Quietly lower discoverability to avoid triggering a backlash.

  • Time-lag Response: Gradual processing after bypassing the most "flammable moment."

  • Co-presence of Counter-narratives: Instead of deleting, place evidence and counterarguments side by side to lower the recipient's decision-making cost.

Conclusion: The optimal solution is not to "delete strongly" but to "handle wisely." Proportionality, gradation, and accountability are key.


3. Transparency Is "Troublesome" but Cost-effective

Transparency incurs costs. Creating notices, presenting evidence, handling appeals, publishing records... Nonetheless, the cost of "opacity" is even higher. Opacity breeds suspicions of "arbitrariness," "political motives," and "shadow banning," leading to chronic platform distrust.
As a minimal setup, the following is recommended.

  • Template for Grounds: Prepare templates for terms, violation patterns, and judgment grounds.

  • Visible Appeal Process: Clearly indicate a concise pathway and SLA guidelines.

  • Knowledge for Prevention: Publish FAQs and case studies to enhance visibility of learning.


4. Reactions on Social Media (Typical Trends)

Observing discussions on social media about this theme, the following "patterns" repeatedly appear. Summarize the distribution of reactions without specifically quoting individual posters or specific tweets.

  • Tech/Developer Community: "Without disclosure of algorithmic deduction criteria and training data, there's no reproducibility. Transparency in A/B testing is needed."

  • Creators/Streamers: "Measures to 'quietly make it invisible' are the hardest. Without explanations, improvements can't be made."

  • Researchers/OSINT Community: "Deletion hinders the preservation of primary information. A 'record frame' for verification purposes is needed."

  • Civil Rights Advocates/Lawyers: "While protecting freedom of expression, harm prevention is also necessary. Design a balance of proportionality and transparency."

  • Political/Social Activists: "Is only information inconvenient to those in power being deleted? Auditing and appeal channels are essential."

  • Platform Practitioners: "There is no perfect answer. We can only aim for 'minimizing total cost' of misjudgment risk and neglect risk."


Overall, there is strong opposition to "lack of explanation" and "excessive strength." Meanwhile, in clear harm cases like incitement to violence or fraud, there is some understanding of swift blocking. In other words, **"prescription differentiation by domain and risk"** is required.


5. Learning "Handling" Through Cases

  • Misinformation (Uncertain Truth, Rapid Spread): Instead of immediate deletion, first provide context and suppress discoverability. Re-evaluate once primary information is confirmed.

  • Hate/Harassment: Prioritize protection of the parties involved. Immediate deletion, account sanctions, and contact channels. Keep explanations concise.

  • Crime Promotion/Self-harm: High urgency. Block based on terms and laws, report, and provide resource guidance.

  • Political Discourse: Emphasize proportionality. Place counterarguments and fact-checks side by side, with transparent labeling.

  • Copyright: Visualize the procedure of notification and counter-notification. Also include measures against abuse.


6. Implementing: Key Points in Product Design

  • Friction Design: Slightly increase the "cost" of spreading (e.g., confirmation dialog before resharing).

  • Observation Indicators (KPI)

    • Backfire Ratio: Mention ratio after response (positive/negative)

    • Controversy Half-life: Half-life of the topic

    • Migration Rate: Rate of movement to other platforms

    • Appeal Turnaround: Average processing time for appeals

  • Experiment Culture: Always start rollouts on a small scale. Share lessons learned from failures in public memos.

  • Human × AI Hybrid: Pipeline of automatic detection → human secondary review → explanation generation.


7. When Deletion Is Still Necessary

In urgent and high-risk scenarios, do not hesitate to immediately block. The key points are post-explanation and archive management. If possible, secure evidence for auditing, notify the parties involved, and present measures for prevention simultaneously. The "aftermath" of deletion prevents the next backlash.


Conclusion: Designing the Way Things Are Seen

Censorship and moderation are not a binary choice of "delete/keep." Visibility, context, timing, explanation—a comprehensive technology for designing how things are seen. Intelligence over strength, dialogue over silence, predictability over opacity. When hidden dynamics are turned into allies, only then will the total harm decrease and trust be restored.



Appendix: Operational Checklist (Shortened Version)

  • Objective: What to protect and what to reduce (definition of harm?)

  • Strength: Can the objective be achieved with minimal intervention?

  • Transparency: Are grounds, notices, and appeals prepared?

  • Timing: Is the design able to avoid the most flammable moments?

  • Context: Is it possible to place educational labels or counter-narratives?

  • Evaluation: Is the Backfire Ratio being monitored?

  • Learning: Are failure cases being published and shared?


Reference Article

When Raising Your Voice Feels Risky: New Research Unveils the Hidden Dynamics of Self-Censorship
Source: https://phys.org/news/2025-10-risky-hidden-dynamics-censorship.html

← Back to Article List

Contact |  Terms of Service |  Privacy Policy |  Cookie Policy |  Cookie Settings

© Copyright ukiyo journal - 日本と世界をつなぐ新しいニュースメディア All rights reserved.