Back to blog
dark forestinternet cultureprivate communities

The Dark Forest Theory of the Internet: Why We Went Quiet

kndred Team·
The Dark Forest Theory of the Internet: Why We Went Quiet

The Public Square Is Empty

Something happened to the internet, and if you have been online long enough, you can feel it even if you cannot name it. The public internet — the one made of open blogs, forum threads, comment sections, and personal websites — has gone quiet. Not because people stopped having things to say, but because they stopped saying them in public.

In 2019, the writer and Kickstarter co-founder Yancey Strickler published an essay called "The Dark Forest Theory of the Internet." The idea, borrowed from Liu Cixin's science fiction novel The Three-Body Problem, goes like this: in the dark forest of the universe, any civilization that reveals itself is immediately destroyed. So all intelligent life stays silent. Strickler argued that the same dynamic was playing out online. The public internet had become so hostile — so overrun with bots, trolls, grifters, engagement farmers, surveillance systems, and bad-faith actors — that anyone with something genuine to say had retreated into private channels.

Real conversation moved to group chats, Discord servers, private Slack communities, Substacks with paid subscribers, and encrypted messaging apps. The public timeline became a wasteland of performative content, ragebait, and AI-generated noise. The people with the most interesting things to say went dark.

How the Internet Became Hostile to Honesty

This did not happen overnight. It was the result of several converging forces that made public speech increasingly dangerous and decreasingly rewarding.

Bots and spam scaled beyond recognition. By some estimates, nearly half of all internet traffic is now generated by bots. On social media, the proportion is staggering. When you post something publicly, the majority of your "engagement" may not even come from humans. The signal-to-noise ratio collapsed, and with it, the incentive to contribute signal.

Context collapse made vulnerability lethal. Danah Boyd and others have written extensively about context collapse — the phenomenon where content intended for one audience is suddenly exposed to every audience simultaneously. A joke you made to your friends becomes a scandal when it reaches strangers. An honest, nuanced take becomes a cancellation target when stripped of context. People learned, often painfully, that anything you say publicly can and will be used against you. The rational response is to say nothing, or to say only the safest possible thing.

Surveillance became the business model. Every public post is harvested, analyzed, and monetized. Your words train AI models. Your opinions build advertising profiles. Your vulnerabilities become data points. The entire public internet became a surveillance apparatus, and people who valued their privacy — or simply did not want to be the product — withdrew.

Outrage economies punished good faith. Algorithmic amplification systematically rewarded the most extreme, provocative, and inflammatory content. Good-faith discussion, nuance, and genuine vulnerability were structurally disadvantaged. The people who thrived in the public square were those willing to perform — to be louder, meaner, more reductive. The thoughtful left.

The Great Retreat

The evidence of this retreat is everywhere. Blog comment sections were shut down across nearly every major publication. Personal blogs declined by roughly 40% between 2014 and 2020. Facebook Groups, once thriving public communities, shifted heavily toward private and secret groups. Discord went from a gaming voice chat tool to the default platform for every niche community on the internet, specifically because it is private and invitation-only.

Even on the platforms that remain public, the behavior has changed. People curate aggressively. They post less frequently. They share less personal content. A 2022 study from Pew Research Center found that the majority of tweets come from a tiny minority of highly active users, while most people either lurk or have abandoned the platform entirely. The same pattern appears across platforms. The public internet is dominated by a shrinking pool of power users and an expanding army of bots, while actual humans retreat to the shadows.

"The dark forest is a place of retreat. It's where we go to be ourselves, to think out loud, to be vulnerable without fear of attack. The tragedy is that these spaces are invisible to the broader internet. The best conversations are happening where no one can find them." — Yancey Strickler

The Problem with Going Fully Private

The retreat to private spaces is rational. It is also a loss. Fully private communities have their own failure modes:

Echo chambers by default. When you hand-pick your community, you tend to select for agreement. Private group chats and Discord servers often become insular, reinforcing existing views rather than challenging them. The diversity of thought that used to characterize the open internet is harder to find when every space is curated.

Discovery becomes impossible. On the open internet, you could stumble across a stranger's blog post and discover a mind that changed your life. In the dark forest, that stranger's thoughts are locked inside a Discord server you will never find. Serendipitous connection — one of the internet's original superpowers — withers when everything is private.

The barrier to entry is social, not intellectual. To join most private communities, you need to know someone who is already inside. This recreates the same social-graph gatekeeping that makes traditional social networks exclusionary. If you are new, if you are shy, if you do not already have a network, you are locked out of the best conversations.

Communities calcify. Without new members and new perspectives, private communities stagnate. The conversations become repetitive. The culture becomes self-referential. What started as a refuge from the toxic public internet becomes its own kind of echo chamber.

A Middle Ground: Semi-Private, Interest-Based Spaces

The dark forest theory identifies a real problem, but total privacy is not the answer. What the internet needs is a middle ground — spaces that are protected from the worst pathologies of the public internet but open enough to enable discovery and serendipity.

The characteristics of this middle ground look something like this:

Small by design. Not 100,000-member subreddits, but rooms of 10-50 people who share a genuine intellectual overlap. Small enough for real conversation. Small enough that everyone is a participant, not a lurker.

Organized by ideas, not identity. Instead of joining a community based on who you know, you join based on what you think about. This preserves the open internet's superpower — connecting strangers — while filtering for the kind of people you actually want to talk to.

Semi-private. Not locked behind invitation codes, but not visible to the entire internet either. You gain access by demonstrating genuine interest in the topic — through your own writing, ideas, or creative output — not by knowing the right people.

No audience mechanics. No follower counts, no likes, no retweets. Remove the performance incentives and you remove the trolls, the grifters, and the engagement farmers. When there is no audience to perform for, people default to honesty.

This is the model that platforms like kndred are exploring: concept-based chat rooms where membership is determined by the overlap between your genuine interests (as revealed by your own writing and notes) and the room's topic. You do not apply. You do not need an invitation. The platform reads what you have written, identifies the concepts you engage with deeply, and surfaces rooms where people are discussing those same concepts.

The Future of Online Community Is Not Open or Closed — It Is Filtered

The dark forest is a symptom of a design failure. The public internet was not built to handle the scale, hostility, and commercial exploitation that it now endures. But the answer is not to abandon the idea of open community. The answer is to build better filters.

Not filters based on who you know (that recreates social hierarchies). Not filters based on self-selected labels (people are bad at describing themselves). But filters based on what you actually think about, write about, and care about — surfaced through intelligent analysis of your own output.

The dark forest theory is a diagnosis, not a destiny. The internet can still be a place where strangers meet, ideas collide, and genuine connection forms. But it will take new kinds of platforms — ones designed around ideas rather than identities, conversations rather than content, and depth rather than scale.

The people who went quiet did not stop thinking. They are waiting for a place worth speaking again.