Back to blog
algorithmsrecommendation systemsintellectual matchingcreation vs consumptionsemantic matchingAI

Beyond the Algorithm: Finding People Who Actually Think Like You

kndred Team·
Beyond the Algorithm: Finding People Who Actually Think Like You

The Algorithm Knows What You Watch. It Does Not Know Who You Are.

You have spent the last hour on YouTube watching videos about woodworking, the philosophy of Martin Heidegger, and street food in Osaka. The algorithm has learned that you will click on videos about woodworking, Heidegger, and Japanese street food. What the algorithm does not know — what it cannot know from consumption data alone — is that these three interests are connected in your mind by a single thread: the idea that mastery of a physical craft is a form of philosophical practice, and that Japanese artisan culture embodies this more purely than any Western tradition.

That connecting thread is who you are. The three separate data points are what the algorithm sees. And no amount of consumption data will ever bridge that gap — because consumption is a weak, noisy, ambiguous signal for intellectual identity.

This is the fundamental limitation of every recommendation algorithm currently deployed at scale. They are extraordinarily good at predicting what you will click on, watch, or buy next. They are structurally incapable of understanding why — and the "why" is where genuine human connection lives.

How Recommendation Algorithms Actually Work

Most people interact with recommendation algorithms every day without understanding what they actually optimize for. The short answer: they optimize for engagement, not understanding.

Collaborative filtering — the foundation of Netflix, Spotify, and Amazon recommendations — works by finding users whose behavior patterns are similar to yours and recommending what those users consumed. "People who bought X also bought Y." This is useful for product recommendations but says nothing about shared intellectual life. People who bought the same book may have bought it for entirely different reasons. The algorithm cannot tell the difference between someone who read Thinking, Fast and Slow out of genuine interest in cognitive psychology and someone who bought it because it was on a "must-read for CEOs" list.

Content-based filtering analyzes the attributes of items you have engaged with and recommends similar items. If you watched a video about neural networks, you get more videos about neural networks. This is better than collaborative filtering for surfacing relevant content, but it operates at the keyword and category level, not the conceptual level. It cannot see the connections between your interests — the thread that links your fascination with neural networks to your interest in Buddhist meditation to your reading about mycelial networks.

Engagement optimization is the layer on top of both approaches. Whatever the underlying model recommends, the final ranking is determined by what is most likely to keep you on the platform. This systematically favors content that triggers emotional reactions (outrage, excitement, anxiety) over content that rewards careful thought. The algorithm does not care whether you feel intellectually enriched after watching a video. It cares whether you watch the next one.

The Consumption Trap

Every major platform builds its understanding of you from what you consume: what you watch, click, like, share, and linger on. This creates a fundamental distortion in how the platform sees you, because consumption is a deeply unreliable signal for identity.

You consume out of habit. A significant portion of what you watch, read, and scroll through is driven by habit and inertia rather than genuine interest. The algorithm interprets this habitual consumption as preference, creating a feedback loop that reinforces behaviors you may not even enjoy.

You consume out of boredom. Late-night scrolling is not an expression of who you are. It is an expression of dopamine-seeking in the absence of better options. But the algorithm treats it identically to your most intentional, engaged viewing.

You consume reactively. Much of social media consumption is reactive — responding to what the algorithm surfaces rather than actively seeking what you care about. You did not decide to watch a 45-second video about a man washing a carpet. The algorithm put it in your feed and you did not scroll past it fast enough. This tells the algorithm something about your attention thresholds. It tells it nothing about your mind.

You consume things you disagree with. Engagement with content you find outrageous, wrong, or offensive is indistinguishable from engagement with content you find fascinating and valuable. The algorithm measures the click, the view duration, the scroll pause — not the emotional valence or intellectual quality of your response.

The result is that the algorithmic profile of you — the model that determines what you see, what is recommended to you, and (on platforms with social features) who you are connected with — is a distorted, shallow, consumption-driven caricature of your actual intellectual life. The algorithm knows your behavioral patterns. It does not know your ideas.

Creation as a Better Signal

If consumption is a weak signal for who you are, creation is a strong one. What you write, sketch, annotate, and build is a fundamentally more honest representation of your intellectual life than what you scroll past on a feed.

Consider the difference in intentionality. You scroll through 500 pieces of content per day without conscious thought. You write a page of notes about something that fascinates you with full conscious engagement. The 500 pieces of consumed content tell the algorithm about your attention patterns. The one page of notes tells a semantic analysis system about your mind.

This is not a hypothetical distinction. Research on the psychology of creation versus consumption consistently shows that creative output is a more reliable predictor of personality, values, and intellectual interests than consumption behavior. What you choose to spend time creating reveals what you truly care about — because creation requires effort, and humans do not invest effort in things they are indifferent to.

From "People Who Watched This Also Watched" to "People Who Think About This Also Think About"

The next evolution in matching people is moving from consumption-based signals to creation-based signals. Instead of "people who watched X also watched Y," the matching logic becomes "people who wrote about X also write about Y" — and at a much deeper level, "people whose conceptual landscape overlaps with yours."

This is what semantic embedding technology enables. When your writing is processed through an embedding model, the result is not a list of topics. It is a high-dimensional representation of the conceptual space your thinking occupies. That representation captures not just what you write about but how your ideas connect — the specific intersections, the recurring themes, the unique combinations that define your intellectual fingerprint.

Two people whose embedding profiles are similar are not just people who share a topic tag. They are people whose thinking occupies similar conceptual territory — who ask similar questions, draw similar connections, and explore similar intersections. The matching is not "you both like philosophy." It is "you both keep returning to the tension between embodied cognition and computational theories of mind, particularly as it applies to creative practice." That is a fundamentally different kind of match, and it produces a fundamentally different kind of connection.

What This Looks Like in Practice

kndred is built around this principle. Instead of building a profile of you from your consumption behavior, the platform asks you to share what you have created — your notes, essays, documents, and writing. The AI analyzes this content to extract your conceptual landscape and matches you with concept-based rooms where people share genuine intellectual overlap.

The experience is qualitatively different from algorithmic recommendation. You do not scroll through a feed of content optimized for your attention. You enter rooms of people who have independently arrived at similar ideas — who have been thinking about the same questions from their own unique angles. The conversation starts at a depth that would take months to reach on a conventional platform, because everyone in the room has already demonstrated deep engagement with the topic.

This is what the internet promised and social media failed to deliver: not just connecting everyone to everyone, but connecting each person to the specific humans who share their most unusual, specific, and personally meaningful intellectual interests. The algorithm was never going to get there, because the algorithm was never looking at the right data.

Beyond Recommendation

The recommendation algorithm was a breakthrough for content discovery. It solved the problem of "what should I watch next?" But it was never designed to solve the problem of "who should I talk to?" — and when platforms tried to repurpose recommendation logic for social connection (People You May Know, suggested follows, recommended communities), the results were shallow and unsatisfying.

Finding people who actually think like you requires a different approach: one that looks at what you create rather than what you consume, that operates at the conceptual level rather than the keyword level, and that optimizes for intellectual resonance rather than engagement metrics.

The algorithm knows what you watch. Your writing knows who you are. And the gap between those two things is where a better kind of online connection becomes possible.