The power of social media echo chambers


At 3000 words, today's newsletter is longer than usual. But it's an important topic I feel strongly about. Anyone who follows me on Twitter/X knows how much I can't stand extreme, binary thinking. Today's newsletter is a breakdown of why that happens on social media and what you can do about it.

Also, today's newsletter is sponsored by one of my favorite tools for ensuring I don't fall prey to sensationalist news stories. Ground News aggregates all news reports on a story, compares headlines, and identifies click-bait bias.

We wouldn't need a tool like this if people read the article and took the time to think critically, but they don't, so we are at a critical time in humanity.

Ground News isn't a fact-checker. They make it easier for you to check the facts, see if anyone is trying to manipulate you, and help you *actually* become informed.

Check out my link here for 40% off the Vantage plan.

Ground News is user-supported, so all the subscription fees make sure the company never has to get in to bed with either side of the media bias.

Now, on to today's newsletter...

Echo chambers result from our brains trying to process information efficiently.

Social media algorithms amplify this effect, detecting patterns in how we interact with information and serving up more of what confirms our existing views.

This algorithmic bias transforms casual social media use into a self-reinforcing echo chamber that shapes how we process new information, for better or worse—and usually, it's for worse.

Thinking is hard, which is why so few do it

Every organism seeks maximum benefit for minimum effort.

This natural tendency extends beyond physical actions to how we manage our emotions, learn new things, and even consume information on social media.

The easiest path to an echo chamber is taking the easy path.

We end up in echo chambers because challenging established thoughts and beliefs and applying critical thought to new information requires more cognitive effort than accepting information that confirms our current thought pattern.

Since social media is in the game of keeping you on the platform for the longest amount of time possible, the algorithms detect your confirmation biases and exploit them. This creates what social scientists call "filter bubbles"—personalized echo chambers that limit exposure to opposing views and overload you with perspectives you've already bought into.

You never challenge what you think about, so your thinking becomes weak.

Nuance is the natural predator of social media

I won't go into my specific stance here, but back in 2022, I tweeted nuanced responses on three hot-button issues: abortion, gun rights, and student loans. I discovered a fascinating yet troubling pattern in modern discourse: nuanced positions receive disproportionate backlash from all sides.

You get twice the hate and half the support because staunch supporters of both sides of the issue see your perspective as an attack on theirs—although the only thing nuance does is step away from the extreme thought patterns that dominate confirmation bias-fueled echo chambers and exercise critical thinking.

Social scientists call this the "nuance penalty": half the support you'd get from taking a hard stance, coupled with twice the criticism from both camps who view any acknowledgment of their opposition as a betrayal. But why?

No one wants to be alone

In sociology, a concept known as "homophily " describes the tendency of individuals to associate or bond with people similar to themselves.

In your everyday conversation, most of you have likely never described a group of friends as "homophilic." Instead, you probably think of it as something along the lines of “birds of a feather flock together.” It's the same idea.

For the flock to move in sync, they not only have to fly in the same direction but also think about flying in the same direction. Otherwise, one of them might unconsciously drift and disrupt the entire formation.

Of course, I don't know how birds think, but anyone who's ever been part of a team knows what happens if one player isn't on the same page as everyone else. Mistakes happen; if they happen too often or one mistake is too severe, the offender loses his starting role on the team. Hell, maybe he's cut from the team altogether.

In our quest to belong, we have stopped challenging ideas that go against the flock because no one wants to be alone. This tendency is natural, but it’s also dangerous.

The ramifications of these filter bubbles extend beyond the individuals in the filter bubble.

Social media platforms have fundamentally changed how information is delivered and consumed. Their algorithms optimize for engagement because engagement keeps users on the platform.

More specifically, engagement with things that confirm and validate our worldview makes us feel superior and right, and not only does that feel good, but it’s also easy. So, we insulate ourselves in social media echo chambers driven by the algorithm on our platform of choice.

While this is good for the platform's profitability, it creates and reinforces echo chambers that algorithmically amplify content matching our existing biases. This leads to what researchers call "algorithmic echo chambers"—where social media decides what we see rather than our own curiosity, and this creates a destructive feedback loop.

The mind-numbing power of algorithmic amplification

When you engage with content on social media that matches your views, the algorithm sees this as a preference—and to be fair, it is.

The problem is that, in the quest to keep you on the platform, it only shows you more content like what you've already seen.

This creates a cycle of confirmation bias where the social media platform shows you more similar information, strengthening the echo chamber effect.

You can convince yourself of anything if you ignore information that doesn't support your conclusion and only cherry-pick data confirming your feelings.

The idea that we are the average of the five people we spend the most time around is more than just a popular saying used to warn people about the company they keep.

For the relationship to continue without friction, people must be similar enough in their ideals that they don't find themselves in frequent or intense disagreement.

No one wants to spend time around someone who makes them feel terrible, and most people don't feel better about themselves when their perspectives and beliefs are attacked or challenged in a hostile way.

The problem is that this approach makes us resistant to new ideas, people, or perspectives.

I’m not saying we all need to get into heated political arguments. I’m saying that the world we create on social media makes those arguments more explosive and destructive because we’ve spent the majority of our days in an environment that reinforces those ideas without question. As a result, by the time we interact with people who don’t think like us, we’ve lost the ability to navigate personal differences and maintain harmonious relationships.

We’ve all seen the stories of people ending relationships or cutting off family members because of who the person voted for. Imagine, for a second, placing a person’s election choice over maintaining your family?

And this example isn’t hyperbole. You can read more about it here. If you don’t, here’s another depressing takeaway from the article:

“A recent Reuters/Ipsos poll found that 13 percent of Americans had ended relationships over the presidential election. Some 16 percent of the 6,426 respondents said they've stopped talking to a family member or friend over the results.”

Social media echo chambers are unique because there is a monetary incentive to build them. In the social media, "cost per mil" era of the Internet, you get locked into an echo chamber before you even know it's being built.

The culture and design of social media simultaneously encourage the attack of different opinions while discouraging nuanced critical thinking. Everyone—from individuals to media organizations—wants to make a hot take that gets shared, grows their influence, and pads their bank accounts. Soon, this quest for digital status becomes an addiction.

This approach leads to an echo chamber padded with confirmation bias, where our attention, emotions, and cognitive abilities are both guardians and prisoners. Understanding the methodology of social media platforms is the first step to breaking out of the echo chamber.

Real-life interactions innoculate you against online toxicity

While you are ultimately the gatekeeper of what you let in your mind and the attitude you take towards people, I believe that a lot of the vitriol on social media exists because people aren't aware of how their social media consumption influences their values.

More specifically, we aren't aware of how the echo chamber's amplified pressure of social conformity intentionally adds fuel to that fire. Echo chambers are dangerous feedback loops that amplify flaws in our thinking while insulating us from correction.

In the public sphere, away from a computer or the World Wide Web, people hear contrary ideas and can consider or ignore them (unless you divorce someone or stop talking to a family member). In most cases, people don't change their minds, and that's fine. Everyone is entitled to their own opinion, but at least they get experience interacting with people who think differently.

However, social media echo chambers are a different beast. They work on a devious feedback loop. You see more of everything you agree with and less of everything you disagree with.

Social media platforms X (formerly known as Twitter) and Facebook are specifically designed this way, as the longer you spend on the platform, the more money they can make from advertisers. The problem with this approach is that it becomes weak if you never do anything to challenge your cognition.

When it weakens, your thinking becomes more extreme and less resilient.

What echo chambers and inbreeding have in common

Think of this like genetic inbreeding. Morals and ethics aside, the problem with having children with relatives is that your genetic material is too similar. The result is that weaker, detrimental genes aren't removed, so those traits propagate through the gene pool. Recessive harmful genes become dominant.

The Habsburg Dynasty is a stark example of genetic echo chambers. Through generations of inbreeding to maintain "pure" royal bloodlines, they concentrated harmful recessive traits. The most famous was the "Habsburg Jaw" - a severe underbite and elongated jaw that made breathing, eating, and speaking difficult. By the dynasty's end, Charles II of Spain was so genetically compromised he couldn't speak properly, struggled to eat, and was infertile, ending the Spanish Habsburg line.

Similarly, social media echo chambers concentrate harmful ideas by limiting exposure to intellectual diversity. Just as royal families only married "acceptable" nobles, social media users often engage with "acceptable" viewpoints. And just as genetic inbreeding amplifies harmful traits generation by generation, echo chambers strengthen flawed ideas through repeated exposure and limited outside influence.

You have to deliberately seek diverse perspectives to prevent intellectual stagnation and the amplification of flawed ideas. Echo chambers compound intellectual weaknesses the way inbreeding concentrates harmful traits.

A research paper titled "The echo chamber effect on social media" shows this effect is strongest on platforms like Facebook and Twitter, where algorithms and network structures create what scientists call "homophilic clusters"—groups that share and reinforce the same perspectives while filtering out contradictory views.

On these social networks, groups naturally form around shared beliefs. Combining this tendency with platform algorithms optimized for engagement gives you the perfect recipe for an echo chamber that demolishes critical thinking. Only engaging with content that matches your views seems innocent, but that only trains the algorithm to show you more similar content, creating a self-reinforcing cycle.

On Facebook and Twitter/X, users primarily connect with and share information among those holding similar views on topics ranging from vaccines to politics. The platform's architecture amplifies this effect—posts spread rapidly within ideological clusters but rarely bridge opposing viewpoints.

This selective exposure explains why political communication and personal beliefs have become increasingly more extreme over time: without outside challenge, weak ideas don't die but rather multiply.

The problem with an echo chamber is it allows weak ideas to flourish, unchecked by the natural selection pressures of intellectual debate and rigor. Because these weaker, fallacious perspectives and ideas never get challenged, they never modify but never die. What happens then is the intellectual equivalent of inbreeding.

When a population lacks genetic diversity, detrimental mutations get passed on rather than phased out by evolutionary pressures. Similarly, flawed arguments and beliefs perpetuate unchecked within echo chambers due to an absence of ideological diversity and dissent. Just as inbreeding multiplies the concentration of undesirable genes over generations, echo chambers compound irrationality over time.

With no infusion of fresh philosophies or counter-evidence, absurdities amplify exponentially. Individuals within these closed-loop systems grow more adamant in their convictions even as their worldviews deviate further from reality. They sink deeper into the delusion that theirs is the only valid mode of thinking.

Instead of the Hapsburg Chin, the result of generations of inbreeding in the European Hapsburg dynasty that prevented many from breathing or eating correctly, you get QAnon followers. Instead of the Blue Fugates of Kentucky, a family whose genetic condition gave them blue skin that inbreeding never weeded out, you get flat-earthers whose absurd beliefs only intensify in their isolated online communities.

The design of social media platforms encourages us to connect primarily with like-minded peers and amplify the points we agree with to signal that we belong to a tribe of other people with similar thoughts. In doing so, we lose the intellectual diversity needed for our beliefs to develop without distortion.

The result is communities where absurd ideas flourish unchecked, validated by the group's mutual agreement rather than their merit. Eventually, members of the group become so convinced about the validity of the perspective that they will not open themselves up to facts, and they will ridicule and cast out members of the community who will.

These poorly formed ideas spread through a social network like a virus. The scientists who studied each social media echo chamber used epidemic modeling to track the spread of information, misinformation, and disinformation, measuring how quickly ideas "infect" different groups.

The findings are striking: information travels up to six times faster within ideologically similar clusters than between opposing viewpoints. The speed difference makes sense. Without friction that might introduce cognitive dissonance, the ideas propagate quickly through a network with frightening speed.

No, the government can't make hurricanes...

Most recently, we witnessed large portions of the American population believe that the government can generate and control the direction of hurricanes. The idea didn't need to be founded in science, and many of these people never stopped to research whether the concept was sensible because it already aligned with a previously held belief. How those beliefs came to be is a topic for another essay. Still, echo chambers are more alluring when they confirm your opinions.

On Facebook and Twitter, content typically reaches an audience that shares the original poster's beliefs. When researchers mapped these "influence sets"—the total group reached by a piece of information—they found clear ideological boundaries.

A pro-vaccine post, for example, primarily reaches other pro-vaccine users, while anti-vaccine content circulates in its own separate network; this makes sense and, by itself, isn't a problem. The problem is that ideological talking points become the basis of interactions in the digital realm rather than different groups interacting based on topics and themes and then discovering their ideologies.

Consider what happens when you join a new group of people in real life, for example, in a boxing gym. You form social groups around a non-political or ideological goal or topic and focus on connecting through those activities.

While certain activities are more attractive to people with certain political leans, there's no guarantee that everyone will think the same. Therefore, there's a huge chance you'll not only have to interact with people who see the world differently than you but also end up liking and learning from them.

Social activist Daryl Davis once said, "Let's say you have a group of 20 people who are anti-racist, and all you do is talk about how bad racism is. Well, what good is that group doing? All you're doing is preaching to the choir. If you and I agree, I'm not accomplishing anything by trying to convince you of what you already know.

You can resolve this by inviting someone to the table who disagrees with you so you can understand why they have that point of view. Then, perhaps, you can figure out a solution to dissuade their fears.”

Talking to people in person is part of this, but interacting with them in person is more important. Social media robs us of this ability, so individuals become more radical in their thoughts, and those radical thoughts augment group polarization. With everyone on social media, there may not be an easy solution. Still, there are some interesting ways we can short-circuit this method.

A cool thing about Reddit

Compared to most social media platforms, Reddit's structure is interesting and produces different results we can all learn from.

Its topic-based communities, rather than friend networks, allow more cross-pollination of ideas. Ideas are challenged, and weaker ones are killed—or, at the very least, downvoted. This structure is similar to the way we meet people in real life.

We seek out people who share similar interests and build a relationship around the topic, and in most cases, only venture into ideology and politics after we've established some type of rapport that can stabilize the relationship enough to keep it going despite any differences—provided they aren't too drastic.

It doesn't always work out this way and friendships have ended over ideological and political differences, but this is WAY more common now in the era of social media than ever before. When I was growing up, we were taught to never discuss politics, religion, or sex in unfamiliar company. There was likely a good reason behind this, but social media has flipped this around,

Social media algorithms, on the other hand, connect you with people almost exclusively based on what they believe. Or, at least, what you think they believe. And because of that, explosive arguments start when a person steps out of line from the expected line of thinking.

The surest way to make someone an extremist is to surround them with people who only agree. Modern network technology has allowed this to happen, which explains why we are more divided and divisive across the cultural and political spectrum despite our society being more connected than ever before. This network effect explains why beliefs become more entrenched over time.

Each share, like, or repost acts as a vote of confidence, gradually pushing groups toward their ideological edges. Without the counterbalance of opposing views that generate enough cognitive dissonance to at least make someone question a long-held belief, extreme positions become the new normal.

Beliefs build echo chambers, while echo chambers reinforce beliefs.

It's a vicious cycle, but what can we do about it, and how can we defend ourselves against the insidious effects of groupthink and confirmation bias?

Don't be intellectually lazy

The ease of modern life hasn't just affected our physical capabilities - it's fundamentally altered how we think. When information is abundant, but thinking is shallow, we default to "low-resistance pathways" - quick judgments over-analysis. This mental laziness manifests as:

1. Emotional reactions replacing reasoned responses

2. Immediate gratification overriding long-term thinking

3. Dismissal of contradictory evidence

4. Oversimplified solutions to complex problems

Simple ideas spread fastest within ideologically similar groups, while challenging concepts rarely bridge divides. If an unpopular idea is too complex to attack, the opponents will create a strawman of that argument instead. That strawman, being easier to attack (even if it's the incorrect idea being attacked), will more easily spread through the network.

Nuance is the enemy of engagement because it requires effort and restraint—the former to understand the true nature of the problem and the latter to prevent initial emotional reactions from polluting one's ability to think.

The only way to take back control of our thinking is by our efforts. There can be no incentive to think more clearly or be less reactive, and no external force can motivate someone to think harder or more clearly.

You've got to invest in yourself to think and feel better.

Today's sponsor is one of the best tools I've found for this

Tools like Ground News represent a technological counter to these echo chamber effects. Their app and website aggregate news from across the political spectrum and highlight how different outlets cover the same story and how platforms create deliberate exposure to diverse perspectives.

Their "Blind Spot" feature is my favorite. If you want to escape an echo chamber, you first have to realize that you're inside of one. A major reason that many people end up in echo chambers is that they don't realize one is being built around them. Media outlets are not neutral. They have a bias, which is reflected in how they cover a story and whether they cover it at all in the first place.

Ground News does not report the news, per se. Rather, it summarizes a story and then shows what percentage of the coverage comes from left, right, and center-biased publications. It allows you to view the different headlines of a story in one place, assess the factuality of the source, and even identify who owns them.

Ground News employs a sophisticated rating system to help users understand potential bias in their news consumption. The platform aggregates assessments from three independent monitoring organizations—AllSides, Ad Fontes Media, and Media Bias Fact Check—each using different methodologies, including editorial reviews, blind surveys, and content analysis.

These organizations are already examining word choice, story selection, correction policies, and sourcing practices, to provide a comprehensive view of each publication's political leanings and factual reliability. Ground News simply reports their findings on every news story in an easy to understand format.

Bias manifests not just in how stories are told but also in which stories are told in the first place. Ground News' Blind Spot feature reveals when stories receive heavy coverage from one political perspective while being largely ignored by others.

By tracking coverage patterns and credibility metrics, the platform helps users understand what they're reading and what they might be missing entirely. This transparency makes it harder for echo chambers to form unnoticed while giving you the tools to escape once you realize what's happening.

This systematic approach to diverse viewpoints matters because it bypasses our natural tendency toward confirmation bias. When we see how different outlets frame the same events, we're forced to confront the limitations of any single perspective. This is why I reached out to Ground News to sponsor this article. Their app and website can aid inreverse-engineering the echo chamber effect: instead of allowing algorithms to narrow our exposure, they deliberately broaden it.

If you subscribe to Ground News using the link below, you’ll get 40% off their Vantage Plan. They are subscriber-supported, so by subscribing, you can directly support my newsletter and contribute to keeping the media transparent.

Stoic Street Smarts

Teaching what I've learned from the hood, the ring, and everything in between. Join 35k other readers to learn how to manage risk, build relationships, and confront reality.

Read more from Stoic Street Smarts

Your inner monologue isn't building your life. Your life is building your inner monologue. You don't build a positive life by blocking out negative thoughts. You block out negative thoughts by creating a positive life. This idea runs contrary to commonly accepted wisdom passed around by everyone, from manifestation coaches to best-selling authors to professional athletes, who insist that your life merely reflects your thoughts. I used to believe this wholeheartedly. But experience and...

You can watch this below. While you're at it, I'd love for you to subscribe. The channel is growing, and your subscription and view hours are free to support it! And another thing before we jump into today's newsletter. I'm *considering* doing a 2-4 week-long workshop teaching the basics of how I've finally triggered the YouTube growth I'm experiencing. The seminar is geared toward writers, millennials and older, and people fed up with X's busted algorithm who want to continue growing an...

video preview

Let's talk about the 3 main ways men destroy their lives. I grew up in the projects and spent my 20s partying and wasting my life. Other than my amateur boxing career, I didn't have much going for me, and I came dangerously close to falling victim to all of these—and I've seen many men who grew up around me, both friends and relatives, fall victim to at least one of these and quite a few, to all three. And look, unless you end up doing life in prison or you catch a sex charge with some kids,...