I’ve come up with a word.

I don’t know whether it will catch on, but I suspect it might—because I think it names something most people already recognize, something that feels especially relevant in the world we’re living in now.

The word is wrongfidence.

I spend much of my professional life helping people identify beliefs that quietly limit their growth—conclusions that once made sense, but now get in the way. However, some beliefs are the most challenging to address—precisely because they don’t feel like they need addressing at all. They are settled. They are givens. They are the immutable laws of our internal universe.

The problem is that feeling settled doesn’t make a belief accurate. Sometimes our most entrenched beliefs are quietly misaligned with reality. And when beliefs no longer match the world we’re actually living in, that mismatch doesn’t stay abstract. Eventually, reality pushes back—through conflict, failure, loss, or surprise. When that happens, the hardest situations to navigate are the ones we don’t believe should exist at all. If something can’t be true, we’re unprepared to respond to it.

Wrongfidence is simply the most economical way I’ve found to describe that pattern.

Most people intuitively understand what it points to. They’ve encountered it in conversations, in families, in institutions, online, and—if they’re honest—occasionally in themselves. Words that click this quickly tend to be useful, because they help us notice patterns in others and in ourselves.

What Wrongfidence Is

Wrongfidence is what happens when confidence outpaces accuracy—when certainty becomes so set that learning, correction, or connection can’t get through. The belief may be sincere. The conviction may feel earned. But the confidence is doing more work than the truth can sustain.

Wrongfidence isn’t ignorance. Ignorance is simply not knowing. Wrongfidence is knowing for sure—and being wrong anyway.

It’s also not stupidity, arrogance, or bad faith. In fact, one of the most unsettling things about wrongfidence is that it often shows up in people who are intelligent, educated, and articulate. This helps explain why it can be so persuasive, and so hard to dislodge.

Examples of How Wrongfidence Takes Shape

There’s a small, funny illustration of this in the movie Hitch. A character seeking dating advice humbly admits he has a lot to work on—but very confidently insists that dancing is not one of those areas. When asked to demonstrate, it turns out he’s an objectively terrible dancer.

The joke lands because the posture is familiar: genuine self-awareness in some areas, paired with unearned certainty in another.

That’s wrongfidence in a nutshell.

There’s another path into wrongfidence that looks very different.

In trauma, certainty can form not from excess confidence, but from pain. Emotionally overwhelming or harmful experiences can harden into absolute beliefs about safety, trust, or predictability: People can’t be trusted. If I let my guard down, I’ll get hurt. This is just how the world is.

These beliefs often begin as protection—an understandable overcorrection meant to prevent future harm. But over time, they can calcify into a kind of wrongfidence of their own: conclusions held with absolute certainty, even when they quietly distort present reality.

What once helped someone survive can later limit their ability to connect, adapt, or feel safe in situations that no longer match the original threat.

From the outside, this can look like stubbornness. From the inside, it often feels like survival.

That sense of certainty isn’t accidental—it’s stabilizing for most of us, even without the extreme distortions of trauma.

We all have our reasons—shaped by experiences, values, and incentives—for holding onto these core, stabilizing beliefs. That’s why, even when we’re wrong, some beliefs endure.

There’s research backing this up. A 2025 study by Gordon Pennycook and colleagues found that belief in conspiracy theories is less about intelligence or education and more about overconfidence—specifically, how certain people feel about their own conclusions. Participants who endorsed false conspiracies consistently overestimated both how accurate they were and how many other people agreed with them. In other words, confidence—not competence—was doing most of the work.

This helps explain why simply providing better information so often fails. When confidence has already sealed the system, facts don’t arrive as data. They arrive as opposition. Evidence feels like attack. Questions feel like disrespect.

Wrongfidence is a closed loop: confidence reinforcing itself, with no obvious place for new information to enter.

Insight is the beginning of change. Growth almost always starts with some version of I don’t know, I might be wrong, or that surprises me. Those moments create openings—small cracks where new information, reflection, or connection can enter.

Wrongfidence is harder to work with because it closes those openings. When certainty feels complete and settled, there’s nothing to examine. No pause. No curiosity. That’s what makes these beliefs so resistant to change—not their content, but their posture.

When Confidence Outruns Correction

This also helps explain how highly visible, highly influential people can become powerful amplifiers of false ideas. When someone is rewarded for decisiveness, insulated from correction, and surrounded by affirmation, confidence can quietly detach from accuracy without immediately being noticed.

That’s how figures like Elon Musk—clearly intelligent, clearly capable—can drift into peddling conspiracy theories (here, here, and here) while retaining an aura of credibility. The issue isn’t lack of intelligence. It’s a surplus of certainty. When confidence becomes self-reinforcing, correction starts to feel unnecessary, even insulting.

Wrongfidence doesn’t just preserve error. It actively resists growth.

When correction is filtered out at scale, the effects don’t stay abstract—they show up most clearly in relationships. It makes conversations brittle. It turns disagreement into threat. It makes apology feel optional and curiosity feel naïve. In families, organizations, and societies, wrongfidence blocks repair—because repair requires uncertainty, and uncertainty feels unsafe when confidence has become part of identity.

Perhaps the most uncomfortable part of wrongfidence, though, is that it isn’t just something other people have.

Wrongfidence is easiest to spot from the outside—and most dangerous when it lives in us.

It shows up when changing our mind would mean losing face. When past certainty feels like sunk cost. When confidence becomes a substitute for safety. When “I’m sure” quietly replaces “I might be wrong.”

Why Give It a Name?

Because wrongfidence is usually invisible to the person holding it, it’s hard to address without first being able to see it. Giving it a name doesn’t solve it—but it makes it visible.

That’s why naming wrongfidence matters.

In therapy, naming a pattern creates enough distance to examine it. Once certainty can be interrogated rather than defended, change becomes possible.

We can talk about the process instead of attacking the conclusion. We can slow down without immediately conceding identity. We can ask not “Who’s wrong?” but “What is this certainty doing for me—or for us?”

So this post is the first place the word appears—here, on this small blog, in February 2026. Not because the phenomenon is new, but because it seemed to need a name. Modern life relentlessly rewards confidence, while offering very few incentives for public revision, humility, or graceful change.

Wrongfidence thrives invisibly in those conditions. But once it can be seen for what it is, space opens for something else: curiosity, when it’s okay to ask questions; humility, when it’s not confused with weakness; growth, when certainty loosens its grip just enough to let learning back in.

Wrongfidence names the problem.

What we do after that—that’s where the real work begins.

One thought

Leave a Reply to MikiCancel reply