Often Wrong But Never In Doubt
Are we living in a post-truth world? The American left-wing and their media allies think so.
Last night, after President Trump announced preliminary agreements with Iran that would trigger a ticker-tape parade in New York City and talk of being added to Mount Rushmore if a Democrat president had done it, people rushed to their keyboards and MSNOW studios to tell America how badly Trump’s team screwed the pooch and that America has really lost because there is no regime change and Obama’s JCPOA was working, even though a half century of evidence points to the opposite.
This morning, the IRGC claimed they are in total control of the Strait of Hormuz even as the US blockade is still in force and American warships are freely transiting the passage. The immediate reaction of anti-Trump people to this “news” will be “Trump is lying and you are a MAGA propagandist!” but Trump and his team have executed measurable things that can be known, the IRGC has not. The IRGC doesn’t have the power to control the thermostat in their offices.
A gargantuan amount of energy these days goes toward trying to make something that is into something that isn’t and something that isn’t into something that is. People feel comfortable in making declarative statements that have zero basis in fact and analogies are drawn between people, groups, and things that are simply non sequiturs. It seems we are a society that prefers being driven by spin and narrative rather than fact and truth, a true post-truth environment.
I did some digging last night (yes, using AI to research it) and discovered it isn’t just a vague cultural irritation, it maps cleanly onto several well-documented psychological, sociological, and media-driven dynamics that have been studied for years and, in many cases, are accelerating.
The “post-truth” condition, a term that gained prominence around 2016, describes an environment where objective facts have less influence on public opinion than appeals to emotion, identity, and narrative. That doesn’t mean facts have disappeared, it means they’ve been demoted. People are no longer primarily asking, “Is this true?” but rather, “Does this fit what I already believe, or want to believe?”
Layered on top of that is something from cognitive psychology known as motivated reasoning. Humans are not neutral processors of information. We tend to accept evidence that confirms our existing views and reject or reinterpret evidence that challenges them. Motivated reasoning goes a step further than just confirmation bias, it actively recruits our intelligence to defend conclusions we’ve already reached emotionally or ideologically. In other words, people aren’t just wrong, they are often quite skilled at constructing arguments that justify being wrong.
Then there’s the concept of social identity theory. People derive a sense of self from belonging to political, cultural, religious, or other groups. Once an idea becomes tied to identity, disagreeing with it is no longer a matter of evidence, it becomes a perceived attack on the person or the group and at that point, facts become negotiable, but allegiance is not. That’s when you start to see confident, declarative statements with little or no factual grounding, because the goal is not accuracy, it is signaling loyalty.
The point about analogies that don’t hold, the non sequiturs, is also well grounded. This is often referred to as false equivalence or category error. In a high-speed information environment, analogies are powerful, but easily abused, rhetorical shortcuts. If two things share even a superficial similarity, they can be rhetorically linked in ways that collapse important distinctions. Once that analogy takes hold in a narrative, it can be very difficult to dislodge, even if it’s fundamentally flawed. Coupled with that is often a linguistic component that scholars refer to as “semantic drift” or “concept creep,” where words expand or shift in meaning over time. Terms like “violence,” “trauma,” “racism,” or “authoritarianism” can be stretched to cover an ever-expanding range of phenomena, and once definitions become fluid, it becomes much easier to make claims untethered to reality when the boundaries of meaning are no longer fixed.
Of course, an ideologically motivated media and information ecosystem amplify all of this. Social platforms, cable news, and even traditional outlets increasingly operate on engagement-driven (i.e. ratings) incentives, not truth-driven ones. Content that provokes outrage, fear, or tribal validation spreads faster and farther than careful, nuanced analysis. This creates what some researchers call “outrage cycles” or “attention markets”, where the most emotionally satisfying version of a story often wins, regardless of its accuracy.
Thinkers like Hannah Arendt warned that when societies lose a shared commitment to factual truth, the danger is not that everyone believes lies, but that no one believes anything with confidence anymore. At that point, narratives compete not on truth, but on usefulness and emotional appeal.
If you are seeing the same as I am, you’re not imagining it. It is simply the convergence of human cognitive wiring, identity politics, media incentives, and shifting language. The result is a culture where the line between what is and what is said to be becomes increasingly negotiable, and where confidence often substitutes for correctness.
And that is when the “often wrong, but never in doubt” crowd takes center stage.



The battlefield now isn’t just reality—it’s hyperrealism. Not what is, but what is presented as more real than reality itself. Narratives don’t just compete with facts—they replace them. People react to simulations, headlines, clips, and curated outrage as if they’re the full picture. That’s the danger. Donald Trump operates in outcomes—blockades, deals, measurable shifts. The opposition operates in hyperreality—reframing outcomes into failure regardless of facts. And in that environment, perception becomes weaponized. But here’s the catch: hyperrealism only holds until reality intrudes. And when it does, the illusion doesn’t bend—it breaks.
Well said! I've noticed that people often make these three errors. 1. Confirmation bias. They accept only information that supports their beliefs and reject info that doesn't. 2. Either-or thinking (or us vs them thinking). Consists of reducing everything to two choices, one good and one bad or evil. 3. Mind reading: thinking you can tell what someone you disagree with is actually thinking.