The world of 2025 is a highly confusing place. For years, if not decades, the news has been a confusing morass, frequently presenting as “news” what the average person clearly understand to be propaganda, only to be denounced and shouted down if they dare to question the Newspeak. It can be both upsetting and confusing.
What is happening?
It’s not so much some overarching conspiracy, for the most part. Some of it certainly is, but the vast majority is news organizations following the dictum of, “If it bleeds, then it leads“. Certain reference sites, like Snopes and Wikipedia, frequently engage in “gray propaganda”, gently seeming to tell you one thing, but in a very carefully curated way, that actually tells you the opposite.
But – how can the average consumer wade through the haze? Below, I will briefly present the method I relay on, for the most part, in writing.
In an era of information overload and competing narratives, the average news consumer faces a challenging question: how should we evaluate new information when we already hold prior beliefs about a subject? The answer does not lie in abandoning skepticism, nor blindly accepting every claim at face value, but in applying a mathematical framework that has served scientists and intelligence analysts for centuries: Bayesian inference.
The Bayesian Approach: Updating Beliefs With Evidence
Named after 18th-century mathematician Thomas Bayes, Bayesian inference provides a structured method for updating our confidence in a hypothesis as new evidence emerges. Unlike binary “true or false” thinking, Bayesian reasoning recognizes that most real-world claims exist on a spectrum of probability. We start with a prior belief — our initial assessment of how likely something is to be true — and systematically adjust that belief as we encounter new information.
The fundamental insight is deceptively simple: the credibility we assign to new information should depend on both the quality of that information and what we already know about the subject. Strong evidence should shift our beliefs significantly, while weak or contradictory evidence should barely move the needle. Recent research has shown that humans can be understood as performing Bayesian inference with systematic biases, suggesting our cognitive processes follow probabilistic rather than purely logical patterns.
Prior Probabilities: What You Think Before The News Breaks
Before evaluating any news story, Bayesian thinking requires honest assessment of your starting position. What did you believe before this new information appeared? This “prior probability” shouldn’t be arbitrary — it should reflect your accumulated knowledge, the base rates of similar claims, and the historical track record of comparable situations.
For instance, if a news outlet reports that a politician has been caught in a scandal, your prior probability should consider: How common are such scandals generally? What is this politician’s past record? What is the news source’s track record on similar stories? A claim that would be extraordinary for one politician might be entirely mundane for another, and Bayesian reasoning accounts for this context.

The challenge is that humans often have poorly calibrated priors. We overestimate the likelihood of dramatic events, underestimate mundane explanations, and let confirmation bias inflate our confidence in beliefs that align with our preferences. Studies have demonstrated that cognitive biases can distort public understanding and contribute to the rapid dissemination of false narratives, with misinformation spreading faster than accurate news because it aligns with existing beliefs and triggers emotional reactions. Bayesian thinking forces us to make these priors explicit rather than leaving them as un-examined assumptions.
Evaluating the Evidence: Likelihood Ratios
Once you’ve established your prior belief, the next step is evaluating how much the new evidence should shift that belief. This is where likelihood ratios enter the picture. Ask yourself: if the claim were true, how likely would I be to see this specific evidence? Conversely, if the claim were false, how likely would I be to see this evidence anyway?
Consider a news report citing “anonymous sources” claiming a major policy shift. If the policy shift were real, would we expect to see anonymous leaks? Almost certainly — major policy changes rarely remain largely secret until they are released. But if the policy shift were not happening, might we still see such reports? Also yes — media organizations sometimes run with unreliable tips, and disinformation campaigns deliberately plant false stories.
The key is that strong evidence is evidence we would expect to see if the claim is true, but not expect to see if the claim is false. Weak evidence is information that would be equally likely under either scenario. A photograph of an event is stronger evidence than an anonymous quote about the event. A leaked internal document is stronger than a second-hand account. Research on misinformation receptivity conceptualizes the problem as weighing the reliability of incoming information against the reliability of prior beliefs.
Common Pitfalls: Where Bayesian Reasoning Goes Wrong
Even when applying Bayesian principles, news consumers make predictable errors. Confirmation bias leads us to treat evidence supporting our existing views as stronger than it actually is, while dismissing contradictory evidence as weak or suspect. Studies show that people fail to update enough when truly strong evidence appears, remaining anchored to their priors even when they shouldn’t be.
Another common mistake is ignoring base rates—the background frequency of events. The base rate fallacy causes people to focus on specific case information while neglecting crucial statistical context. Dramatic claims about rare events require dramatically strong evidence, because the prior probability is low to begin with. A report of political corruption in a notoriously corrupt system requires less evidence to be credible than the same report in a historically clean government.
Media coverage frequently falls prey to this fallacy. If a person is shown a series of news stories about a particular crime, they may overestimate the frequency of that crime, even if it is actually quite rare.
Practical Application: A Daily Discipline
Applying Bayesian inference to news consumption doesn’t require complex mathematics. It requires disciplined thinking: acknowledge your starting beliefs honestly, evaluate evidence quality rigorously, and update your confidence proportionally. When multiple independent sources corroborate a story, your confidence should increase substantially. When evidence is ambiguous or sources are unreliable, your beliefs should barely shift.
The Bayesian framework doesn’t eliminate uncertainty—it manages it. In a media environment designed to generate clicks through certainty and outrage, thinking probabilistically is an act of intellectual resistance. It allows you to remain open to new information while maintaining appropriate skepticism, to change your mind when evidence warrants it, and to resist manipulation by those who exploit cognitive biases.
The news will always be noisy, biased, and incomplete. Bayesian thinking provides a rational method for navigating that noise without succumbing either to cynical dismissal of all information or credulous acceptance of comfortable narratives.
Conclusion
As I point out above, Bayesian methods are not foolproof – they can still lead to mistakes. However, overall, it is a good yardstick to start from. Why is this important? Because if you are reading this in the United States, you have the ability to effect change by voting – and if your thinking is skewed by those seeking to manpulate you, you need to be aware of how those parties are trying to manipulate you, because your vote counts.
This stuff seriously impacts your personal “bottom line”.


