The Unreliability of Rationally Formed Beliefs

Imagine there are ten people trying to guess the number of balls in a jar. Broadly speaking, these people could be divided into three groups of people using different methods to figure this out. Despite these groupings, ten different answers are given. Each person felt that their method produced the correct answer, that the other methods were in error, and that the other people claiming to have used the same method as them had done so incorrectly.

A few things are clear. First, assuming someone got the right answer, the probability that any given person got the right answer is only 10%. Second, each person could not point to the fact that their reasoning seemed clearly correct to them and other’s reasoning seemed clearly in error to justify thinking they got the right answer since this perception will be had by all the people including those who got it wrong. They would need some further independent justification and without that they should simply note that there is an answer which seems most reasonable to them but there is only a 10% chance they are right and a 90% chance they are wrong.

There are important areas of human thought that seem somewhat analogous to this. The odds of someone being correct are not always as low as 10%, but they seem clearly too low to warrant confidence and make clear that the feeling that your line of thought is obviously right and other’s obviously in error don’t track truth very well in these domains. Some obvious examples are religious faith and moral intuitions. If we think of these as “methods” it is obvious that they are unreliable methods in the sense that different individuals and populations using the same methods produce very different results.

Methods of knowledge where this sort of analogy clearly does not apply include simple perception and basic mathematical intuition. These methods produce basically the same results for everyone and so can be considered reliable in the relevant sense.

None of this is very concerning to me, but what does bother me is the potential for this sort of analysis to be applied to a method we can broadly call rationality, trying to look at the relevant evidence and infer the most probable conclusion, in the domains of social science and philosophy. Comparing individuals within cultures and people between cultures at different places and times it is clear that many people who thought they were rationally forming beliefs about many philosophic and social scientific questions were wrong, so unless we have some additional reason for certainty we have to conclude that if we are such a person then there is a high chance, for many questions greater than half, that we are wrong. This will be true no matter how clearly we seem right and other seem wrong so long as, so far as we can tell, people who think we are wrong have similar feelings about their own line of reasoning.

One thing we’d hope to be able to appeal to in order to get out of this is knowing lots of facts or trying to follow some more specific methodology like “the scientific method”. But a quick look at highly informed laymen, intellectuals, and people who claim to be engaging in a scientific approach to social science or philosophy even within current western culture, let alone across cultures from elsewhere, makes it clear that a great diversity of opinion exists even among this subset of people. This problem is even more severe if, like me, you think most intellectuals and social scientists have failed in their use of reason and consequently believe lots of false things.

We might be tempted to say that our preferred belief was produced by the more correct understanding and application of the scientific method and by a superior selection of relevant facts. We might even feel that people who disagree with us would agree if they could see our facts and understand our method. We all know this doesn’t happen often in debate when we attempt to do exactly that, but we can blame that on the irrational mental state of the people we disagree with. But of course, lots of people think all these things while holding a diverse range of views on many topics and so us feeling this way doesn’t help us justify trusting our own reason more than theirs when applied to these sorts of issues.

If this argument is valid, it may not shake our feeling that our reasoning on these sorts of topics is trustworthy. But it would imply that this is implicitly a belief that we have a special ability to tell apart good and bad arguments and form beliefs which most people do not have and this belief is not only unjustified but, so far as the evidence we have indicates, is probably false.

This argument is fairly simple and it seems like the sort of idea that certain people would have thought worth talking about so I assume it has a name. I don’t know its name though and my hope is that someone who reads this will be able to inform me of its name so that I can see what sort of responses people have thought up to it. Beyond that the point of this article was just explaining the argument in a hopefully clear way without trying myself to rebut it.

12 thoughts on “The Unreliability of Rationally Formed Beliefs

  1. I’d argue that the most popular topics to argue over are those that are most arguable — where the arguments for the different sides are pretty comparable in plausibility.

    I came up with this idea in 2009 when it was popular to argue over whether Peyton Manning or Tom Brady was the best NFL quarterback. The evidence was pretty evenly split so it was fun to argue over. Since then, Brady has clearly emerged as the greatest of his time and probably of all time, so the argument has died down.

    Like

  2. The closest person to touch on this stuff is George Soros in regards to uncertainty and “reflexivity” in markets.The idea behind reflexivity is that human behaviour alters stocks and the stocks alter human behaviour. If 10000000 aliens come to earth and start buying dell stock, regardless of whatever technical analysis or fundamental analysis you do you would have never predicted such a radical increase in stock price. At some point, when the dell stock gets high enough, people get satisfied with their gains and sell, which would be the stock affecting human behaviour. Alien or human behaviour can’t influence the hard sciences in the same way.

    Ultimately, I think that the question of human fallibility comes down to proof. People make mistakes in computer science and mathematics all the time, they just have inequalities and program malfunctions that make it easier to correct those errors. The social sciences (and to a lesser extent, economics and psychology) do not have these same undeniable and rapid proofs.

    Unfortunately though, this concept of “inherent human fallibility” does not have a proper name, which is a shame.

    Like

  3. I cannot find a direct reference to this analysis of Nietzsche (e.g. The Cambridge Companion to Critical Theory, chap 6: Adorno’s Aesthetic Theory), however citing here Heterodox Out Loud podcast 27 Jan 2022;
    19:20 – Chris Martin: “..CRT is actually a bit more of a psychological theory, it comes from critical theory which has its roots to some degree in Friedrich Nietzsche, the German philosopher who said that our cognition is shaped by our self-interest. So sometimes we believe certain things are true not because we have enough evidence but because it is convenient for our self-interest. So we’re not really reliable perceivers of the world, we are not even reliable thinkers.”
    21:30 – Zach Rausch: “I also have not heard of CRT being originated from Friedrich Nietzsche, which I think is a really interesting point and how it is fundamentally about; we’re not really rational thinkers and we are biased towards ourselves and our own group interests, and that can cause problems when developing laws and other ideas.”

    Like

  4. I don’t think their is a name for this argument. What you’re arguing against could be called “arguing from intuition” or “the rationalistic fallacy,” but those are pretty niche terms to refer to it. I think knowledge about the fallacy of rationalistic arguments is too low to have made arguments against it name-worthy. Though, that’s just my only-somewhat-informed take on it.

    As for the solution to this problem, I don’t know for sure. I think not granting legitimacy to anyone who doesn’t use quantitative data as the foundation of their belief system is a start, though.

    Like

  5. There is always a “right” answer. The grey areas are what result when trying to support the wrong answer.

    In the debate over quarterbacks, only one can be the best. What you are really debating is the specific, limited criteria to determine that.

    And if you do set the criteria, and one of the items is not something like “current professional American football quarterback”, then you need to apply that criteria to ALL people who have a chance to meet that criterion.

    For example, the UCLA “men’s” college basketball team went 100 games without a loss. The Connecticut “women’s” basketball team broke that record. Is that an apples-to-apples comparison?

    Like

    • Stop trying to sound smart. What you’re saying is almost completely unrelated to what Sean was talking about. And not only that, what you’re saying isn’t even cohesive. Plus, what I think you’re trying to say isn’t even disagreed about. What Sean (the author) is talking about is the mostly-lolbertarian method of arguing things from intuition, such as things like, “Assume X and Y, therefore Z.”

      To your credit, you have gone on to say that you’re “decades old,” so I’m going to guess you’re a boomer and all the nonsense you spew is from your dementia.

      Like

        • No, you’re a fucking retard and everyone else can see it. You make generalized claims about everything without providing any evidence for and you can’t even form a coherent comment. Most of what you say doesn’t even make sense. You come to this blog completely unfamiliar with the author or even the author’s former work on this very blog and you leave your shitty comments that are almost entirely unrelated to what the author is talking about and you certainly say them without evidence. If you’re going to misunderstand or ignore what the author is saying, you shouldn’t be here being an eyesore and a distraction to the people who actually understand what the author is talking about and don’t ignore it. So, how about YOU get your nose out of Sean’s butt.

          TL;DR
          Okay, boomer.

          Like

        • You still haven’t cited anything for your claim that, “Are any of those studies rigorously tested and repeated? No. Not too long ago some organization decided to test a bunch of studies by exactly repeating them. The new results weren’t even close to the originals,” too. So, I’m going to ask you again.

          Why is it that you can’t post evidence for these claims you’re making? Who established these supposed standards you’re referring to? How do you know these standards are accepted as scientific and as the consensus? What stops these standards for disqualifying other things? How is your “lived experience” more statistically meaningful than the lived experience of the 100s of thousands of people surveyed, studied, and analyzed in all the studies you’re trying to write off? What evidence do you have that these studies fail to replicate?

          If you’re unable to post any citations, you’re a fraud. We can debate on these things, but you have to actually be willing to consider the evidence I provide. And if you think I have no legitimacy, well buddy, that can go both ways. I’m not going to waste my time talking to some sub-80IQ boomer on these day-1 lolbertarianism non-arguments. Again, if you do not give any citations (and your citations can’t just be some dumb news articles, they have to be raw data sets or scientific journals), I will not consider what you have to say and you will have shown to everyone else you have no actual basis for your knowledge.

          Tl:DR
          Post citations or you’re a retard.

          Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s