Imagine there are ten people trying to guess the number of balls in a jar. Broadly speaking, these people could be divided into three groups of people using different methods to figure this out. Despite these groupings, ten different answers are given. Each person felt that their method produced the correct answer, that the other methods were in error, and that the other people claiming to have used the same method as them had done so incorrectly.
A few things are clear. First, assuming someone got the right answer, the probability that any given person got the right answer is only 10%. Second, each person could not point to the fact that their reasoning seemed clearly correct to them and other’s reasoning seemed clearly in error to justify thinking they got the right answer since this perception will be had by all the people including those who got it wrong. They would need some further independent justification and without that they should simply note that there is an answer which seems most reasonable to them but there is only a 10% chance they are right and a 90% chance they are wrong.
There are important areas of human thought that seem somewhat analogous to this. The odds of someone being correct are not always as low as 10%, but they seem clearly too low to warrant confidence and make clear that the feeling that your line of thought is obviously right and other’s obviously in error don’t track truth very well in these domains. Some obvious examples are religious faith and moral intuitions. If we think of these as “methods” it is obvious that they are unreliable methods in the sense that different individuals and populations using the same methods produce very different results.
Methods of knowledge where this sort of analogy clearly does not apply include simple perception and basic mathematical intuition. These methods produce basically the same results for everyone and so can be considered reliable in the relevant sense.
None of this is very concerning to me, but what does bother me is the potential for this sort of analysis to be applied to a method we can broadly call rationality, trying to look at the relevant evidence and infer the most probable conclusion, in the domains of social science and philosophy. Comparing individuals within cultures and people between cultures at different places and times it is clear that many people who thought they were rationally forming beliefs about many philosophic and social scientific questions were wrong, so unless we have some additional reason for certainty we have to conclude that if we are such a person then there is a high chance, for many questions greater than half, that we are wrong. This will be true no matter how clearly we seem right and other seem wrong so long as, so far as we can tell, people who think we are wrong have similar feelings about their own line of reasoning.
One thing we’d hope to be able to appeal to in order to get out of this is knowing lots of facts or trying to follow some more specific methodology like “the scientific method”. But a quick look at highly informed laymen, intellectuals, and people who claim to be engaging in a scientific approach to social science or philosophy even within current western culture, let alone across cultures from elsewhere, makes it clear that a great diversity of opinion exists even among this subset of people. This problem is even more severe if, like me, you think most intellectuals and social scientists have failed in their use of reason and consequently believe lots of false things.
We might be tempted to say that our preferred belief was produced by the more correct understanding and application of the scientific method and by a superior selection of relevant facts. We might even feel that people who disagree with us would agree if they could see our facts and understand our method. We all know this doesn’t happen often in debate when we attempt to do exactly that, but we can blame that on the irrational mental state of the people we disagree with. But of course, lots of people think all these things while holding a diverse range of views on many topics and so us feeling this way doesn’t help us justify trusting our own reason more than theirs when applied to these sorts of issues.
If this argument is valid, it may not shake our feeling that our reasoning on these sorts of topics is trustworthy. But it would imply that this is implicitly a belief that we have a special ability to tell apart good and bad arguments and form beliefs which most people do not have and this belief is not only unjustified but, so far as the evidence we have indicates, is probably false.
This argument is fairly simple and it seems like the sort of idea that certain people would have thought worth talking about so I assume it has a name. I don’t know its name though and my hope is that someone who reads this will be able to inform me of its name so that I can see what sort of responses people have thought up to it. Beyond that the point of this article was just explaining the argument in a hopefully clear way without trying myself to rebut it.