Suppose you have a lot of reasons to believe a thing, and no good reasons to not believe it. Should this make you more or less likely to believe it, relative to a case where the considerations are a bit more mixed?
At first glance, it appears you should believe it more, since things with lots of good reasons for them and no reasons against them tend to be true.
However often when people have many reasons for a thing and no reasons against it, it is because they have been collecting them, probably unintentionally. Humans seem to do this when they have a belief they care about.
For instance, when I was younger I could have told you fifteen reasons that logging old growth forests in Tasmania was harmful. They were more economically valuable as a tourism attraction, and the logging was perpetuating corruption, and the forests harbored endangered species, and so on. Somehow, coincidentally, at least almost all the considerations aligned.
For another instance, vegetarians often think that vegetarianism is very easy, and healthy, and moral, and good for the environment, and more enjoyable. Meat eaters often think that vegetarianism is inconvenient, and unhealthy, and not morally important, and worse for the environment in certain ways, and unpleasant. It’s even less likely that all the considerations align, but in random different directions depending on who does the set of unbiased analyses. I think here both groups are aware of some people on the other side doing this.
This seems common enough that if you find yourself with a collection of considerations all pointing in one direction, you should be somewhat worried.
On the other hand, often you have lots of aligned reasons because there is some fundamental reason behind them all. For instance, if you can’t prove a statement in math, because every different way you can think of to try to prove it fails, this may be because it is false. Here you expect to get entirely evidence pointing in one direction.
A less clear case is whether exercise is good. It seems there are lots of different good reasons to exercise. But all of them go through you being healthy, so this is not so surprising — you will look better, you will feel better, you will live for longer, you will be more sane and happy.
Some situations are more conducive to evidence all pointing one way, or coming out in different directions. If the question is whether something is on net good, and it has a variety of effects, probably some should point in different directions. If the ‘considerations’ are a number of somewhat noisy measurements of a hidden quantity, then if the first measurement is high, probably the others will be too.
In whatever situation, you should also expect some things to come out with all of the evidence pointing in one direction, by chance. You might also expect all of the considerations to come out one way due to a selection effect in combination with chance. For instance, if you thought of a particular example because it was the case which you think is most overwhelmingly lopsided, then it was selected for being lopsided, and so this is less surprising than if another random belief had this character.
I think if you find yourself with lots of aligned beliefs like this, you should consider asking yourself:
- Is there some root consideration pushing all the other consideration one way?
- Am I motivated to believe this thing?
- Is this a kind of situation where I should expect all of the evidence to point in one direction?
This seems similar to confidence inside/outside an argument, with your own understanding playing the role of the argument. The pattern is to consider cognitive tools potentially flawed.
I don’t think we should be particularly suspicious (epistemically) about our thinking when it suggests particularly one-sided conclusions, as opposed to when it suggests more balanced conclusions, as more balanced conclusions can also be flawed. Instead it’s instrumentally useful to pay more attention to the conclusions that can be overturned, and whose change can be influential. One-sidedness suggests a particularly easy direction of investigation, but it might be not very efficient, as most of these are just true and you won’t succeed in finding important flaws in your reasoning.
Pingback: 1p – Why do all the considerations point the same way? | Profit Goals
Pingback: 1p – Why do all the considerations point the same way? – Exploding Ads
On the other hand, if one has exerted conscious effort to find reasons against, and still can’t find any that come close to plausibility, we might go back to a strong reason to believe.
(To take a comically extreme example, there is, after all, no good reason to believe the Nazis continue to plot from inside the hollow Earth, and many reasons not to.
This is not because we’ve only bothered to accumulate evidence that suggested they really were defeated in 1945, but because they actually were – and the Earth isn’t hollow.)
Still, an excellent warning to re-examine the filters we use for acquiring our base information, much like my strategy of deliberately trying to avoid any cocooning in information sources.
This reminds me of the zen buddhist cause and effect parable about the farmer, his son, and a horse. Or perhaps the evolutionary story of life on earth throughout the history of extinction events. Being a moral relativist, I don’t assume my perceptions of right and wrong to be right or wrong. I embrace the paradoxical phenomena of being a violent monkey somehow self aware in an infinite universe. I find awe and wonder to be helpful, and scientific certitude a misnomer, you know like military intelligence it seems our reason so often involves subjective, oxymoronic, self-interest that blinds us to an objective purpose to being alive. Forgive me if I go off on irrational tangents, I don’t have the benefit of higher education to temper my imagination.
Finding lots of aligned beliefs related to a randomly chosen proposition is suspicious. Finding lots of aligned beliefs related to a proposition for which you have noticed lots of aligned beliefs is much less suspicious. Just the fact that you noticed it means that it’s probably going to be unusual.
I think this captures a lot of the idea:
http://lesswrong.com/lw/gz/policy_debates_should_not_appear_onesided/