I often look away when I order food from a place where I can see it being prepared, because I expect to see things that will make me doubt my safety dining there. Similarly I prefer to sleep when I am sick, watch loud tv while on airplanes, and buy foods and drinks rather than make them myself where I can see myself making them.
This is all, of course, irrational. If I expect that opening my eyes will show me evidence that will make me believe X, then I already believe X. Or I should, if I am rational and expect to remain so. In these cases I don’t expect to remain rational. I quite reasonably expect that if I receive particular pieces of evidence I will update too much, so I should not believe what I expect to believe in the future, conditional on collecting evidence. I should avoid the evidence.
The ‘avoid evidence’ solution doesn’t seem like a very good one though. If I recognize that updating so much is irrational in time to avoid the evidence, why don’t I just recognize it when I get the evidence, and not update so much?
Perhaps I am just full of irrational fears that I can’t control by my mere will and reasoning. I don’t think that’s quite it though. Intuitively it seems the problem is that while I believe that I will vastly overweight any evidence I get to the effect that my sandwich is dangerous, when I actually see the rashy hand go into the lettuce or whatever it’s hard to judge whether this isn’t perhaps one of the rare occasions when I should be concerned. The specific piece of evidence looks different every time, so it’s hard to convince myself that a novel particular event that looks bad at the moment really fits into the reference class of other evidence that looks bad and isn’t dangerous.
Do other people behave this way? How should they behave instead? How do you fix this?
Kaj Sotala has a fine post about this phenomenon: http://lesswrong.com/lw/72d/strategic_ignorance_and_plausible_deniability/
It is indeed very difficult to actually allow yourself to see information that might challenge your self-efficacy, delusion of safety, or some other cherished belief.
I don’t know wholly what to do about this, but I have some ideas:
* Use known “mental checklist” techniques to realize when you’re doing this, learn what it feels like, and say “Oh, that’s what that feels like. Better not do that.”
* Imagine the worst case scenario of every possible decision, and then, since this is probably an overly optimistic guess at what will happen, keep imagining worse scenarios until you can’t anymore. Then, imagine even worse scenarios. Then you might have hit the point where reality will actually have a chance of being either better or worse than your prediction.
* If the fear itself is irrational, write down the real consequences, and not just the affect associated with them in your mind. In cognitive-behavioral therapy, this tends to make you realize how stupid your emotions are.
Grognor: The worst-case-scenario trick only applies to the planning fallacy, not life in general; people both overrate and underrate external risks, whereas they’re uniformly overconfident of a plan they’re enacting. (For instance, an anxious person might imagine that drinking expired milk could lead to a fatal hemorrhage, overestimating the plausible risks substantially.)
You could just go with the flow, and become a closeted eccentric who cooks all her own food and never flies anywhere.
Avoiding evidence should not be confused with avoiding stimuli, even when they are correlated. Engaging in a debate with people who I expect to have different opinions than me on topics that are important to me is almost always a good predicter of a reduction in my mood and well-being.
If you are instinctively afraid of something (flight, height) or disgusted by something (icky food), the stimuli of that something will create negative emotions inside your brain that are usually not subjected to the volitional parts of your frontal lobes.
It is just as rational to avoid these stimuli as it is rational to avoid touching something hot when you don’t want to be burned. Maybe one day we will re-engineer our brains so that our volition is the “admin” of our emotions. But this is not currently true for most of us.
Deciding to face facts that might be bad consumes more energy than most people realize. (See my summary on decision fatigue here: http://tinyurl.com/7lnoxne) Pick your battles. Face the important facts, and don’t trouble yourself about listening to loud television on planes.
Develop a more ordinate response to the bacteria likely around in your food. Make a fermented food at home. If you can manage something like kombucha, where the bacterial growth is obvious, you’ll get less worried about bacteria overall. I don’t think that your other examples are similar. Sleeping when you are sick is what we are evolved to do. Eating, worrying, etc… all takes up energy better used to fight infection. In flight, we are basically at the mercy of the crew, so it makes perfect sense to put your attention elsewhere.
One thing I should have been clearer about: the _specific_ connection between decision fatigue and the excessive revision of priors. I think this problem resolves when you make a careful decision integrating the new evidence. Decision fatigue (or the threat of it) leads either to decision avoidance or impulsivity. Over-updating expresses impulsivity in the context of revising priors adversely based on new evidence. That’s probably a more on-point answer to your questions.
Fast Food Nation:
“During one experiment in the early 1970s people were served an oddly tinted meal of steak and french fries that appeared normal beneath colored lights. Everyone thought the meal tasted fine until the lighting was changed. Once it became apparent that the steak was actually blue and the fries were green, some people became ill.”
The part about food preparation gave me pause. Our stomachs rely on heuristics instead of rationality, and I suspect that your looking away is better explained as a hack to get around these heuristics than a purely irrational reaction in itself.
Just a thought.