Monthly Archives: November 2009

Extremes of reliability and zealotry

Opinions and actions are spread across continua. The ones at the ends are sometimes called ‘extremist’, ‘fanatical’, ‘fundamentalist’ or ‘zealous’. These are insults or invitations to treat the supporters without seriousness. Other times the far reaches of a continuum are admired as ‘sticking to one’s principles’, ‘consistent’, ‘loyal’, ‘dedicated’, ‘committed’. Claims of certainty and crossing your heart and hoping to die are also looked well upon. So what’s the difference? Obviously the correct answers to some questions are at the ends of spectrums while others, such as optimal trade-offs, tend to have more central values. Is this what determines our like or dislike for centrism and extremism? Lets look at some examples from my understanding of popular opinion.

Things you should be extremist on:

  • What’s the worth of a human life?
  • At what degree of temptation should you cheat on your partner? Break the law? Break a promise?
  • How long should a marriage last?
  • How much does average IQ differ across races?
  • How much should a pedophile be willing to pay for you to let them have a child?

Things you should not be extremist on:

  • How closely should we follow a single political principle, such as libertarianism or communism?
  • What proportion of situations should you analyse in terms of a single theory?
  • How much of your sacred text is literally true? How much should it influence your life?
  • To what degree should one principle, such as utilitarianism, define your ethical views?
  • To what degree should you rely on reasoned thought for opinions?
  • How much of your time should you devote to a single activity (with the exception perhaps of looking after your family)?

I can’t see that the first list contains fewer trade offs than the second list. In fact it probably has more. So what’s the pattern?

The one I see is whether commitment is to an impersonal idea or to a group or person. If you take a centrist position on your personal and group loyalties you are something between flaky and treacherous. You are not supposed to trade off friends. On the other hand strong commitment to a policy position, theory, type of analysis, ethical standpoint, or other impersonal influence on behavior is unbalanced, biased, radical, dangerous, and consists of seeing everything as nails. It’s worse to belong to an edge political party than a central one, but worse to be undecided (central) on which group you belong to than to pick one and support it loyally.

This seems to make sense evolutionarily, as it is important for humans to have loyal associates, and not important for them to have associates who are committed above all else to something abstract that they might sacrifice your welfare for at any time. Ideas do not have babies with you or share their mammoth. Ideas are handy of course, but you want your associates to use them flexibly in the pursuit of upholding their social commitments, rather than using their social commitments flexibly in the pursuit of other principles.

What about sticking to one’s principles? That seems a praiseworthy non-human related extreme. Can you be praised for sticking to any principles though? No. Principles about loyalty, compassion, and honesty are good for instance, but principles like ‘always work when you can, regardless of what your wife thinks about it’, ‘always walk on the left hand side of telegraph poles’, and even committed utilitarianism impress few. Again it’s all about absolutes of reliability to others.

Why evaluate everything ASAP?

When I was young my brother and I used to play a game where we would page through books of wild animals or foreign places and on each page pick our favorite one, just for the pleasure of choosing. Driving in the country, looking in a museum, or window shopping in a mall, people point out to one another which things they like or dislike. We choose favorite colors, places, Dr Who incarnations, reality TV contenders, political personalities, reasons for disregarding postmodernism. On hearing the news, meeting new people, or learning gossip, the next step is usually to make a value judgment about the parties involved. Of all the characteristics everything has, the one we are itching to establish first is our own judgment of value.

At first this seems to make sense. How much we like something is key to how we respond to it. However plenty of the things we so keenly evaluate we can’t easily respond to, and don’t try to, apart from voicing our opinion. Do we waste so much time and thought judging things we can’t influence  as a sad byproduct of judging things we can influence? It sometimes even looks like we even seek out things to evaluate – a costly byproduct that would be! But perhaps in the distant past we hardly came across anything so far away that we couldn’t influence it? That seems wrong; while we saw less we could also influence less.

The only good explanation I can think of is that the response we do often make to our judgments – voicing them – is the purpose. Why would we want to voice evaluations so? One explanation is that it is in the hope that someone else will fix things for us, but this explanation faces the same problem as our original explanation; most of the things that I can’t affect are no easier for my friends to affect. No matter how much I tell my brother that I like pterodactyls most, he doesn’t do a thing. I don’t get my Dr Who and that cloud I thought was pretty evaporated before anyone lifted a finger.

A last theory is that it’s all about hinting to others what sort of people we are. To that irrelevant preferences should be quite useful, just as our more obviously image-improving activities, such as careful dressing, are. A signaling theory makes the things furthest from our influence better to demonstrate opinions on, as we aren’t constrained by the need to make useful decisions on them. I often make inferences about people from their expressed valuations, as I think everyone does. The knowledge of this presumably influences people’s expressed valuations, as people are almost ubiquitously sensitive to having inferences made about them. If these two things are true, it should be hard for expressed evaluations not to become somewhat repurposed for signaling, and so little reason to restrict them to topics in reach.

Do what your parents say?

Should you feel compelled to pay any heed to what your parents want in your adult choices? I used to say a definite ‘no’. My mother said I should do whatever I liked, and I vowed to ignore her. From a preference utilitarian perspective, I guess that virtually all aspects of a person’s lifestyle make much more difference to a given person than to their parents. If you feel a sense of obligation in return for your parents giving you life, why? You made no agreement, your parents took their chances in full knowledge you might grow up to be anyone.

However what if fewer parents do take their chances with a greater risk of children being less satisfactory to them? The biggest effect of taking your parents’ preferences into account more could be via increasing the perception that children are worth having to other parents. It may be a small effect, but the value of life is high.

I’m not sure how much of a difference expected agreeableness of childen makes to people’s choices to have them. At first it may seem negligible. Most people seem to like their children a lot regardless of what they do. However if a person were guaranteed that their child would grow up to be exactly the opposite of what they admire, I would be surprised if there were no effect, so I must expect some gradient. I haven’t seen any data on this except my mother’s (joking?) claim that she would’ve aborted me had she thought I would be an economist. I’m not about to give up economics, but I do visit sometimes, and I painted the new living room and helped with my grandmother’s gardening since getting here this time. See how great descendent are? I would be interested if anyone has better data.

Why is poor communication popular?

A great insight in one sentence seems obvious, no matter how much of history how many people have spent not coming up with it. The same insight alluded to and digressed from for hours on end seems like a fantastic mountain of understanding. Why is this? I think Paul Gowder’s explanation for people liking bad books probably extends to partially explain:

Why do people read bad books…[and] why do so many…end up praising them? …

1. The sunk cost fallacy. You get fifty, a hundred pages into Atlas Shrugged or something and you’ve bled so much — you’ve invested so much into getting through this book, tortured yourself with so much bad writing and so many stupid ideas! How horrible would it be to waste all that effort! Better grind on and finish. Or so we tell ourselves. Because we’re irrational.

2. Cognitive dissonance. You’ve read all of Of Grammatology! Holy shit that was unpleasant…You’re not sure whether you actually learned anything enlightening, or whether old Jacques was just spitting jive. But wait! You’re a rational person! You’d be a fool if you’d spent a hundred hours and endless tears trying to make sense of that stuff and it turned out to be nonsense. Therefore, it must be very wise and you should defend it and demand others read it! Or so we tell ourselves. Because we’re irrational.

Hat tip to Mike Blume. I’ll add:

3. Less concise works can easily be designed to cheat quality heuristics.  You are always better off guessing whether a work was good or bad than admitting you didn’t understand it well and can’t remember most of it, because that does not distinguish you from the stupid people who didn’t understand or remember it because they don’t understand or remember anything. If you are going to guess, you use heuristics. Many of the same things that make writing less comprehensible also lead people to guess it is insightful: complexity, length, difficult words. If you can’t follow well enough to confidently compress it into the one sentence version you would have thought obvious, you will likely guess that it contains more than one sentence worth of interesting content.

Why you should listen to your heart

Follow your heart… Trust your instincts… Listen to your feelings… You know deep inside what is right…etc

– Most people

Humans ongoingly urge and celebrate others’ trust of their ‘heart’ over their ‘head’. Why?

One explanation is that it’s just good advice. I admit I haven’t seen any research on this, though if it were true I would expect to have seen some evidence. If overly emotional people did better on IQ tests for instance we would probably have heard about it, but perhaps hearts aren’t good at that sort of question. They also aren’t good at engineering or cooking or anything else concrete and testable that I can think of except socializing. More people struggle against their inclination to do what they feel like than struggle to do more of it. Perhaps you say it isn’t their heart that likes masturbating and reading Reddit, but that really makes the advice ‘do what you feel like, if it’s admirable to me’, which is pretty vacant. Perhaps listening to your heart means doing what you want to do in the long term, rather than those things society would have you do, which are called ‘reason’ because society has bothered making up reasons for them. This seems far fetched though.

Another explanation is that we want to listen to our own hearts, i.e. do whatever we feel like without having to think of explanations agreeable to other people. We promote the general principle to justify using it to our hearts’ content. However if we are doing this to fool others, it would be strange for our strategy to include begging others to follow it too. Similarly if you want to defect in prisoners’ dilemmas, you don’t go around preaching that principle. A better explanation would explain our telling others, not our following it.

Another explanation is that this is only one side of the coin. The other half the time we compel people to listen to reason, to think in the long term, to avoid foolish whims. This seems less common to me, especially outside intellectual social groups, but perhaps I just notice it less because it doesn’t strike me as bad advice.

My favorite explanation at the moment is that we always do what our hearts tell us, but explain it in terms of abstract fabrications when our hearts’ interests do not align with those we are explaining to. Rationalization is only necessary for bad news. Have you ever said to someone, ‘I really would love to go with you, but I must submit to sensibility and work on this coursework tonight, and in fact every night for the foreseeable future’? We dearly want to do whatever our listener would have, but are often forced by sensible considerations to do something else. It never happens the other way around. ‘I’m going to stay in tonight because I would just love to, though I appreciate in sensibleness I should socialize more’. Any option that needs reasons is to be avoided. ‘Do what your heart tells you’ means ‘Do what you are telling me your heart tells you’, or translated further, ‘Do what my heart tells you’.