RationalIrrationality

In “Why people are irrational about politics?”, Michael Huemer argues: “It’s sometimes rational to be irrational”

What does he mean? Well, it makes sense if you distinguish two meanings of ‘rationality”:

And, sometimes those two clash - i.e. it can in your interest to be stubborn and close minded, and hold irrational beliefs (See also SelectivelyOpenMinded).

Among others, this is a reason to be suspicious about voting (see AboutVoting).

Nice quote:

The problem of political irrationality is the greatest social problem humanity faces. It is a greater problem than crime, drug addiction, or even world poverty, because it is a problem that prevents us from solving other problems. Before we can solve the problem of poverty, we must first have correct beliefs about poverty, about what causes it, what reduces it, and what the side effects of alternative policies are. If our beliefs about those things are being guided by the social group we want to fit into, the self-image we want to maintain, the desire to avoid admitting to having been wrong in the past, and so on, then it would be pure accident if enough of us were to actually form correct beliefs to solve the problem. Analogy: suppose your doctor, after diagnosing your illness, picks a medical procedure to perform on you from a hat. You would be lucky if the procedure chosen didn’t worsen your condition.

I believe this is pretty relevant to online communities too.


See ThinkingGoo, and PassagesOfPerspective for more on belief formation.

CategoryReasoning

Discussion

I remember Tocqueville writing about how in “times of trial” (such as the American Revolution), men tended to elect awesome leaders, and in “normal times”, men tended to elect pretty average leaders. It makes sense when you consider that unless there’s something really serious going on, people have no reason to pay too much attention to all the aspects of politics.

This page should probably be made a bit more about online communities, and less about politics.

(This was originally linked to in ThinkingGoo, but it may deserve it’s own page)

Something seems wrong with this page. It reminds me on the constructivists saying “if only we could drop the concept of truth we could all live peacefully together”. - If only we could all be rational then all problems could be solved.

The point is that rationality like ethics only works with a reference system. If you do not share the reference system (e. g. goals or values) then the results will differ.

It is not irrational to have different goals. It is irrational to expect people to share the same views. It seems irrational to seek the solution of problems by unifying rationality.

I don’t interpret this as meaning “If only we could all be rational then all problems could be solved.” - there’s also an element of “we can’t expect everybody to be fully rational, because there are these forces working against, so let’s be cautious of systems like voting that only work if everybody is fully rational.”

I don’t know about constructivists (I don’t think I’m one), but it seems more straightforward to me that “being rational about the decisions we make as a group” is overal something positive. The original essay isn’t as much arguing in favour of rationality (as I imagine the constructivists would do about dropping the concept of “truth”) as finding some specific instances where we’re not being rational.

Also, even if we aren’t rational, it’s better for us to be aware that we aren’t rational, rather than believing that we are rational and anybody who thinks differently from us is stupid. Being SelectivelyOpenMinded isn’t really “epistically rational” (i.e. it’s not the best way to get to the truth), but it’s ok when you’re open and straightforward about it. So it’s not as much about unifying rationality as recognizing irrationality.

To link it to the other topic at hand, MetaPhysics: one thing is to believe that it’s a proven fact that one’s date of birth determines one’s character, and that “established science” is just ignoring the evidence, and it’s another thing to accept that the zodiac is probably bunk, but at least it makes one pay attention to the right things. So believing in the Zodiac may be wrong but useful. (The application to religion is left as an exercise to the reader)

I’m glad this page is here, but the concepts are not clear in my mind.

Here are the clear & distinct ideas I see eye-to-eye with:

  • Joining a particular group or transforming your life to some end is often a package deal: You’re going to have to / be expected to accept a lot of beliefs, all at once, and really change your head.
    • This can represent a conflict between your self-interest (joining this group,) and what you understand to be true.
    • Specifically, the case is often brought up: Joining the wealthy. The wealthy have a lot of money (self-interest,); They also tend to have views on things that can be a little askew. (BusyPeopleLackPerspective.)
    • What makes sense to do for your self-interest (“Instrumental rationality”) can be in conflict with what you have learned and understand (“Epistemic rationality.”)
    • Hence the phrase: “Rational Irrationality.”

There’s a book called “Global Brain” that I read recently; Chapter 8 is called “Reality is a Shared Hallucination.” It details many times in which the individual’s rational self-interest (specifically, desire to conform) trumps the individual’s concept of what is true (what is observed with the individual’s own eyes.)

Here’s one specific example he gave:

There was a study (1965, Solomon Asch) where he made cards with two lines: Some with one longer than the other, and some where the lines were the same length. He prep’ed 9 volunteers to say that most of the equal lengths were the same size, and most of the different lengths were different sized. But there were a few where he told them all: Always get this one wrong. Then he’d introduce a 10th volunteer, who didn’t know about the experiment.

He’d ask everyone “same” or “different,” and 75% of the time, when they got to the ones that were tricks, (where the 9 volunteers were colluding to give the false answer,) the 10th volunteer would just go with whatever the group said. “What’s this?” (pointing to different sized lines.) “They’re equal!” 75% of the time- the 10’th person goes right along: “They’re equal!”

After the experiment was over, he’d ask the 10th person: Did the peer pressure make you feel uncomfortable? Many (the book doesn’t indicate minority or majority) said that they actually saw what they said.

He gives a ton of other examples.

Continuing with the remaining ideas:

  • “Among others, this is a reason to be suspicious about voting (see AboutVoting).” – I suspect this one makes Helmut uncomfortable, because Helmut passionately defends voting. I don’t think I agree that RationalIrrationality (the conflict between Instrumental Rationality and Epistemic Rationality) is “a reason to be suspicious about voting.” It would only be suspicious if voting were a mechanism to answer “What is true?” Rather, voting is only a mechanism to answer: “What shall we do?”
  • The idea that political irrationality is the greatest social problem humanity faces. – I disagree with this. I think we can solve the problems of poverty without having correct beliefs about poverty. What this line of thought seems to argue, is that if you don’t have near universal correct understanding about a problem, you can’t solve it. It seems like the waterfall model, all over again.

What else could possibly work? Evolution. (And, take his quotation, and apply it to nature trying to solve it’s problems, and the error is clear there, as well.) Try 10,000 things, and then see what worked. Mutate and reapply. Different niches likely have different fundamental problems, actually. That is: Poverty in Orlando may well have fundamental differences than Povery in Uganda, but similarities to Poverty in (Wherever.) It’s possible that the correct way to approach the problem is to make a detailed model of poverty and it’s solutions, and to apply the carefully prepared solution. And it’s good that there are people who try that route: Again, that’s part of the evolutionary answer to the situation. But it’s also possible that there are some kids on the street setting up lemonade stands, and some churches out there giving people hope, and some hollywood stars visiting the area drawing tourists, and, … (and so on, and so forth,) and that one of these may be particularly effective, and so on.

If the only way were to form proper beliefs, and administer the right serum, then it seems more likely to me that political irrationality would be the greatest social problem we face.

This all said: I think political irrationality is definitely a problem; I do wish people had more backbone and thought more about what they do and do not know. :) We all know that confidence in opinion is a sign of strength and so on and so forth, and people are happy to parade soldiers on the way to kill for no good reason. I’m watching it in my country right now.

My personal lessons from this are:

  • Guard your thoughts, guard your thinking, be skeptical of knowledge.
  • When you are taking a course of action, think, speak, and act assured of certainty. (“The snake, knowing itself, strikes quickly.”)

George Bush’s actions, in this light, stand explained. The problem isn’t that the process was wrong or bad; The problem is that the majority of the people in the US are uneducated, throughout the ranks. This causes the US to act in a way that is comparable to a psychopath.

I agree with Helmut that we can’t seek solutions by trying to unify rationality, by trying to “get us all on the same page.”

As for the paper itself:

  • I think I agree with (A) “Miscalculation.” Or rather, “multiple interpretation.” I think that the events in the world can be interpreted in a zillion different ways.
  • I think people are irrational / rational as I described in ThinkingGoo: There are clearly chunks of logic and reason in there. But the general framework is basically irrational.
  • I agree with the practices encouraged by the paper: We should give serious time to objections to our arguments. We should think before making serious political claims during opinion formation. We should hesitate before believing something that it is in our self-interest to believe. We should take into account the irrationality of others, and not just go along with it. We should be fair-minded to other perspectives.

I suspect that Helmut would agree with the general gist of the argument, though not the particulars of the phrasings.

Languages: