Changing Minds

I’ve long felt that most people do not change their minds about things that are important to them, including areas like religion and politics. I think most people seem to pick a set of beliefs sometime in early adulthood or before and stick to them. It’s not that people are necessarily close-minded as such. It’s just that there are major aspects of life for which no argument can be sufficiently convincing in practice.

This has always just been a pet belief of mine. I’ve rarely managed to convince people that it is correct, which of course just reinforces my belief. Oddly, however, there was recently some scientific research which somewhat supports it. The theory is, essentially, that people disregard facts that disagree with their beliefs in order to reduce cognitive dissonance. The research was based on facts seen in news reports, so one interpretation of it is that people don’t believe the news, and that it doesn’t say anything about personal conversations. Still, I find it interesting, and of course, if true, it has implications for the ideal of an informed voting population.

Here is a Google Docs link to the study. Here is a Boston Globe article. I don’t expect you to be convinced, unless of course you already are.


Posted

in

by

Tags:

Comments

15 responses to “Changing Minds”

  1. fche Avatar

    When even the study/article peddles “facts” that are too vaguely stated to be true or false, no wonder people don’t take the purported data seriously.

    Maybe what is needed in news is rigour – challenged by the need to communicate compactly.

  2. Ian Lance Taylor Avatar

    I guess the question is whether more rigor, whatever that would look like, would actually make any difference. You can see the fake news reports used in the study starting on page 39. The correction paragraphs seem to me to be about as rigorous as one could expect from a news article, but those are the corrections which according to their study actually reinforced the contradictory beliefs.. Note that to make their point they had to use articles on controversial subjects, as otherwise there would not have been any strong existing beliefs to override the facts.

    Of course there is a real sense in which all sociology seems made up. It’s really hard to extrapolate from these little experiments to any real notion of how people behave. In this case, since the study confirms my pre-existing bias, I would prefer to believe that it has some truth to it.

  3. Simetrical Avatar

    I always thought this was obvious. There are some collections of beliefs that are more or less interchangeable from an empirical perspective, in that no easily-attainable evidence will favor one much over the other. People spend far too much time arguing about these beliefs in the false hope that they’ll convince anyone.

    The exact beliefs might vary based on the person. For instance, a typical person will not easily find strong evidence for or against evolution — the evidence in favor exists and is quite strong, but it’s complicated. (Dendrochronology seems like the best easy evidence in favor, since everyone knows trees grow one ring per year.) Some religious beliefs are totally unprovable and unfalsifiable, and so are basically all moral beliefs.

    Some secular-humanist sorts try to pick beliefs from these equivalence classes by Occam’s razor. That only pushes off the problem, though, since others can just deny Occam’s razor as a general principle. (As I do — although of course it’s a useful heuristic in many cases.) Occam’s razor can’t be justified by empirical evidence, since it’s the foundation of empiricism, so that would be circular. Plus, it’s not rigorous at all, and you can argue about how to apply it.

    A lot of beliefs are complicated and involve both evidence-based and non-evidence-based parts. I’ve been thinking about global warming a lot in this respect, for example. Non-experts can’t realistically understand the evidence in favor of it, but they have to make a decision anyway, because the stakes are high and they’ll have to pay them. The decision people make seems to be based largely on their attitude toward science, scientists, nature/the environment, technological progress, and so on — not specific evidence. I guess there’s no better way to formulate an opinion on questions like this. (Trusting the experts is not useful as a general strategy, and indeed people widely ignore what economists say, for instance.)

    Of course, most people *say* that their beliefs are based on evidence or sound reasoning, not more-or-less arbitrary basic attitudes and beliefs. But this is untenable on any contentious issue, because there are people on the other side who are just as smart and well-informed as you. It’s amazing how readily almost everyone will assume that anyone who disagrees with them is stupid, ignorant, or deluded, without explaining why they think those apply any less to them.

    When I argue about anything that touches on these beliefs, my goal is generally to either figure out what those beliefs are in more detail, or to discuss particular details that are actually provable one way or the other, or to convince the other person that the differences between our beliefs are largely arbitrary. Not many people play along with this strategy, though, and it’s exhausting, so now I tend not to argue about this sort of thing on the Internet. When I do, it tends to be with select people, like you, who I can trust to be reasonable. (But beware echo chambers . . .)

  4. Dan Avatar
    Dan

    Until a few years ago, I was pretty comfortable with my set of “early adulthood beliefs”. Naturally, I considered myself open-minded and critical. It was actually these sorts of cognitive bias results that led me to question and eventually discard many of my beliefs (mostly political in nature). The trend I’ve noticed is that the views I’m mostly likely to regret later are those which evoke strong emotional reactions. Affective support is a great crutch for poorly-justified beliefs.

    Edsger Dijkstra said “Perfecting oneself is as much unlearning as it is learning.” In one sense, unlearning is the easier of the two – learning requires external input and the formation of new concepts, while unlearning can be accomplished incrementally by habitually questioning one’s beliefs. On the other hand, unlearning isn’t as emotionally rewarding by default. Confidence and righteousness feel good, while uncertainty is neutral at best, disorienting and distressing at worst. Epistemic honesty contains no artificial sweeteners.

    Generalizing from one example, my personal experience doesn’t give me much hope for the “informed voter” ideal you mentioned. The study you cite (and others in the same vein) suggests that directly confronting secure beliefs isn’t effective. The only remaining strategy I know of for spreading true beliefs is first convincing people to loosen their epistemic grip, but doing so seems to incur a psychological toll. We’re incentivized to prefer “knowing” to not knowing.

    Of course, I could be wrong.

  5. Ian Lance Taylor Avatar

    Thanks for the thoughtful comments.

    I think the strongest argument in favor of evolution is the “Panda’s Thumb” style arguments: cases where animals are not well designed for their environment, but there is a clear inheritance from ancestral features. This does of course rely on an argument from authority for the basic facts, but for most of us that is true for all facts.

    Climate change is a good example. Our society has gotten remarkably complex, and it seems inevitable that we are going to face increasingly complex issues. The essentials of carbon dioxide and the greenhouse effect are, in my opinion, easy enough to grasp, and as I’ve written before anybody who wants to seriously argue against climate change needs to produce a solid argument for why the greenhouse effect is irrelevant. But it’s clear that many people refuse to believe that the climate is changing, and as far as I can tell there is no way that they will ever be convinced. It is possible with some effort to explain away all the facts of the matter. This is not an encouraging sign for our future.

    I agree that for many people living with uncertainty seems to be difficult. After all, there is very little any one person can do to affect complex aspects of our society, besides vote, and voting is a small thing and is only effective when one is part of a large group. So both avoiding uncertainty and seeking to effect change requires adopting the beliefs of a larger group, and then that group serves as another bulwark against changing your mind. Hmmm.

  6. Simetrical Avatar

    I misspoke about evolution — I meant that the best easy evidence in favor of an old Earth (more than about 6,000 years old) is dendrochronology. There are fairly small groups of trees that can be matched up to get a continuous sequence of tree rings dating back about 10,000 years. I’d think this is hard to argue against, although I haven’t had a chance to try it yet. I don’t know of any particularly easy arguments for evolution, offhand.

    In case I wasn’t clear about climate change, I was saying that there’s no good way for a layman to formulate an opinion in either direction on what action to take. Even if you accept that global warming is actually occurring and is mostly due to carbon emissions, it doesn’t follow that cutting carbon emissions will help enough to justify the cost. There are economists who disagree with that, and not just crackpots as far as I can tell (e.g., see the Copenhagen Consensus). Which ones are right?

  7. Dan Avatar
    Dan

    I agree that the course of action with regard to global warming is far from clear, even assuming warming. It’s a lot easier to analyze current warming trends than it is to project their implications for the global climate 50 years from now. And the speculation I’ve seen on this topic appears very one-sided, almost like it’s taboo to consider that there may be positive consequences to warming. Freeman Dyson is an interesting exception.

    Another thing that frustrates me about the debate over carbon emissions is how frequently ocean acidification is neglected, despite its relatively straightforward relationship with atmospheric CO2. At least slowly rising temperatures give time for gradual adaptation, but losing entire coral reefs could be quite an ecological shock. The existence of a second reason to reduce CO2 emissions that’s nearly independent of the first should be a major factor in the debate, but it seems like disjoint support is commonly seen as weakening a position, rather than strengthening it. Yet another bizarre quirk of human perception.

    Simetrical, earlier you dismissed Occam’s razor, but I think you’re going to have a very hard time convincing someone to pay attention to evidence if they’re not employing some form of it. This is because someone who rejects Occam’s razor can simply keep appending explanations to the data. You might point out dendrochronology to Bob the creationist, but Bob can simply reply that it’s evidence that trees accumulated rings more rapidly when the world was fresh.

    You can already witness that type of answer to questions like, “What about parallax and redshift measurements of the light from distant stars/galaxies? These things are clearly further than 6000 light-years away.” Bob replies: “Oh, well God simply created the universe with that light already on its way here, to give us plenty of starlight from day 1.” Or he could just say that it’s evidence that light moved faster when the world was fresh. The razor may not be easily formalized (though Solomonoff induction is an interesting candidate), but it’s absolutely fundamental to sane belief formation. Without it, you can just explain everything with “a wizard did it” and to hell with any contradictory evidence: a wizard did that too.

  8. Simetrical Avatar

    It’s taboo to consider the positive consequences of global warming because it’s entangled with moral issues. In circles where it’s taken for granted that carbon cutting is necessary, someone who says otherwise is viewed as perpetuating a falsehood that’s causing tremendous suffering. It’s the same reason why it’s taboo in many circles to suggest that there are innate differences between races or sexes, or (in other circles) to suggest that God doesn’t exist or homosexuality is okay, and so on.

    In all cases, when there’s a dispute about an issue where one side perceives the other as causing major harm or suffering, the latter will tend to get upset about the former even stating its position openly. In some cases, opponents of a position will try to avoid even dignifying the position with a real response, if they think their position is secure enough: I see this often with those attacking Holocaust denial and sexism/racism.

    Ocean acidification is often neglected in public discussions of global warming. On the other hand, climate engineering is often neglected too. I don’t think anyone disputes that we could not only halt but entirely reverse warming via cloud whitening and such, much faster and for much less direct cost than carbon cutting, but somehow it’s rarely even contemplated except as Plan B. And if it is, often the disadvantages (requires ongoing maintenance, doesn’t help acidification, has side effects) are stated as though those refuted the idea completely, without even attempting to quantify them and compare them against the disadvantages of carbon cutting. I have various theories about why this is, but this post is fairly long already, since I wrote the second half before the first . . .

    Whether Occam’s razor is fundamental to sane belief formation really depends on what you mean by “sane”. I would say a belief is more sane — or let’s say more rational — if someone acting on it will be able to more easily achieve his goals. It’s not always obvious when this holds, but the same is true of Occam’s razor.

    For instance, if you lose your keys, it’s generally more rational to think that you just put them down somewhere than that aliens abducted them. Why? Because the first theory would predict that if you looked around a bit you’d find them, while if you believed the second theory you wouldn’t bother looking. The first theory will let you find your keys fairly often in practice, so those who believe in it can legitimately claim superiority over the ones who believe the second theory.

    On the other hand, suppose you believe that super-powerful aliens abducted your keys and then placed them somewhere so as to make it look exactly like you forgot them there. This explanation is more complicated than the obvious one, but it predicts nothing different, so it’s hard to say why it should be irrational. There’s no practical difference to anyone whether you believe the first or the second, so on what basis do you claim one is superior to the other, taken in isolation?

    The same goes for a religious person who believes that God created the world a few thousand years ago and made it look old. (You can still try to convince a young-Earth creationist that at least the Earth looks old — many of them think that the idea of an old Earth is completely groundless and only made up by scientists so they can deny God.) And it also goes for conspiracy theorists who say that some conspiracy controls something or other, but it’s covered up, so you’ll never be able to prove it one way or the other.

    These beliefs produce essentially identical predictions to what Occam’s razor would, so I don’t think there’s any grounds for calling them irrational or insane. For practical purposes, they’re identical to what Occam’s razor says. Of course, other beliefs commonly associated with these might be irrational, but not just because they violate Occam’s razor — because they produce worse predictions.

    For instance, Bob’s theory that light moved faster when the world is fresh would make testable predictions that could be easily disproven. Even if he leaves himself an escape route so that his theory is compatible with any observation, you can still claim victory, because you predicted better: you gave a specific prediction that was true, while he made no prediction and so was merely right by default. But if up front, he says he agrees with every prediction of modern science, and would be as surprised as you if the scientific theory were proven wrong, but he thinks the theory actually is wrong and it’s just that God makes the theory appear correct in every way — I wouldn’t call that irrational. Inelegant, maybe.

    Rationality should be about being right in practice, after all, not conforming to particular theoretical rules. Otherwise, what’s the point?

  9. Dan Avatar
    Dan

    For points meant to apply in practice, it seems like you’re focusing on an impractical edge case, where the more complex hypothesis happens to make identical predictions.

    In practice, if I observed someone consistently offering complex hypotheses that result in identical predictions to simpler hypotheses, I’d be forced to conclude that they’re working backwards from the simpler hypotheses, while lying or self-deluding as to the origin of their beliefs. This is just a counting argument – there are many more complex hypotheses, so it would be a huge coincidence for their predictions to line up perfectly.

    It’s also very difficult for this condition to hold, as reality is heavily interconnected. Take your key abduction scenario. Are the predictions really exactly the same? If I thought aliens had abducted my keys, I might well be concerned that exposure to radiation in space had damaged the electronics on my access fob. So I bring a spare with me to work that day, in case aliens ruined my keys. Sane? (This leaves aside a host of other predictions, like a massive increase in my estimate that we’ll encounter other intelligent life in the universe, which in turn could affect decisions like “Should I donate to SETI?”).

    In your “Theory is wrong but God makes it appear right” scenario, I’m not sure what it would mean for our scientific theories to “appear” correct while actually being wrong. Science doesn’t prescribe an ultimate ontology, merely describes the behavior of the systems we observe in ever-finer detail. Again, if Bob’s God theory makes exactly the same predictions, I suspect it’s not the true origin of those predictions.

    But suppose we give Bob the benefit of the doubt that his complex hypotheses are independent of the simpler ones, and we’re sure they make identical predictions (assuming this is even possible). It seems like we’re no longer talking about two separate hypotheses. What we’d have is disjoint “Occam evidence” for one set of predictions. You mentioned “equivalence classes of beliefs” previously, and asserted that Occam’s razor means favoring the simplest one from the class. This seems like a mischaracterization of the razor to me – if the beliefs A and B are really equivalent, then there’s simply no need to choose one or the other, you’d sum then instead as the prior for theory AB.

    The rub is that in practice, this never happens. If the complex theory is actually a theory and not just word soup, its predictions will diverge. And since simpler theories tend to perform better in practice, how is rejecting the razor as circular anything other than conforming to a particular theoretical rule?

  10. Dan Avatar
    Dan

    Neat coincidence – right after posting my last comment, I encountered this post, which is making essentially the same point about identical predictions. (See this comment if the terminology makes it appear otherwise).

  11. Ian Lance Taylor Avatar

    Thanks for the comments.

    I agree about concerns about ocean acidification, and in fact that is an argument that cloud seeding and the like are not sufficient to address the increasing carbon dioxide levels. But in fact I think we should seriously consider cloud seeding nevertheless, and indeed I’m surprised that a country like China is not already doing it.

    I view Occam’s Razor as a pragmatic statement about a pragmatic world. We can never actually know what is true, but we must decide in finite time with incomplete information. Occam’s Razor is a useful guide, though it is not always correct.

    In my opinion, economic arguments that trying to stop climate change is not worth the cost almost always radically underestimate the economic growth which will result from mitigation technologies. That is, they look only at the cost to existing industries, not at the economic benefits from new industries. I’m sure you know that the Copenhagen Consensus has been criticized on several fronts; in particular, they implicitly argued that money spent on climate change would not be spent on other things, and that argument is false on its face.

  12. Simetrical Avatar

    Dan:

    There’s an important special case where a complicated theory will closely (not exactly — I should have made that clearer) match the predictions of a simple theory, and that’s when the complicated theory involves an intelligent agent trying to make the simple theory appear true. This is in fact a pretty common belief — it’s what all conspiracy theorists basically say, and many theists. If I believe that the Illuminati rule the world and they just make it look like governments do, and that the Illuminati is very good at it, then this will naturally replicate the predictions of the simpler theory.

    You’re right that the replication won’t be exact. For instance, my Illuminati belief would lead me to assign much higher probability than you would to the hypothesis that the Illuminati conspiracy will be uncovered. Likewise, if I believed that God was hiding and making it look like scientific explanations are correct, I’d still assign much higher probability to the coming of the Messiah. In both cases, though, the probability assigned would be quite low in absolute terms, not enough to count much against the theory if they don’t come true.

    So the theories don’t make exactly the same predictions, and you have to distinguish them. But they do make very similar predictions, so in practice, there’s nothing particularly wrong with the more complicated theory. Since rationality is about winning (yeah, I’ve read a lot of Less Wrong too . . .), a theory that makes you win only very slightly less should be considered only very slightly less rational.

    (You might say that a theist loses a lot, what with wasting their time in church and whatnot. But they only lose that if they’re wrong, and assuming that in order to prove their belief irrational is circular. By the same token, an atheist will lose a lot if he’s wrong, but there’s no clear evidence of net loss, so it can’t be counted against the belief when gauging whether it makes you win or not.)

    Ian:

    The Copenhagen Consensus has certainly been criticized; otherwise there’d be no dispute. 🙂 However, recommendations such as the one it makes (along the lines of “don’t bother cutting carbon until we have an easy market replacement anyway”) are endorsed by a number of reputable economists, including three Nobel Prize winners in the case of the Copenhagen Consensus on Climate. Since the experts themselves disagree, I don’t see any way for laymen to reach a good conclusion on the issue. You can’t just count the number of experts on each side, and it’s not reasonable to expect that you can reach a reliably good conclusion by evaluating the arguments directly. You just have to decide which argument sounds better or who you trust more, and hope for the best.

  13. Ian Lance Taylor Avatar

    My concern about the Copenhagen Consensus is that, as far as I can tell, they asked the wrong question. They asked what should be done if there were $50 billion to spend. They didn’t consider the potential benefits to the economy of developing new technology.

    That said, you’re right that it’s pretty hard to know what to do when experts disagree. One useful thing you can do is see who is paying the experts in question. And at some point you really do just have to count experts, at least experts who appear to be unbiased. A strong majority on one side of the question is not proof, but it’s hard to know what else you can do.

  14. Simetrical Avatar

    New technology will be developed either way. It’s just a question of which technology. You’d create jobs and technology if the government required massive investment in shampoo too, but more wealth would be created absent such requirements. The Copenhagen Consensus recommends spending the money on different technological investments; those will benefit the economy too. But in the end, I don’t have an informed opinion on questions of economics.

    I don’t agree that counting experts is ever useful if there are significant numbers on both sides. It does help to look at what ulterior motives they might have (conscious or unconscious), including who’s paying them. For instance, climate studies sponsored by oil companies are suspect — but so are those sponsored by governments. Governments are pressured on the one side by groups with economic interests in the outcome (on both sides), and also by popular opinion. According to one survey, health researchers report more interference from government sponsors than private sponsors.

    Ideological motives can be just as powerful as more straightforward conflict of interest, too. For instance, this New York Times article about a study reporting a drop in maternal deaths says “some advocates for women’s health tried to pressure The Lancet into delaying publication of the new findings, fearing that good news would detract from the urgency of their cause”. This is a recurrent pattern in science that affects policy: the scientists are aiming to achieve policy goals, and don’t want the public to have access to research that could encourage them to deemphasize those goals. It’s the same motive that led the East Anglia researchers to try to give as little of their data or methodology as possible to climate skeptics. They might not outright lie, but they could choose not to submit articles for publication, or look harder for flaws in research with the wrong conclusions, understate uncertainties to popular audiences, and so on.

    So it’s a complicated story. You can’t get away with following government research, because governments are biased too. You can’t get away with following independent research, because independent researchers are biased too. You certainly can’t just follow the majority, even if you have reliable data on what the majority thinks, because it’s easy for a majority of published research to contain systemic errors. You just have to pick a side and hope.

    (But, by the way, I don’t know of any reason to think a strong majority of economists supports carbon cutting, or any majority at all. Do you know of any?)

  15. ncm Avatar

    I know of plenty of economists who support carbon reductions. It’s been a topic of active debate whether taxation or complicated market tricks would be more effective or politically viable.

    Back to the original topic, I have seen people change their minds in the face of evidence, or simply logical argumentation. I’ve done it myself. So, more interesting than the question whether it occurs is what is needed for it to occur. Part of that is qualities of the person changing, perhaps scrupulousness chief among them, but also versatility of mind, raw intelligence, and courage. On the other hand, some people change in the face of falsehoods and fears; Robert Heinlein became increasingly disconnected from reality during his third marriage.

Leave a Reply