Warning: ob_start(): non-static method wpGoogleAnalytics::get_links() should not be called statically in /home/popeconomics/popeconomics.com/wp-content/plugins/wp-google-analytics/wp-google-analytics.php on line 259

Warning: Cannot modify header information - headers already sent by (output started at /home/popeconomics/popeconomics.com/wp-content/plugins/wp-google-analytics/wp-google-analytics.php:259) in /home/popeconomics/popeconomics.com/wp-content/plugins/wp-greet-box/includes/wp-greet-box.class.php on line 496
Why it’s so hard to change your mind, and anyone else’s — Pop Economics

Why it’s so hard to change your mind, and anyone else’s

by Pop on September 11, 2010

Post image for Why it’s so hard to change your mind, and anyone else’s

We all say we’re open-minded…

One of the more fascinating areas I’ve studied is “self-affirmation.” We all think of ourselves as good people. But in turn, when something challenges that identity, we put up psychological roadblocks to combat the threat.

That makes it really hard to change your beliefs. This is important to pretty much any arena you enter—personal finance, economics, politics, who should wash the dishes. We’ve all argued with someone who stuck to his guns in the face of mounting evidence against his or her position. And yet, I bet we’ve all been on the other side of that argument too.

Why the heck is it so hard to admit we’re wrong?

You’re less likely to change your mind if you have a low sense of self-worth.

About a year ago, political science professors Brendan Nyhan from the University of Michigan and Jason Reifler from Georgia State drew on psychological research to see if improving someone’s image of himself could make him more open to changing his mind.

To do this, they sampled a group of Republicans in 2008, a little while after the “surge” of American troops in Iraq and the subsequent reduction of insurgent attacks. (They sampled Democrats too, but the Republican results were more interesting.)

Before the sample, the researchers asked one group of Republicans to pick a moral value out of a list that was most important to them and write about a time that they embodied it. The other group was simply asked to write what they ate in the last two days.

They showed the Republicans a graph that clearly showed insurgent attacks had decreased as the number of troops in Iraq increased. Then, they asked if they supported troop withdrawal.

The Republicans who did not do the self-affirmation exercise became more opposed to troop withdrawal, consistent with the GOP position at the time. But the Republicans who did write about a value they embodied became more open to withdrawing troops.

Feeling better about yourself makes you more comfortable with changing your mind. It seems to be a logical extension that the more crummy you feel about yourself, the less likely you are to cave. Maybe that’s yet another reason why it’s so hard to sell a lousy stock and “lock in your losses.”

After we choose a belief system, we don’t accept evidence that challenges that belief.

Most scientists believe in the threat of global warming. That’s just a fact. And yet, public opinion is sharply divided over whether or not carbon gas emissions have or could warm the Earth over time. In fact, public opinion is actually sharply divided over whether or not that scientific consensus actually exists.

A group of law professors surveyed individuals on several divisive issues, such as global warming and gun control, and showed them evidence that seemed to support or contradict their positions. They were each given a set of opinions by experts and asked to interpret what the scientific consensus was. For example, someone might see a series of statements by scientists stating that global warming was a real threat along with one or two scientists who called global warming bunk.

No matter the evidence shown, those who identified with individualistic, pro-commerce values were more likely to see a lack of consensus on “anti-commerce” risks like global warming, whereas those who identified with egalitarian values were more likely to see consensus.

In other words, each group took the same evidence and used it to support their pre-defined beliefs. So next time you’re arguing with your pigheaded pro- or anti-global warming neighbor, keep in mind that it’s not that he’s being stubborn, it could actually be that he doesn’t interpret the evidence in the same way you do.

We adopt the beliefs of people we identify with.

I’m an equal-opportunity political hater. I support some Republican positions and some Democratic positions, but find the ways in which Democrats and Republicans support their positions ridiculous. It amounts to a bunch of rhetoric and name calling and never leads to progress on policy issues.

And in addition to that, ever notice that every Republican and Democrat seems to support nearly the entire slate of policy positions of their parties? Why should being a pro-lifer have any correlation with supporting tax cuts? Why would someone who’s pro-choice also support gun control? To quote Mugatu from Zoolander, I feel like I’m taking crazy pills!

It turns out, we’re deeply susceptible to the values of our peers or, at least, whomever we identify as peers. In a study in the 1950s, researchers asked students from opposing Ivy League schools to watch a tape of a football game and count the number of violations they thought were committed by each team. The students consistently saw fewer violations committed by their own team than by the opposition.

Other scientists have run tests asking opposing teams to rate their teammates relative strengths against the opposition. Of course, those assigned to the red team suddenly find everyone wearing red as faster, stronger, taller, and generally better than everyone wearing blue.

This wouldn’t be so worrying if it stopped at football games. But unfortunately, the same tendency regularly translates to issues that influence society’s well-being.

In a paper published in the journal Nature, a Yale Law professor proposed a couple solutions that helped break down the team mentality that prevents a rational interpretation of facts. One idea was to present facts in such a way that didn’t seem to conflict with the opposing team’s values. For example, showing evidence of global warming doesn’t have to be “anti-business.” It could instead be a reason to develop nuclear energy plants.

Another idea was to have opinions in support of an anti-Republican or anti-Democratic issue come from fellow Democrats or Republicans. If a Republican comes out in support of the theory of global warming, other Republicans become more accepting of evidence that supports it. In effect, they can change their minds on an issue, without having to leave their team.

Keep all of this in mind the next time you find yourself on opposing sides of any issue, whether they be at home or in the workplace. If you’re having a big meeting at work to decide whether to pursue a new business venture, keep in mind that at some point, the discussion will cease to be about the facts, and start to be about each sides’ sense of self-worth.

Managing the emotional and psychological parts of the discussion will become even more important than presenting yet another set of facts that support your side. And hopefully, you’ll be more aware of your own susceptibility to throw reason out the window.

Share

{ 10 comments… read them below or add one }

Jan September 12, 2010 at 11:36 am

I think people who choose not to debate (rather than argue) have limited self worth, but people of strong convictions often have very high self worth.
If you change your mind easily, you never really had convictions to begin with.

Jacq @ Single Mom Rich Mom September 12, 2010 at 11:54 am

Along the lines of Jan’s comment above, I wonder if people with strong convictions really do have very high self-esteem or if they’re like the Wizard of Oz, hiding behind a facade and trying to fool themselves and convincing us in the process.
Pop, not sure if you’ve read it, but a really good book along these lines was “Mistakes were made (but not by me): Why we justify foolish beliefs, bad decisions, and hurtful acts”. Very entertaining read on confirmation bias etc. Here’s the link:
http://www.amazon.com/Mistakes-Were-Made-But-Not/dp/0156033909/ref=sr_1_1?ie=UTF8&s=books&qid=1284306298&sr=8-1

In the meantime, I usually just fall back on this Walt Whitman quote whenever someone dogmatic calls me to task for changing my mind:
“Do I contradict myself? Very well, then I contradict myself, I am large, I contain multitudes.”

Pop September 12, 2010 at 1:40 pm

Thanks for pointing that book out, Jacq. I haven’t read it yet.

I guess I’d have to hear an example of what you guys mean by “strong conviction”, but in general I think I’d rather have beliefs that are open to change in light of new facts.

Someone might start with the conviction that gun control is not a solution to curbing violence, for example. But maybe he or she wasn’t aware that countries that have outlawed hand-guns saw a 90% reduction in murders. (This isn’t a commentary on gun control, and I’m making these numbers up.)

The key question is whether that person decides to re-evaluate their position in light of this new fact or if they draw inward and become defensive. I’m guessing that the knee-jerk reaction of most people (including me) is to tend toward the latter, which is a serious problem.

Rob Bennett September 13, 2010 at 3:27 pm

My belief is that this is why it takes years for us to recover from a period of overvaluation in the stock market. All overvaluation is irrational. But it takes years (sometimes decades!) for us to acknowledge the mistake. It would be best for all of us to acknowledge the realities and move on. But we stretch out these painful periods because moving on requires a change in fundamental beliefs about how stock investing works.

Rob

understatementjones September 14, 2010 at 12:59 pm

Keep in mind that your own beliefs are as susceptible as anyone else’s. having worked in politics for quite some time, I can tell you that you’re factually wrong about whether and how politics works. But “equal opportunity political hat[ing]” is its own beliefs system with its own proponents, and you’re susceptible both to squeezing facts to fit that framework and identifying with folks who espouse the same beliefs. That’s the problem with a lot of behavioral econ – there are no clear ways around many of these behavioral traps and there’s a real danger that we’ll assume our self-knowledge protects us, even in the face of evidence to the contrary.

jim September 14, 2010 at 2:25 pm

“ever notice that every Republican and Democrat seems to support nearly the entire slate of policy positions of their parties? ”

No I have never noticed that generalization to be true. In fact the opposite is more often true from what I’ve seen. People tend to vote the “closest” party to their views and a 100% perfect alignment with party platform is not so common.

Example: In one poll asking if Roe V Wade was a “good thing”, 40% of Republicans agreed and 21% of Democrats disagreed. In another poll 31% of Republicans declared themselves ‘pro-choice’ and 25% of Democrats said they were ‘pro-life’. And this is just ONE of the many issues that make up a party platform.

Pop September 14, 2010 at 2:59 pm

@understatementjones I completely agree that I’m subject to the same emotional influences that taint everyone’s reason.

@Jim I meant Democratic and Republican lawmakers, but point taken. I don’t normally think of non-politicians as “Republicans” or “Democrats”, even if they’re registered with a certain party. But I’m willing to bet that the more time they spend in political discussions with fellow partymembers or the more involved they get in politics, the more their own views begin to align with the party slate.

jim September 14, 2010 at 9:50 pm

“I meant Democratic and Republican lawmakers”

Ok, I gotcha. Yes thats a different matter. Lawmakers have taken to voting entirely along partisan lines 99% of the time.

Kevin S September 17, 2010 at 6:52 pm

I disagree that managing emotion is the biggest part of influencing decision making.

The biggest part of influencing decision making is taking emotion completely out of the equation. This done by taking advantage of people’s propensity to accept the recommendations of those who are perceived to be experts.

In my experience, you are more likely to change your mind if you have a LOW sense of self worth. Those with low self worth are more likely to give their decision making authority over to others.

Facts can be manipulated to support any conclusion, even scientific facts. Everyone has an agenda, even those who portray themselves to be unbiased purveyors of facts. Forty years ago medical facts were trotted out to support the idea that smoking was healthy. A modern example is liar loans. Supposedly intelligent people were convinced that since there is a positive correlation between home ownership and healthy communities, if everybody owns a home all communities will be healthy. This was used to support the idea that everybody should own a home, and in turn was used to justify crazy loan terms.

There are not enough of us who understand that correlation is not causation, and there are too many of us who are willing to accept the recommendations of experts – even when the facts support a conclusion that makes no sense.

It all goes back to what my mom used to say, “It’s good to be open minded, but don’t be so open minded that all your brains fall out.”

Scientist October 1, 2010 at 8:56 am

I’m an academic scientist. In theory, the results of our current work should be independent of our former work, but this rarely seems to be the case. Sometimes we strain and force things to fit preconceived notions. I haven’t yet figured out who is immune to this: I know both young and senior scientists who are guilty, and others who are quite honest. You can hear it in the excessively antagonistic remarks made at conferences, and the unduly defensive or dismissive responses that follow. If our livelihoods and those of our students and postdocs didn’t hedge quite so much on grant renewals at 3-5 y cycles, things might (maybe?) be better.

I want to second the endorsement of “Mistakes Were Made.” It’s a truly outstanding book–you’ll probably recognize the authors. It’s effectively a rigorous, nuanced follow-up to the conformity theme of “Influence.”

Leave a Comment

{ 1 trackback }

Previous post:

Next post: