Why people make stupid decisions: Behavioral Economics

by Stephanie Chasteen on December 15, 2009

One of the things that’s puzzling to anyone, and especially us logic-oriented scientists, is how people can look at strong evidence and seemingly ignore it.  They go with their gut, or what they think they know, instead of the data staring them in the face.

This is the basis of a huge amount of work in what is called behavioral economics — or, the psychology of why we make the economic decisions that we do.  There’s a great article and radio piece on NPR about Daniel Kahneman’s work in economics, which won him the Nobel Prize in 2002.  For instance, we have the illusion of validity (we have too much confidence in our own judgment), or the anchoring effect (we’re unduly influenced by numbers that we’re exposed to, such as a “compare to” price on an item).

Here are some classic examples, as written in an email from Nathan Lasry:

1- A group is given the price of an object they must buy. The same object can be purchased 5$ cheaper across town (remember this is 5$ in the late 70s early 80s, so was worth much more than today’s 5$). The question: Would you drive across town to get the object?

Most people said YES to driving across town IF they were saving 5$ on a 15$ calculator.
Most people said NO to driving across town IF they were saving 5$ on a 125$ coat.

The trouble? When you walk into the grocery store to spend that 5$, it really doesn’t matter where it came from…
This result does not fit at all with classical economic theory that portrays humans as ‘spock-like’ rational agents that would place an absolute value on driving across town.

2- In another interesting example, people were asked:
Do you prefer getting $1000 with 100% certainty or getting a 50% chance of receiving $2500. Most will choose the certain $1000, although the expectation value of the second option is higher 1250$.  This is ok from a strictly rational perspective because these folks are willing to pay 250$ as ‘insurance’. So you can call them ‘Risk aversive’.
BUT
The same people are then asked what they would choose between a certain loss of $1000 versus a 50% chance of either loosing nothing or loosing $2500.  Most will choose the riskier 50% alternative. So the SAME ‘risk averse’ people in the first example become ‘risk seeking’ in the second.

These, and all sorts of other biases, are outlined in a great book I’m listening to right now, How We Decide.  A lot of the themes from this book keep cropping up in my favorite podcast, Radio Lab, especially their recent episode on Choice. If you find this stuff interesting, check out the work of Baba Shiv, who sticks his subjects into MRI machines to see the hardwiring underlying how emotions affect our decisions.  He was the one who did the famous study showing that people not only rated the same wine more highly when told it was expensive, but actually had a better subjective experience of the wine based on their expectations.  And here is a TED talk by Dan Ariely on how our irrationality is predictable, and we can be encouraged or discouraged from cheating with some simple manipulations, like being reminded of an honor code, or replacing cokes with dollars.  He calls this our “buggy moral code.”

Another book that comes highly recommended is  Kluge: The Haphazard Construction of the Human Mind
Here’s some stuff covered in that book (as written in an email from Bill Goffe):

halo effect: attractive people are seen as better teachers, they earn more, etc. (presumably halos from non-people have an impact too)

priming: what is in your mind when a second topic comes up is likely to color how you view or judge the second. Marcus gives this example: you ask undergrads how happy they are and how many dates they had last month. If the happiness question is asked first, there is no correlation between the answers. If if you first ask about dates, there is a correlation between the answers.

anchoring and adjustment (a variation on priming): a number that people have in mind influences their estimate of something entirely different. One example: add 400 to the last three digits of your phone
number. Then, when did Attila the Hun’s rampage end? If the phone answer was less than 600, the median guess was A.D. 629, if the sum was between 1,200 and 1,399, the median year was A.D. 979.

mere familiarity: people prefer what they know. Marcus reports one study (now done in 12 languages) that people prefer letters in their own names. One study told half the participants that feeding alley cats were legal and the other half were told it was illegal. Yet, most favored the current policy, whichever it might be.

threat: the more we are threatened, the more we cling to our beliefs. I could imagine that this comes up in the physics classroom when beliefs about mechanics are challenged.

confirmation bias: we tend to be place more weight on evidence that supports our beliefs than evidence that doesn’t (I think this one is widely known); the flip side is “motivated reasoning.”

This examination of the irrationality of people’s economic behavior was apparently pretty controversial stuff in economics, whose models assumed that humans are essentially rational and logical decision makers who will make the choices that benefit them the most.

But there’s probably another reason for economists’ resistance. An imperfectly rational human being challenges a really important idea: the notion that markets work well because individuals can be counted on to make the best choice for themselves.

“Merely accepting the fact that people do not necessarily make the best decisions for themselves is politically very explosive. The moment that you admit that, you have to start protecting people,” Kahneman says.

In other words, if the human brain is hard-wired to make serious errors, that implies all kinds of things about the need for regulation and protection.

In our own work in educational research and reform, this has many implications as well.  After all, we’re often presenting faculty with data and information at how students learn best, and meeting great resistance.

{ 2 comments… read them below or add one }

Denise Shull December 18, 2009 at 4:24 pm

The clue here lies in the neuroeconomics – and the neuroscience on social and affective processing. Understanding the foundational role of these “non-logic” dimensions makes most of these tendencies (the word they should be called imo vs. biases) rather understandable.

DKS

Captain Skellett December 23, 2009 at 12:33 pm

Ugh! Tell me about it! I go completely non-rational in with sales. The numbers get so confusing, it’s easier to just buy things than to think about whether it’s a rational decision or not!

I spent about half an hour in the photo album section of a store today because they had 50% off a second item, it took me THAT LONG to realise I had ALREADY bought a photo album somewhere else and really didn’t need two more. Ridiculous!

Leave a Comment

{ 2 trackbacks }

Previous post:

Next post: