Behavioral economics gets inside your brain.
As we explored in earlier posts, the individual in neoclassical thought is a fairly limited creature. Even the most distinguished of individuals in society (i.e. judges) can be reduced to self-interested, utility maximizing beings. (Posner, 2010) In classical thought, in contrast, individuals are complicated units that want to improve not only their own happiness and enjoyment, but also that of others around them. But the attainment of these virtues varies by time, place, civilization and institutional development. (Smith, 1759, at VII.IV.36)
Behavioral economics – perhaps the newest school of economic thought – falls somewhere in between these two poles. Like Smith, individuals have a value set that extends beyond selfishness. Like the neoclassical conception of a time-transcendent homo economicus that maximizes utility, behavioralists outline a time-transcendent homo behavioralis that is subject to cognitive and computational problems that can be modeled. Unlike Marxist or institutional perspectives (which go deep into social context), the behavioralist school goes deep into the individual.
Essence of Behavioral Model
Individuals are not perfectly rational: our preferences are not always well-ordered, and our cognition and calculation abilities are limited (Simon, 1996, at 26-30) – which in turn limits our ability to correctly assign probabilities to events. (Jolls et al., 1998, at 1477)
Simon utilizes the concepts of the internal versus the external environment, and procedural versus substantive rationality. In neoclassical economics, the economic actor’s inner environment does nothing but define an economic goal (i.e. profit or utility maximization, which is in turn determined by rationality and optimizing postulates), and then it adapts to the outer environment (i.e. competitor firms, the government, etc.), which is characterized by cost and revenue curves.
The behavioral account differs from this account, and notes that the external environment is incredibly complex. Humans confront the complexity of the world by structuring their inner environment. They use lists and rules of thumb to make “good enough” decisions – i.e. decisions that may not optimize utility functions in the rational choice / perfect information senses, but which satisfice. Firms, for their part, use cost accounting and management science (such as inventory control) to deal with complexity, including simply defining for productive units what they should seek to maximize (Simon, 1996, at 27-32) These “good enough” decisions are useful on average, but err much of the time. (Jolls et al., 1998, at 1477) In sum, the external environment makes true substantive rationality difficult, so individuals structure their inner environment to be as procedurally rational as possible.
Human behavior takes place in a cognitive universe characterized by bounded rationality, bounded willpower and bounded self-interest. We respond to bounded rationality through calculation shortcuts like availability heuristics and unavailability heuristics. Something that happened recently or seems particularly salient (or evocative in our value system) will be assigned a high probability, while something that is perhaps quite likely but which we have not seen recently may be deemed unlikely. Some low probability events are discounted to “no probability” events; but equally un-probable events could be seen as highly probable through the mechanism of outrage. These phenomena lead to systematic under- or over-reaction. Cognitive neuroscience suggests that these shortcuts stem from a “dialogue” between the intuitive and fast-reacting part of our brain (the amygdala) and the slower fact processing center (the prefrontal cortex). (Sunstein, 2009, at 5-6, 22-23, 51)
Biases also play an important shortcut role in producing judgment errors in a world of bounded rationality.
- Hindsight bias leads us to assign higher probabilities to events ex post than we do ex ante.
- Omission bias leads us to more readily excuse errors that were caused by non-action than by action (i.e. commission).
- We are often overoptimistic about our future prospects, and have difficulty knowing how we will feel in the future about our future experiences. (Jolls et al., 1998, at 1548-1549)
- Self-serving bias leads us to assume states of the world most conducive to our interests. For instance, behavioral studies of teacher contract negotiations found that teachers’ unions systematically used higher-than-average reference wages from adjacent school districts. This upwards bias was said to exist not only as a matter of game strategy, but that the unions actually believed that the higher-than-average wage was the average wage.[i] (Jolls et al., 1998, at 1502)
Bounded rationality also affects our decision-making more generally. We can develop attachments to resources we inherit or are given, and demand much higher prices for selling them than we would be willing to pay for them had we not received them. This is known as loss aversion or endowment effects, and can prevent the type of market clearing we would predict from neoclassical economics. (Jolls et al., 1998, at 1549-1550)
Bounded willpower leads us to be unable to avoid actions (i.e. smoking or desserts) that our future selves would have us avoid. This is not only due to high discount rates, but hyperbolic discounting, i.e. the inability to have consistent discounting. The example given is one where a criminal’s or addict’s (self-destructive) enjoyment today does not come at the expense of the future self, because the future self is in effect a different person who will not see cause-effect patterns in prior activity. This has led to the theory in criminal justice that small but immediate punishments are superior to long punishments over many years. (Jolls et al., 1998, at 1539-1540)
Bounded self-interest leads us to behave fairly (even when this does not optimize our own income, as in ultimatum games) or spitefully (i.e. wanting to see retribution against those that are not seen as behaving fairly). Sunstein and colleagues evoke the example of divorces, where a rational response might be to seek out-of-court settlements or post-judgment ex parte settlements. Most divorces, however, are so acrimonious that they do not even speak, so will not even initiate bargaining processes. This is not just an instance of information asymmetry leading to market failure, as in neoclassical economics, but willful refusal to even begin collecting information. (Jolls et al., 1998, at 1495)
[i] The problem here seems to be the difficulty of actually getting inside the respondents’ heads to know what they truly believe. The authors write that “strategic behavior does not seem to provide a strong explanation for the empirical evidence on school-district negotiations. There was no incentive in the study (as there would often be in negotiations with the opposing party) for parties to choose comparable districts in a strategic manner.” But this ignores the fact that bargainers habitually stay “on message,” even when there is no obvious incentive to do so. Such individuals would probably deem it disadvantageous for researchers to print that union officials don’t believe the reference numbers they are putting up on the bargaining table.