More Skepticism of Behavioral Social Science
| Peter Klein |
Here at O&M we have been somewhat skeptical of the behavioral social science literature. Sure, in laboratory experiments, people often behave in ways inconsistent with “rational” behavior (as defined by neoclassical economics). Yes, people seem to use various rules-of-thumb in making complex decisions. And yet, it’s not clear that the huge literature on such biases and heuristics tells us much we don’t already know.
An interesting essay by Steven Poole argues the behavioralists’ claims are overstated, mainly by relying on a narrow, superficial notion of rationality as the benchmark case. Contemporary psychology suggests that people interpret the questions posed in laboratory experiments in a nuanced, contextual manner in which their seemingly “irrational” answers are actually reasonable.
There are many other good reasons to give ‘wrong’ answers to questions that are designed to reveal cognitive biases. The cognitive psychologist Jonathan St B T Evans was one of the first to propose a ‘dual-process’ picture of reasoning in the 1980s, but he resists talk of ‘System 1’ and ‘System 2’ as though they are entirely discrete, and argues against the automatic inference from bias to irrationality. . . . In general, Evans concludes that a ‘strictly logical’ answer will be less ‘adaptive to everyday needs’ for most people in many such cases of deductive reasoning. ‘A related finding,’ he continues, ‘is that, even though people may be told to assume the premises of arguments are true, they are reluctant to draw conclusions if they personally do not believe the premises. In real life, of course, it makes perfect sense to base your reasoning only on information that you believe to be true.’ In any contest between what ‘makes perfect sense’ in normal life and what is defined as ‘rational’ by economists or logicians, you might think it rational, according to a more generous meaning of that term, to prefer the former. Evans concludes: ‘It is far from clear that such biases should be regarded as evidence of irrationality.’
Poole also argues strongly against the liberal-paternalist “nudges” advocated by Cass Sunstein and Richard Thaler, noting that “there is something troubling about the way in which [nudging] is able to marginalise political discussion.” Moreover, “nudge politics is at odds with public reason itself: its viability depends precisely on the public not overcoming their biases.” Poole concludes:
[T]here is less reason than many think to doubt humans’ ability to be reasonable. The dissenting critiques of the cognitive-bias literature argue that people are not, in fact, as individually irrational as the present cultural climate assumes. And proponents of debiasing argue that we can each become more rational with practice. But even if we each acted as irrationally as often as the most pessimistic picture implies, that would be no cause to flatten democratic deliberation into the weighted engineering of consumer choices, as nudge politics seeks to do. On the contrary, public reason is our best hope for survival. Even a reasoned argument to the effect that human rationality is fatally compromised is itself an exercise in rationality. Albeit rather a perverse, and – we might suppose – ultimately self-defeating one.
Worth a read. Even climate-change skepticism gets a nod, in a form consistent with some reflections here.