Seven Deadly Sins of Contemporary Quantitative Political Analysis
22 November 2010 at 8:31 am Peter G. Klein 2 comments
| Peter Klein |
The rational-choice revolution in political science — universally acknowledged and generally respected, though not always loved — has let to an explosion of quantitative empirical research (making political science, like some strands of sociology, look more and more like neoclassical economics). Philip Schrodt warns, however, against these seven deadly sins:
- Kitchen sink models that ignore the eects of collinearity;
- Pre-scientic explanation in the absence of prediction;
- Reanalyzing the same data sets until they scream;
- Using complex methods without understanding the underlying assumptions;
- Interpreting frequentist statistics as if they were Bayesian;
- Linear statistical monoculture at the expense of alternative structures;
- Confusing statistical controls and experimental controls.
The economics literature is somewhat better at 4-7, though clearly susceptible to 1 and 3. (I’m not a logical positivist so 2 isn’t a sin for me.) In any case, this paper is worth reading, particularly for graduate students across the social sciences.
Here are commentaries by Andrew Gelman and Matt Blackwell. Oh, and Schrodt maintains that “[t]he answer to these problems is solid, thoughtful, original work driven by an appreciationof both theory and data. Not postmodernism.” Take that, performativitarians! The paper also includes some historical and philosophical perspective, with “a review of how we got to this point from the perspective of 17th through 20th century philosophy of science, and . . . suggestions for changes in philosophical and pedagogical approaches that might serve to correct some of these problems.”
Entry filed under: - Klein -, Methods/Methodology/Theory of Science.
1.
Tom | 22 November 2010 at 12:20 pm
I loved this article, and Gelman’s response summarizes my own feeling about data analysis perfectly:
“As in many rants of this sort (my own not excepted), there is an inherent tension between two feelings:
1. The despair that people are using methods that are too simple for the phenomena they are trying to understand.
2. The near-certain feeling that many people are using models too complicated for them to understand.”
2.
Rafe | 22 November 2010 at 9:52 pm
A couple of interesting pieces on the dangers of mathematiacl formalism and nice neat models.
This is by an environmental physicist and agricultural scientist who was very well read on a wide front including the history and philosophy of science. He was concerned about the way researchers in his field were turning to models and forgetting about the world outside the window.
Click to access Philip_on_soils__science___models.pdf
“We examine the last half-century of natural science, with special reference to its ethos and to changing public attitudes to the autonomy and accountability of the scientific community. The content of soil science places it uneasily between natural science on the one hand and the world of professional practice on the other…A disturbing aspect is that computer modeling has largely suplanted laboratory experimentation and field observation as the research activity of students.”
This one explains how the mathematicians can do a nice model that completely misses the piont of the scientific theory that it is supposed to illuminate.
Click to access Schwartz_on_mathematics.pdf
“Give a mathematician a situation which is the least bit ill-defined — he will first of all make it well defined. Perhaps appropriately, but perhaps inappropriately…with the danger that…the mathematician turns the scientist’s theoretical assumptions, i.e., convenient points of analytical emphasis, into axioms, and then takes these axioms literally. This brings with it the danger that he may also persuade the scientist to take these axioms literally…In this way, mathematics has often succeeded in proving, for instance, that the fundamental objects of the scientist’s calculations do not exist.”