Klaas Sijtsma is an experienced psychometrician and former rector magnificus of Tilburg University. In “Never waste a good crisis”, Sijtsma discusses academic fraud (and in particular the infamous Stapel case, the fallout of which he had to deal with as dean of the School of Social and Behavioral Sciences at Tilburg University) and questionable research practices (henceforth QRPs). Importantly, Sijtsma also offer some refreshingly direct suggestions on how the reliability of psychological science may be improved.
Solicit Statistical Advice!
Before outlining the contents of the book, I would like to focus on Sijtsma’s main recommendation. The argument is straightforward: statistics is difficult, and empirical researchers who wish to analyze their data ought to solicit professional statistical advice rather than trying something for themselves. When I asked Klaas Sijtsma about this in person, he elaborated (I am paraphrasing here):
My recommendation was inspired by daily life. If my car breaks down then I take it to a mechanic — I do not try to fix it myself. And if I I have an issue with my teeth I go and see a dentist rather than my neighbor.
To this I can add an anecdote of my own. As a graduate student in the cognitive department I oversaw an empirical study conducted by four undergraduates. When the data needed to be analyzed, the undergrads visited the methods department and sollicited statistical advice. The statistician on duty expressed surprise at the intended analysis plan (a standard ANOVA on average response times) and recommended a linear mixed effects model instead. I quickly convinced the undergrads that we should keep things simple and stick to the analysis that everybody else in the field was using for these kinds of experimental designs. In hindsight, I believe that the statistician was right and I was wrong.
Klaas makes his case forcefully, clearly, and repeatedly. Consider the following characteristic fragments from “Never waste a good crisis”:
There is little if any excuse for using obsolete, sub-optimal, or inadequate statistical methods or using a method irresponsibly, and inquiring minds find out that better options exist even when they lack the skills to apply them. In the latter case, a statistician can offer solutions. (p. 44)
and
My proposition is that researchers need to use methodology and statistics, but that methodology and statistics are not their profession; they practice skills in methodology and statistics only on the side. This causes QRPs. (p. 140)
and
Query: Imagine a world in which researchers master statistics and data analysis well. Will QRPs be a problem?
My hypothesis is: No, or hardly.
Bringing in statistical professionals has the added benefit of ameliorating methodological inertia, that is, the tendency to practice the methods that one was once taught rather than state-of-the-art methods that may be more appropriate and provide more insight.
Sijtsma argues that certain other remedies that have been proposed (e.g., lowering the α-level to .005; the use of Bayesian inference) are largely ineffective; the core problem is the pervasive amateurism. This amateurism is not overcome with a few graduate courses here and there. A professional statistician engages with data analysis eight hours a day, every working day. The gap in expertise is much deeper than empirical researchers may suspect.
“Begone, Statisticians!”
Who will stand against Sijtsma’s recommendations? It seems wasteful that so much resources are being invested in data collection, and that the data analysis is then almost an afterthought. Moreover, one would hope that when a professional statistician is brought in, it will become clearer what statistical questions need to be addressed in the first place. However, individual researchers probably don’t like meddling from methodologists, especially not at first. My daughter’s favorite phrase is “I want to do it myself!” and I can see why researchers would like to maintain control and ownership over their analysis, even if it is suboptimal.
There are probably many other reasons why researchers insist on conducting their own statistical analyses and generally shun expert advice (there are exceptions, of course). Perhaps it is hubris, or the inability to admit that one may need help. It may also be that the prospect of having to debate the external statistician on methodological minutiae is relatively unpleasant (“Welcome to hell. In your first millenium here you will be forced to discuss violations of sphericity with an external statistician, who will become increasingly aware that you lack any methodological knowledge whatsoever. In the next millenia, you will have to define a p-value. Once you recall the correct definition and provide a compelling argument for why it is a useful measure to report, you will be free to leave. Best of luck.”). There may also be considerable time pressure. And finally, what if the external statistician proposes to adopt a convoluted, state-of-the-art method that the researchers themselves can neither execute nor explain?
Now there are fields that regularly work with external stats advisors; for instance, professional statisticians working at hospitals may assist doctors in drawing the most appropriate conclusions from their data. It is not immediately evident to me that such external consultation has greatly reduced QRPs (the work by Ben Goldacre suggests that it has not). Moreover, in medicine the p-value still reigns supreme, and there is little room for alternative analysis procedures. In general, the medical field does not strike me as a hotbed of methodological innovation (I am happy to be corrected here).
Contents of the Book
“Never waste a good crisis” consists of the following seven chapters:
Chapter 1: Why this book?
Chapter 2: Fraud and questionable research practices
Chapter 3: Learning from data fraud
Chapter 4: Investigating data fabrication and falsification
Chapter 5: Confirmation and exploration
Chapter 6: Causes of questionable research practices
Chapter 7: Reducing questionable research practices
Chapters 1-4 feature details of the Stapel case. The point of Chapter 6 is to underscore that “Statistics is difficult”. Finally, Chapter 7 discusses possible improvements to the status quo, and Sijtsma ends by recommending that data are publicly shared and that researchers seek statistical consultation. Sijtsma’s book is of interest to anybody interested in open science, and to all methodologists. He ruthlessly points out problems but then provides concrete solutions as well. As a prototypical member of the book’s target audience, I finished the book in a few sittings. Highly recommended.
Disclaimer
Klaas Sijtsma is my collaborator and colleague. Had I disliked the book I would not have reviewed it.
References
Sijtsma, K. (2023). Never waste a good crisis: Lessons learned from data fraud and questionable research practices. Boca Raton (FL): CRC Press.