In economics, the term "Dutch disease" describes the decline of the manufacturing sector after revenues from natural or external resources strengthen the nation's currency, making exports more expensive for other countries. That happened to the Netherlands after the discovery of a large natural gas field.
"Double Dutch" refers to unintelligible or garbled speech or language. “Dutch courage” is liquor-induced. A "Dutch book" guarantees a bookie to win money no matter what happens (read here how it works). Likewise, “Dutch treat" or "Dutch party" is idiom that emphasizes negative cultural characteristics of the Netherlands (an economical or parsimonious attitude). Ever since the 17th century nautical battles the Dutch have been a verbal target for English. Now we can add "Dutch data" or "Dutch research" to the list. In a relatively short time-frame several Dutch professors were forced to leave their academic position because of misconduct. They were hung out to dry after investigations revealed that data were fabricated or "massaged" to strengthen outcomes.
Loss of scientific integrity covers a full range of activities. Fraud and plagiarism are on the higher end of the scale, fishing to find significant and publishable results would be at the lower extreme. Deliberately making up or massaging data to “find” significant results may be rare incidents of "scientific malpraxis". However, these extremes are part of a wider crisis in research. Name dropping and ghostwriting are, respectively, more manifest and covert practices in achieving publications in top international journals, bringing in research grants, and training PhD-students. Some essentials of research are lost in the process. E.g. reproducibility and accuracy are supposedly basic principles of scientific research, but funding is directed away from replication of experiments towards “original” or “applied” studies.
In a 2005 PLoS Medicine essay, titled “Why Most Published Research Findings Are False ”, epidemiologist John Ioannidis argues that small sample sizes, small effect sizes, and “flexibility” in the research process contribute to a high rate of false positives in published research claims in biomedicine. A counter attack would require a change in scientific mentality that might be difficult to achieve. Therefore, high hopes are set on the methodology of randomized controlled trials and some kind of registration of data collections or networking investigators.
However, enhanced research standards may also improve the situation. This website is intended to contribute to better statistical analyses. In a series of comments on Dutch journal articles, or other Dutch research reports, some common misses are demonstrated. To quote Clarice Weinberg (2001), editor of Epidemiology where the founder K.J. Rothman started discouraging the use of P-values:
" It is time to stop blaming the tools, and turn our attention to the investigators who misuse them."
A 'blame and shame' approach will perhaps not be effective in the short run (and could harm my career), but hopefully will attract some attention of graduate students and PhD-candidates.
Not a modern pillory
This is not a crusade against individual researchers, but an effort to improve data analyses. Contrary to popular belief, this website is not a contemporary pillory. It is intended as a modern showcase of common errors in statistical analyses. And whenever a rectification is published, I will gladly remove the item from this showcase. So keep me posted.
On a personal note: my field is social psychiatry, so feel invited to join and add examples of "Dutch research" in other research fields. Your views, and other comments, are welcome, but all anonymous contributions are discarded.