Book review: Ontspoorde Wetenschap
On the tube in London, I pulled the book Ontspoorde Wetenschap out of my bag, to commiserate to my travel companion that I was reading this very interesting Dutch book about research fraud, but had nobody to discuss it with because it was in Dutch and nobody else in London had read it.
Immediately upon spotting the book, a guy walked over to us across the train carriage, and excitedly said in a Dutch accent: “I’ve just finished that book – it’s great!” He told me what his favourite part was, recommended Ben Goldacre as British equivalent (“but he’s trying to be too entertaining”) and then got off at the next stop.
Most of my blog readers can’t read Dutch and aren’t riding the tube with me, so I’ll try to summarize why it was so interesting – and yet also incredibly frustrating.
Frank van Kolfschooten is a Dutch science journalist who wrote a book about research fraud in 1993 called Valse Vooruitgang (“False Progress”). When research fraud in the Netherlands hit national and international media in 2011 with the case of Diederik Stapel and his habit of making up entire datasets, for years and years, involving so many high profile publications and affecting so many collaborators, it was time for Van Kolfschooten to write a sequel. That book is Ontspoorde Wetenschap (“Derailed Science”), published in 2012.
The entire book consists of only four chapters. The two longest chapters are each about 100 pages long, and consist of many, many, many examples of research fraud. Some cases just took up a few paragraphs, others a number of pages, but regardless of the length, I could only read a few cases at a time because it was so frustrating. Not because of the way it was written – it was easy to read – but because the cases were so often avoidable, or had reached points where something could have been done.
Van Kolfschooten found his cases by carrying out a large nationwide survey. Responding was voluntary, so he admits that these cases are only the tip of the iceberg, and that he doesn’t know how big that iceberg actually is. But even this small sample is far too much. I wish the book had been shorter, just because it means there would have been fewer cases of fraud to write about. And these were only cases where either perpetrator, victim, or whistle-blower was based in The Netherlands. There are many others.
There were recurring themes: In a few cases, journals had not retracted articles that had been shown to be fraudulent. In two cases, a researcher accused of fraud insisted that they had lost the only existing copy of their raw data in a computer crash. There were examples of senior researchers being given too much independence without oversight, leaving department heads unaware of inappropriate research practices. Pressure to publish, and the (perceived) need to have a long list of publications – preferably in high-impact journals – was often cited by either the researchers who committed fraud, or those trying to analyse what went wrong. Journals were also to blame, for prioritising hype over sound science, negative results, technical articles or replications.
In between his previous book and this most recent book, The Netherlands has set up a national board for research integrity, and an international body of publishers and editors has set up the Committee on Publication Ethics (CoPE) so there are now at least systems in place to report fraud, but as the long list of recent examples in this book shows, it doesn’t seem to have discouraged it.
One of the book’s conclusions gave me some hope, though. In many cases, the fraud could have been either avoided or picked up sooner if only the raw data had been made available to peer reviewers and others. This was one of the main recurring themes in the book, and one that Van Kolfschooten also highlights in his conclusion. I work for a publisher that publishes underlying research data with the articles, because it makes authors think about what they are submitting, and it gives reviewers a chance to check the data if they have any doubts. There are other journals like this, and an increasing number of data repositories and data submission guidelines and mandates. It was good to see a book full of examples of the negative effects of not sharing data, but still depressing that there were so many examples to be found.
If you can read Dutch, I highly recommend this book (in small doses, with breaks to restore your faith in the integrity of the general research population). If you can’t read Dutch, my anonymous co-passenger on the London Underground recommends you read Ben Goldacre as alternative.