[Added 12/24/2010: There have been several comments for this post, highlighting some of the controversies regarding this topic. "CFS" has had a lot of recent research and studies with very conflicting results. The comment by Richard Smith mentioned above was made in March 2010, and a lot of research has been reported since then. Hopefully we will soon find out the truth and hopefully get closer to providing a cure for our patients. This post is NOT about appraising the evidence regarding the "CFS" literature and thus this is NOT a commentary on the Science study mentioned above. It is about the problems with peer review process in general as identified by a former editor of a major journal, and a tentative exploration for an alternative model, and barriers to such a model. The statements in the paragraph above referring to CFS and XMRV are there just to provide context. For the purpose of this post, it could well have been another condition and a different study]
Richard Smith points out the problems with the current peer-review process:
- Faith based not evidence based
- Slow
- Expensive
- Largely a lottery
- Poor at detecting errors and fraud
- Stifles innovation
- Biased
He suggests that we move away from our bias for top journals and move away from the traditional peer-review process and use a "publish and then filter" process.
This got me thinking about how this could work.
- A central resource for online hosting of all research articles in each area of biomedical science. We would not have multiple journals competing and catering to the same audience
- There would be some kind of simple review process to filter out "junk" and "spam" publications
- The articles would need to include all the necessary raw data so anyone could rerun the statistical tests and verify the results.
- There would be a robust authentication scheme for authors.
- Each article would have a place for commenting much like a blog, but you would need to have to be authenticated before submitting your comments. There would be no anonymous comments.
- Readers after logging in could rate each article on various criteria e.g. study design, practical value, etc...
- The comments could also be rated up or down
- It would be possible to track how many times the article was cited, tweeted and posted on Facebook; how many times it was downloaded, favorited, etc.
- Other studies on the same topic would also be linked from the article making it easy to find all the studies in one place.
- Part of the publication process would be to search for all the previously published related articles in this central repository and provide links to all of these.
- Viewers could see a timeline of development of literature on a specific topic
- Over a period of time, some studies, authors, commentators would rise to the top.
- There would be a robust search and tagging system.
- Some articles could be accompanied by "editorials".
- Every time the IRB at an institution approved a protocol, it would create an entry in this central repository. Investigators would have to provide their data and a short summary at end of the study even if they did not write it up fully. This would remove the problem of publication bias for positive studies and make meta-analyses more complete. If they did not provide this information, their ratings would go down.
Most of this functionality already exists - just look at YouTube, Ebay, Amazon etc. It would not take a lot to get this working. The problem is breaking down the traditions and existing norms. How can you replace the thrill and ego-boost that authors get from having their article accepted in a "top-tier" journal. Would the really big multi-center randomized double blinded trials with positive results get submitted to this central resource instead of to a top tier journal? Would universities change their criteria for promotion and tenure?
We need to break down some of the walled gardens of some of our "top" journals and level the playing field but it will be an uphill battle.
[Added 12/24/2010 - Looking at some of the comments for this post, there is clearly a lot of energy surrounding the research on "CFS". Would it not be easier for folks looking to study this condition if all the studies reporting on "CFS" and possible connection to XMRV were published in the same repository, so they would not have to go to multiple journals and databases to find this information, all the raw data was available, the pros and cons of each study were transparently viewable and authenticated users could post comments in unmoderated fashion (like to this blog post) to add to the richness of the discussion? Why do we need to have so many barriers to collaboratively finding solutions to such vexing problems?]
[Added 12/24/2010 - Looking at some of the comments for this post, there is clearly a lot of energy surrounding the research on "CFS". Would it not be easier for folks looking to study this condition if all the studies reporting on "CFS" and possible connection to XMRV were published in the same repository, so they would not have to go to multiple journals and databases to find this information, all the raw data was available, the pros and cons of each study were transparently viewable and authenticated users could post comments in unmoderated fashion (like to this blog post) to add to the richness of the discussion? Why do we need to have so many barriers to collaboratively finding solutions to such vexing problems?]