Peer reviewed article ‘wrong’ shock

Dave Johns at Slate has written an interesting piece on self-publication and peer review, using a recent study about ‘social contagion’ as a peg.  I wrote a post back in March about some of the pitfalls of peer review.  In it, I wrote about the Social Science Research Network – a self/early publication site used by academics in politics, economics, social statistics etc.  I thought that perhaps the names and institutions of the authors and the number of citations the paper picked up could be used as a signal of the paper’s academic worth.

The social contagion paper Johns is complaining about was published on SSRN before being picked up by pretty much every single media organisation going (including the Washington Post, New York Times, Guardian etc etc).  It makes claims that divorce is catching.  It has been downloaded 779 times (download rank 9270 – for comparison, a really high number of downloads would be around 100,000) but has never been cited.  The authors are high-profile and come from well-respected institutions (Brown, UC San Diego and Harvard) but their paper has failed to attract any academic attention – either good or bad.  I’m tempted to say that citation signalling is doing its job: the paper is insufficiently interesting to even attract criticism, so we should take care that we don’t over-rely on its findings.  Still, it was only published at the end of 2009, so I am reluctant to bang my gavel just yet.

Perhaps this is a simple case of bad science journalism.  A (flawed) paper is picked up by (bored) journalists and instantly becomes fact.  The paper has been picked apart by writers and bloggers so the record is slowly being set straight.  Maybe we shouldn’t worry?  Not so fast.  Fowler and Christakis also published a very similar piece on social contagion (this time of obesity) in a famous and respected peer reviewed journal called the New England Journal of Medicine.  This social contagion article has picked up no fewer than 239 citations.  So it’s peer reviewed (that’s one signal) and well cited (another).  Can we feel confident in it?  Dave Johns says not, and there are plenty of other people including some in my blogroll who agree with him.  Johns relies quite heavily on a (peer reviewed, naturally!) paper by Russell Lyons which critiques Fowler and Christakis’ approach. I find Lyons’ paper extremely persuasive – but I’m no-one special.  How can a journalist know who to believe and what to print?

What can we conclude?  Journalists shouldn’t write about things they don’t understand?  Every news outlet should employ a statistician as a fact-checker?  It would be nice… unlikely, but nice!  Of course, academics should take care when speaking to journalists but part of the problem here is that researchers are under pressure to demonstrate the ‘impact’ of their research.  A key plank of ‘impact’ is dissemination of research findings in such a way that they can be picked up by users, such as policy-makers.  The media is an important intermediary – have a look at ESRC’s impact evaluation pages to see what I mean.  Academics are pretty much encouraged to talk to the press and the end results are not at all surprising.

In fact, I’m not sure that the end results are particularly terrible either.  Peer reviewed but problematic statistics were published in a simplified way in newspapers all over the world.  This made other researchers mad enough to do something about it and has resulted in a slew of public criticism.  Yes, it’s unfortunate that millions of newspaper readers still think that obesity is contagious, but it’s hardly the end of the world.  And, whichever way you look at it, people really shouldn’t believe everything they read in newspapers, should they?  The problem remains, though, that peer review has been somewhat ineffective… so it really worth the candle?

This entry was posted in Journalism and tagged . Bookmark the permalink.

2 Responses to Peer reviewed article ‘wrong’ shock

  1. Pete says:

    Great blog Vic. It rang a bell when I read it, so I went to a little stash of book reviews I keep, with the intention of one day buying them or at least following them up. There’s a review of Christakis and Folwer’s book ‘Connected: the surprising power of social networks and how they shape our lives’ released in 2009. The review is from the FT, which basically describes the book’s propositions as given fact. Lazy journalism indeed.

Comments are closed.