If you watch people at the cinema, you may notice that they basically fall into two groups; those who can’t wait to jump out of their seats and get back into the daylight (or the pub) as soon as the film ends and the titles begin, and those who remain in their seats to watch the credits. The latter group are sometimes rewarded for their patience by more scenes or even ‘out-takes’, showing the actors getting key scenes or lines wrong and then falling about laughing. I am one of the people who stay behind until the end, and then later watches those documentaries about how films are made. For me, there’s value in ‘the story behind the story’.
It can be equally fascinating to hear the stories behind research stories; to find out what went on behind the scenes, especially where those goings-on didn’t make it into the published paper. In his reflections on the pioneering research studies he led in the field of neonatal care, Bill Silverman (2003) tells of the practitioners who would hold the opaque allocation envelopes up to the light in the hope of being able to read what was inside and thereby choose which intervention they thought would best suit the babies they were caring for; demonstrating excellence in people-centred care and practice, but unfortunately adding an element of unwanted bias to what was supposed to be a randomised controlled trial. Consequently, he and his wife spent a couple of days wrapping allocation papers in black paper before resealing the envelopes in order to prevent this.
Not all hidden research stories are discussed as openly and honestly as this by the researchers. For instance, it was Menticoglou and Hall (2001) who pointed out that there was a strong element of behavioural bias in the Canadian post-term trial (Hannah et al 1992) rather than the original researchers. This is often the case; the study itself looks rather good on paper, and then an insightful reviewer will come along and point out that there may have been other stories going on underneath it all which put the results in a different light.
Sometimes it is the practitioners who are involved in the day-to-day practicalities of the trial (rather than the people whose names are on the publications) who offer insight into the hidden story. One midwife who took part in one of the UK third stage trials told me that:
“I know it said that we had training in physiological third stage … but it wasn’t much … lots of the midwives carried on as if they were still doing it that way, just without the synt … that was what they were comfortable with. You can understand it, they had been actively managing third stage for years … they knew it prevented PPH, and then suddenly they come along and say we have to do it another way…”
The emergence of “rapid response” facilities such as that on the BMJ website – where people can raise questions relating to the story of the study, and sometimes gain an answer from the researchers themselves – was an important step forward in seeing what went on behind the scenes. And Hamilton and Kessler’s (2004) suggestion of the addition of an honesty box was a great one, but even the now common ‘conflict of interest’ declaration can hide important forms of bias. I think that there’s still more that we can do. Perhaps one day those of us whose waiting for film credits is rewarded by seeing another part of the story will also find better ways of sharing the stories behind the research.
Hamilton WT and Kessler D (2004). BMJ papers could include honesty box for research warts. BMJ 328:1320
Hannah ME, Hannah WJ, Hellmann J et al (1992). Induction of labor as compared with serial antenatal monitoring in post-term pregnancy. A randomized controlled trial. New England Journal of Medicine. 326 (24): 1587-92.
Menticoglou S, Hall PF (2002). Routine induction of labour at 41 weeks gestation: nonsensus consensus. British Journal of Obstetrics and Gynaecology. 109:485-91
Silverman WA (2003). Personal reflections on lessons learned from randomized trials involving newborn infants, 1951 to 1967. James Lind Library.