Intended for healthcare professionals

Education And Debate

Systematic Reviews: Some examples

BMJ 1994; 309 doi: https://doi.org/10.1136/bmj.309.6956.719 (Published 17 September 1994) Cite this as: BMJ 1994;309:719
  1. P Knipschild
  1. Department of Epidemiology, University of Limburg, PO Box 616, 6200 MD Maastricht, Netherlands.

    Reviewing the literature is a scientific inquiry that needs a clear design to preclude bias. It is a real enterprise if one aims at completeness of the literature on a certain subject. Going through refereed English language journals is not enough. On line databases are helpful, but mainly as a starting point. This article gives examples of systematic reviews on vitamin C and the common cold, pyridoxine against the premenstrual syndrome, homeopathy, and physiotherapy.

    You will have heard of Maastricht - in 1992 the European Union treaty was signed there. Some people dislike Maastricht because it seems to stand for the ideal of the United States of Europe, but many of us in Maastricht do not even know what the treaty is about. What we like is to sit together and enjoy our Burgundian way of living. Maastricht is one big sidewalk cafe.

    People in my town are very inventive in finding reasons for painting the town red. Of the many festivals that we have, carnival in the late winter is definitely number one. Anyone born and bred in Maastricht, would not dream of escaping the noise, the jigging and the many beers. For almost a week we live in sin and after that we are so depleted that we need a few days off to recover. Nearly every good carnivalist gets a sore throat, a stuffy nose, and other signs of a common cold: it is a marker that we have done our duty.

    A carnival trial?

    Some years ago a doctor who was not from Maastricht asked me to help set up a preventive trial on vitamin C and the common cold. I immediately thought of making it a carnival trial. So I suggested, “Take 200 carnivalists and randomise them to placebo or vitamin C before the carnival storm breaks. You will have the answer right away.”

    But then I had to think more professionally. “Wait a minute,” I said, “did you review the literature first?” I explained to him that it was wrong to begin a new trial if you have not done a thorough literature search. Off he went, but he was back a week later. “Did you know,” he started, “that double Nobel laureate Linus Pauling never has a common cold because he marinates himself in vitamin C?” I told my visitor that case reports could not convince me anymore. “It is trials that I want,” I added, “trials and nothing less.”

    Example of a non-systematic review

    Then my visitor showed me Pauling's 1986 book, How to Live Longer and Feel Better.1 “Here it is,” he said, “chapter 13 tells you all you want to know about trials of vitamin C and the common cold: Pauling refers to more than 30 to prove his point. And nearly all are positive.”

    The chapter in the book was an updated version of Pauling's earlier bestseller, Vitamin C and the Common Cold. It is a good example of an extensive but non-systematic review of the literature. It does not tell the reader anything about the design of his survey of trials. For a start, what were the admission criteria for his studies and where did he look for them? Was the methodological quality assessed blindly, or at least independently of the outcome? And how did Pauling decide whether the result of a certain trial, and the combined result of the better ones was positive, negative, or in between?

    The hazard of a haphazard review is obvious. Probably all of us are prejudiced and tend to focus on what we like to see. And, even worse, some tend to dismiss anything that does not suit their purpose. This makes it worthwhile to set certain rules before starting a review process. Reviews are scientific inquiries and they need a clear design to preclude bias.

    An exhaustive search

    Some colleagues and I, wanting to outsmart Pauling and his supporters, made a plan and started a new and exhaustive search.2 Of course this included Medline from 1966 up to 1991. Our literature computer cranked out lots of studies, among them 22 controlled trials. Next we started checking yielded 15 additional trials. Then we checked the references of the references, which yielded another nine. The third check gave us only one extra, bringing the total to 47. (If you do the same with Embase you find 15+16+11+2=44 trials, all which you have already found with Medline and the three checks of references.)

    What happens if you still do not stop there? We went on searching Index Medicus from 1940 to 1965 manually, checked through Current Contents, bent over textbooks on vitamins (including Pauling's book), wrote and talked to researchers who had done interesting trials, went to special libraries such as Hoffman-la Roche's “World of Vitamins” in Switzerland, and told everyone that we were after vitamin C and the common cold. By doing this we added another 14 trials, bringing our total to 61.

    We feel that our collection is still far from complete. With the Medline search we got only 36% of the studies, but checking the references, and the references of the references was very rewarding: this provided 75%. Only fanatic collectors can do much better.

    Next we graded every trial according to its methodological soundness, independently of the results, of course.3 On a scale from 0 to 12, 15 of the 61 trials scored seven or more points. Interestingly, only one of these 15 trials was not in Medline.

    Vitamin C for your cold?

    To know what vitamin C can do to a common cold, you should of course rely heavily on the results of the best studies. It does not make sense to combine the top 15 with the other 46 and do a statistical precision (cumulatively pooled) meta-analysis. This is also impossible if there are large differences between trials in the choice of patients, interventions, and measurements of effect.

    After a review of the literature,3, 4 here is my conclusion: vitamin C, even in gram quantities per day, cannot prevent a cold. On the other hand, if you already have a cold, a megadose of, say, 1 g vitamin C may slightly decrease the duration and severity of your cold (perhaps by 10%).

    What about Pauling's review? He did not mention five of my top 15 studies (all published before 1986), and two others were referred to only in passing. The other eight trials were discussed in his book, but some could not show a preventive effect but considered only the therapeutic benefit of vitamin C. Two preventive trials that showed no effect were “unfortunately flawed,” according to Pauling. Yet, if you read his colourful story instead of a rather dull systematic review, it all sounds very convincing. He ends his case: “Catching a cold and letting it run its course is a sign that you are not taking enough vitamin C.”

    Pauling also emphasises an important side effect of taking a megadose of vitamin C (and other nutrients) everyday. He believes that, if you do this, together with a few other healthful practices, from youth or middle age, you can “extend your life and years of well being by 25 or even 35 years.”

    New trial

    If you want to do a new trial, there are at least two reasons why you should do a systematic review first. One reason is that you can learn a lot from earlier studies. Talking to the authors of earlier studies is especially useful; it should be part of the preparation of a new trial. They will tell you about things that went wrong but cannot be found in their papers. It prevents you from making the same mistakes.

    The other reason for a review is that sometimes a new trial can add little to what is already known. Never believe beforehand that you are the first to study a certain subject. Many honest investigators missed earlier research or at least did not refer to it in their publications. Here is one example.

    Some patients, and doctors for that matter,5 believe that pyridoxine (vitamin B-6) works against the premenstrual syndrome. What does the literature say? Researchers from Oxford published a trial on this subject in 1989, referring to five earlier trials.6 By then I knew of six other trials (including two on premenstrual mastalgia), of which two were large, well performed trials. One year later, researchers from Philadelphia also published a report on pyridoxine and the premenstrual syndrome.7 They did not refer to nine of the 12 trials published before 1990, which included four well performed trials.8

    Would these researchers have started their studies if they had known of all other published and ongoing trials? Every important trial that they had missed showed an ambiguous or negative result.

    Foreign languages

    Especially if it does not concern research on specialist, mainstream medicine from Anglo-American countries, there is much grey literature around. Of course you must include publications in less famous, non- refereed journals (and even “internal” reports) in your systematic review9 - you are the referee. Some papers are of high quality, but the authors are not yet familiar with the idea of writing a report for one of the well known English language journals.

    In the meantime, good reviewers should know their languages, or at least have people around who are not afraid of reading “Eine Placebo- kontrolierte Doppelblindstudie” or “Une etude randomisee a double insu face au placebo.” There is nothing fundamentally wrong with publications in “foreign” languages.

    Grey literature

    I recently discussed the importance of searching the grey literature on alternative medicine.10 It is a real enterprise that takes its toll in blood, sweat, and tears. One of the examples I gave was homoeopathy.

    Helped by alternative researchers, my colleagues and I turned many libraries upside down to get a collection of 107 controlled trials (published before May 1991). Medline yielded only 18 publications. Checking the references and the references of the references increased the number to 30, still not more than 28%. For homeopathy, other sources such as congress reports and dissertations (from Germany and France) were more fruitful.2

    Most (61%) of the trials that we could find on homeopathy were published in other languages than English. We graded all trials for their methodological quality on a scale of 0 to 100; 16 scored 60 or more points. Only three of the better (full) publications were in English.11

    The physiotherapy literature

    Several years ago my department began to study physiotherapy. For dubious reasons the Dutch government considered classical physiotherapy to be mainstream and manual therapy to be alternative medicine. In our trial manual therapy seemed more effective for patients with persistent spine problems.12

    We started, however, with searching the literature.13 It really got out of control, partly because further trials on back pain, shoulder stiffness, and ankle sprains were also initiated. In the end we asked everybody to help us find controlled trials on exercise therapy, manipulation, or physical applications. Of course we searched Medline and Embase, but we also glanced at many unindexed journals, books, and congress reports. And we checked references endlessly.

    So far we have found about 750 randomised clinical trials (and another 750 controlled studies without random allocation). Most were done among patients with back pain (175), knee problems (114), lung dysfunction (77), shoulder stiffness (51), stroke (45), and ankle problems (43). Trials have been published in almost every journal that you can think of. Again, many of the trials that we found were not in Medline or Embase. An unindexed journal called Physiotherapy was second (with 23 trials) after Archives of Physical Medicine and Rehabilitation (38 trials). Half of all trials were published in English, but this proportion may decrease now that we are trying harder to find publications in other languages.

    As the trials come in, a special group grades the studies blindly according to their methodological quality. Unfortunately, the quality seems to be low. On a scale from 0 to 100 the median is only 40, whereas only 2% of the trials score 60 points or more. I am well aware that efficacy studies on physiotherapy are more difficult than on drugs, but one can do much better. To improve clinical research on physiotherapy my university has started a special doctoral programme for physiotherapists who are interested in a research career.

    Summing up

    I presented several examples of reviews. The first was on vitamin C and the common cold, for which there is already a megadose of literature. One extensive but non-systematic review seems to be seriously biased towards the ideas of the author. The example also shows that searching with Medline yields only a limited number of publications. However, checking the references, and the references of the references, is very rewarding.

    One of my short examples was about pyridoxine in the treatment of patients with a premenstrual syndrome. I wondered whether the researchers of recently published trials would have started their studies if they had known of all published or ongoing trials. Next I argued, using homoeopathy as an example, that good researchers do not restrict their reviews to refereed papers, or papers that are written in English. It takes less time and money to translate a paper than to do a new trial.

    Finally I told you about our large collection of trials on physiotherapy. Of course, we also use them to write reviews on physiotherapeutic topics. The BMJ published some of them, showing interest in systematic reviews.14, 15 The Cochrane Centre (now the Cochrane Collaboration) invited my colleagues and me to help with its enormous enterprise to have every old and new trial computerised - we will be glad to do that for physiotherapy.

    Ideally, all published and ongoing research on a certain subject is stored in Oxford or in one of the many collaborating centres for free use. Such a dynamic vademecum, which should be checked by every investigator who thinks of doing a clinical trial, is needed; the Cochrane Collaboration deserves all the support it can get.

    I thank the many people in the Department of Epidemiology, University of Limburg, who actually spent more time searching the literature than I did. The project was supported by several grants of the Dutch Ministry of Welfare, Public Health, and Cultural Affairs.

    References

    1. 1.
    2. 2.
    3. 3.
    4. 4.
    5. 5.
    6. 6.
    7. 7.
    8. 8.
    9. 9.
    10. 10.
    11. 11.
    12. 12.
    13. 13.
    14. 14.
    15. 15.