1.gif (1892 bytes)

Editorial

Indian Pediatrics 2001; 38: 1349-1353  

Using Evidence to Improve Clinical Practice


Doing what’s best for people and communities is the aim of health professionals, but are our decisions based on reliable research? Evidence-based medicine aims for health care decisions that draw on relevant reliable research findings from across the world. Randomized controlled trials help us delineate the relatively modest comparative advantages we anticipate for many health care interventions, and systematic reviews of randomized controlled trials summarize these data using methods that minimize bias and enhance reliability(1,2). The Cochrane Collaboration is an inter-national organization that has done much to highlight the need for reliable reviews to inform practice, and has now generated some 2000 reviews across many health care specialties.

In the early 1990’s, some policy makers and clinicians expected evidence-based approaches to lead a revolution in care, with a cheaper, more efficient service. In the event, evidence-based medicine is having a substantive impact on the way health professionals do things and think, rather than a prescription for clinical reform. Evidence-based approaches are now central to undergraduate and postgraduate medical training, and it is clear that health professionals need to learn new skills in interpreting and using evidence in clinical practice.

Expectations and Realities

Why then have we not seen a revolution in health care with substantive changes in what we do? The mechanism for change is appealing: take a health problem, carry out systematic reviews about benefits and harms, use the information to decide best practice, then implement it. The problem is that the process for change is based on three assumptions: that the knowledge is there, hidden away in the existing literature; that knowledge is pure in some way and does not need interpreting; and that knowledge deter-mines behavior. Unfortunately, experience suggests that none of these assumptions are true.

Systematic reviews generate new knowledge when they bring together many small studies. For example, a Cochrane Review of amodiaquine for malaria from many small studies mainly in Africa showed convincingly that the drug was more effective than chloroquine(2,4). Systematic reviews may demonstrate uncertainty: for example, growth monitoring in children is widely practiced in developing countries, but the trial from George and colleagues showed no benefit, and the systematic review of the topic raised doubts about whether the practice is worthwhile(5,6). Similarly, the World Bank promotes routine anthelmintic drugs for school children every six months: the trial from Awasthi and colleagues highlighted the potential of the intervention, and the systematic review underlined this but expressed the opinion that further research was needed before countries invested their scarce resources in large community programs(7,8). Taking an example relevant to hospital doctors, the systematic review of steroids for tubercular meningitis suggested potential benefit, but as yet it is unproven(9).

The second assumption made is that research evidence is somehow pure, and that people producing systematic reviews are engaged in a process of presenting information impartially for the reader to decide. But systematic reviews are like any other form of research, subject to bias and influence from the researchers and others. As we have seen recently, people have views about what data should be taken into account and how it should be presented. Cochrane reviewers and editors could not agree on what data to include and how the findings should be interpreted in a review of breast cancer screening(10,11). Exactly the same data in a meta-analysis can be interpreted in different ways: we showed a meta-analysis of artemether versus quinine for severe malaria to a clinician and a pharmacist working in the same hospital in Zimbabwe, and they reached diametrically opposite conclusions about which drug was best (PG, personal observations)(12).

The third assumption is that knowledge leads providers to change their practice, and yet we all know this is simplistic. Decisions about health care are based on habit, past training, what the Professor does and what the health service allows; and, at times, may depend on an individual clinicians’ pre-ferences, including the benefits to him or her from the sale of drugs or diagnostics arising from the consultation(13).

Changing for the Better

Systematic reviews rarely provide simple answers to best practice, and it is naive to expect this. What they have done is to shake up the medical profession globally: we have accepted "best-guess" as sufficient to make decisions about other people’s lives for years, but the evidence-based approach is forcing us to be honest about our uncertainty. We need this to move forward. We have to admit what we don’t know to help us delineate the research we need to find out the answer. The simple fact people are questioning practices is a good thing, as it helps clinicians to learn and move medicine forward.

Systematic reviews sometimes provide clear messages. Where current practice varies from this research evidence, this provides a mandate for change. Organizations, Ministries and clinicians are grappling with how to identify these areas, and how to bring about the change required to make care more compatible with the research evidence. Active players include the World Health Organization’s technical departments and Essential Drugs Program; national governments in several developing countries; and individual hospital and healthcare providers scattered across the world. The Effective Health Care Alliance Program is a network of such people, and our collective experience suggests the following three practical points may help people and organizations that want to translate research into policy and practice.

Focus

Translating research into policy and practice needs a focus. Suitable topics should show benefit over existing interventions, and be cheap and easy to implement. For example, reduced osmolarity oral rehydration salt seems to be more effective for diarrhea in children, and a change in formula is currently being considered by WHO and UNICEF; and caregiver support during labor seems to help both mother and infant(14,15). So people responsible for health care need to stay in touch with research evidence and seek out the reviews that have important implications for policies and guidelines.

Change is also a priority for interventions which are widely practised but where the evidence suggests no benefit. Change in practice is even more important if these interventions use resources, have the potential to do harm, or are unpleasant for the patient. For example, routine episiotomy for vaginal delivery is still widely practised in some developing countries, despite evidence of harm, that it prolongs hospital stay, and hurts(16,17). Human albumin is widely used to treat shock, yet it is expensive, and has no benefit over saline, and may even be associated with a higher mortality(18). These are target areas that change could improve care and reduce costs.

Understand the Science

Commonly policy makers and clinicians view their own populations and patients as special, somehow different from everyone else, and in whom the evidence from reviews is not applicable. There are some dangers here, as if this is the case why bother to do research in the first place? However, these comments often reflect that evidence-based specialists have not communicated well the science behind their methods, so that individuals are able to themselves critically appraise systematic reviews and use the information in their own setting.

Policy makers, clinicians and user group agents need to understand research evidence and have skills in interpreting and using it. People need to understand where the messages are coming from and how they are generated, particularly as they may change over time. We have found engaging people in conducting systematic reviews is a good way of people understanding what evidence is, and that formal training in critical appraisal of systematic reviews is central to interpreting them.

Table I__Strategy to Influence Midwifery Practice in Hospitals
The purpose of the Better Births Initiative is to improve maternity care by:
1. Identifying specific changes that are achievable, and could dramatically improve women’s experiences during labor.
Specific changes identified include
(a) Encouraging a partner, friend relative or lay carer to support women during labor.
(b) Stopping routine procedures that are of no proven benefit, particularly if they are embarrassing or uncomfortable (for example, shaving, supine position for birth).
(c) Avoiding making interventions routine where threre is no evidence of benefit. This includes: routine enemas, routine restriction to bed, routine intravenous fluids, and routine episiotomy.
2. Developing and testing innovative methods to bring about these changes.
3. Developing an agreed strategy (the Better Births Initiative) which is simple, accessible and applicable to low-income countries.
4. Implementing the strategy in local spheres of influence.
5. Encouraging others to adopt the package.

 

Change is a social process

Health care provision takes place within social networks and organizations that are like living organisms, with complicated politics and random happenings influencing every event and decision. It is clear from systematic reviews of behavior change that effective interventions for change in practice should be multi-dimensional and focused(19). Providers need small group, interactive educational interventions that engage them in the process of change, encourage critical thinking about current practice, and increase individual capability(20).

What is more important to remember is that change is likely to be incremental and random(21). Classic linear models that describe change as a logical and planned process are inappropriately applied to health professional behavior. The origins of changes in practice are more likely to be smaller experiments that, if successful, are accepted and transformed into action. Complexity science acknowledges the complicated nature of health systems and interactions between those working within them, and suggests that an organic approach to change is best(22). So we need to keep trying different approaches to change, and not to give up when things do not work as change cannot be predicted. What is important is to find approaches that do seem to be effective, and look at ways to apply them in other settings.

With collaborators, we have been exploring ways to make childbirth care in developing countries more evidence-based. The ‘Better Births Initiative’ is an innovative approach using attractive, interactive materials (including workbooks, posters, video presentations, and self-audit) drawing on the principles of evidence-based medicine and relevant systematic reviews. This is being tested in educational workshops with mid-wives in South Africa (Table I)(23). Early results suggest that complex theory holds true; change happens sometimes, it cannot be predicted, and it takes time.

We believe that translating evidence into practice is a complex process to institu-tionalize, but needs us all to understand the principles and see where each of us as individuals has an opportunity to influence change and do our job better.

Acknowledgements

To the Department for International Development, UK for their funding. The views expressed are entirely those of the authors.

Competing interests: We are both employed through a U.K. Government funded research grant that aims to promote evidence-based approaches in middle and low income countries.

Paul Garner,
Helen Smith,

Cochrane Infectious Diseases Group,
Effective Health Care Alliance Program,
Liverpool School of Tropical Medicine, U.K.


Correspondence to:
Professor Paul Garner,
International Health Division,
Liverpool School of Tropical Medicine, 
Pembroke Place, Liverpool L3 5QA, U.K.,
E-mail: [email protected]

Key Messages

• Health professionals need to understand the principles of evidence-based medicine and be able to critically appraise systematic reviews.

• For effective change, we need to focus change on common interventions where practice appears to vary from the evidence.

• Changing practice is chaotic. We need a variety of approaches and cannot expect success every time.


 References


1. Collins R, Peto R, Gray R, Parish S. Large-scale randomized evidence: Trials and overviews. In: Oxford Textbook of Medicine, 3rd edn. Eds. Weatherall DJ, Ledingham JGG, Warrell DA, New York, Oxford University Press, 1996.

2. Clarke M, Oxman AD, Cochrane Reviewers Handbook 4.1.2 [updated March 2001]. In: The Cochrane Library, Issue 2, 2001. Oxford, Update Software. Updated quarterly.

3. Olliaro P, Nevill C, Ringwald P, Mussano P, Garner P, Brasseur P. Systematic review of amodiaquine treatment in uncomplicated malaria. Lancet 1996; 348: 1196-1201.

4. Olliaro P, Mussano P. Amodiaquine for treating malaria (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxford, Update Software.

5. George SM, Latham MC, Abel R, Ethirajan N, Frongillo EA. Evaluation of effectiveness of good growth monitoring in South Indian villages. Lancet 1993; 342: 348-352.

6. Panpanich R, Garner P. Growth monitoring in children (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxford Update Software.

7. Awasthi S, Pande V, Fletcher R. Effective-ness and cost-effectiveness of albendazole in improving nutritional status of pre-school children in the urban slums. Indian Pediatr 2000; 37: 19-29.

8. Dickson R, Awasthi S, Dewelueke C, Williamran P. Anthelmintic drugs for treating worms in children: Effects on Growth and Cognitive Performance (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxforde Update Software.

9. Prasad K, Volmink J, Menon GR. Steroids for treating tuberculous meningitis (Cochrane Review). In: The Cochrane Libary, Issue 2, 2001. Oxford Update Software.

10. Olsen O, Gotzsche PC. Cochrane review on screening for breast cancer with mammo-graphy. Lancet 2001; 358: 1340-1342.

11. Horton R. Screening mammography-an overview revisited. Lancet 2001; 358: 1284-1285.

12. McIntosh HM, Olliaro P. Artemisinin derivatives for treating severe malaria (Coch-rane review). In: The cochrane Library, Issue 2, 2001. Oxford Update Software.

13. Garner P, Kale R, Dickson R, Dans T, Salinas R. Implementing research findings in developing countries. BMJ 1998; 317: 531-553.

14. Kim Y, Hahn S, Garner P. Reduced osmo-larity oral rehydration solution for treating dehydration caused by acute diarrhea in children (Cochrane Review). In: The Coch-rane Library, Issue 2, 2001. Oxford Update Software.

15. Hodnett ED. Caregiver support for women during childbirth (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxford Update Software.

16. Carroli G, Belizan J. Episiotomy for vaginal birth (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxford Update Software.

17. Maduma-Butshe A, Dyall A, Garner P. Routine episiotomy in developing countries. BMJ 1998; 316: 1179.

18. The Albumin Reviewers (Alderson P, Bunn F, Lefebvre C, Li Wan Po A, Li L, Roberts I, Schierhout G). Human albumin solution for resuscitation and volume expansion in critically ill patients (Cochrane Review). In: The Cochrane Library, Issue 2, 2001. Oxford Update Software.

19. NHS Centre for Reviews and Dissemination. Getting Research findings into practice. Effective Health Care Bulletin Volumes 5(1). York, University of York, 1999.

20. Fraser S, Greenhalgh T. Coping with complexity: Educating for capability. BMJ 2001, 323: 799-803.

21. Plesk P, Greenhalgh T. The challenge of complexity in health care. BMJ 2001; 323: 625-628.

22. Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New health System for the 21st Century. Washington DC, National Academy Press, 2001.

23. http://www.liv.ac.uk/lstm/bbimainpage.html

Home

Past Issue

About IP

About IAP

Feedback

Links

 Author Info.

  Subscription