Sunday, December 10, 2006

Over-reliance on RCTs leading to "evidence-based paralysis"?

Via DB's Medical Rants, retired doc has posted about the limitations of the randomized controlled trial in addressing some patient, diagnostic, and/or therapeutic situations - both posts note the use of the term "evidence based paralysis" in a letter to the editor of the Archives of Internal Medicine this past summer.

The letter to the editor (full-text access requires subscription) comments that there is a lack of RCT data to support tight glycemic control in type II diabetes, in response to this study by Ziemer et al which explored the use of computerized reminders and feedback to prompt closer clinician management of patient HbA1c levels. In their response to this letter, the authors of the original study note, "...we are concerned about "RCTomyopia" (belief that clinical action can be justified only by randomized, controlled trials) and "evidence-based paralysis" (unwillingness to take action without incontrovertible proof from controlled trials)."

The RCT design is expensive and challenging to execute appropriately; it is not feasible for some issues, or may not apply to the care of specific patients (e.g. due to exclusion of relevant patients in the original RCT). Future JMLA cases will discuss potential reasons for lack of RCT data to address a given clinical situation; retired doc's post is a great lead-in to this future discussion.

Related links:
The term "evidence-based paralysis" is also used here and here and a few other sites via this Google search.

Also via the Evidence-Based Nursing and Midwifery blog, an interview with Dr. Michael Ashby (published in the HLA NEWS, the National Newsletter of Health Libraries Australia), Director of the Centre for Palliative Care at Melbourne University in Australia, notes the potential for "therapeutic paralysis" with excessive focus on requiring RCTs to support clinical practice decisions.

New: Journal of Medical Case Reports

In line with this month's JMLA case study focus on how case reports can be used to further medical knowledge, in December 2006 BioMed Central announced that it will be launching a new title, the Journal Of Medical Case Reports.

From the press release:
"Case reports that expand the field of medicine are of interest and value to clinicians. Publishing case reports helps fulfil the need to gather more comprehensive data about individual cases. This potentially valuable resource is currently neglected by any medical journals, so Journal of Medical Case Reports will provide an important outlet for these studies." says Professor Deborah Saltman, BioMed Central's Editorial Director for Medicine.
The journal is currently accepting submissions for its first issue, which is anticipated sometime early this year. With a 2000 word limit for the reports, the editorial team is seeking reports that discuss original reports of general interest in medicine, including:
1. Unreported or unusual side effects or adverse interactions involving medications
2. Unexpected or unusual presentations of a disease
3. New associations or variations in disease processes
4. Presentations, diagnoses and/or management of new and emerging diseases
5. An unexpected association between diseases or symptoms
6. An unexpected event in the course of observing or treating a patient
7. Findings that shed new light on the possible pathogenesis of a disease or an adverse effect

Monday, December 04, 2006

World Health Organization recommendations for guideline development

The BioMed Central journal Health Research Policy and Systems is publishing a new series of reviews from the World Health Organization, outlining a WHO subcommittee's recommendations for the components of the practice guideline development process.

From the introductory article by Oxman et al:
In 2005 the World Health Organisation (WHO) asked its Advisory Committee on Health Research (ACHR) for advice on ways in which WHO can improve the use of research evidence in the development of recommendations, including guidelines and policies. The ACHR established the Subcommittee on the Use of Research Evidence (SURE) to collect background documentation and consult widely among WHO staff, international experts and end users of WHO recommendations to inform its advice to WHO. We have prepared a series of reviews of methods that are used in the development of guidelines as part of this background documentation.
The series will eventually include 16 reviews; currently, seven are posted on the journal's site:
1. Guidelines for guidelines
2. Priority setting
3. Group composition and consultation process
4. Managing conflicts of interests
5. Group processes
6. Determining which outcomes are important
7. Deciding what evidence to include

The 7th installment is probably closest to the decisions we're making in answering the question for this issue's case. The authors' prompts for deciding how to select evidence to address guideline topics include:
* The most important type of evidence for informing global recommendations is evidence of the effects of the options (interventions or actions) that are considered in a recommendation. This evidence is essential, but not sufficient for making recommendations about what to do. Other types of required evidence are largely context specific.
* The study designs to be included in a review should be dictated by the interventions and outcomes being considered. A decision about how broad a range of study designs to consider should be made in relationship to the characteristics of the interventions being considered, what evidence is available, and the time and resources available.
* There is uncertainty regarding what study designs to include for some specific types of questions, particularly for questions regarding population interventions, harmful effects and interventions where there is only limited human evidence.
* Decisions about the range of study designs to include should be made
explicitly.
* Great caution should be taken to avoid confusing a lack of evidence with evidence of no effect, and to acknowledge uncertainty.
* Expert opinion is not a type of study design and should not be used as evidence. The evidence (experience or observations) that is the basis of expert opinions should be identified and appraised in a systematic and transparent way.
Future installments in the series will address other components relevant to our question, including searching for evidence, synthesizing research, presenting results, and evidence grading strategies.