• Center on Health Equity & Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Changing Physician Behavior: What Works?

Publication
Article
The American Journal of Managed CareJanuary 2015
Volume 21
Issue 1

The authors evaluate methods for implementing clinical research and guidelines, in order to change physician practice patterns, in surgical and general practice.

ABSTRACT

Objectives

There are various interventions for guideline implementation in clinical practice, but the effects of these interventions are generally unclear. We conducted a systematic review to identify effective methods of implementing clinical research findings and clinical guidelines to change physician practice patterns, in surgical and general practice.

Study Design

Systematic review of reviews.

Methods

We searched electronic databases (MEDLINE, EMBASE, and PubMed) for systematic reviews published in English that evaluated the effectiveness of different implementation methods. Two reviewers independently assessed eligibility for inclusion and methodological quality, and extracted relevant data.

Results

Fourteen reviews covering a wide range of interventions were identified. The intervention methods used include: audit and feedback, computerized decision support systems, continuing medical education, financial incentives, local opinion leaders, marketing, passive dissemination of information, patient-mediated interventions, reminders, and multifaceted interventions. Active approaches, such as academic detailing, led to greater effects than traditional passive approaches. According to the findings of 3 reviews, 71% of studies included in these reviews showed positive change in physician behavior when exposed to active educational methods and multifaceted interventions.

Conclusions

Active forms of continuing medical education and multifaceted interventions were found to be the most effective methods for implementing guidelines into general practice. Additionally, active approaches to changing physician performance were shown to improve practice to a greater extent than traditional passive methods. Further primary research is necessary to evaluate the effectiveness of these methods in a surgical setting.

Am J Manag Care. 2015;21(1):75-84

This paper compares implementation methods that are currently dispersed among literature and specialties. Findings show that commonly used passive interventions in practice today (eg, printed educational material) are less effective than active methods (eg, continuing medical education workshops and tailored multifaceted interventions).

  • Our study indicates that practices should focus on implementation of active methods to change physician behavior and limit use of passive dissemination of educational material or formal didactic conferences.
  • Future research should focus on testing and adapting these implementation methods to specific environments, such as surgery.
  • The cost-effectiveness of these interventions should be studied in future research.

There is increasing recognition of the difficulty in translating research evidence and clinical guidelines into practice, which has resulted in the development of many active dissemination and implementation strategies.1 Although there is a substantial amount of primary research evidence concerning the effectiveness of various implementation methods, it is extensively dispersed amongst medical specialties.1

Research evidence should ideally inform the development of clinical practice guidelines (CPGs).2 CPGs are systematically developed and updated, and are evidence-based.3 They provide physicians with a framework for diagnosing, assessing, and treating clinical conditions commonly encountered in practice, and are developed to promote best practices for patient populations. The implementation of these guidelines is important to help improve the quality and consistency of care in clinical situations by changing physician practice patterns.3

Figure

The demonstrates the steps involved in implementing research findings into medical practice. After dissemination of CPGs, there are 6 main factors specific to healthcare providers that effect guideline adoption into practice and physician behavior: guideline implementation, characteristics of practice, laws and incentives, patient characteristics/problems, social norms, and knowledge and skills.4,5 Understanding the relative effectiveness of guideline implementation methods is necessary to changing physician behavior for the better and improving patient outcomes.5

Overall, just a small number of reviews unify different methods and evaluate the implementation methods proven most successful in changing physician behavior.2 Moreover, there is a lack of research on intervention methods specific to surgery.1 We have undertaken a systematic review to inform changing physician practice patterns by evaluating methods for implementing clinical research and guidelines. More specifically, we addressed the following research question: in surgical and general practice, through what methods are clinical research results, as well as guidelines, best implemented to change physician practice patterns? A secondary focus of this study was to determine implementation methods shown to be effective in a surgical setting, specifically in orthopedics.

METHODS

Assessment of Eligibility

Reviews fulfilling the following criteria (1-4 and either condition 5 or 6) were eligible for inclusion: 1) topic linking research/guideline to practice, 2) education or other implementation method of guidelines, 3) systematic reviews, 4) English language, 5) all surgery types and postoperative care, and 6) general practice (hospital and private).

Reviews were excluded for the following reasons: 1) published before 1970; 2) guidelines publications; 3) conference and letter reviews; 4) reviews focused on nonsurgical topics (ie, those that are too specialized to be considered general practice, as they only focus on a specific condition [eg, stroke/cardiology, urology, gynecology, pathology/bacterial infection, dentistry, rehab, nursing, palliative care, pharmaceuticals/analgesics, psychiatric, vaccination, and diabetes]); and 5) literature reviews.

Recent changes to medicine and technology rendered any evidence published before 1970 of limited relevance to current and future practice. Reviews were screened independently in duplicate at the title, abstract, and full-text stage based on the eligibility criteria. All disagreements were resolved by a consensus process that required the reviewers to discuss their rationale for their decisions.

Identification of Reviews

In order to identify appropriate search terms, 2 reviewers (whose anonymity will be maintained) each conducted an independent preliminary electronic search and selected 5 to 6 reviews appropriate to the topic. Key terms were identified from these reviews to develop a broad search strategy.

eAppendix Table 1

We searched the electronic databases EMBASE, MEDLINE, and PubMed for relevant articles published prior to October 6, 2012, using a combination of the identified search terms (, available at www.ajmc.com).

Critical Appraisal Score

eAppendix Table 2

The methodological quality of each included review was independently graded, using an adapted version of the AMSTAR critical appraisal tool.6 The AMSTAR tool has been shown to have good construct validity.6 A copy of this adapted appraisal score tool can be found in .

For the critical appraisal scores, a “yes” to an assessment question was given a score of 1. The maximum score for the adapted AMSTAR questionnaire was 11, and the minimum score was 0. We specified that a score of 9 or higher indicated high methodological quality, 5 to 8 moderate quality, and 4 or less low quality. The reviewers resolved discrepancies for each item through discussion and re-evaluation of the study methodology until a consensus was reached.

Assessment of Agreement

Inter-rater agreement for each screening step was conducted using a weighted kappa (κ) statistic. Cohen’s κ values of less than 0 were rated as less than chance agreement; 0.01-0.20, slight agreement; 0.21-0.40, fair agreement; 0.41-0.60, moderate agreement; 0.61-0.80, substantial agreement; and >0.80, high agreement.7

The inter-rater agreement for the critical appraisal score was determined using an intraclass correlation coefficient (ICC; 2-way mixed model, single measure). ICC values were interpreted as follows: 0 to 0.2, poor agreement; 0.3 to 0.4, fair agreement; 0.5 to 0.6, moderate agreement; 0.7 to 0.8, strong agreement; and >0.8, high agreement.

All statistical analyses were conducted using SPSS v18.0 (IBM Corp, Armonk, New York).

Data Collection and Data Abstraction

Two reviewers independently abstracted data from the full-text articles using a previously piloted data abstraction form. They abstracted information on the form of implementation method, the subject focus group (eg, general physicians, surgeons, medical students, etc), number of included studies’ outcome measures (physician performance outcomes and patient outcomes), definition of effectiveness, and the most effective and least effective implementation methods recommended by the review.

After data abstraction, the reviews were grouped based on implementation method: audit and feedback, continuing medical education, other interventions, and comparison of interventions. The “other interventions” category included reviews that analyzed incentives, decision support systems, journals, and printed educational materials (PEMs) as single interventions. The last category consisted of reviews that compared the effectiveness of 2 or more interventions.

Finally, a spectrum of least effective to most effective intervention methods was created to summarize the findings of this paper. To create this spectrum, the relative effectiveness of the different implementation methods was compared by considering the AMSTAR scores of the reviews. Results from studies with high methodological quality were given greater weight than results of studies with lower methodological quality.8

Statistical Analysis

Descriptive statistics were presented for all systematic reviews that reported them. For reviews that compared 2 or more interventions, we reported adjusted relative risk (ARR) and adjusted risk difference (ARD) values if included in the review. For reviews assessing a single intervention, we presented the percentage of compliance of physicians with guidelines or the percentage of studies included in the review that showed significant change in outcome measure. Lastly, implementation methods were deduced highly effective, moderately effective, and ineffective based on description by the authors of the original reviews.

RESULTS

Included Studies

eAppendix Figure

Our literature search identified 1592 potentially relevant citations, of which, 14 reviews were included ().

Inter-rater agreement was fair for the title screening stage (κ = 0.480; 95% CI, 0.439-0.52) and moderate for the abstract screening stage (κ = 0.726; 95% CI, 0.661-0.791) and full text screening stage (κ = 0.560; 95% CI, 0.454-0.665).

Study Quality

Table 1

We judged 5 reviews to be of high methodological quality,9-13 5 reviews to be of moderate quality,14-18 and the remaining 4 reviews to be low quality ().2,5,19,20 There was high agreement between reviewers for the critical appraisal score (0.946; 95% CI, 0.848-0.974).

Study Characteristics

Table 2

Various intervention methods were used, including audit and feedback, computerized decision support systems, continuing medical education, financial incentives, local opinion leaders, marketing, passive dissemination of information, patient-mediated interventions, reminders, and multifaceted interventions. These interventions and additional terminology used in this paper are defined in . Table 1 also summarizes some of the general study findings. Six reviews had qualitative measures and did not define specific parameters on how they identified implementation methods as highly effective, moderately effective, or ineffective.2,5,14,16,18,20

Audit and Feedback

Three reviews discussed the effectiveness of audit and feedback in changing physician performance.10,17,20 The outcome measures were based on patient outcomes (eg, blood pressure) and physician performance (eg, prescribing). Two reviews by Jamtvedt et al (118 studies) evaluated multifaceted interventions, including audit and feedback, as more effective (ARR of physician compliance ranged from 0.99 to 1.30) compared with no intervention or audit alone or feedback alone.10,17 In one review, Jamtvedt et al reported that the ARD of audit and feedback versus no intervention varied from —0.16 (a 16% decrease in compliance) to 0.70 (a 70% increase in compliance) in different studies; thus, the results were inconclusive.10 The review by Mugford et al20 showed that feedback alone is most effective when presented close to the time of decision making in clinical practice.

Continuing Medical Education

Table 3

Four reviews discussed the effectiveness of active forms of continuing medical education (CME) in changing physician performance and patient outcome.11,12,14,15 Interventions were considered effective if they lead to improved professional performance (ie, prescribing behavior that complies with evidence-based guidelines) or patient health outcomes. The majority of the studies (58%) in a review by Davis et al reported that active forms of CME improved physician performance by the physicians meeting practice objectives over a period of 30 days to 1 year or longer.11 Within 77% of studies in this review, internet-based CME, such as webinars or online teaching modules and Internet, improved or maintained physician performance.11 Two of the reviews (64 studies) evaluated multifaceted implementation methods that included CME.14,15 Multifaceted activities—which included active interventions such as reminders, patient-mediated interventions, outreach visits, and use of opinion leaders—were shown to be effective in 70% of the reviews.15 See for specific intervention examples and recommendations for effective methods. Additionally, learning linked to clinical practice and self-directed multifaceted active educational methods both resulted in improved physician performance.14

Furthermore, reviews showed that formal didactic conferences and passive forms of CME, such as brochures or PEMs, are the least effective methods for change and, at best, create small changes within practice.15,16 Other forms of passive dissemination, such as mailing PEMs to clinicians, were also deemed ineffective in changing physician behavior when used alone.19 One review (32 studies) reported that PEMs generally had a small effect, and in comparison to education workshops, the risk difference (in this case, based on physician outcomes of smoking cessation activities for patients) was +0.5% in favor of PEMs.12 However, PEMs may be effective for raising awareness about a specific behavior change.19 It is important to recognize that these passive approaches represent the most common approaches adopted by various healthcare organizations. To ensure that practice changes, specific strategies to apply research-based recommendations, including active interventions, need to be implemented.19

Other Interventions

Two reviews discussed other intervention methods, including incentives and decision-support systems, in changing physician performance and patient outcomes. 9,13 In one review13 (42 studies), incentives proved to be effective in the form of payment for each service (70% of studies in the review), payment for providing care for a patient or specific population (69% of studies), or payment for providing a change in the activity or quality of care (85% of studies).13 One review (70 studies) identified that decision-support systems improved patient results if they were activated automatically as part of the clinician work flow.9 Reportedly, these systems significantly improved clinical practice in 68% of studies.9

Comparison of Interventions

Five reviews reported comparing the effectiveness of several intervention methods in changing physician performance and patient outcomes.2,5,16,18,19 Four of the reviews compared multiple intervention methods (including audit and feedback, reminder, and CME) and determined that multifaceted interventions were most effective in changing physician practice patterns.2,5,16,18 Multifaceted interventions included a combination of active interventions: audit and feedback, reminders, local consensus or marketing, academic outreach, and interactive education. The final review by Grimshaw et al concluded, based on qualitative data, that multifaceted interventions are more likely to be effective compared with single interventions.19

Additionally, interactive educational methods were identified in 3 reviews as highly effective single intervention methods for changing physician practice patterns. 2,16,18 Interactive educational methods or active forms of CME are nondidactic or lecture-based learning, focus on facilitating physician discussion, and link educational experience to the physician’s clinical cases.16,18 Reminders (concurrent and automatic) were also recommended due to consistent positive results.5,16,18,19 Conversely, didactic lecture-based CME and passive dissemination of PEMs were identified as weak interventions to change physician practice patterns.2,5,16,18,19

Taking into account the methodological quality score of the review based on AMSTAR, multifaceted interventions and active forms of CME were rated the most effective implementation methods to change physician behavior for a desired outcome.

DISCUSSION

Various implementation methods are utilized to try to change physician behavior, and implementing the most effective ones is crucial to success. Our findings provide a comparison of relative effectiveness of various interventions, indicating that active forms of CME and multifaceted interventions are the most effective. In general, active approaches to changing physician performance have been shown to improve practice to a greater extent than traditional passive methods. Active CME approaches include academic detailing, outreach programs, and workshops (more details are provided in Table 3).

In addition, integrating clinical decision support systems, reminders, and patient-mediated interventions are noneducational active methods that help physicians to change their practice to meet guidelines. In contrast, passive interventions, including passive dissemination of information, PEMs, and didactic formal educational programs, were proved useful for creating awareness, but not for physician behavioral changes. Several reviews suggest that although less expensive, passive interventions are clearly less effective.21 Overall, the evidence supports implementing active, multifaceted intervention methods to most effectively change physician performance.

Our findings are similar to those of other systematic reviews that covered the implementation of guidelines and research into physician practice. Grimshaw et al22 noted as well that multifaceted interventions are more effective than a single intervention; however, they stated that the effect did not increase incrementally with the number of components in a multifaceted intervention. The intervention components must be selected to align with the specific needs of a given clinical practice.23 Furthermore, Grimshaw et al22 reported that the majority of interventions resulted in modest to moderate improvements in care. There is an imperfect evidence base to support decisions about which implementation strategies and forms of guideline dissemination are likely to be most effective under various circumstances. Grimshaw et al22 outline several factors to consider, including potential clinical areas that would benefit from effectiveness activities, likely benefits and costs required to introduce guidelines, and likely benefits and costs as a result of any changes in behavior by the provider.

Strengths and Limitations

This paper has several important strengths. First, the quality of each review was taken into consideration so that our eventual conclusions about the most effective interventions would be compelling. Second, the electronic searches, study selection, quality assessment, and data abstraction were all conducted in duplicate to increase reliability and to reduce the likelihood of expectation and reporting bias in the findings.

Although the results of this study may be relevant to implementing clinical guidelines in practice, some limitations must be noted. First, the reviews included are highly heterogeneous, meaning that the interventions and outcomes differed considerably. For instance, high, moderate, and low effectiveness are defined by different parameters in each review. Indeed, some studies do not report these parameters and only make qualitative judgment on the effectiveness of interventions. Additionally, many of the included reviews highlight heterogeneity as a limitation to making conclusive recommendations when comparing all techniques that attempt to change physician behavior. We attempted to consider this heterogeneity by comparing results of similar interventions, precluding the pooling of data, and representing all review findings individually. Furthermore, risk of publication bias may exist since only reviews published in the English language were included. However, attempts were made to include articles from numerous countries that utilized various methodologies (although it must be noted that results from lower-quality studies were given less weight in our evaluation).

Future Directions

The focus of the included reviews was on clinicians, generally. These results may not be generalizable to all specialties, such as surgery. A secondary focus of this study was to determine specific implementation methods proven effective in a surgical setting. However, no studies were identified that looked at guideline implementation techniques in surgery, specifically orthopedics. Thus, this is a sparse area of research, and future studies might focus on finding the most effective educational programs for implementing guidelines in surgical practice.

Furthermore, only 1 review covered in this paper provided a cost-effectiveness analysis for CME,18 which concluded that a CME program for physicians on angiotensin-converting enzyme inhibitors cost $2062 per life-year saved.18 Five other reviews recommended that cost-effectiveness be included when analyzing implementation methods.2,5,10,16,19 Grimshaw et al, in a comprehensive review as well, detailed that the quality of usable data on the financial cost of each implementation method is relatively low in current research.22 Implementing research findings incurs costs, and, in some circumstances, these costs can outweigh their potential benefits.22 Decisions about implementation methods for a clinical practice should consider cost as well as other factors.

Lastly, 1 review included a subset analysis to determine if CME had long-term effects on physician behavior.11 In this review, 50% of the studies showed that CME interventions, including follow-up for a year, resulted in long-term changes to physician performance.11 Other reviews recorded that few studies measured the extended effects of behavior changes,14 although implementation methods with long-term effects are clearly beneficial because they sustain positive change in physicians’ practices. Future research should include a study of long-term follow-up to determine if implementation interventions are enduring.

CONCLUSIONS

Multifaceted implementation methods and active forms of CME are highly effective in changing physician practice, while PEMs were shown to be the least effective. Given the challenges in this systematic review, additional longitudinal reviews, which monitor subjects over an extended period of time to offer clear insights into the long-term effects on practice, as well as the cost-effectiveness of the implementation methods, are required. Additionally, focused research is required to look at guideline implementation methods specific to medical fields such as orthopedic surgery.Author Affiliations: Clinical Epidemiology and Biostatistics (FM, CR, NS) and the Department of Surgery, Division of Orthopaedic Surgery (MB), McMaster University, Hamilton, ON, Canada.

Source of Funding: No funding was received for this study. Dr Bhandari is funded, in part, by a Canada Research Chair.

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (FM, CR, MB); acquisition of data (FM, CR, NS); analysis and interpretation of data (FM, CR, NS, MB); drafting of the manuscript (FM, CR, MB); critical revision of the manuscript for important intellectual content (FM, CR, NS, MB); statistical analysis (FM, CR, NS, MB); administrative, technical, or logistic support (NS); supervision (MB).

Address correspondence to: Mohit Bhandari, MD, PhD, FRCSC, 293 Wellington St N, Ste 110, Hamilton, ON, L8L 8E7. E-mail: bhandam@mcmaster.ca.REFERENCES

1. Ceccato NE, Ferris LE, Manuel D, Grimshaw JM. Adopting health behavior change theory throughout the clinical practice guideline process. J Contin Educ Health Prof. 2007;27(4):201-207.

2. Yen BM. Engaging physicians to change practice. J Clin Outcomes Manag. 2006;13(2):103-110.

3. Turner T, Misso M, Harris C, Green S. Development of evidence-based clinical practice guidelines (CPGs): comparing approaches. Implement Sci. 2008;3:45.

4. Smith WR. Evidence for the effectiveness of techniques to change physician behavior. Chest. 2000;118(2)(suppl):8S-17S.

5. Davis DA, Taylor-Vaisey A. Translating guidelines into practice. a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines. CMAJ. 1997;157(4):408-416.

6. Shea BJ, Grimshaw JM, Wells GA, et al. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7:10.

7. Cohen J. Weighted kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychol Bull. 1968;70(4):213-220.

8. Guyatt G, Rennie D, Meade MO, Cook DJ. Users’ Guides to the Medical Literature: A Manual for Evidence-based Clinical Practice. 2nd ed. New York, NY: McGraw-Hill Education, LLC; 2008.

9. Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765.

10. Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2006;(2):CD000259.

11. Davis D, Galbraith R; American College of Chest Physicians Health and Science Policy Committee. Continuing medical education effect on practice performance: effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009;135(3)(suppl):42S-48S.

12. Farmer AP, Légaré F, Turcot L, et al. Printed educational materials: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2008;(3):CD004398.

13. Flodgren G, Eccles MP, Shepperd S, Scott A, Parmelli E, Beyer FR. An overview of reviews evaluating the effectiveness of financial incentives in changing healthcare professional behaviours and patient outcomes. Cochrane Database Syst Rev. 2011;7:CD009255.

14. Cantillon P, Jones R. Does continuing medical education in general practice make a difference? BMJ. 1999;318(7193):1276-1279.

15. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. a systematic review of the effect of continuing medical education strategies. JAMA. 1995;274(9):700-705.

16. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ. 1998;317(7156):465-468.

17. Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Does telling people what they have been doing change what they do? a systematic review of the effects of audit and feedback. Qual Saf Health Care. 2006;15(6):433-436.

18. Bloom BS. Effects of continuing medical education on improving physician clinical care and patient health: a review of systematic reviews. Int J Technol Assess Health Care. 2005;21(3):380-385.

19. Grimshaw JM, Eccles MP, Walker AE, Thomas RE. Changing physicians’ behavior: what works and thoughts on getting more things to work. J Contin Educ Health Prof. 2002;22(4):237-243.

20. Mugford M, Banfield P, O’Hanlon M. Effects of feedback of information on clinical practice: a review. BMJ. 1991;303(6799):398-402.

21. NHS Centre for Reviews and Dissemination. Getting evidence into practice. Eff Health Care. 1999;5(1):1-16.

22. Grimshaw JM, Thomas RE, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8(6):iii-iv,1-72. Review.

23. Haynes B, Haines A. Barriers and bridges to evidence based clinical practice. BMJ. 1998;317(7153):273-276.

Related Videos
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.