• Center on Health Equity & Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Assessing Medical Home Mechanisms: Certification, Asthma Education, and Outcomes

Publication
Article
The American Journal of Managed CareMarch 2018
Volume 24
Issue 3

Using statewide quality data for medical home–eligible clinics, we tested asthma education as a clinical mechanism whereby medical homes achieve better asthma outcomes.

ABSTRACT

Objectives: Patient-centered medical homes (PCMHs) represent a widespread model of healthcare transformation. Despite evidence that PCMHs can improve care quality, the mechanisms by which they improve outcomes are relatively unexamined. We aimed to assess the mechanisms linking certification as a Health Care Home (HCH), a statewide PCMH initiative, with asthma care quality and outcomes. We compared direct certification effects versus indirect clinical effects (via improved care process).

Study Design: This was an observational study using statewide patient-level data on asthma care quality and asthma outcomes.

Methods: This study examined care quality for 296,662 adults and children with asthma in 501 HCH-certified and non-HCH clinics in Minnesota from 2010 to 2013. Using endogenous treatment effects models, we assessed the effects of HCH certification on care process (patient education using asthma action plans [AAPs]) and outcomes (asthma controlled; having no exacerbations) and asthma education’s effect on outcomes. We used logistic regression to formally decompose direct (certification) versus indirect (via education/AAPs) effects.

Results: Adults’ adjusted rates of process and outcomes targets were double for HCH versus non-HCH clinics; children’s rates were also significantly higher for HCHs. Tests of the indirect/care process effect showed that rates of meeting outcomes targets were 7 to 9 times higher with education using an AAP. Decomposition indicated that the indirect effect (via education/AAPs) constituted 16% to 35% of the total HCH effect on outcomes.

Conclusions: HCHs were associated with better asthma care and outcomes. Asthma education with AAPs also was associated with better outcomes despite being a minority of HCHs’ total effect. These findings suggest that HCHs improve outcomes partially via increased care management activity, but also via other mechanisms (eg, electronic health records, registries).

Am J Manag Care. 2018;24(3):e79-e85Takeaway Points

  • We used statewide quality reporting data in medical home and non—medical home clinics to assess whether education using asthma action plans (AAPs) is a mechanism linking medical home certification to better asthma control and lower risk of exacerbation.
  • Using endogenous treatment effects models, we confirmed that medical homes and education using AAPs are associated with better outcomes. We then decomposed medical home effects via clinical (education/AAP) versus other mechanisms, and we found that patient education using AAPs accounted for 16% to 35% of medical homes’ association with outcomes.
  • Healthcare redesign requires both clinical and nonclinical transformation to improve outcomes. These findings also support patient education using AAPs.

Patient-centered medical homes (PCMHs) represent a type of payment and delivery system reform that emphasizes improved access, primary care—based teams, and care continuity and coordination.1,2 Evidence suggests that medical homes may be associated with higher quality of care and lower resource use and costs.3-11 However, although certain medical home components—including electronic health record (EHR) adoption, team-based care, financial incentives, state support, and others—may improve care,12-14 the mechanisms by which medical homes may result in better outcomes remain untested.

Two mechanisms that might link medical homes to better asthma outcomes are: 1) organizational transformation, including workforce management, team structure, and use of registries or EHRs to track patient populations,12,14,15 and 2) patient-centered care management.16 Although the former might require qualitative methods or provider surveys to assess, care management is observable via certain process measures in secondary data, making it amenable to research in understanding medical homes’ mechanisms.

One condition with evidence of benefit in medical homes is asthma,9,17 which afflicts more than 25 million people in the United States.18 Beyond its symptom burden, the condition is associated with 1.75 million visits to the emergency department (ED) and more than 400,000 hospitalizations annually.19,20 Proper asthma care includes routine follow-up visits and specialty referrals as needed, assessment and self-monitoring, addressing environmental irritants and triggers, pharmacotherapy, and patient—provider partnerships in striving for treatment goals and asthma control.21

Within asthma care, a notable care management tool is asthma education by a clinician, often including an asthma action plan (AAP), which is a written or electronic plan focusing on daily treatment, early detection, crisis management, and treatment of asthma exacerbations. AAPs are a standard of care recommended by several scientific and professional organizations.21 Although highly variable,22,23 AAPs have evidence of benefit in improving provider counseling and patient outcomes, particularly as part of comprehensive asthma education.23-30 As such, education using an AAP serves as a process measure for asthma quality, although AAP use remains low.31,32 Education with an AAP may represent a mechanism of coordinated care and self-management linking medical homes with better outcomes. Yet, medical home—based improvements in asthma outcomes, if present, are likely not solely due to care process via education and AAPs; they also likely reflect organizational and structural changes, including primary care physician–led teams, improved access, and population assessment and patient tracking.12-14

In this study, we used statewide clinic-reported patient-level data on adult and pediatric patients with asthma at primary care clinics certified as Health Care Homes (HCHs), versus those not HCH-certified, under a statewide initiative. Our research questions were: 1) what are the effects of being in HCH versus non-HCH clinics on asthma care quality process (education/AAP) and outcomes measures (having asthma controlled; being free from exacerbations)?; 2) is education with an AAP associated with better asthma outcomes?; and 3) what are the direct (via HCH certification) and indirect (via education/AAP) effects on asthma outcomes, and how do they compare?

METHODS

This was an observational study; clinics self-selected into HCH certification. The study was approved by the University of Minnesota Institutional Review Board.

Health Care Home Initiative

The Minnesota HCH initiative was part of a larger 2008 health reform law.9 Minnesota developed its own HCH certification process for primary care clinics. Certification, conducted by the Minnesota Department of Health, is based on 5 standards: access/communication, participant registry/tracking participant care activity, care coordination, care plan, and performance reporting/quality improvement.

The HCH initiative included payment reform via additional reimbursement for care coordination, based on tiers of patient complexity. However, financial incentives do not appear to be a primary reason for certification: In a 2012 survey, most HCH clinics had not conducted a financial analysis before seeking certification.9,33

Data

Data came from the HCH certification database and MN Community Measurement (MNCM). Under the Statewide Quality Reporting and Measurement System (SQRMS), data are collected by clinics and submitted directly to MNCM, the measures developer and data steward. These data include records about patients from all payers and insurance types, including commercial insurance, Minnesota Health Care Plans (MHCP; Minnesota’s Medicaid/SCHIP and low-income insurance programs), Medicare, self-pay, and uninsured.

SQRMS is a separate initiative from HCH; participation in SQRMS does not mean a clinic will seek HCH certification, and certification does not require high quality scores. Incentives for SQRMS participation are separate from those for HCH (eg, Minnesota’s Quality Incentive Payment System includes payments for meeting quality targets or improvement over time).

Sample

Only primary care clinics that were located in Minnesota and eligible for HCH certification were included. Based on available years and data for quality measures, our sample (depending on the year and measure) consisted of between 64,000 and 80,000 patients in more than 500 HCH and non-HCH clinics reporting data on asthma quality from 2010 to 2013. Data from 2009 were not used for quality measurement (no clinics were HCH-certified in 2009), but they were used to estimate self-selection into HCH certification.

Measures

“HCH certification” is a yearly indicator of whether a patient’s clinic was HCH-certified for at least part of that year (“HCH”) or was an HCH-eligible primary care clinic that either chose not to become certified or had not yet completed certification during the year (“non-HCH”). A clinic could be non-HCH one year and certified the next. Under SQRMS, clinics themselves reported enrollee-level data for their own patients, meaning that patients were attributed directly by the clinics who cared for them, and submitted quality measures based on their EHRs.

Asthma quality data were available from 2010 to 2013. They consisted of a process measure and 2 outcomes measures, based on the observation period of July 1 of the previous year to June 30 of the current year. The process measure pertained to having asthma education and AAP in the patient’s chart at any time during the observation period. The AAP contained information on medication doses and purposes of medications, information on recognizing exacerbations and what to do when they occurred, and information on patient triggers. For outcomes measures, having “asthma well controlled” came from the most recent asthma control tool during the observation period: for those 17 years and older, the latest Asthma Control Test or Childhood Asthma Control test score of 20 or greater or the latest Asthma Control Questionnaire score of 0.75 or lower; for those younger than 17 years, the latest Asthma Therapy Assessment Questionnaire score of 0. “Not having elevated risk of exacerbation” indicated having less than 2 inpatient hospitalizations or ED visits summed across the observation period.

Patient covariates included sex, insurance (commercial, Medicare, MHCP, self-pay/uninsured), distance to clinic, and child/adult. Clinic characteristics, including rurality, came from the HCH certification database. For rurality, we classified zip codes based on rural—urban commuting areas34: urban/metropolitan, micropolitan, small town, and rural.

Analysis

We conducted analyses of the total, direct, and indirect effects of HCHs and AAPs using the endogenous treatment effects (eteffects) procedure in Stata 1435 (StataCorp LP; College Station, Texas) to map direct and indirect effects and the ldecomp procedure in Stata to formally decompose total, direct, and indirect effects of HCHs and AAPs, respectively.36

Eteffects models use a probit model for treatment (treatment model), whose residuals are then used in models predicting outcomes (outcomes models) for treated and nontreated groups. This was done to help address endogeneity and ensure that there were comparable observations between “treated” and “nontreated” groups (the overlap condition). We employed 2 sets of eteffects models. The probit treatment models, where treatments were 1) the direct effect (HCH as treatment) and 2) the indirect effect (education/AAP as treatment), included clinic rurality, square root of patient volume, clinic membership in a medical group, and a selection bias correction term (see below). Outcomes models for both were probit models predicting asthma outcomes (not being at risk of exacerbation and having asthma well controlled), controlling for patient sex, insurance, and distance to clinic (same zip code, <10 miles, 10-20 miles, >20 miles).

Treatment models for the eteffects procedure employed an inverse Mills ratio to adjust for the fact that clinics self-selected into applying for HCH certification.37 This selection term was calculated yearly, because certification, variables associated with certification, and their effects on certification could vary across years. The selection term was calculated from a model estimating the probability, each year, that a clinic would be HCH-certified during the year, as regressed on clinic-level measures from the prior year, including average rates of meeting diabetes care quality and vascular care quality targets for each clinic, number of providers, number of patients, membership in a medical group with 10 or more clinics, and proportion of patients insured by MHCP, Medicare, or uninsured/self-pay. The selection term was imputed for clinics that had not participated in quality reporting in the prior year via regression-adjusted imputation, with an imputed-lambda indicator included in the models.

Based on the eteffects models, we present adjusted potential outcomes means for probabilities of, separately, HCH (direct) and AAP (indirect) effects on asthma outcomes. To formally decompose the direct and indirect (via AAPs) effects of HCH on asthma outcomes, we applied the ldecomp procedure in Stata,36 used for decomposing such effects of categorical variables in logistic regression models, with standard errors obtained via the bootstrap method. Based on this analysis, we present computed odds ratios (ORs) with 95% CIs, as well as log ORs (where direct and indirect effects sum to total effect), allowing us to present direct/indirect effects as percentages of the total effect.

RESULTS

Table 1 displays unadjusted patient characteristics at HCH and non-HCH clinics by adult/child status. Because HCH status could change (clinics could be non-HCH one year, then HCH the next), numbers represent the share of person-years in each category. Generally, adults and children in HCHs were seen at larger clinics and had higher rates of meeting asthma targets, public or self-insurance, urban residence, and having their residence located 20 or fewer miles from the clinic (but not in same zip code).

Being served by an HCH clinic was significantly positively associated with the process measure (asthma education/AAP) and with outcomes measures (free from exacerbation; having asthma controlled) based on potential outcomes means from eteffects models (Table 2). Among adults, estimates for HCHs were often double those of non-HCH clinics (eg, the rate with an AAP was 66.4% [95% CI, 65.9-67.0%] when seen in an HCH versus 31.8% [95% CI, 31.3-32.2%] for non-HCH clinics) with similar rates and magnitude differences for outcomes measures. For children, rates were generally higher—and less disparate between HCH and non-HCH clinics&mdash;but still diverged significantly: Rates of AAPs and positive outcomes were approximately 14 to 16 percentage points higher for HCH than non-HCH clinics.

Assessment of the “indirect” path showed a strong association of AAPs with positive outcomes measures (potential outcomes means from eteffects models) (Table 3). Rates of no exacerbation and having asthma controlled were between 8.2% and 13.6% with no AAP in place (95% CIs, 8.0%-8.4% and 13.4%-13.7%, respectively) and between 76.9% and 94.6% with an AAP in place (95% CIs, 76.3%-77.6% and 94.5%-94.8%, respectively). When divided by AAP, rates were slightly higher for adults but were generally very close within clinic type (HCH, non-HCH).

We also assessed the possible mechanisms linking HCHs with asthma outcomes by decomposing the total, direct (HCH only), and indirect (via AAPs) associations in a series of models (Table 4). Although ORs are presented, we also present the log ORs and the calculated percent of the total effect attributable to direct and indirect effects in the far right column. For adults, the indirect effect (via AAPs) accounted for 16% of the total association with not having exacerbations and 34% with having asthma controlled. Conversely, 84% and 66% were accounted for by the direct HCH effect, respectively. For children, the indirect effect (via AAPs) accounted for 35% of the total effect on having no exacerbation and 28% of the total effect on having asthma controlled (with the direct HCH effect accounting for 65% and 72% of the total effect, respectively).

DISCUSSION

Our findings provide evidence of improved asthma care quality process and outcomes in HCHs versus eligible but noncertified primary care clinics. Moreover, they support a better understanding of the relationships between those variables: Improved process in clinical care management (via patient education with AAP) is associated with better outcomes and represents a modest, but relatively consistent, mechanism that links being in an HCH-certified clinic with better outcomes. The direct HCH effect may be associated with the other HCH mechanisms, such as patient registries and health records, clinical decision support, and patient portals.15,38

The findings can also be interpreted as describing structural and clinical aspects of PCMHs.39 Structural aspects include registries, care plans, and coordination meetings ensuring that providers have current and correct information about patients for better delivery of clinical care, whereas AAPs involve patient care to facilitate behavioral change. The PCMH direct effect may represent structural factors, whereas the indirect effect may represent clinical factors. Further research on how the quality of implementation of PCMH structures affects the functioning of PCMH clinical components would be useful; there are likely multiple aspects of transformation (structural, clinical, or otherwise)40 involved in improving outcomes. Other care models and settings that integrate additional elements (eg, accountable care organizations) are likely more complex; intersectoral, financial, and other mechanisms may be required to fully understand outcomes.

These findings may be understood in a broader context of care management. Unlike many process measures, such as glycated hemoglobin testing or cancer screening, asthma education and AAP imply discussion or care planning and care management, rather than laboratory testing. Similar measures for other conditions—documentation of decision aids for tests or medications, diabetes self-management education, medication reconciliation in multimorbidity, or other evidence-based tools in care management&mdash;may deserve further attention in quality measurement and reporting.

Other notable findings include the fact that the HCH effect on the asthma care process and outcomes was fairly large. This is likely a function of the comparison of the HCH effect to usual care (non-HCH clinics). The more that usual care for a condition has been a focus of clinical practice improvement and public policy, the lower the HCH—usual care difference is likely to be.41 Because asthma became a focus of quality measurement and public reporting concurrently with HCH development, the HCH—usual care gap is likely to be large. The large HCH effect can then be interpreted as HCHs being quicker to adapt asthma evidence-based practices than non-HCHs. In the long run, however, the HCH effect is likely to attenuate as all clinics become more likely to adopt practices like registries and AAPs. After the initial improvement in care, it is likely that the rates of improvement will decrease as best practices are more widely implemented.42

Our results are consistent with evidence supporting better asthma outcomes with asthma education and AAPs,23,25,28,30 but they also add to this literature in at least 2 ways. First, we assessed process and outcome measures in asthma among adults and children separately, and we found similar but not identical results, with differences varying across assessment of direct and indirect associations. Second, compared with prior studies using trial designs, we employed data from a statewide policy initiative; our large observational approach allowed for a larger sample size and a potentially better approximation of effectiveness across settings. This was made possible by the statewide patient-level SQRMS quality measures; because there is no procedural code for AAPs, other large analyses of administrative claims data are largely precluded. This unique capability to assess statewide, yet patient-level, trends in care quality speaks to the strength of our data and the usefulness of our analysis. Moreover, we employed robust methods to address selection bias and endogeneity.

Limitations

This study had limitations. Most notably, as a nonrandomized comparison, unaddressed selection bias or confounding due to unobservable factors may remain. Moreover, factors affecting certification changed over time. Thus, propensity score matching would require scoring, matching, and comparing on a yearly basis. We used the inverse Mills ratio within eteffects models across years, but residual confounding remains a threat to validity.

Our analyses also excluded potentially relevant covariates, such as patient comorbidity or clinic case mix; this may have resulted in unobserved confounding. Additionally, some patients might be attributed by 2 or more clinics in SQRMS. However, persons seen at multiple clinics within a system should still have had complete data submitted, and we are not aware of evidence or theory suggesting that patients might be systematically seen at 1 type of clinic but meet quality targets at another. Although a separate initiative, SQRMS participation may be associated with greater propensity for certification, affecting generalizability.

Regarding clinical interpretation, the specific type and format of AAPs used can vary: Although reporting required specific pieces of information in the plans, as well as education, fidelity in practice is likely less than perfect. These variations may have implications for the effectiveness of education with AAPs and their association with outcomes, but we were not able to assess this with our data.

An additional issue was that although the comparison of direct HCH-only effects with indirect HCH effects (through education/AAPs) provided some information about the care systems/clinical practice distinction, the full breakdown of the direct effect of HCHs across care systems improvement, improved registries, and other HCH features was not done here. Work elsewhere suggests that these components are not equal in shaping benefits of medical homes,12,14 so further understanding of what composes the direct effect would be useful in fully understanding that mechanism.

A final issue was the temporal ordering of asthma care measures: Asthma education/AAP happened at any time during each observation year (July 1 of the prior year to June 30 of the observation year); risk of exacerbation represents a reported count of acute use for that same year. As such, the causal relationship between these 2 measures could be more complex than specified here. However, we are encouraged by the fact that findings were similar for having asthma controlled (which was based on the most recent visit and thus more likely to happen after education/AAP). Although this similarity lends some support for the exacerbation finding, we cannot guarantee a single causal direction.

Despite its limitations, this study adds to the literatures on HCHs, AAPs, and potentially care management in establishing associations between payment and care delivery reform, quality process of care measures, and asthma outcomes.

CONCLUSIONS

HCH certification was associated with greater likelihood of receiving education/AAP and positive asthma outcomes, and AAPs were themselves associated with better asthma outcomes. The decomposition of their effects indicated that AAPs accounted for a small but robust portion of the total HCH effect on outcomes. HCHs may offer better care management via AAPs and similar processes, and these processes may be responsible for better outcomes for asthma and other chronic conditions.&ensp;

Acknowledgments

This research was supported by a contract from the Minnesota Department of Health to evaluate Minnesota’s Health Care Home Initiative. The authors gratefully acknowledge assistance by the Minnesota Department of Health, Minnesota Department of Human Services, MN Community Measurement, and Robert Kreiger.Author Affiliations: Division of Health Policy and Management, University of Minnesota (NDS, DRW), Minneapolis, MN; Children’s Hospital and Clinic — Minneapolis (MF), Minneapolis, MN.

Source of Funding: This research was supported by a contract from the Minnesota Department of Health to evaluate Minnesota’s Health Care Home Initiative.

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (NDS, DRW); acquisition of data (MF, DRW); analysis and interpretation of data (NDS, DRW); drafting of the manuscript (NDS); critical revision of the manuscript for important intellectual content (NDS, MF, DRW); statistical analysis (NDS, DRW); obtaining funding (MF, DRW); and supervision (DRW).

Address Correspondence to: Nathan D. Shippee, PhD, Division of Health Policy and Management, University of Minnesota, 420 Delaware St SE, Minneapolis, MN 55455. Email: nshippee@umn.edu.REFERENCES

1. Stange KC, Nutting PA, Miller WL, et al. Defining and measuring the patient-centered medical home. J Gen Intern Med. 2010;25(6):601-612. doi: 10.1007/s11606-010-1291-3.

2. Primary Care Medical Home. The Joint Commission website. jointcommission.org/accreditation/pchi.aspx. Published 2014. Accessed July 8, 2014.

3. DeVries A, Li CH, Sridhar G, Hummel JR, Breidbart S, Barron JJ. Impact of medical homes on quality, healthcare utilization, and costs. Am J Manag Care. 2012;18(9):534-544.

4. Nelson KM, Helfrich C, Sun H, et al. Implementation of the patient-centered medical home in the Veterans Health Administration: associations with patient satisfaction, quality of care, staff burnout, and hospital and emergency department use. JAMA Intern Med. 2014;174(8):1350-1358. doi: 10.1001/jamainternmed.2014.2488.

5. Nielsen M, Olayiwola JN, Grundy P, Grumbach K. The Patient-Centered Medical Home’s Impact on Cost & Quality: An Annual Update of the Evidence, 2012-2013. Patient-Centered Primary Care Collaborative website. pcpcc.org/resource/medical-homes-impact-cost-quality. Published January 2014. Accessed February 8, 2018.

6. Paustian ML, Alexander JA, El Reda DK, Wise CG, Green LA, Fetters MD. Partial and incremental PCMH practice transformation: implications for quality and costs. Health Serv Res. 2014;49(1):52-74. doi: 10.1111/1475-6773.12085.

7. Stevens GD, Shi L, Vane C, Peters AL. Do experiences consistent with a medical-home model improve diabetes care measures reported by adult Medicaid patients? Diabetes Care. 2014;37(9):2565-2571. doi: 10.2337/dc14-0440.

8. Garcia-Huidobro D, Shippee N, Joseph-DiCaprio J, O’Brien JM, Svetaz MV. Effect of patient-centered medical home on preventive services for adolescents and young adults. Pediatrics. 2016;137(6):e20153813. doi: 10.1542/peds.2015-3813.

9. Wholey DR, Finch M, White KM, et al. Evaluation of Health Care Homes: 2010-2012. Report to the Minnesota Legislature. Minnesota Department of Health website. health.state.mn.us/divs/opa/2014hchevalrpt.pdf. Published January 2014. Accessed February 8, 2018.

10. Higgins S, Chawla R, Colombo C, Snyder R, Nigam S. Medical homes and cost and utilization among high-risk patients. Am J Manag Care. 2014;20(3):e61-e71.

11. Neal J, Chawla R, Colombo CM, Snyder RL, Nigam S. Medical homes: cost effects of utilization by chronically ill patients. Am J Manag Care. 2015;21(1):e51-e61.

12. Gao Y, Nocon RS, Gunter KE, et al. Characteristics associated with patient-centered medical home capability in health centers: a cross-sectional analysis. J Gen Intern Med. 2016;31(9):1041-1051. doi: 10.1007/s11606-016-3729-8.

13. Kern LM, Edwards A, Kaushal R. The patient-centered medical home, electronic health records, and quality of care. Ann Intern Med. 2014;160(11):741-749. doi: 10.7326/M13-1798.

14. Marcus Thygeson N, Solberg LI, Asche SE, Fontaine P, Gregory Pawlson L, Scholle SH. Using fuzzy set qualitative comparative analysis (fs/QCA) to explore the relationship between medical “homeness” and quality. Health Serv Res. 2012;47(1 pt 1):22-45. doi: 10.1111/j.1475-6773.2011.01303.x.

15. King J, Patel V, Jamoom E, DesRoches C. The role of health IT and delivery system reform in facilitating advanced care delivery. Am J Manag Care. 2016;22(4):258-265.

16. Meterko M, Wright S, Lin H, Lowy E, Cleary PD. Mortality among patients with acute myocardial infarction: the influences of patient-centered care and evidence-based medicine. Health Serv Res. 2010;45(5 pt 1):1188-1204. doi: 10.1111/j.1475-6773.2010.01138.x.

17. Driscoll DL, Hiratsuka V, Johnston JM, et al. Process and outcomes of patient-centered medical care with Alaska Native people at Southcentral Foundation. Ann Fam Med. 2013;11(suppl 1):S41-S49. doi: 10.1370/afm.1474.

18. Moorman JE, Akinbami LJ, Bailey CM, et al. National surveillance of asthma: United States, 2001-2010. Vital Health Stat 3. 2012;(35):1-58.

19. Mehal JM, Holman RC, Steiner CA, Bartholomew ML, Singleton RJ. Epidemiology of asthma hospitalizations among American Indian and Alaska Native people and the general United States population. Chest. 2014;146(3):624-632. doi: 10.1378/chest.14-0183.

20. Akinbami LJ, Moorman JE, Liu X. Asthma prevalence, health care use, and mortality: United States, 2005-2009. Natl Health Stat Report. 2011;(32):1-14.

21. Williams SG, Schmidt DK, Redd SC, Storms W; National Asthma Education and Prevention Plan. Key clinical activities for quality asthma care: recommendations of the National Asthma Education and Prevention Program. MMWR Recomm Rep. 2003;52(RR-6):1-8.

22. Gupta S, Wan FT, Ducharme FM, Chignell MH, Lougheed MD, Straus SE. Asthma action plans are highly variable and do not conform to best visual design practices. Ann Allergy Asthma Immunol. 2012;108(4):260-265.e2. doi: 10.1016/j.anai.2012.01.018.

23. Gibson PG, Powell H. Written action plans for asthma: an evidence-based review of the key components. Thorax. 2004;59(2):94-99.

24. Patel MR, Valerio MA, Sanders G, Thomas LJ, Clark NM. Asthma action plans and patient satisfaction among women with asthma. Chest. 2012;142(5):1143-1149. doi: 10.1378/chest.11-1700.

25. Kessler KR. Relationship between the use of asthma action plans and asthma exacerbations in children with asthma: a systematic review. J Asthma Allergy Educ. 2011;2(1):11-21.

26. Agrawal SK, Singh M, Mathew JL, Malhi P. Efficacy of an individualized written home-management plan in the control of moderate persistent asthma: a randomized, controlled trial. Acta Paediatr. 2005;94(12):1742-1746. doi: 10.1080/08035250510039973.

27. Fassl BA, Nkoy FL, Stone BL, et al. The Joint Commission Children’s Asthma Care quality measures and asthma readmissions. Pediatrics. 2012;130(3):482-491. doi: 10.1542/peds.2011-3318.

28. Gibson PG, Powell H, Coughlan J, et al. Self-management education and regular practitioner review for adults with asthma. Cochrane Database Syst Rev. 2003;(1):CD001117. doi: 10.1002/14651858.CD001117.

29. Yin HS, Gupta RS, Tomopoulos S, et al. A low-literacy asthma action plan to improve provider asthma counseling: a randomized study. Pediatrics. 2016;137(1):e20150468. doi: 10.1542/peds.2015-0468.

30. Kuhn L, Reeves K, Taylor Y, et al. Planning for action: the impact of an asthma action plan decision support tool integrated into an electronic health record (EHR) at a large health care system. J Am Board Fam Med. 2015;28(3):382-393. doi: 10.3122/jabfm.2015.03.140248.

31. Morse RB, Hall M, Fieldston ES, et al. Hospital-level compliance with asthma care quality measures at children’s hospitals and subsequent asthma-related outcomes. JAMA. 2011;306(13):1454-1460. doi: 10.1001/jama.2011.1385.

32. Peters SP, Jones CA, Haselkorn T, Mink DR, Valacer DJ, Weiss ST. Real-world Evaluation of Asthma Control and Treatment (REACT): findings from a national web-based survey. J Allergy Clin Immunol. 2007;119(6):1454-1461. doi: 10.1016/j.jaci.2007.03.022.

33. Wholey D, Finch M, Shippee ND, et al. Evaluation of the State of Minnesota’s Health Care Homes Initiative: Evaluation Report for Years 2010-2014. Minnesota Department of Health website. health.state.mn.us/healthreform/&#8203;homes/legreport/docs/hch2016report.pdf. Published December 2015. Accessed February 8, 2018.

34. Rural-urban commuting area codes. Economic Research Service website. ers.usda.gov/data-products/rural-urban-commuting-area-codes.aspx. Published 2010. Updated October 12, 2016. Accessed March 14, 2017.

35. StataCorp. Stata Release 14 Treatment-Effects Reference Manual: Potential Outcomes/Counterfactual Outcomes. College Station, TX: StataCorp LP; 2015.

36. Buis ML. Direct and indirect effects in a logit model. Stata J. 2010;10(1):11-29.

37. Heckman JJ. Sample selection bias as a specification error. Econometrica. 1979;47(1):153-161. doi: 10.2307/1912352.

38. Soderberg K, Rajamani S, Wholey DR, LaVenture M. Health reform in Minnesota: an analysis of complementary initiatives implementing electronic health record technology and care coordination. Online J Public Health Inform. 2016;8(3):e204. doi: 10.5210/ojphi.v8i3.7094.

39. Bond GR, Salyers MP, Rollins AL, Moser LL. Future Developments in Assertive Community Treatment. Indianapolis, IN: ACT Center of Indiana, Indiana University—Purdue University Indianapolis; 2008.

40. Solimeo SL, Hein M, Paez M, Ono S, Lampman M, Stewart GL. Medical homes require more than an EMR and aligned incentives. Am J Manag Care. 2013;19(2):132-140.

41. Song Y. Two randomized screening trials with prostate cancer mortality, two interim results: a consideration of usual-care window in screening trials. Contemp Clin Trials. 2010;31(4):378-380. doi: 10.1016/j.cct.2010.04.005.

42. Campbell SM, Reeves D, Kontopantelis E, Sibbald B, Roland M. Effects of pay for performance on the quality of primary care in England. N Engl J Med. 2009;361(4):368-378. doi: 10.1056/NEJMsa0807651.

Related Videos
Martin Kolb, MD, PhD
Martin Kolb, MD, PhD
Panagis Galiatsatos, MD, MHS, an expert on bronchiectasis
Panagis Galiatsatos, MD, MHS, an expert on bronchiectasis
Panagis Galiatsatos, MD, MHS, an expert on bronchiectasis
Panagis Galiatsatos, MD, MHS, an expert on bronchiectasis
Panagis Galiatsatos, MD, MHS, an expert on bronchiectasis
Panagis Galiatsatos, MD, MHS, an expert on bronchiectasis
John Hood, PhD
Pamela J. McShane, MD, an expert on bronchiectasis
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.