In this study, the Patient-Centered Medical Home was associated with improvements in patients' experience with access to care but not other domains of care.
Objectives:
Although the Patient-Centered Medical Home (PCMH) model is being implemented across the country to transform primary care, it is not yet clear whether this model actually improves patients’ experiences with healthcare. Our objective was to measure patients’ experiences over time in practices that transformed into PCMHs.
Study Design:
We conducted a prospective study, using 2 serial cross-sectional samples, in a multipayer community.
Methods:
We surveyed 715 patients: 346 at baseline, when practices had just completed transformation, and 369 at follow-up, which was a median of 15 months later. These patients received care from 120 primary care providers at 10 ambulatory practices (20 sites) that achieved Level III PCMH, as defined by the National Committee for Quality Assurance. We measured patient experience, as defined by the 7 domains of the Clinician and Group-Consumer Assessment of Healthcare Providers and Systems (CG-CAHPS) Adult Primary Care Questionnaire.
Results:
Patients’ self-reported experience with access to care improved significantly over time, with 61% of respondents giving access to care the highest rating at baseline versus 69% at follow-up (P = .02). There were no significant changes over time for the other domains.
Conclusions:
The PCMH was associated with improvements in patients’ experience with access to care but not other domains of care. This study, which took place in a multi-payer community, is one of the first to find a positive effect of the PCMH on patient experience.
Am J Manag Care. 2013;19(5):403-410Although the Patient-Centered Medical Home (PCMH) model is being implemented widely, it is not yet clear whether this model actually improves patient experience.
Understanding patients’ experience of healthcare is critically important.1 If patients are not satisfied and engaged with their healthcare providers, then healthcare is unlikely to be successful in improving health. Aspects of patients’ experience of healthcare include their perceptions of: the quality of their relationships with providers, the quality of disease management, access to care, and communication with office staff.2
Efforts are under way across the country to improve healthcare, in part through improving patients’ experience.3 One of the most common approaches, with more than 100 demonstration projects under way, is the Patient-Centered Medical Home (PCMH).4 The PCMH is being promoted as a strategy for improving patient-centeredness, improving quality, and decreasing cost.3 This model is also being used to test alternative reimbursement strategies for primary care.5,6
The PCMH is a set of practice standards that emphasize coordination of care and management of chronic disease over time.2 PCMH standards developed by the National Committee for Quality Assurance (NCQA) include processes of care for: optimizing access to care and patient communication, patient registries, care management, patient self-management support, electronic prescribing, tracking of tests and referrals, and performance reporting.2 Practices receive NCQA recognition when they have implemented a specified number and pattern of processes of care, with different levels of recognition depending on the number and pattern implemented.2
Whether these processes translate into better experiences for patients is not yet clear, in part because evaluations of most demonstration projects are still unfolding.4,7 There have been a few studies that have considered the effects of the PCMH model on patient experience over time, but these studies have been conducted in specific populations or settings that limit generalizability. A recent systematic review found a small positive effect of the PCMH on patient satisfaction; however, this finding was mostly driven by studies in pediatric and geriatric populations. 8 Among those studies in general adult populations, 1 previous study found a positive effect9,10 and another found no effect.11 The first of these studies took place in an integrated delivery system that provided both organization and financing of healthcare,9,10 while the other took place in multiple communities around the country with coordination by a national organization.11
We sought to measure patients’ experiences over time in primary care practices for general adult populations, which transformed into PCMHs in a multi-payer community12 with extensive locally driven quality improvement efforts.
METHODSOverview
We conducted a prospective study, using 2 serial crosssectional samples of patients receiving primary care in the Hudson Valley region of New York. The Institutional Review Board of Weill Cornell Medical College approved the study. In a separate paper, we describe baseline patient experience compared with national benchmarks.13 In this paper, we measure whether patient experience changes over time, as practices gain experience with the PCMH.
Setting
The 7 counties of the Hudson Valley are located immediately north of New York City. Physicians in this region provide healthcare with fee-for-service reimbursement from multiple payers. The average practice size is 4 physicians.14
This study took place in the context of an initiative led by THINC,15 a non-profit organization that convened 6 health plans and 1 large employer to provide financial incentives for physicians to implement the PCMH model. Financial incentives ranged from $2.00 to $10.00 per member per month for achieving PCMH Level III, as defined by the National Committee on Quality Assurance’s (NCQA’s) 2008 criteria.16
PCMH transformation took place at 12 adult primary care practices and 1 pediatric primary care practice. The physicians in these practices are members of the Taconic Independent Practice Association (IPA).17 Practices were assisted in their transformation by the Taconic IPA, as well as by 2 external consulting groups. The lead physicians from each practice met at least monthly as a Medical Council to coordinate their efforts and share best practices. Practice transformation consisted of systematically reviewing the NCQA tool, documenting PCMH processes that were already in place, and targeting and implementing those processes that were not initially in place that were also of interest to the practice. Practices were permitted to vary in which aspects of the PCMH they implemented.
Practice-based needs assessments began in January 2009, and actual transformation began in March 2009. All practices submitted their applications to NCQA and were awarded Level III recognition (the highest of 3 levels). The median submission date was December 2009 (range August 2009-January 2010).
Sampling and Recruitment
We excluded the 1 pediatric practice that had undergone transformation, because the patient experience tool (described below) was not applicable to a pediatric population. We excluded 2 adult solo practices that delayed their medical home implementation, leaving 10 practices in total (20 sites).
Recruitment took place in the waiting rooms of the practices, because this was viewed by the Medical Council as the most patient-centered strategy. (The alternative, providing the research team with patients’ contact information without patients’ explicit approval to release such information, was not viewed as patient-centered.) We prepared 1-page information sheets, which had English and Spanish versions on each side and invited patients to participate. We provided each practice with the same fixed number of information sheets, which practice staff then distributed to consecutive patients. Patients who opted to participate provided their own contact information and the name of their primary care doctor to confirm receipt of primary care at one of the participating practices. The information sheets described the study broadly as a patient experience survey and did not name the PCMH per se. Patients were offered a $5 incentive for participation.
Baseline patient experience data were collected from November 2009 to February 2010. Follow-up data were collected (in a separate sample of patients) from February 2011 to August 2011. The median duration between rounds of the survey was 15 months. We sampled patients at 2 time points, because we sought to include patients with recent visits to their primary care physicians, in order to avoid recall bias.
Measurement of Patient Experience
We measured patient experience, which is a measure of patient-centeredness that is broader than patient satisfaction and includes reports from patients on what they did or did not experience in their interactions with the healthcare system.18 We based our survey tool on the 2007 Clinician & Group - Consumer Assessment of Healthcare Providersand Systems (CG-CAHPS) Adult Primary Care Questionnaire.19 We included the 35 questions from CG-CAHPS and added 14 additional questions, in order to address concepts included in the PCMH model that were not explicitly covered in the CG-CAHPS. These 14 questions were derived from questions in the Ambulatory Care Experiences Survey (ACES),20 the American College of Physicians Center for Practice Innovation Clinician and Staff Survey (unpublished), the Commonwealth Fund Quality of Health Care Patient Survey,21 and the Commonwealth Fund International Health Policy Survey.22 The final survey contained 49 questions.
Survey Administration
Telephone surveys were administered by the Cornell Survey Research Institute (SRI). Cornell SRI attempted to contact each patient up to 5 times. If they did not reach a patient after 5 attempts, the patient was removed from the potential respondent pool. Outreach to patients was stratified by practice; data collection stopped for a given practice if the target number of completed surveys (N = 40) was reached.
Analysis
We used descriptive statistics to characterize the patientsin the sample. We compared the baseline and follow-up samples, using χ2 or Fisher’s exact tests.
We applied analytical guidelines published by the national CG-CAHPS team23 to aggregate survey responses into 7 nonmutually-exclusive domains: access to care, communication and relationships, disease management, doctor communication, follow-up of test results, office staff, and overall rating of the doctor. We calculated for each question the proportion of patients that gave the most favorable response, using the questions with 6-point scales. We then averaged this result across questions within each domain to yield the average proportion of patients who gave the most favorable response for that domain. We compared baseline and follow-up by domain using 2-sample tests of proportions. We also analyzed the data at the question level, similarly using 2-sample tests of proportions to measure change over time.
All data analyses were conducted with SAS version 9.3 (SAS Institute Inc, Cary, North Carolina), except for the 2-sample tests of proportions, which were conducted using Stata/IC 12.0 (Stata Corp LP, College Station, Texas). We considered P values <.05 to be significant.
RESULTS
Figure
Table 1
At follow-up, we distributed 2706 patient forms across 10 practices. Of those, 1602 (59%) signed up to participate (). We made outreach attempts to 1034 of those (65%). We reached our target sample size with 423 patients completing the telephone protocol (40%). Of those, 387 (92%) completed the survey. Response rates were similar across practices ().
We excluded 10 patients who were not able to confirm that they had seen their primary care physician on the day they were recruited, and we excluded 8 patients who were not able to confirm that they had seen their primary care physician in the previous 12 months. Thus, the final sample size for the follow-up survey was 369 patients. Recruitment for the baseline data was similar,13 contributing 346 patients from the 10 practices also surveyed at follow-up.
The patients in the study were cared for by a total 120 unique primary care providers: 80 primary care providers at baseline and 95 at follow-up, including 55 primary care providers whose patients contributed data in both time periods.
Approximately two-thirds of patients at follow-up (69%) were female, with a median age of 45 to 64 years (Table 2). Most (84%) were white and had a median educational level of some college. Half (48%) of patients had excellent or very good health. Approximately half (52%) had visited a healthcare provider at least 3 times for the same problem in the previous year. The patients’ most common chronic diseases were: hypertension, gastroesophageal reflux disease, and arthritis.
There were no differences between patients in the 2 time periods in terms of gender, age, race, education, overall health, frequency of visits, frequency of having any chronic condition, or frequency of each of 10 specific chronic conditions (Table 2).
Table 3
We found that patients’ experience with access to care improved significantly over time, with 69% of respondents giving the most favorable rating at follow-up, up from 61% (P = .02). There was no significant change over time in any of the other 6 domains of care (). There was a trend toward improved experience with office staff, with 72% of respondents giving the most favorable rating at baseline and 78% doing so at follow-up (P = .06). There was also a trend toward worsening experience with follow-up of test results, with 76% giving the most favorable rating at baseline and 69% doing so at follow-up (P = .06).
Table 4
When we looked at the question level, we found that improvements with patients’ experience regarding access to care were driven by improvements in availability of appointments for urgent problems and decreases in wait time to be seen once at the office (). We also found significant improvements at the question level in patients’ perceptions of how much time doctors spend with them and the helpfulness of the office staff (Table 4).
DISCUSSION
We found that patients’ experience with access to care improved over time within practices that transformed into PCMHs. We found an absolute improvement of 8 percentage points, and a relative improvement of 13%. Improvements in access to care were driven by patients’ experiences with greater availability of appointments for urgent medical problems and by decreased waiting time once in the doctor’s office. Patient experience did not change significantly over time for any other domain of care.
Access to care, or being able to receive care in a timely manner from an appropriate provider, has been highlighted by many organizations as an important element of patient-centered care, even among those who have health insurance.24 It is believed that timely outpatient care—including during and after regular office hours—may avert unnecessary and more expensive emergency department utilization.24 Indeed, many practices across the country are experimenting with “open access” scheduling, which leaves open appointments for same-day visits when patients call with urgent problems.25 Some of the practices in this community had employed that model as part of their medical home transformation.
Access to care may have changed more than other dimensions of care for several reasons. Access to care was the dimension that patients were least satisfied with at baseline and thus had the most room for improvement. The work flow modifications needed to improve access to care may also have more rapid effects than other dimensions of care.
We found a trend toward improvement in experience with office staff (P = .06). Within that domain, 1 of the questions did show significant improvement over time (P <.01), with patients reporting that clerks and receptions were more helpful at follow-up. Previous work has shown that staff in medical offices can rapidly and substantially affect retention of patients by being more polite, listening more attentively, and taking more responsibility for one’s actions.26 Although we do not have detailed information on how staff members may have changed in this study, these types of changes could occur within the time frame observed.
We also found a trend toward worsening in patient experience with follow-up of test results. Failure to follow up all test results has been recognized as a common problem in clinical medicine and a challenging one to address. A recent systematic review found 19 studies quantifying the rate of failure to follow up test results, which varies from 9% to 62% of laboratory tests and from 1% to 36% of radiology tests.27 PCMH standards require practices to have methods in place to track test results, but those standards may be just the beginning of quality improvement efforts in this area.
Several other dimensions of care may have suffered from a ceiling effect, in that it is difficult to demonstrate improvement upon something that is already very good. Possible interpretations of the lack of findings in other dimensions include: 1) the intervention did not have an effect, 2) a small effect was there, but a larger sample size might have been needed to find it, 3) the survey tool we used was not sensitive enough to small changes that occurred, or 4) an effect was occurring but a longer time period for follow-up would be needed to find it.
This study builds on previous ones in this area. A demonstration project in the Seattle-based Group Health Cooperative found favorable effects of the PCMH model on patient experience, particularly for access, care coordination, and patient activation.9,10 The absolute improvement in patient experience with access to care in that study was 3.48 percentage points on a 100-point scale at 12 months and 2.84 percentage points at 24 months.9,10 That project was based in a large clinic within an integrated delivery system that combines organization and financing of healthcare.10 In contrast, the National Demonstration Project (NDP) found no effect of the PCMH model on patient experience over 26 months, across 8 domains of care including access.11 The 31 NDP practices were located across the United States and had been selected from among more than 300 practices that had applied to participate.11
Exact definitions of “medical homes” and the exact survey instruments used to assess patient experience have varied in the previous studies in this field. Group Health and the NDP each developed their own definitions of the medical home.10,11 We used the definition developed by NCQA, which is being adopted widely nationally4 and which may facilitate direct comparisons with other communities with ongoing evaluations. Group Health primarily used the Ambulatory Care Experiences Survey (ACES) to measure patient experience, while the NDP used its own tool, which it had developed from multiple previously published instruments.11 We used the CG-CAHPS tool, which was developed from the ACES, and which is emerging as a national standard for measurement of ambulatory patient experience.28
This study had several limitations. First, our baseline period of data collection was not strictly preintervention; rather, it occurred during medical home transformation. This may have limited our ability to find any rapid effects that might have taken place at the beginning of transformation. Second, it is possible that access to care decreased during transformation and then rebounded, which we cannot determine from these data. We do not have detailed data on the specific actions the practices took to increase access to care. Third, our study did not have a concurrent control group, which limits our ability to rule out secular trends. Finally, our study did not survey the same patients twice but instead used 2 different samples of patients. Although the 2 samples might have had different attitudes toward healthcare, their demographic characteristics were very similar.
In conclusion, we found that patients’ experience with access to care improved significantly over time in practices that had undergone transformation into PCMHs. We did not find changes over time in other domains of care. This represents one of the first studies to find an effect of the PCMH on patient experience in a community with multiple payers, fee-for-service reimbursement, and locally driven quality improvement.Author Affiliations: From Department of Public Health (LMK, RVD, AE, RK), Department of Medicine (LMK, RK), Department of Pediatrics(RK), Weill Cornell Medical College, New York, NY; Health Information Technology Evaluation Collaborative (LMK, RVD, AE, RK), New York, NY; NewYork-Presbyterian Hospital (RK), New York, NY.
Funding Source: This work was supported by the Commonwealth Fund (grant #20080473) and the Taconic Independent Practice Association.
Author Disclosures: The authors (LMK, RVD, AE, RK) report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.
Authorship Information: Concept and design (LMK, RVD, RK); acquisition of data (LMK, RVD, RK); analysis and interpretation of data (LMK, AE,RK); drafting of the manuscript (LMK, AE); critical revision of the manuscript for important intellectual content (LMK, RVD, RK); statistical analysis (AE, RK); obtaining funding (LMK, RK); administrative, technical, or logisticsupport (RVD); and supervision (LMK, RK).
Address correspondence to: Lisa M. Kern, MD, MPH, Department of Public Health, Weill Cornell Medical College, 425 E 61st St, New York, NY10065. E-mail: lmk2003@med.cornell.edu.1. Safran DG. Defining the future of primary care: what can we learn from patients? Ann Intern Med. 2003;138(3):248-255.
2. National Committee for Quality Assurance. Patient-Centered Medical Home, 2011. www.ncqa.org/tabid/631/Default.aspx. Accessed January 3, 2013.
3. Patient-Centered Primary Care Collaborative, 2012. http://www.pcpcc.net/. Accessed January 2, 2013.
4. Bitton A, Martin C, Landon BE. A nationwide survey of patient centered medical home demonstration projects. J Gen Intern Med. 2010;25(6):584-592.
5. Berenson RA, Rich EC. US approaches to physician payment: the deconstruction of primary care. J Gen Intern Med. 2010;25(6):613-618.
6. Berenson RA, Rich EC. How to buy a medical home? policy options and practical questions. J Gen Intern Med. 2010;25(6):619-624.
7. Barr MS. The need to test the patient-centered medical home. JAMA. 2008;300(7):834-835.
8. Jackson GL, Powers BJ, Chatterjee R, et al. The patient-centered medical home: a systematic review [published online 2012]. Ann Intern Med. 2013;158(3):169-178.
9. Reid RJ, Coleman K, Johnson EA, et al. The group health medical home at year two: cost savings, higher patient satisfaction, and less burnout for providers. Health Aff (Millwood). 2010;29(5):835-843.
10. Reid RJ, Fishman PA, Yu O, et al. Patient-centered medical home demonstration: a prospective, quasi-experimental, before and after evaluation. Am J Manag Care. 2009;15(9):e71-e87.
11. Jaen CR, Ferrer RL, Miller WL, et al. Patient outcomes at 26 months in the patient-centered medical home National Demonstration Project. Ann Fam Med. 2010;(8, suppl 1):S57-S67, S92.
12. Stuard SS, Blair AJ. Interval examination: regional transformation of care delivery in the Hudson Valley. J Gen Intern Med. 011;26(11): 1371-1373.
13. Kern LM, Dhopeshwarkar R, Edwards A, Kaushal R. Patient experience at the time of practice transformation into Patient-Centered MedicalHomes. Int J Person Centered Med. In press.
14. Kern LM, Barron Y, Blair AJ 3rd, et al. Electronic result viewing and quality of care in small group practices. J Gen Intern Med. 2008;23(4): 405-410.
15. THINC: Taconic Health Information Network and Community. www.thinc.org. Accessed January 2, 2013.
16. National Committee for Quality Assurance. Standards and guidelines for Physician Practice Connections - Patient-Centered Medical Home (PPC-PCMH), 2008. www.ncqa.org/Portals/0/Programs/Recognition/PCMH_Overview_Apr01.pdf. Accessed January 2, 2013.
17. Taconic IPA. www.taconicipa.com. Accessed January 2, 2013.
18. Browne K, Roseman D, Shaller D, Edgman-Levitan S. Analysis & commentary: measuring patient experience as a strategy for improving primary care. Health Aff (Millwood). 2010;29(5):921-925.
19. Agency for Healthcare Research and Quality. CAHPS Clinician & Group Surveys. www.cahps.ahrq.gov/clinician_group/. Accessed January 2, 2013.
20. Safran DG, Karp M, Coltin K, et al. Measuring patients’ experiences with individual primary care physicians: results of a statewide demonstration project. J Gen Intern Med. 2006;21(1):13-21.
21. Beal AC, Doty MM, Hernandez SE, Shea KK, Davis K for the Commonwealth Fund. Closing the divide: how medical home promote equity in health care: results from the Commonwealth Fund 2006 health care quality survey. Commonwealth Fund, 2007. http://www.commonwealthfund.org/Publications/Fund-Reports/2007/Jun/Closingthe-Divide--How-Medical-Homes-Promote-Equity-in-Health-Care--Results-From-The-Commonwealth-F.aspx. Accessed January 2, 2013.
22. Schoen C, Osborn R, Doty MM, Bishop M, Peugh J, Murukutla N. Toward higher-performance health systems: adults’ health care experiences in seven countries, 2007. Health Aff (Millwood). 2007;26(6): w717-w734.
23. Agency for Healthcare Research and Quality. What’s available for the CAHPS Clinician & Group Surveys - Clinician & Group Survey and Reporting Kit, 2008. https://cahps.ahrq.gov/clinician_group/cgsurvey/whatsavailableforcahps-cgsurveys.pdf. Accessed January 2, 2013.
24. Berry LL, Seiders K, Wilder SS. Innovations in access to care: a patient-centered approach. Ann Intern Med. 2003;139(7):568-574.
25. Rose KD, Ross JS, Horwitz LI. Advanced access scheduling outcomes: a systematic review. Arch Intern Med. 2011;171(13):1150-1159.
26. Baker SK. Improving service and increasing patient satisfaction. Fam Pract Manag. 1998;5(7):29-34.
27. Callen JL, Westbrook JI, Georgiou A, Li J. Failure to follow-up test results for ambulatory patients: a systematic review. J Gen Intern Med. 2012;27(10):1334-1348.
28. National Committee for Quality Assurance. NCQA’s new distinction in patient experience reporting, 2011. http://www.ncqa.org/tabid/1429/
Default.aspx. Accessed January 2, 2013.
Overhauling Quality Measurement in the US: Measure What Matters
October 30th 2024As the US charts its course through the next political era, it is crucial that we boldly allocate resources and prioritize what truly impacts patients. When faced with complexity, feasibility concerns, or entrenched norms, we must proclaim: “It’s the outcomes, stupid.”
Read More
No Free Lunch: The Misaligned Incentives of the American Health Care System
October 30th 2024The author highlights reasons why we have not seen substantial cost savings in the health care industry and why future efforts are likely to continue to see forceful pushback, as well as offers potential solutions.
Read More