• Center on Health Equity & Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Are Healthcare Quality "Report Cards" Reaching Consumers? Awareness in the Chronically Ill Population

Publication
Article
The American Journal of Managed CareMarch 2015
Volume 21
Issue 3

Using longitudinal data, this paper examines to what extent comparative quality information in the form of healthcare "report cards" is reaching chronically ill adults.

ABSTRACT

Background: Significant investments have been made to provide comparative healthcare quality information (CQI) to the public, but whether these efforts are increasing awareness of CQI is unknown.

Objectives: To provide regional estimates of change in awareness of CQI among the chronically ill population residing in 14 geographic regions of the United States between 2008 and 2012. Additionally, to examine its correlation with the changes in the availability of quality reports.

Study Design: Data from 2 waves (2008 and 2012) of a random-digit dial survey of 11,896 adults with chronic illness.

Methods: Regression-adjusted change in the percentage of respondents aware of physician and hospital CQI, and Pearson correlations between regional change in awareness of CQI and regional change in availability of quality reports.

Results: While the number of reports on both hospital quality and physician quality increased between 2008 and 2012, there was significant change in awareness of only physician CQI (12.8% to 16.2%, regression-adjusted change of 3.7 percentage points; P <.05). No significant correlation was found between the change in awareness of CQI and the change in availability of hospital quality reports or physician quality reports.

Conclusions: Awareness of physician CQI among the chronically ill increased modestly between 2008 and 2012, but no significant increase in awareness of hospital quality was observed. As efforts to report CQI accelerate, more attention to approaches to dissemination may be warranted in order to increase awareness in the chronically ill population.

Am J Manag Care. 2015;21(3):236-244

  • Federal investment in providing comparative quality information (CQI) has been substantial and increasing as CMS’s “Physician Compare” website is now online.
  • There is little information about consumer awareness, especially among those with chronic conditions.
  • This is the first study to examine awareness of CQI with longitudinal data.
  • Our results suggest that awareness of healthcare quality reports remains low, with modest increases between 2008 and 2012.
  • There is also substantial variation across regions in terms of baseline levels of awareness and longitudinal changes.

Public reporting of healthcare provider quality measures has emerged as a major reform strategy. Reporting efforts have been propelled by an increasing recognition that healthcare markets vary on the dimensions of prices and quality, may lack transparency on these dimensions, and may fail to foster incentives that will lead to better outcomes.1-4 The availability of quality information may lead to consumer “shopping” for higher-quality providers (“consumer pathway”), incentivizing providers to improve the quality of care they provide. However, this strategy assumes awareness and use of quality reports by consumers.5,6

Harris and Buntin (2008) define comparative quality information (CQI) as including “all materials disseminated by public and private entities that are intended to inform consumers’ decisions in choosing healthcare providers.”7 This includes quality information that may be seen by consumers at their workplace, gathered from traditional printed sources (eg, newspapers, books, magazines, journals), or via the Internet (eg, social media, Web searches, e-mail).

The variety and scope of CQI has grown substantially, with information coming from many sources offering a range of measures, and making quality comparisons among provider entities.8 There is regional variability in CQI accessible to consumers in terms of content, credibility, and applicability of information to local consumer needs.9 A substantial proportion of CQI is targeted toward the chronically ill,9 a population group that is the focus of the chronic care model and the patient-centered medical home model,10,11 with particular emphasis on disease self-management. Despite the abundance of CQI, it is unclear whether consumer awareness has kept pace with its growth, especially over the last few years.

The federal and state governments, regional coalitions, and employers and health plans have invested significant resources in making CQI available to the public.12 While quality information for Medicare-certified hospitals has been publicly available to consumers since 2005, the Affordable Care Act mandated that CMS report performance data about Medicare-enrolled physicians on the “Physician Compare” website, which is now available online.13 Despite these expanding policy initiatives, little empirical information on consumer awareness of CQI exists. Some large nationally representative surveys that preceded many current reporting efforts found that less than 1 in 5 consumers are aware of hospital or physician CQI.14-16 A more recent study found higher awareness (69%) of online physician ratings17; however, none of these studies looked at the awareness of CQI among chronically ill patients, which is a high-priority group for healthcare reform. Moreover, data on regional variation in consumer awareness of CQI is unavailable, as is information on whether the awareness among the chronically ill is related to the variation in the amount of CQI available by type of chronic disease.

We provide overall and regional estimates of awareness of CQI among the chronically ill population residing in 14 regions of the United States for 2 different periods: 2008 and 2012. These regions are sites for multi-stakeholder alliances in Aligning Forces for Quality (AF4Q), a national program funded by the Robert Wood Johnson Foundation (RWJF), and aimed at realizing community-wide improvements in health and healthcare delivery.18 We also correlated regional estimates of availability of quality reports with regional awareness of these reports.

METHODSData

We used information from 2 data sources: the AF4Q Community Quality Reporting Tracking Database (AF-4QTD) and the Aligning Forces for Quality Consumer Survey (AF4QCS). AF4QTD tracks the number and contents of physician and hospital quality public reports released in the 14 AF4Q regions. Research staff members systematically review the websites of various public and private organizations that are sponsoring public reports in these regions, and they conduct interviews with their program personnel to request additional details as well as to gather specific information about these reports.9

The first wave of AF4QCS was a random-digit dial (RDD) survey conducted between June 2007 and August 2008 for chronically ill adults (18 years or older) in the 14 AF4Q regions across the nation, including whole states and metropolitan and rural areas.18 All respondents had visited healthcare professionals during the previous 2 years for the care of at least 1 of the following 5 conditions: diabetes, hypertension, asthma, chronic heart disease, and depression. 19 The same respondents were attempted again in the second wave between July 2011 and November 2012. To compensate for attrition and to account for potential change in the population between the 2 periods, additional respondents were surveyed in the second wave using RDD.

Our study sample includes 7139 individuals from the first wave and 8362 from the second wave. The response rate of the first wave was 27.6% based on the American Association of Public Opinion Research (AAPOR) method, and 45.8% based on the Council of American Survey Research Organizations (CASRO) method. In the second wave, the panel response rate was 63.3%; the overall response rate—calculated as the weighted average of the panel and the new RDD samples&mdash;was 39.7% based on the AAPOR method, and 42.1% based on the CASRO method.

Our response rates are in line with the general declining trend in survey response rates.20-22 Recent studies have shown substantial variation in reported response rates depending on the methods of calculation23,24 and the lack of association between response rates and nonresponse bias.20,24 Moreover, surveys with lower response rates do not lead to significantly different estimates,25-27 and using extra resources to achieve a better response rate may actually introduce bias.28-30 We compared AF4QCS to the 2008 and 2011 National Health Interview Survey, which has a greater than 90% response rate because of its continuous annual sampling design and personal household interviews by trained staff. No significant differences were found in terms of respondents’ demographic characteristics and prevalence of chronic conditions.

Availability of Physician and Hospital CQI

The variation in the availability of physician and hospital CQI is measured by the number of physician quality reports and the number of hospital quality reports accessible to the public, using information at 2 points in time from the AF4QTD.9 Reports produced by local and regional health plans were included, but only if they were available to the public and not to plan enrollees exclusively. Additionally, we used a directory created by the RWJF to systematically capture physician and hospital quality reports that are national in scope.31 The database includes reports produced by a variety of organizations, including CMS, the Leapfrog Group, National Committee for Quality Assurance, and Bridges to Excellence, in addition to a number of health plans that operate in many markets nationally. These reports vary significantly in the percentage of hospitals and doctors included in the reports, as well as whether participation in the report is voluntary. For example, in our counts we excluded reports that provide comparative scores for a narrow or selected group of hospitals or physicians, sometimes due to the voluntary decision of the hospital or physician practice to provide data. We also estimated the percentage of the population with access to a health plan-sponsored report using plan enrollment estimates from Interstudy.

Awareness of Physician and Hospital CQI

The awareness of physician and hospital CQI is measured by the responses to consecutive survey questions in AF4QCS. The first question asks respondents whether they had seen information comparing physicians, hospitals, or health plans during the past 12 months. Among those who answer yes, subsequent questions separately ask if the respondent had seen information comparing physician quality and hospital quality during the past 12 months. The response was coded as 1 if the respondent said yes to the first and subsequent questions, and 0 if the respondent said no to either the first or subsequent questions. Survey questions focused on comparative information or reports that provided data for all or most providers that might be selected by consumers in a particular healthcare market. It should be noted that “seeing” is a specific yet important aspect of “awareness”—respondents may be aware of CQI without seeing it, but the survey question was designed to be more specific (see) rather than general (aware) to limit the possibility of socially desirable responses and to reduce measurement error. Moreover, actual visual exposure to CQI is more likely to have a direct and meaningful impact on individuals.

Statistical Analysis

We calculated the percentage of respondents who were aware of physician and hospital CQI. To understand whether certain regions differed from the others in terms of the population level awareness, for each of the 2 periods, we conducted a 2-sample proportion z test for each region versus the other regions combined. We then calculated regression-adjusted change of awareness over the 2 waves of AF4QCS for all regions combined and for each region separately. The (linear) regression adjustments allowed us to control for sociodemographic characteristics of survey respondents. To see if the response varied based on the likelihood that a report card contained measures associated with one’s specific health conditions, we conducted sensitivity analysis using a subsample of respondents with diabetes and a subsample of respondents with depression, since the majority of physician report cards contain diabetes measures while few contain depression measures. To examine whether there was an association between the amount of available CQI and awareness of CQI, we calculated Pearson correlation coefficients between the change in population awareness of CQI and the change in regional availability of publicly available quality reports, and the change in the percentage of the population with access to a health plan-sponsored report.

RESULTS

Study Population

Table 1

shows the demographic profile of survey respondents in 2012. Most of the sample respondents were older, white, female, college-educated, and had some form of public insurance, consistent with other studies of the adult chronically ill.32 About half of sample respondents had 2 or more chronic conditions, with hypertension and diabetes being the most prevalent. About 9% of our 2012 sample was uninsured, and most had an annual household income between $30,000 and $60,000.

Availability of Comparative Quality Reports

Figures 1

2

and show the availability of comparative quality reports for AF4Q regions in June 2007 and August 2011. The total number of publicly available hospital quality reports went up from 76 in June 2007, to 87 in August 2011. Over this period, there were increases in the number of publicly available physician reports (16 to 41), quality reports with diabetes measures (9 to 25), and quality reports with depression measures (5 to 7). The estimated percentage of the overall population that had access to health plan-sponsored quality reports went up for hospital CQI (from 29.2% to 42.4%) and for physician CQI (from 21.5% to 40.4%) between June 2007 and August 2011 (not shown in figures). While all AF4Q regions had at least 1 publicly accessible hospital quality report at the baseline (June 2007), 6 sites (Cincinnati, Cleveland, Kansas City, Memphis, Puget Sound, and Willamette Valley) did not have a publicly available physician quality report at baseline. Regional availability of quality reports with diabetes measures was higher than the quality reports with depression measures— both at baseline and in August 2011.

Awareness of Comparative Quality Reports

Table 2

Table 3

The change in awareness of CQI among the chronically ill population between 2008 and 2012 was 0.54 percentage points for hospital CQI (25.5% to 26.5%), and 3.47 percentage points for physician CQI (12.8% to 16.2%). Although the level of awareness of CQI increased slightly between 2008 and 2012 for both types of quality reports, the percentage increase was statistically significant (P <.05) only for awareness of physician CQI (). Among the subpopulation who had diabetes or depression, awareness of physician CQI increased by 2.32 percentage points (14.1% to 16.1%; P <.05) and by 3.6% percentage points (12.5% to 16%; P <.01), respectively (). There was significant regional variation in awareness of CQI for both periods and for both types of quality reports (Table 2). For instance, in 2008, the awareness of physician CQI ranged from 6.9% (Maine) to 19.3% (Detroit, Michigan), and awareness of hospital CQI ranged from 17% (Humboldt County, California) to 34.7% (Cleveland, Ohio). Similarly, in 2012, the awareness of physician CQI ranged from 9.8% (Humboldt County, California) to 22.6% (Detroit, Michigan), and awareness of hospital CQI ranged from 19.2% (Humboldt County, California) to 39% (Detroit, Michigan). None of the AF4Q sites showed a statistically significant change (at P <.05) in awareness of hospital CQI between 2008 and 2012. Six AF4Q sites showed statistically significant (P <.05) increases in regional awareness of physician CQI between 2008 and 2012: Maine (7.31 percentage points); Minnesota (7.13 percentage points); Puget Sound region, Washington (5.76 percentage points); Cleveland (4.57 percentage points); South Central Pennsylvania (4.34 percentage points); and Willamette Valley, Oregon (3.87 percentage points) (Table 2). Among patients with diabetes, only Detroit (9.77 percentage points) showed a statistically significant change (P <.05) in regional awareness of physician CQI between 2008 and 2012. Among patients with depression, only Minnesota (11.45 percentage points) and Cleveland (9.39 percentage points) had a statistically significant (P <.05) increase in awareness of physician CQI (Table 3).

Relationship Between Availability of CQI and Awareness

None of the correlation coefficients examining the correlation between regional availability of publicly available quality reports and change in awareness of CQI were statistically significant (physician quality reports [—0.09; P = .74]; hospital quality reports [—0.18; P = .54]; reports with diabetes measures [0.03; P = .91]; and reports with depression measures [0.35; P = .22]). The correlation between the change in the percentage of the regional population aware of quality reports and the change in access to health plan-sponsored quality reports was not significant for physician quality reports (0.5226; P <.10), but was significantly negative for hospital quality reports (—0.6035; P <.05).

DISCUSSION

Efforts to inform consumers using public reports of comparative hospital and physician quality have grown significantly in recent years.8,9 The Affordable Care Act continues this trend, mandating that CMS create a “Physician Compare” website to go along with its existing “Hospital Compare,” “Nursing Home Compare,” “Home Health Compare,” and “Dialysis Facility Compare” websites.13 While payers might use this CQI in provider selection and contracting, and providers might use publicly reported CQI to engage in quality improvement efforts, publicly available CQI is also intended to inform consumers’ choices of healthcare providers and to improve their interaction with providers. There is little information about the extent to which the consumer population is aware of CQI and how such awareness has changed over time.

Our results suggest that awareness of CQI is modest among chronically ill adults. About a quarter of this population is aware of hospital quality reports, with little change between 2008 and 2012. Awareness of physician quality information grew at a greater rate—though from a lower baseline value&mdash;starting at 12.8% in 2008 and reaching 16.20% of the chronically ill adult population in 2012 (P <.05). Awareness of CQI varies substantially across communities, with awareness of physician quality reports ranging from about 10% to 23% in 2012 in the communities we examined. Rates for patients with specific chronic conditions, such as diabetes and depression, have 2012 values that are similar to these values.

It is not clear what percentage of the population must be aware of CQI to have the intended impact, and impact can be measured in a number of ways. For example, the “consumer pathway” posits that one outcome is a shift in patients across providers—for example, providers with scores that are significantly worse than those of their peers losing patients, while providers with scores that are significantly better than those of their peers are gaining patients. This pathway assumes that the existing provider capacity and insurance arrangements in a given market will allow patients to switch freely across providers. Another outcome is that overall average and individual provider-level quality improvement will occur as a result of publication of CQI, especially when a sizable part of the population is aware of the reports. Although there is some theoretical literature regarding the impact of consumer awareness on quality of nursing homes,33,34 to our knowledge, no empirical study has examined which percentage of the consumer population must be aware of CQI to impact these physician quality outcomes.

There are several reasons why awareness of CQI may be lower than expected. First, while consumers might find the information contained in CQI reports useful, they may be unaware of the information if dissemination is poor or passive. Early efforts focused on getting the reports populated with scientifically sound measures to make them acceptable to payers and providers, and often focused less on dissemination, relying on distribution of print copies to small subsets of the overall population by employers, community groups, or patient advocacy groups35 (eg, disease-specific associations). Contemporary CQI reports are housed on websites, but short of initial news media coverage or press releases by report sponsors, there is often little marketing and dissemination of CQI. While some consumers might find the CQI valuable when they are facing a specific healthcare decision, such as the choice of a hospital or doctor, the dissemination may not coincide with the time period when consumers are making these decisions. Additionally, research has shown that general Internet searching techniques do not always point consumers to available CQI reports.36

A second explanation may relate to consumers’ satisfaction with existing providers. Our sample included consumers with at least 1 of 5 chronic conditions, most of whom are being actively treated by healthcare professionals. If these individuals are satisfied with their existing providers, they may be less likely to be aware of CQI available to them. Third, awareness may be less in markets where there are fewer options to proactively switch healthcare providers, or among sub-populations that are less inclined to “shop” for healthcare, such as those with lower levels of education or income.37 There are many possible explanations for why the levels of awareness we estimated, and the changes in these levels, are not higher among the adult chronically ill population. Future research may help to better understand key factors.

Limitations

First, measuring the “dose” or “quantity” of CQI at a community level at a given point and over time is challenging and subject to measurement error. Second, our measurement efforts were focused on report availability and content rather than report dissemination. Third, our awareness estimates are generated from asking survey questions of chronically ill consumers about whether they have seen CQI. Fourth, our study focused on a sample of adults with 1 or more of 5 common chronic illnesses, so our estimates do not generalize to the entire adult population. Finally, we report regional variation in CQI for 14 communities that were part of the AF4Q demonstration program, a goal of which was to produce and disseminate CQI.

CONCLUSIONS

As CMS continues to invest resources in providing physician CQI as part of its “Physician Compare” website, and as other payers and community groups consider whether to continue disseminating CQI, it is important to monitor changes in awareness of CQI, and to link awareness to other important outcomes such as consumer satisfaction with healthcare providers, and provider improvement on standardized quality measures. In addition, it is important to study the dissemination of CQI and to examine whether some approaches to dissemination lead to greater awareness than others.Author Affiliations: Department of Health Policy and Administration, Center for Health Care and Policy Research, The Pennsylvania State University (DPS, YS, NB), University Park, PA; Division of Health Policy and Management, School of Public Health, University of Minnesota (JBC), Minneapolis, MN.

Source of Funding: The study was funded as a part of an evaluation grant for the Aligning Forces for Quality (AF4Q) program by the Robert Wood Johnson Foundation. The Robert Wood Johnson Foundation had no role in the design and conduct of the study; collection, management, analysis, and interpretation of data; preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (DPS, YS, NB); acquisition of data (DPS, YS, JBC); analysis and interpretation of data (DPS, YS, NB, JBC); drafting of the manuscript (DPS, YS, NB, JBC); critical revision of the manuscript for important intellectual content (DPS, YS, NB, JBC); statistical analysis (DPS, YS, NB); and obtaining funding (DPS).

Address correspondence to: Dennis P. Scanlon, PhD, Department of Health Policy and Administration, The Pennsylvania State University, 504 Donald H. Ford Building, University Park, PA 16802. E-mail: dpscanlon@psu.edu.REFERENCES

1. Anderson GF, Reinhardt UE, Hussey PS, Petrosyan V. It’s the prices, stupid: why the United States is so different from other countries. Health Aff (Millwood). 2003;22(3):89-105.

2. Bentley TGK, Effros RM, Palar K, Keeler EB. Waste in the U.S. health care system: a conceptual framework. Milbank Q. 2008;86(4):629-659.

3. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):2635-2645.

4. The state of health care quality 2009. National Committee for Quality Assurance website. http://www.ncqa.org/Portals/0/Newsroom/SOHC/SOHC_2009.pdf. Published 2009. Accessed August 11, 2011.

5. Hibbard JH, Greene J, Sofaer S, Firminger K, Hirsh J. An experiment shows that a well-designed report on costs and quality can help consumers choose high-value health care. Health Aff (Millwood). 2012;31(3):560-568.

6. Sinaiko AD, Eastman D, Rosenthal MB. How report cards on physicians, physician groups, and hospitals can have greater impact on consumer choices. Health Aff (Millwood). 2012;31(3):602-611.

7. Harris KM, Buntin MB, The RAND Corporation. Choosing a health care provider: the role of quality information. Research Synthesis Report No. 14 from the Robert Wood Johnson Foundation. California Health Policy Forum website. http://www.cahpf.org/GoDocUserFiles/530.Choosing%20a%20healthcare%20provider.pdf. Published May 2008. Accessed November 20, 2012.

8. Roski J, Kim MG. Current efforts of regional and national performance measurement initiatives around the United States. Am J Med Qual. 2010;25(4):249-254.

9. Christianson JB, Volmar KM, Alexander J, Scanlon DP. A report card on provider report cards: current status of the health care transparency movement. J Gen Intern Med. 2010;25(11):1235-1241.

10. Coleman K, Austin BT, Brach C, Wagner EH. Evidence on the Chronic Care Model in the new millennium. Health Aff (Millwood). 2009;28(1):75-85.

11. Berenson RA, Hammons T, Gans DN, et al. A house is not a home: keeping patients at the center of practice redesign. Health Aff (Millwood). 2008;27(5):1219-1230.

12. Health policy brief: public reporting on quality and costs. Health Affairs website. http://www.healthaffairs.org/healthpolicybriefs/brief.php?brief_id=65. Published March 08, 2012. Accessed June 12, 2013.

13. Physician Compare Fact Sheet 2013 Redesign. CMS website. http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/physician-compare-initiative/Downloads/Physician_Compare_Fact_Sheet_2013_Redesign.pdf. Published May 2013. Accessed June 23, 2013.

14. 2008 update on consumers’ views of patient safety and quality information. Kaiser Family Foundation website. http://kff.org/healthreform/poll-finding/2008-update-on-consumers-views-of-patient-2/. Published September 30, 2008. Accessed July 15, 2013.

15. National survey on consumers’ experiences with patient safety and quality information. Kaiser Family Foundation website. http://www.kff.org/kaiserpolls/pomr111704pkg.cfm. Published October 31, 2004. Accessed July 15, 2013.

16. Fox S, Jones S. The social life of health information. Pew Research Center website. http://www.pewinternet.org/Reports/2009/8-The-Social-Life-of-Health-Information.aspx. Published June 11, 2009. Accessed August 22, 2013.

17. Hanauer DA, Zheng K, Singer DC, Gebremariam A, Davis MM. Public awareness, perception, and use of online physician rating sites. JAMA. 2014;311(7):734-735.

18. Scanlon DP, Beich J, Alexander JA, et al. The Aligning Forces for Quality Initiative: background and evolution from 2005 to 2012. Am J Manag Care. 2012;18(6 suppl):s115-s125.

19. Maeng DD, Martsolf GR, Scanlon DP, Christianson JB. Care coordination for the chronically ill: understanding the patient’s perspective. Health Serv Res. 2012;47(5):1960-1979.

20. Johnson TP, Wislar JS. Response rates and nonresponse errors in surveys. JAMA. 2012;307(17):1805-1806.

21. Bunin GR, Spector LG, Olshan AF, et al. Secular trends in response rates for controls selected by random digit dialing in childhood cancer studies: a report from the Children’s Oncology Group. Am J Epidemiol. 2007;166(1):109-116.

22. Cull WL, O’Connor KG, Sharp S, Tang SF. Response rates and response bias for 50 surveys of pediatricians. Health Serv Res. 2005;40(1):213-226.

23. Martsolf GR, Schofield RE, Johnson DR, Scanlon DP. Editors and researchers beware: calculating response rates in random digit dial health surveys. Health Serv Res. 2013;48(2, pt 1):665-676.

24. Davern M. Nonresponse rates are a problematic indicator of nonresponse bias in survey research. Health Serv Res. 2013;48(3):905-912.

25. Keeter S, Kennedy C, Dimock M, Best J, Craighill P. Gauging the impact of growing nonresponse on estimates from a national RDD Telephone Survey. Public Opinion Quarterly. 2006;70(4):759-779.

26. Holle R, Hochadel M, Reitmeir P, Meisinger C, Wichman HE; KORA Group. Prolonged recruitment efforts in health surveys: effects on response, costs, and potential bias. Epidemiology. 2006;17(6):639-643.

27. Blumberg S, Davis K, Khare M, Martinez M. 2005. The effect of survey follow-up on nonresponse bias: Joint Canada/United States Survey of Health, 2002—03. Paper presented at: Annual Meeting of the American Association for Public Opinion Research; May 12-15, 2005; Miami Beach, FL.

28. Davern M, McAlpine D, Beebe TJ, Ziegenfuss J, Rockwood T, Call KT. Are lower response rates hazardous to your health survey? an analysis of three state health surveys. Health Serv Res. 2010;4(5, pt1):1324-1344.

29. Groves RM, Peytcheva E. Building dynamic survey cost models using survey paradata. Presented at: 62nd Annual American Association for Public Opinion Research Conference; May 17-20, 2007; Anaheim, CA.

30. Groves RM. Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly. 2006;70(5):646-675.

31. Comparing health care quality: a national directory. Robert Wood Johnson Foundation website. http://www.rwjf.org/en/research-publications/find-rwjf-research/2013/09/national-directory.html. Published September 2013. Accessed January 28, 2014.

32. Egede LE. Major depression in individuals with chronic medical disorders: prevalence, correlates and association with health resource utilization, lost productivity and functional disability. Gen Hosp Psychiatry. 2007;29(5),409-416.

33. Hirth RA. Consumer information and competition between nonprofit and for-profit nursing homes. J Health Econ. 1999;18(2):219-240.

34. Mukamel DB, Weimer DL, Mushlin AI. Interpreting market share changes as evidence for effectiveness of quality report cards. Med Care. 2007;45(12):1227-1232.

35. Abraham J, Feldman R, Carlin C. Understanding employee awareness of health care quality information: how can employers benefit? Health Serv Res. 2004;39(6, pt 1):1799-1815.

36. Sick B, Abraham JM. Seek and ye shall find: consumer search for objective health care cost and quality information. Am J Med Qual. 2011;26(6):433-440.

37. Hibbard JH, Berkman N, McCormack LA, Jael E. The impact of a CAHPS report on employee knowledge, beliefs, and decisions. Med Care Res Rev. 2002;59(1):104-116.

Related Videos
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.