Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Measurement Properties of Questionnaires Measuring Continuity of Care: A Systematic Review

  • Annemarie A. Uijen ,

    A.Uijen@elg.umcn.nl

    Affiliation Radboud University Nijmegen Medical Centre, Department of Primary and Community Care, Nijmegen, The Netherlands

  • Claire W. Heinst,

    Affiliation Radboud University Nijmegen Medical Centre, Department of Primary and Community Care, Nijmegen, The Netherlands

  • Francois G. Schellevis,

    Affiliations Netherlands Institute for Health Services Research (NIVEL), Utrecht, The Netherlands, Department of General Practice and the EMGO Institute for Health and Care Research, VU University Medical Centre, Amsterdam, The Netherlands

  • Wil J.H.M. van den Bosch,

    Affiliation Radboud University Nijmegen Medical Centre, Department of Primary and Community Care, Nijmegen, The Netherlands

  • Floris A. van de Laar,

    Affiliation Radboud University Nijmegen Medical Centre, Department of Primary and Community Care, Nijmegen, The Netherlands

  • Caroline B. Terwee,

    Affiliation Department of Epidemiology and Biostatistics and the EMGO Institute for Health and Care Research, VU University Medical Centre, Amsterdam, The Netherlands

  • Henk J. Schers

    Affiliation Radboud University Nijmegen Medical Centre, Department of Primary and Community Care, Nijmegen, The Netherlands

Abstract

Background

Continuity of care is widely acknowledged as a core value in family medicine. In this systematic review, we aimed to identify the instruments measuring continuity of care and to assess the quality of their measurement properties.

Methods

We did a systematic review using the PubMed, Embase and PsycINFO databases, with an extensive search strategy including ‘continuity of care’, ‘coordination of care’, ‘integration of care’, ‘patient centered care’, ‘case management’ and its linguistic variations. We searched from 1995 to October 2011 and included articles describing the development and/or evaluation of the measurement properties of instruments measuring one or more dimensions of continuity of care (1) care from the same provider who knows and follows the patient (personal continuity), (2) communication and cooperation between care providers in one care setting (team continuity), and (3) communication and cooperation between care providers in different care settings (cross-boundary continuity). We assessed the methodological quality of the measurement properties of each instrument using the COSMIN checklist.

Results

We included 24 articles describing the development and/or evaluation of 21 instruments. Ten instruments measured all three dimensions of continuity of care. Instruments were developed for different groups of patients or providers. For most instruments, three or four of the six measurement properties were assessed (mostly internal consistency, content validity, structural validity and construct validity). Six instruments scored positive on the quality of at least three of six measurement properties.

Conclusions

Most included instruments have problems with either the number or quality of its assessed measurement properties or the ability to measure all three dimensions of continuity of care. Based on the results of this review, we recommend the use of one of the four most promising instruments, depending on the target population Diabetes Continuity of Care Questionnaire, Alberta Continuity of Services Scale-Mental Health, Heart Continuity of Care Questionnaire, and Nijmegen Continuity Questionnaire.

Introduction

Continuity of care is an important characteristic of good health care. [1][4] In the literature, continuity often refers to the extent by which care is provided by the same person (personal continuity). Personal continuity is relatively easy to measure as it can be expressed as an index, based on duration of provider relationship, density of visits, dispersion of providers or sequence of providers [5].

From the 1990’s on, however, continuity of care is increasingly seen as a multidimensional concept. [6] Besides personal continuity, it also includes the seamless provision of care by a group of professionals in the medical home (team continuity), and continuity between different care settings, e.g. general practice and specialist care (cross-boundary continuity). [6][8] As more and more care providers are involved in individual patient care, the communication and cooperation aspects of care become increasingly important.

Measuring continuity of care in its multidimensional meaning requires a robust and solid measurement instrument. Reviews have shown that many instruments have been developed over time. [9][13] These reviews, however, did not include recent publications and have focused solely on one concept. As we found that other concepts like coordination and integration of care show great overlap with continuity of care [6], the limited continuity scope seems too narrow for a complete overview of instruments. Moreover, existing reviews have not systematically appraised the measurement properties of the instruments found. Therefore, we performed a systematic review to identify the instruments measuring continuity of care, to assess the dimensions of continuity in those instruments, and to evaluate their measurement properties.

Methods

Search Strategy

We searched the computerized bibliographic databases of PubMed, Embase and PsycINFO from 1995 to October 2011. We chose to start searching in 1995, as the multidimensional concept only emerged from then on. [6] It would therefore be very unlikely that relevant instruments developed before 1995 would use multidimensional definitions of continuity of care. We used the keywords ‘continuity of care’, ‘coordination of care’, ‘integration of care’, ‘patient centered care’, ‘case management’ and its linguistic variations in combination with a search filter developed for finding studies on measurement properties of measurement instruments (see Appendix S1). [14] We restricted our search to English or Dutch language articles. Reference lists were screened to identify additional relevant studies.

Selection Criteria

We included all articles describing the development and/or evaluation of the measurement properties of an instrument measuring - what we will define in this review as - continuity of care [6][8]: (1) care from the same provider who knows and follows the patient (personal continuity), (2) communication and cooperation between care providers in one care setting (team continuity), and (3) communication and cooperation between care providers in different care settings (cross-boundary continuity). Instruments measuring only one or two of these dimensions were also included. Instruments based on a single item or index or instruments also measuring other concepts besides these three dimensions of continuity of care were excluded.

Two reviewers (AU and CH) independently screened titles, abstracts and reference lists of the studies retrieved by the literature search. If there was any doubt as to whether the article met the inclusion criteria, consensus was reached between the reviewers. The full-text articles were reviewed by two independent reviewers (AU and CH) for in- and exclusion criteria. If necessary a third independent reviewer (HS) was consulted.

Data Extraction

Data extraction and assessment of measurement properties and methodological quality were performed by two reviewers (AU and CH) independently. In case of disagreement, a third reviewer (CT) made the decision. One of the found measurement instruments was developed and validated by AU [15]; [16], so CH and CT scored this instrument. All instruments were questionnaires with pre-defined answering categories. The following data were extracted:

  1. Dimensions of continuity of care. For each questionnaire we identified which dimensions of continuity of care (personal, team and/or cross-boundary continuity) are measured.
  2. Measurement properties. We describe the measurement properties of each questionnaire divided over three domains, according to the COSMIN taxonomy [17]: (1) reliability (including internal consistency, reliability, measurement error), (2) validity (including content validity, structural validity and hypothesis testing (construct validity)), and (3) responsiveness. These measurement properties are defined in Table 1. In addition, interpretability is also described. Interpretability is the degree to which one can assign qualitative meaning to quantitative scores. [17] This means that investigators should provide information about clinically meaningful differences in scores between subgroups, floor and ceiling effects, and the minimal important change. [18] Interpretability is not a measurement property, but an important characteristic of a measurement instrument [17].
  1. Quality assessment. Assessment of the methodological quality of the included studies was carried out using the COSMIN checklist. [19] This checklist consists of nine boxes with methodological standards for how each measurement property should be assessed. [20] Each item was rated on a 4-point scale (poor, fair, good or excellent). An overall score for the methodological quality of a study was determined by taking the lowest rating of any of the items in the nine boxes.

Best Evidence Synthesis – Levels of Evidence

Some studies evaluated the same measurement properties for a specific questionnaire. To determine the overall quality of each measurement property established in different studies we combined the results of the different studies for each questionnaire, taking into account the number of studies, the methodological quality of the studies and the direction (positive or negative) and consistency of their results.

The possible overall rating for a measurement property could reach 8 different categories (+++, ++, +, +/−, ?, −, −− or −−−) [21]; [22] (Table 2). For example, when two studies of the same questionnaire show good methodological quality on evaluating ‘reliability’, then the overall rating would be either ‘+++’ or ‘−−−’ (Table 2), depending on the result (positive or negative) of the measurement property for which we used criteria based on Terwee et al. [23] (Table 1). These criteria were derived from existing guidelines and consensus within the research group of Terwee et al.

thumbnail
Table 2. Levels of evidence for the overall quality of the measurement property [22].

https://doi.org/10.1371/journal.pone.0042256.t002

In this case, when both studies showed intraclass correlation coefficient (ICC) <0.70, the overall rating would be ‘−−−’. This means that there is strong evidence (multiple studies of good methodological quality) for low levels of reliability. However, when there is only one study of fair methodological quality showing ICC>0.70, the overall rating would be ‘+’. When one study shows ICC>0.70, while another study shows ICC<0.70, the overall rating would be ‘+/−’. When there are only studies of poor methodological quality, the overall rating would be ‘?’, independent of the result of the measurement property.

Results

The search strategy resulted in 4749 articles from PubMed, 2366 articles from Embase and 349 articles from PsycInfo (Figure 1). From these searches, we included 23 articles in this review. We included one extra article that was not yet published which describes the validation of an included measurement instrument. [16] Reference tracking did not result in additional articles. Finally, we included 24 articles describing the development and/or evaluation of 21 questionnaires measuring continuity of care [15]; [16]; [24][45].

thumbnail
Figure 1. Search strategy resulting in 4749 articles from PubMed, 2366 articles from Embase and 349 articles from PsycInfo.

https://doi.org/10.1371/journal.pone.0042256.g001

Table 3 presents an overview of the identified questionnaires. Seventeen questionnaires measured continuity of care from the perspective of the patien [15]; [16]; [24][27]; [29][35]; [37][41]; [43][45], four from the perspective of the care provider/program director [28]; [36]; [42]. From the instruments measuring continuity from the perspective of the patient, three were developed for diabetic patient [29]; [33]; [44], three for patients with a mental illnes [24]; [30]; [37]; [41]; [43], two for patients with cance [38]; [45], two for previously hospitalised patient [26]; [35], two for patients with complex and chronic care need [32]; [40], one for patients with heart failure or atrial fibrillatio [34]; [39], one for users of welfare services [25], one for patients visiting their family practice physician [31], one for patients living at home [27] and one for patients in general regardless of morbidity or care setting [15]; [16].

Ten instruments measured aspects of personal, team and cross-boundary continuit [15]; [16]; [24]; [26]; [30][35]; [37]; [39]; [41]; [44], while eleven instruments measured only one or two of these dimensions [25]; [27][29]; [36]; [38]; [40]; [42]; [43]; [45].

Most questionnaires were originally developed in English, except for the Dutch questionnaires of Casparie et al. [27] and Uijen et al. [15]; [16], the Chinese questionnaire of Wei et al. [44], and the Swedish questionnaire of Ahgren et al [25].

Table 4 presents a description of the study populations. Eight of the instruments were solely developed and/or evaluated in primary care population [27]; [31][33]; [40]; [41]; [43]; [44], eight solely in secondary care population [26]; [34][36]; [38]; [39]; [42]; [45] and five were developed and/or evaluated in both primary and secondary care populations [15]; [16]; [24]; [25]; [28][30]; [37].

The methodological quality of the studies is presented in Table 5 for each questionnaire and measurement property. Most studies assessed the internal consistency, content validity, structural validity and construct validity of the instruments, although frequently the methodological quality of the studies regarding these measurement properties was fair or poor. The reliability and measurement error were only assessed in a minority of the studies and the methodological quality regarding these measurement properties was often fair or poor. Cross-cultural validity, criterion validity and responsiveness were not assessed in any of the studies.

thumbnail
Table 5. Methodological quality of each article per measurement property and instrument (COSMIN Checklist).

https://doi.org/10.1371/journal.pone.0042256.t005

The synthesis of results per questionnaire and their accompanying level of evidence are presented in Table 6. Six instruments (CPCI [31], CCI [26], CPCQ [40], HCC [34]; [39], CCCQ [45] and NC [15]; [16]) scored positive on the quality of at least three measurement properties. Information regarding the interpretability of the instruments was missing in most studies.

thumbnail
Table 6. Quality of measurement properties and the interpretability per instrument.

https://doi.org/10.1371/journal.pone.0042256.t006

Discussion

In this systematic review we found 21 instruments measuring - what we define as - continuity of care. We found six instruments that we would probably not have found when we would have focussed our review solely on continuity of care, instead of taking into account related concepts as coordination and integration. [25]; [28]; [31]; [36]; [40]; [45] CPCQ and CCCQ aim to measure ‘coordination of care’ [40]; [45], CSI and the instrument of Ahgren et al. measure ‘integration of care’ [25]; [28], CRP-PIM measures ‘communication among care providers’ [36] and CPCI measures ‘attributes of primary care’ [31].

Most included instruments have problems with either the ability to measure all three dimensions of continuity of care or the number or quality of its assessed measurement properties.

Only about half of the questionnaires measured all three dimensions of continuity of care (personal, team and cross-boundary continuity). Of most instruments three or four measurement properties were assessed (mostly internal consistency, content validity, structural validity and construct validity). Only six instruments (CPCI [31], CCI [26], CPCQ [40], HCCQ [34]; [39], CCCQ [45] and NCQ [15]; [16]) scored positive on the quality of at least three measurement properties. These findings do not mean that the other questionnaires are of poor quality, but imply that studies of high methodological quality are needed to properly assess their measurement properties.

Strengths and Limitations

One of the strengths of this review is that our search not only focused on the concept of ‘continuity of care’, but also took into account the relating concepts ‘coordination of care’, ‘integration of care’, ‘case management’ and ‘patient centred care’. This resulted in the inclusion of instruments which measure the same aspects of care but are defined in different ways.

To our knowledge, this is the first review on measurement instruments for continuity of care that systematically appraised the measurement properties of the instruments found. This allows us to compare the instruments on the quality of their measurement properties.

We used a robust and standardized method to assess the quality of the measurement properties, which attributes considerably to the continuity knowledge base.

A limitation of this study is that we searched from 1995 onwards. Measurement instruments developed before this time were not included in our review. However, because of the changing definitions of continuity over time, we consider it very unlikely that we missed relevant instruments [6].

Another limitation is that the raters had to make a large number of judgements on each study and each measurement instrument. Although the COSMIN checklist [19] and the quality criteria for the measurement properties [23] are defined as objective as possible, different raters could come to a different judgement. That is why two reviewers assessed the measurement properties and methodological quality of the studies, and in case of disagreement a third reviewer was consulted.

Comparison with Existing Literature

Previous reviews have identified many instruments measuring continuity of care or one of its related concepts, such as patient centred care or integrated care. [9][13] Most reviews have limited their search to only one concept. We found only one review, identifying measures of integrated care, that broadened its search to concepts as continuity of care, care coordination and seamless care, but this review did not systematically appraise quality measures of the instruments. [13] Most instruments included in previous reviews have not been included in our review due to several reasons. Some studies did not describe the development or evaluation of the measurement properties at all, some did not measure - what we define in this review as - continuity of care, and some measured a much broader concept than continuity of care (e.g. all key areas of primary care including accessibility and thoroughness of physical examination).

We found no review assessing the quality of the measurement properties of the included instruments. Hudon et al. systematically assessed the quality of the included articles, i.e. whether all relevant information such as characteristics of the study population was described. [10] However, the quality of the measurement properties was not assessed.

Implications for Practice and Research

The decision which instrument to use will depend on the characteristics of the study population, the ability and desire to measure all three dimensions of continuity, the population in which the instrument was developed and/or validated, the quality of the measurement properties and the interpretability of the instrument.

For a comprehensive measurement of continuity of care, we recommend to use the the DCCQ [44] for diabetic patients, as both other questionnaires for diabetic patients (DCCS [29] and ECC-DM [33]) either do not measure all three dimensions of continuity of care or show lower quality of their measurement properties and interpretability.

For patients with a mental illness, we recommend to use the the ACSS-MH [24]; [30]; [37]. Both other questionnaires available for patients with a mental illness (CONNECT [43] and CONTINU-UM [41]) are only validated in primary care, do not measure all three dimensions of continuity of care or show lower quality of their measurement properties and interpretability.

For patients with heart failure or atrial fibrillation, we only found the HCC [34]; [39]. As this instrument measures relational, team and cross-boundary continuity and shows good quality of the measurement properties, this seems to be a proper questionnaire for this patient group.

For patients with a (chronic) illness (irrespective of the type of (chronic) illness), we found the CPCI [31], VCC [27], CPCQ [40], the instrument of Gulliford et al. [32] and the NCQ [15]; [16]. For a comprehensive measurement of continuity of care, the NCQ is the only questionnaire that has been validated in primary and secondary care and shows the highest quality of its measurement properties and interpretability.

The instruments developed to measure continuity for patients with cancer (CCCQ [45] and the instrument of King et al. [38]), patients previously hospitalized (CCI [26] and PCCQ [35]), and users of welfare services (instrument of Ahgren et al. [25]) all have problems regarding the limited number of dimensions of continuity measured, the limited quality of the measurement properties or the low interpretability of the instrument. The instruments developed to measure continuity of care from the perspective of the provider (CCPS-I [42], CCPS-P [42], CRP-PIM [36] and CSI [28]) need to be used with caution because of the limited quality of the measurement properties and interpretability.

For future research, we believe it is especially important to further evaluate the measurement properties and interpretability of the promising DCCQ, ACSS-MH, HCCQ and NCQ. For none of these instruments, responsiveness is evaluated, although this is an important characteristic of a questionnaire, especially when used to measure change in continuity of care. As the DCCQ and NCQ are originally developed in respectively Chinese and Dutch, cross-cultural validation needs to be evaluated.

Supporting Information

Author Contributions

Conceived and designed the experiments: AAU FGS WJHMB FAL CBT HJS. Performed the experiments: AAU CWH CBT. Analyzed the data: AAU CWH CBT HJS. Wrote the paper: AAU CWH FGS WJHMB FAL CBT HJS.

References

  1. 1. Adair CE, McDougall GM, Mitton CR, Joyce AS, Wild TC, et al. (2005) Continuity of care and health outcomes among persons with severe mental illness. Psychiatr Serv 56: 1061–1069.
  2. 2. Hanninen J, Takala J, Keinanen-Kiukaanniemi S (2001) Good continuity of care may improve quality of life in type 2 diabetes. Diabetes Res Clin Pract 51: 21–27.
  3. 3. Moore C, Wisnivesky J, Williams S, McGinn T (2003) Medical errors related to discontinuity of care from an inpatient to an outpatient setting. J Gen Intern Med 18: 646–651.
  4. 4. Stange KC, Ferrer RL (2009) The paradox of primary care. Ann Fam Med 7: 293–299.
  5. 5. Jee SH, Cabana MD (2006) Indices for continuity of care: a systematic review of the literature. Med Care Res Rev 63: 158–188.
  6. 6. Uijen AA, Schers HJ, Schellevis FG, van den Bosch WJ (2012) How unique is continuity of care? A review of continuity and related concepts. Fam Pract 29: 264–271.
  7. 7. Haggerty JL, Reid RJ, Freeman GK, Starfield BH, Adair CE, McKendry R (2003) Continuity of care: a multidisciplinary review. BMJ 327: 1219–1221.
  8. 8. Reid R, Haggerty J, McKendry R (2002) Defusing the confusion: concepts and measures of continuity of health care. University of British Columbia, Canada: Prepared for the Canadian Health Services Research Foundation, the Canadian Institute for health information and the advisory committee on health services of the federal/provincial/territorial Deputy Ministers of Health. Canadian Health Services Research Website. Available: http://www.chsrf.ca/Migrated/PDF/ResearchReports/CommissionedResearch/cr_contcare_e.pdf. Accessed 2012 Jul 10.
  9. 9. Adair CE, McDougall GM, Beckie A, Joyce A, Mitton C, Wild CT, Gordon A, Costigan N (2003) History and measurement of continuity of care in mental health services and evidence of its role in outcomes. Psychiatr Serv 54: 1351–1356.
  10. 10. Hudon C, Fortin M, Haggerty JL, Lambert M, Poitras ME (2011) Measuring patients’ perceptions of patient-centered care: a systematic review of tools for family medicine. Ann Fam Med 9: 155–164.
  11. 11. Lawrence M, Kinn S (2011) Defining and measuring patient-centred care: an example from a mixed-methods systematic review of the stroke literature. Health Expect. doi: 10.1111/j.1369–7625.2011.00683.x. [Epub ahead of print].
  12. 12. Mead N, Bower P (2000) Patient-centredness: a conceptual framework and review of the empirical literature. Soc Sci Med 51: 1087–1110.
  13. 13. Strandberg-Larsen M, Krasnik A (2009) Measurement of integrated healthcare delivery: a systematic review of methods and future research directions. Int J Integr Care 9: e01.
  14. 14. Terwee CB, Jansma EP, Riphagen II, de Vet HC (2009) Development of a methodological PubMed search filter for finding studies on measurement properties of measurement instruments. Qual Life Res 18: 1115–1123.
  15. 15. Uijen AA, Schellevis FG, van den Bosch WJ, Mokkink HG, van WC, Schers HJ (2011) Nijmegen Continuity Questionnaire: development and testing of a questionnaire that measures continuity of care. J Clin Epidemiol 64: 1391–1399.
  16. 16. Uijen AA, Schers HJ, Schellevis FG, Mokkink HGA, Weel van C, Bosch van den WJHM (2012) Measuring continuity of care: psychometric properties of the Nijmegen Continuity Questionnaire. Br J Gen Pract 62: e949–e957 (9)..
  17. 17. Mokkink LB, Terwee CB, Patrick DL, Alonso J, Stratford PW, Knol DL, Bouter LM, de Vet HC (2010) The COSMIN study reached international consensus on taxonomy, terminology, and definitions of measurement properties for health-related patient-reported outcomes. J Clin Epidemiol 63: 737–745.
  18. 18. Mokkink LB, Terwee CB, Knol DL, Stratford PW, Alonso J, Patrick DL, Bouter LM, de Vet HC (2010) The COSMIN checklist for evaluating the methodological quality of studies on measurement properties: a clarification of its content. BMC Med Res Methodol 10: 22.
  19. 19. Mokkink LB, Terwee CB, Patrick DL, Alonso J, Stratford PW (2010) The COSMIN checklist for assessing the methodological quality of studies on measurement properties of health status measurement instruments: an international Delphi study. Qual Life Res 19: 539–549.
  20. 20. Terwee CB, Mokkink LB, Knol DL, Ostelo RW, Bouter LM (2011) Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist. Qual Life Res.
  21. 21. Furlan AD, Pennick V, Bombardier C, van TM (2009) 2009 updated method guidelines for systematic reviews in the Cochrane Back Review Group. Spine (Phila Pa 1976 ) 34: 1929–1941.
  22. 22. van Tulder M, Furlan A, Bombardier C, Bouter L (2003) Updated method guidelines for systematic reviews in the cochrane collaboration back review group. Spine (Phila Pa 1976 ) 28: 1290–1299.
  23. 23. Terwee CB, Bot SD, de Boer MR, van der Windt DA, Knol DL, Dekker J, et al. (2007) Quality criteria were proposed for measurement properties of health status questionnaires. J Clin Epidemiol 60: 34–42.
  24. 24. Adair CE, Wild TC, Joyce A, McDougall G, Gordon A (2004) Continuity of mental health services study of Alberta: a research program on continuity of mental health care. University of Calgary, Canada.
  25. 25. Ahgren B, Axelsson SB, Axelsson R (2009) Evaluating intersectoral collaboration: a model for assessment by service users. Int J Integr Care 9: e03.
  26. 26. Bull MJ, Luo D, Maruyama GM (2000) Measuring continuity of elders’ posthospital care. J Nurs Meas 8: 41–60.
  27. 27. Casparie AF, Foets M, Raaijmakers MF, de Bakker DH, Schneider MJ, et al. (1998) Onderzoeksprogramma Kwaliteit van Zorg: vragenlijst continuïteit van zorg vanuit cliëntperspectief VCC: handleiding en vragenlijsten. Utrecht, Nederland, NIVEL.
  28. 28. Dobrow MJ, Paszat L, Golden B, Brown AD, Holowaty E (2009) Measuring Integration of Cancer Services to Support Performance Improvement: The CSI Survey. Healthc Policy 5: 35–53.
  29. 29. Dolovich LR, Nair KM, Ciliska DK, Lee HN, Birch S, Gafni A, Hunt DL (2004) The Diabetes Continuity of Care Scale: the development and initial evaluation of a questionnaire that measures continuity of care from the patient perspective. Health Soc Care Community 12: 475–487.
  30. 30. Durbin J, Goering P, Streiner DL, Pink G (2004) Continuity of care: validation of a new self-report measure for individuals using mental health services. J Behav Health Serv Res 31: 279–296.
  31. 31. Flocke SA (1997) Measuring attributes of primary care: development of a new instrument. J Fam Pract 45: 64–74.
  32. 32. Gulliford M, Cowie L, Morgan M (2011) Relational and management continuity survey in patients with multiple long-term conditions. J Health Serv Res Policy 16: 67–74.
  33. 33. Gulliford MC, Naithani S, Morgan M (2006) Measuring continuity of care in diabetes mellitus: an experience-based measure. Ann Fam Med 4: 548–555.
  34. 34. Hadjistavropoulos HD, Biem HJ, Kowalyk KM (2004) Measurement of continuity of care in cardiac patients: reliability and validity of an in-person questionnaire. Can J Cardiol 20: 883–891.
  35. 35. Hadjistavropoulos H, Biem H, Sharpe D, Bourgault-Fagnou M, Janzen J (2008) Patient perceptions of hospital discharge: reliability and validity of a Patient Continuity of Care Questionnaire. Int J Qual Health Care 20: 314–323.
  36. 36. Hess BJ, Lynn LA, Holmboe ES, Lipner RS (2009) Toward better care coordination through improved communication with referring physicians. Acad Med 84: S109–S112.
  37. 37. Joyce AS, Adair CE, Wild TC, McDougall GM, Gordon A, Costigan N, Pasmeny G (2010) Continuity of care: validation of a self-report measure to assess client perceptions of mental health service delivery. Community Ment Health J 46: 192–208.
  38. 38. King M, Jones L, Richardson A, Murad S, Irving A, Aslett H (2008) The relationship between patients’ experiences of continuity of cancer care and health outcomes: a mixed methods study. Br J Cancer 98: 529–536.
  39. 39. Kowalyk KM, Hadjistavropoulos HD, Biem HJ (2004) Measuring continuity of care for cardiac patients: development of a patient self-report questionnaire. Can J Cardiol 20: 205–212.
  40. 40. McGuiness C, Sibthorpe B (2003) Development and initial validation of a measure of coordination of health care. Int J Qual Health Care 15: 309–318.
  41. 41. Rose D, Sweeney A, Leese M, Clement S, Jones IR (2009) Developing a user-generated measure of continuity of care: brief report. Acta Psychiatr Scand 119: 320–324.
  42. 42. Schaefer JA, Cronkite R, Ingudomnukul E (2004) Assessing continuity of care practices in substance use disorder treatment programs. J Stud Alcohol 65: 513–520.
  43. 43. Ware NC, Dickey B, Tugenberg T, McHorney CA (2003) CONNECT: a measure of continuity of care in mental health services. Ment Health Serv Res 5: 209–221.
  44. 44. Wei X, Barnsley J, Zakus D, Cockerill R, Glazier R, Sun X (2008) Assessing continuity of care in a community diabetes program: initial questionnaire development and validation. J Clin Epidemiol 61: 925–931.
  45. 45. Young JM, Walsh J, Butow PN, Solomon MJ, Shaw J (2011) Measuring cancer care coordination: development and validation of a questionnaire for patients. BMC Cancer 11: 298.