Home About us Editorial board Search Browse articles Submit article Instructions Contacts Login 
Users Online: 1115
Home Print this page Email this page

 



 
Previous article Browse articles Next article 
ORIGINAL ARTICLE
J Edu Health Promot 2020,  9:27

Contextualization and psychometrics of interprofessional collaboration checklist in Iranian community health-care setting


1 Department of Medical Education, Tehran University of Medical Sciences, Tehran, Iran; Department of Clinical Science and Education, Karolinska Institute, Soder Hospital, Stockholm, Sweden; Department of National Agency for Strategic Research in Medical Education, Tehran, Iran
2 Department of Medical Education; Department of Community Medicine, Tehran University of Medical Sciences, Tehran, Iran
3 Department of Medical Education, Tehran University of Medical Sciences, Tehran, Iran

Date of Submission22-Jul-2019
Date of Acceptance01-Sep-2019
Date of Web Publication28-Feb-2020

Correspondence Address:
MS. Maryam Karbasi Motlagh
Department of Medical Education, Tehran University of Medical Sciences, Tehran
Iran
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jehp.jehp_427_19

Rights and Permissions
  Abstract 

INTRODUCTION: Assessment interprofessional collaboration (IPC), in community health-care setting usually has been neglected due to the lack of standard tools and assessors. In the present study, the IPC checklist extracted from CANMEDS collaborator toolkit for teaching and assessing the collaborator role is contextualized in Iranian community healthcare.
MATERIALS AND METHODS: According to CANMEDS Toolkit, an instrument extracted for IPC assessment. Using Chavez' toolkit, face and content validity were studied through two rounds of Delphi by 12 experts of TUMS. Qualitative content validity including content validity index (CVI), and content validity ratio (CVR) were assessed following watching a standard video about IPC by them. Construct validity was studied by confirmatory factor analysis through LISREL software. To check reliability, Cronbach's alpha was calculated, and the other 12 experts completed checklists in test–retest process with a 2-week interval.
RESULTS: Face and qualitative content validity were confirmed using the Delphi method. CVI and CVR were calculated as 0.61 and 0.86. In factor analysis, x2/df and RMSEA were calculated as 1.363 and 0.036; CFI, IFI, GFI, and AGFI were >0.7, and hence, the construct validity was confirmed. Cronbach's alpha was 0.953 for internal consistency. Test–retest was also calculated as 0.918 indicated to confirm reliability.
CONCLUSION: CANMEDS framework as an assessment tool for evaluating IPC in community health setting is not only valid and reliable in the Iranian context but also it is easy to use for respondents resulted from the rational number of items in community.

Keywords: Collaboration, community, contextualization, interprofessional, interprofessional collaboration, psychometrics


How to cite this article:
Shirazi M, Shariati M, Zarghi N, Karbasi Motlagh M. Contextualization and psychometrics of interprofessional collaboration checklist in Iranian community health-care setting. J Edu Health Promot 2020;9:27

How to cite this URL:
Shirazi M, Shariati M, Zarghi N, Karbasi Motlagh M. Contextualization and psychometrics of interprofessional collaboration checklist in Iranian community health-care setting. J Edu Health Promot [serial online] 2020 [cited 2020 Jul 8];9:27. Available from: http://www.jehp.net/text.asp?2020/9/1/27/279795


  Introduction Top


Testing interprofessional collaboration (IPC) is an assessment of collaboration with different health-care professions to provide quality care to the client, by sharing knowledge, mutual collaboration with other professions/learners, clients/patients, family members, and communities.[1],[2],[3] Working interprofessionally could be possible through teaching and applying it in practice. Unfortunately, it has been neglected in medical sciences professions curricula.[4] Nowadays, there are hierarchical systems resulted from isolated educational disciplines that lead to health-care settings insufficiency due to the lack of knowledge and experience in IPC as learning to work with each other interprofessionally. On the other side, patient safety may be at risk due to the lack of effective collaboration and communication between different professions.[1],[4],[5] It has been suggested that interprofessional education as a basis of IPC, should be started from scratch in undergraduate medical sciences professions curricula.[4] Although there are few curricula, in which IPE and IPC have been achieved through effective teaching and testing by the valid and reliable tools and assessors, somehow testing has been neglected.[6]

Testing IPC is as crucial as its teaching in different health-care fields, either in clinical health care or community health-care settings. Numerous researchers have been considered it in clinical setting much more than community health-care setting.[4],[7],[8] On the minus side, they emphasize on moving to IPC in community health-care setting because of collaboration between different professions as well. According to Reeves (2000) community-based interprofessional education brings not only students' satisfaction, but also leads to improve their communication skills. Furthermore, interprofessional education in community settings could be conducive to develop collaboration and mutual understanding between health-care team members. Health-care providers and stakeholders also reported better interaction and collaboration with both clients and health-care team which leads to higher satisfaction.[9] Although community-based medical education covers both IPE and IPC, usually it has been concentrated on teaching rather than assessment. Besides the assessment of IPC with valid and reliable tools and assessors has not been done in community setting.[6],[10]

To assess IPC in community setting, it is also necessary to have not only standard rater/assessor but also a valid and reliable instrument. Several instruments have been developed for testing IPC in different fields; however, because of context-bound nature of IPC, i.e., considering the practice environment, they should be contextualized.[10] Some scholars try to develop tools for reflections on their experiences in the community health field as a self-report; however, they could not test their performance by an external evaluator using a standardized checklist.[11],[12]

For assessing of IPC performance in community health setting, CANMEDS collaborator toolkit for teaching and assessing the collaborator role has been selected as a framework in the present study. It is not only clear and concise but also assesses necessary competencies in four domains of roles and responsibility, team leadership, conflict management, and team functioning. It also assesses the collaboration competencies collectively. Therefore, individual competencies have not been regarded solely when using this framework. The objectives of CANMED framework can be applied as teaching tools, assessment tools, and resource tools.[10] Although it is a standard tool, should be contextualized when applying in any new field. It is noteworthy too that this tool is applicable in both clinical and community health-care settings. It should be noted that the instrument, developed through this study, is the first checklist extracted from CANMEDS which is applied for IPC assessment in community health-care setting. Some researchers found IPC as a challenging and complicated concept through doing qualitative studies; however, some international bodies tried to prepare a framework for its teaching and assessment.

The present study aimed to check psychometric properties of IPC checklist extracted from CANMEDS collaborator toolkit for teaching and assessing the collaborator role in Iranian community health context.


  Materials and Methods Top


This paper is a descriptive-analytic study for confirming psychometric properties of IPC checklist. The IPC checklist extracted from the CANMED collaborator toolkit for teaching and assessing the collaborator role[10] was applied for testing IPC. It was contextualized and validated following taking the permission of developers from February 2017 to June 2018 at Tehran University of Medical Sciences in the field of Health-care setting. The checklist included 18 items, tested four domains of team functioning, team leadership, roles and responsibilities, and conflict management which was scaled based on a 4-point Likert like Scale: not applicable as 1, less than expected as 2, expected as 3, and observed more than expected as 4. The total number of people participating in this study was 24 including general physicians, MSc of nursing in fields of education, critical care and management, BSc midwives, and sport medicine specialists. Accessible sampling was chosen because of limit number of experts in IPC field. All participants were not only familiar with IPC concepts but also work and do research in this field.

The contextualization process was conducted based on the modified form of the toolkit on translating and adapting instruments by Chavez.[13] To check validity, first, the instrument was translated by two bilingual expert translators familiar with IPC concepts separately. Two translation copies were integrated as one Persian copy. Then, the expert panel checked the accuracy of the translation. Two Delphi rounds[14] were carried out for assessing face and qualitative and quantitative content validity, i.e., content validity index (CVI) and content validity ratio (CVR) using Lawshe method.[15] The translated instrument was sent through E-mail to the experts in first Delphi rounds. The experts were twelve people familiar with IPC, including general physicians, MSc of nursing in fields of education, critical care and management, BSc midwives and sport medicine specialists participated in this step of study. Face validity and content validity were also observed through considering comments of experts, in which the statements could be understood well (qualitative) and scored it for CVI and CVR (quantitative). In the second Delphi round, the final copy was sent again to them for final confirmation. The final copy was an 18–statement tool. For assessing reliability, participants watched a 30-min valid and reliable film[5] about IPC. After watching the simulated IPC video, twelve experts completed a checklist. As to the literature, the minimum interval for doing test–retest has been considered as 2 weeks;[13],[16] and hence following a 15-day interval, the participants filled out again the checklist following watching the former video. To confirm reliability, test–retest was calculated. Then, the instrument was back-translated by the different bilingual translators from the previous step. The final instrument was compatible with the original one. The data were analyzed through the SPSS version 16 (SPSS, Inc., Chicago, IL, USA) using Cronbach's alpha for internal consistency. Lisrel: 8.5 was used for factor analysis in confirming construct validity using KMO and Bartlett tests. Ethical considerations were followed as rules and regulations of TUMS, in which participants were taken informed consent. The TUMS ethics committee also approved conducting this study (IRB code: 9221486002).


  Results Top


Twenty-four participants (12 for Delphi rounds and 12 for testing reliability) were introduced in this study. Demographic data of participants are shown in [Table 1]. The response rate was 100%. The results are presented in the following sections:
Table 1: Demographic characteristics of subjects

Click here to view


Validity

For content validity, two rounds of Delphi with 2–3 weeks interval, yields agreement of higher than 90%. Hence, the qualitative content validity to matching with aim and objectives was confirmed. To quantitative content validity CVI and CVR were calculated as 0.61 and 0.86, respectively. According to Lawshe, at the presence of 12 participants, content validity is confirmed when the result is calculated >0.56. The total score of instrument CVI was calculated as 0.61; therefore, quantitative content validity was also confirmed. The face validity was assessed for the developed instrument as well. [Table 2] shows the quantitative content validity of developed instrument (CVI and CVR).
Table 2: Content validity (content validity index and content validity ratio)

Click here to view


Factor analysis was carried out to confirm construct validity. The data were collected from 24 experts who were familiar with IPC concepts; they also were working and researching IPC. Due to the lack of people in this new emerging field, just 24 sets of data were accessible for running factor analysis. Confirmatory factor analysis was carried out using Lisrel software. According to the literature,[17] if x2/df <3, RMSEA ≈ 0.1 and indices of CFI, IFI, GFI, and AGFI are >0.7, the construct validity can be confirmed. [Table 3] demonstrates the results of confirmatory factor analysis by which the construct validity of IPC instruments has been confirmed.
Table 3: Confirmatory factor analysis

Click here to view


Reliability

To assess reliability, Cronbach's alpha was calculated as 0.953 which is higher than 0.7, i.e., the IPC instrument is reliable for applying in Iranian context. It is also indicated to internal consistency of instrument. The test–retest was also done for confirming reliability as well. Twelve participants were completed the checklist following watching the standard video in two steps with a 2-week interval. The correlation coefficient was calculated as 0.918.


  Discussion Top


Testing IPC is a challenging issue in health-care settings. It is related to both assessors/rater and valid and reliable instrument. This study aimed to contextualize the extracted checklist from CANMEDS collaborator toolkit for teaching and assessing the collaborator role. The results have been discussed in two sections of validity and reliability. The subjects introduced to study were people involved with IPC and engaged with working and doing research in this field. Interestingly, their fields of study were different. Therefore, we had a multidisciplinary team for doing this research. In the present study, Face, content, and construct validity were confirmed through this study. It was reliable as well; hence, this checklist is a valid and reliable tool for the assessment of IPC in Iranian context.

Face and content validity were confirmed both qualitatively following two Delphi rounds and quantitatively by calculating CVI and CVI. Construct validity, both exploratory and confirmatory was confirmed using factor analysis. It yielded four latent areas which all tested IPC collectively.

As to the literature, majority of IPC assessment tools have been applied in all settings such as clinical, inpatient, outpatient, emergency medicine, education, etc.[18] and community health-care setting has paid little attention.

On the other side, the theoretical framework of the present study is CANMEDS competencies which could be applied both as a teaching and assessment tool,[10] whereas the other tools used other IPC competency frameworks, in which just assessment of IPC could be achieved. Therefore, the CANMEDS collaborator toolkit for teaching and assessing the collaborator role seems to be better toolkit for testing IPC because of 100% matching of its items with the objectives should be thought. In developing the checklist, all objectives have considered as assessment items. Hence, you can design the assessment based on your aims and objectives. We also designed our scenario for IPC assessment based on the CANMEDS competencies and then, participants completed the extracted checklists. Consequently, all elements are in line with each other, i.e., all domains of IPC, including team functioning, team leadership, roles, and responsibilities, and conflict management could be evaluated as much as possible.

A systematic review was conducted by Shrader et al., in which quantitative tools for assessing IPC were reviewed. The IPC assessment tools were divided into four categories based on Kirkpatrick assessment levels. Nineteen instruments out of 36 classified in the third level, i.e., behavioral change level.[18] As we tried to follow checking performance and behavioral change in health-care context, we decided to choose the related tools allocated to third level of Kirkpatrick pyramid.[19] These tools have been discussed compared to IPC checklist extracted from CANMEDS collaborator toolkit for teaching and assessing the collaborator role which is contextualized in present study.

All tools could be applied in all health-care settings. Eight tools were self-report evaluations, five tools, assessed IPC by an external evaluator and the rest, evaluated culture and climate of IPC. Except for self-reports, other instruments were completed either by team or individuals. All instruments included 5–59 items. Interestingly, two tools for novice and expert people, developed through a Ph.D. dissertation, were completed following watching live simulated scenario like our study.[20]

Five instruments applied external evaluator for rating individuals as a part of interprofessional team. External observers/raters could be faculty member, preceptors working in health-care setting, standardized patients, peers and all other people involved with 360-evaluation.[5],[20],[21],[22],[23] The CanMeds IPC checklist, developed in the current study also could be utilized by external evaluators to assess performance of team members to act interprofessionally.

Compared Shrader et al. to the present study, five IPC tools have been completed by an external evaluator. As to D'Amour et al., it is necessary for assessing IPC by both standard instrument and standard rater/assessor. In this study, we contextualized the instrument by which an external assessor could evaluate IPC.[2] The item numbers of instrument can also influence on respondents when completing the form.[24] It is preferred to choose rational number of items in the related context if the validity and reliability could be supported. In the present study, 18 items could lead to good coefficients of validity and reliability.

Keshmiri et al.[5] applied interprofessional collaborator assessment rubric (ICAR), developed and contextualized by Curran et al.[6] for testing IPC. However, to check IPC, we applied CANMEDS collaborator toolkit for teaching and assessing the collaborator role. Like Keshmiri et al.,[5] in the present study, the modified toolkit on translating and adapting instrument[13] was applied. They also used two Delphi rounds for qualitative content validity; however, they did not report quantitative amounts of CVI and CVR in their study. We conducted expert panel for checking translation and compatibility of the translated copy from English to Persian, before running Delphi rounds. In our study, Delphi rounds covered both qualitative and quantitative content validity.

Similarly, both studies have used the standard simulated video in order to check reliability due to the lack of IPC context in Iranian fields. They have reported reliability using Cronbach's alpha as 0.71, and test–retest as 0.76; however, in this study, internal consistency was 0.953 and test–re-test coefficient as 0.918 (P = 0.000) The CANMEDS IPC checklist seems to be more valid and reliable in Iranian context statistically both in clinical and community health-care settings. It seems, the ICAR with 31 items in 6 domains is too long to use in community health-care settings. Some domains have been merged in CANMEDS IPC checklist, makes it briefly and more user-friendly to apply, especially in community health-care setting.

Due to the lack of experts in the field of IPC in Iran, the sample size of study is small. As there is no IPC setting in Iran, a standardized simulated video was used to check reliability of instrument.


  Conclusion Top


According to the results, CANMEDS IPC checklist is a valid and reliable tool for testing IPC in the Iranian context, especially in community health-care setting. As the CANMEDS collaborator toolkit for teaching and assessing the collaborator role could be applied both in teaching and testing, it is recommended to apply this framework for both teaching and testing in different settings of clinical and community health care.

Acknowledgment

Authors offer their special thanks to all participants for their valuable help in conducting this study. They also appreciate NASR for supporting the study financially.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
  References Top

1.
Personnel WSGoMEoH. Learning Together to Work Together for Health: Report of a WHO Study Group on Multiprofessional Education of Health Personnel: The Team Approach. World Health Organization; 1988.  Back to cited text no. 1
    
2.
D'Amour D, Ferrada-Videla M, San Martin Rodriguez L, Beaulieu MD. The conceptual basis for interprofessional collaboration: Core concepts and theoretical frameworks. J Interprof Care 2005;19 Suppl 1:116-31.  Back to cited text no. 2
    
3.
Panel IECE. Core Competencies for Interprofessional Collaborative Practice: Report of an Expert Panel: Interprofessional Education Collaborative Expert Panel; 2011.  Back to cited text no. 3
    
4.
Khan NS, Shahnaz SI, Gomathi KG. Currently available tools and teaching strategies for the interprofessional education of students in health professions: Literature review. Sultan Qaboos Univ Med J 2016;16:e277-85.  Back to cited text no. 4
    
5.
Keshmiri F, Ponzer S, Sohrabpour A, Farahmand S, Shahi F, Bagheri-Hariri S, et al. Contextualization and validation of the interprofessional collaborator assessment rubric (ICAR) through simulation: Pilot investigation. Med J Islam Repub Iran 2016;30:403.  Back to cited text no. 5
    
6.
Curran V, Hollett A, Casimiro LM, Mccarthy P, Banfield V, Hall P, et al. Development and validation of the interprofessional collaborator assessment rubric (ICAR). J Interprof Care 2011;25:339-44.  Back to cited text no. 6
    
7.
Park S, Khan NF, Hampshire M, Knox R, Malpass A, Thomas J, et al. A BEME systematic review of UK undergraduate medical education in the general practice setting: BEME guide no 32. Med Teach 2015;37:611-30.  Back to cited text no. 7
    
8.
Hammick M, Freeth D, Koppel I, Reeves S, Barr H. A best evidence systematic review of interprofessional education: BEME guide no 9. Med Teach 2007;29:735-51.  Back to cited text no. 8
    
9.
Reeves S. Community-based interprofessional education for medical, nursing and dental students. Health Soc Care Community 2000;8:269-76.  Back to cited text no. 9
    
10.
Glover Takahashi S, Dawn M, Richardson D. The Canmeds Toolkit for Teaching and Assessing the Collaborator Role. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2012.  Back to cited text no. 10
    
11.
Cappiello JD, Joy J, Smith P, Orgren RA. The SEARCH project: Acquainting students in the health professions with interprofessional care. J Allied Health 2015;44:91-5.  Back to cited text no. 11
    
12.
Arndell C, Proffitt B, Disco M, Clithero A. Street outreach and shelter care elective for senior health professional students: An interprofessional educational model for addressing the needs of vulnerable populations. Educ Health (Abingdon) 2014;27:99-102.  Back to cited text no. 12
    
13.
Chávez LM, Canino G. Toolkit on Translating and Adapting Instruments. Available from: http://www hsri org/files/uploads/publications/PN54_Translating_and_Adapting pdf. 2005.  Back to cited text no. 13
    
14.
Tomasik T. Reliability and validity of the delphi method in guideline development for family physicians. Qual Prim Care 2010;18:317-26.  Back to cited text no. 14
    
15.
Lawshe CH. A quantitative approach to content validity 1. Pers psychol 1975;28:563-75.  Back to cited text no. 15
    
16.
Taymoori P, Moeini B, Lubans D, Bharami M. Development and psychometric testing of the adolescent healthy lifestyle questionnaire. J Educ Health Promot 2012;1:20.  Back to cited text no. 16
    
17.
Bryant FB, Yarnold PR. Principal-Components Analysis and Exploratory and Confirmatory Factor Analysis 1995.  Back to cited text no. 17
    
18.
Shrader S, Farland MZ, Danielson J, Sicat B, Umland EM. A systematic review of assessment tools measuring interprofessional education outcomes relevant to pharmacy education. Am J Pharm Educ 2017;81:119.  Back to cited text no. 18
    
19.
Barr H, Koppel I, Reeves S, Hammick M, Freeth DS. Effective Interprofessional Education: Argument, Assumption And Evidence (Promoting Partnership For Health). John Wiley & Sons; 2008.  Back to cited text no. 19
    
20.
Chiu CJ. Development and Validation of Performance Assessment Tools for Interprofessional Communication and Teamwork (PACT). 2014.  Back to cited text no. 20
    
21.
Careau E, Vincent C, Swaine BR. Observed interprofessional collaboration (OIPC) during interdisciplinary team meetings: Development and validation of a tool in a rehabilitation setting. J Res Interprof Pract Educ 2014;4.  Back to cited text no. 21
    
22.
Thistlethwaite J, Dallest K, Moran M, Dunston R, Roberts C, Eley D, et al. Introducing the individual teamwork observation and feedback tool (iTOFT): Development and description of a new interprofessional teamwork measure. J Interprof Care 2016;30:526-8.  Back to cited text no. 22
    
23.
Curran V, Casimiro L, Banfield V, Hall P, Gierman T, Lackie K, et al. Interprofessional collaborator assessment rubric. Academic Health Council of Canada; 2010.  Back to cited text no. 23
    
24.
Artino AR Jr., La Rochelle JS, Dezee KJ, Gehlbach H. Developing questionnaires for educational research: AMEE guide no 87. Med Teach 2014;36:463-74.  Back to cited text no. 24
    



 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

Top
Previous article  Next article
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Materials and Me...
Results
Discussion
Conclusion
References
Article Tables

 Article Access Statistics
    Viewed197    
    Printed17    
    Emailed0    
    PDF Downloaded30    
    Comments [Add]    

Recommend this journal