Print this page Email this page Users Online: 149 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
BRIEF COMMUNICATION
Year : 2016  |  Volume : 29  |  Issue : 3  |  Page : 244-249

Assessment of community-based training of medical undergraduates: Development and validation of a competency-based questionnaire


1 Department of Community Medicine, Indira Gandhi Medical College and Research Institute, Puducherry, Tamil Nadu; Department of Operational Research, International Union Against Tuberculosis and Lung Disease (The Union), South-East Asia Office, New Delhi, India
2 Department of Community Medicine, Velammal Medical College Hospital and Research Institute, Madurai, Tamil Nadu, India
3 Department of Preventive and Social Medicine, Jawaharlal Institute of Postgraduate Medical Education and Research; Department of Community Medicine, Pondicherry Institute of Medical Sciences, Puducherry, Tamil Nadu, India
4 Department of Preventive and Social Medicine, Jawaharlal Institute of Postgraduate Medical Education and Research, Puducherry, Tamil Nadu, India
5 Department of Community Medicine, Indira Gandhi Medical College and Research Institute, Puducherry, Tamil Nadu, India

Date of Web Publication11-Apr-2017

Correspondence Address:
Hemant Deepak Shewade
Department of Operational Research, International Union Against Tuberculosis and Lung Disease (The Union), South-East Asia Office, C.6 Qutub Institutional Area, New Delhi - 110 016
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/1357-6283.204218

  Abstract 

Background: The global shift toward competency-based education and assessment is also applicable to community-based training (CBT) of undergraduate medical students. There is a need for a tool to assess competencies related to CBT. This study aimed to develop a tool that uses a competency-based approach to evaluate CBT of medical undergraduates. Methods: A preliminary draft of the questionnaire was prepared by the investigators based on a conceptual framework. Using the Delphi technique, this draft was further developed by a specialist panel (n = 8) into a self-administered questionnaire. After pretesting with students, it was administered to medical undergraduates (n = 178) who had recently completed Community Medicine. Item analysis and exploratory factor analysis were performed under which principal component analysis was used. Reliability was assessed by calculating Cronbach's alpha, convergent validity by correlating the scores with Community Medicine university examination scores, and construct validity by describing percentage variance explained by the components. Results: A 74-item questionnaire developed after the Delphi technique was further abridged to a 58-item questionnaire. Cronbach's alpha of 74 and 58-item questionnaires were 0.96 and 0.95, respectively; convergent validity was 0.07 and 0.09, respectively; and percentage variance explained by the components were 69.3% and 70.1%, respectively. Agreement between scores of both versions was 0.76. Discussion: The authors developed a questionnaire which can be used for competency-based assessment in community-based undergraduate medical education. It is a valuable addition to the existing assessment methods and can guide experts in a need-based design of curriculum and teaching/training methodology.

Keywords: Community-based teaching, community-based training, competency-based, Delphi technique; factor analysis, questionnaire, tool, undergraduate medical education


How to cite this article:
Shewade HD, Jeyashree K, Kalaiselvi S, Palanivel C, Panigrahi KC. Assessment of community-based training of medical undergraduates: Development and validation of a competency-based questionnaire. Educ Health 2016;29:244-9

How to cite this URL:
Shewade HD, Jeyashree K, Kalaiselvi S, Palanivel C, Panigrahi KC. Assessment of community-based training of medical undergraduates: Development and validation of a competency-based questionnaire. Educ Health [serial online] 2016 [cited 2021 Oct 21];29:244-9. Available from: https://www.educationforhealth.net/text.asp?2016/29/3/244/204218


  Background Top


Community-based training (CBT) program, a vital part of undergraduate medical education (UGME), is an instructional program carried out in the community outside the teaching hospital.[1] In India, CBT is managed by the Department of Community Medicine or Preventive and Social Medicine. CBT has been conceived with the understanding that health care is not bound within the ivory tower of tertiary teaching hospitals. There is limited literature available on CBT models in India. The Reorientation of Medical Education scheme though not as successful as it was conceived to be, is worthy of a mention. Further, individual institutions have established their own models for CBT.

UGME programs worldwide have been shifting over from the year(s) in-training approach to the competence-based training approach.[2] The Medical Council of India, which regulates medical education in India, has in its Vision 2015 document recommended a shift toward a competency-based approach.[3] However, there are no competency-based tools available for evaluating CBT. Therefore, we aimed to develop a tool that uses a competency-based approach to evaluate CBT of medical undergraduates.


  Methods Top


Study setting

We conducted the study in two government-funded medical colleges (teaching hospitals) in Puducherry, India, between June 2013 and January 2014. Medical graduation or Bachelor of Medicine and Bachelor of Surgery (MBBS) consists of four and half years (9 semesters) followed by one year of internship. The subject of Community Medicine is taught from the first year to the 7th semester. While theory sessions are conducted throughout, clinical postings (community-based) with the Department of Community Medicine begin from the 3rd semester. There are three clinical postings, each of 4 weeks duration, in rural/urban health training centers. Students also participate in the Family Health and Advisory Program in the 4th and 5th semesters, where students follow a family (in the community they serve) through weekly home visits. Through the CBT, students are exposed to four components of Community Medicine: family medicine, epidemiology, health promotion, and health management.[4]

Questionnaire development and pretesting

Based on a review of competencies covered by CBT programs, we developed a conceptual framework using six core competencies, adapting the framework to the Indian context, and preparing a preliminary list of major domains with items under each domain.[5] We purposively selected a panel of specialists (n = 8) in Community Medicine and used the Delphi technique to further develop the questionnaire and arrive at a consensus. The specialists were asked to rate each item (rating range between 0 - Poor and 4 - Good) under the headings of relevance, sensitivity, specificity and understandability. The iterative process continued up to the predefined consensus point when 70% of the specialists gave a rating of three or above for each item under relevance, sensitivity, and specificity.

The questionnaire was then pretested with three students who were not part of the study group. All MBBS students who were to appear for the final examination in Community Medicine in December 2013 (7th semester) in the two medical colleges completed the questionnaire in January 2014. Institute Research Committee of Indira Gandhi Medical College and Research Institute, Puducherry, approved the study.

Data entry and analysis

We double entered and validated the data using EpiData (version 3.1, EpiData Association, Odense, Denmark) software. We analyzed data using STATA (version 12.1, Copyright 1985–2011 StataCorp LP USA, serial number: 30120504773).

We ensured face and content validity through the Delphi process and pretesting with students. We assessed convergent validity using the correlation between scores obtained in the questionnaire and final Community Medicine university examination scores. We set positive correlation, intraclass correlation coefficient [ICC] >0.6, as a criterion to indicate acceptable validity. To determine internal consistency, we calculated Cronbach's alpha for the questionnaire overall and for the major domains separately, considering alpha values >0.70 as an indicator of good internal consistency. We developed an abridged version of the questionnaire after item analysis followed by data reduction. After testing the assumptions for factor analysis, we created a correlation matrix. We checked adequacy of sampling using Kaiser-Meyer-Olkin (KMO) statistics (>0.60) and applied the Bartlett test for sphericity (P < 0.05). We performed data reduction through exploratory factor analysis (EFA) using the principal components factor extraction method.[6],[7],[8] We planned to select factors/domains based on Eigenvalue >1.0 (Kaiser Method) and a break in the scree plot. We carried out varimax rotation to abridge the scale using items with factor loadings above 0.50. In cases where inclusion of an item in a domain (EFA) did not concur with the Delphi consensus, we went with the latter.


  Results Top


The seven domain, 74-item self-administered questionnaire after completing Delphi consensus and student pretesting is available from the authors. A total of 178 students completed the questionnaire. The ICC coefficient for convergent validity was 0.07 indicating a lack of correlation. The overall Cronbach's alpha was found to be 0.96. All the domains had a good within domain correlation. The domain-wise Cronbach's alpha are summarized in [Table 1]. The KMO measure was 0.87 indicating sampling adequacy. The Bartlett's test of sphericity was statistically significant (Chi-square = 9111.9, P< 0.0001).
Table 1: Domain-wise Cronbach's alpha value for the 74-item questionnaire on self-assessed competencies in community-based training of undergraduate medical students, Puducherry, India (2013-2014)

Click here to view


The final questionnaire after data reduction had six domains and 58 items [Appendix 1 [Additional file 1]]. Cronbach's alpha value and convergent validity (ICC) for the 58-item questionnaire were 0.95 and 0.09, respectively. Construct validity was 70.1%. Agreement of the scores of 74-item questionnaire with the 58-item questionnaire was 0.76 (ICC). Steps involved in questionnaire development and a number of items at each step are described in detail in [Table 2].
Table 2: Steps followed in developing questionnaire on self-assessed competencies in community-based training of undergraduate medical students, Puducherry, India (2013-2014)

Click here to view



  Discussion Top


This is the first study from India and worldwide to develop a tool for competency-based evaluation of CBT in UGME. Overall, we used a conceptual framework and Delphi process aided by statistical methods to develop the questionnaire. We developed a reliable and valid (face/content/construct validity) 58-item questionnaire using the Delphi technique followed by pretesting, item analysis, and data reduction. Our study did not include a component of concurrent validity as the literature search did not reveal any other tool that could measure the same construct.

The scores obtained in our tool did not correlate with the university examination scores (convergent validity). This might be due to the theory (knowledge)-based nature of the university exam while our tool is competency-based. However, the university scores were the closest we could get in identifying another test based on a similar construct as our tool.

There are a number of implications of our study. First, since the 58-item questionnaire is shorter and correlated well with 74-item questionnaire, it can be used for competency-based evaluation of CBT. Second, the Delphi panel was diverse with sufficient local representation. Therefore, the competencies identified in the 74-item questionnaire may provide the base for development of valid curricula for CBT practices. Third, of the 74 competencies in 74-item questionnaire, 41 competencies (55%) pertained to “Family Medicine” or “Public Health Administration at primary health care level.” This calls for a change in teaching hospitals in infrastructure and work culture among Community Medicine faculty so that the primary role of faculty focuses on family medicine and community health administration.[9] However, there could be challenges in implementation of competency-based CBT and UGME in general, such as curricula design, faculty training, student assessment, and systematic institutional change.[10]

Among the strengths of our study is the addition to the pool of tools available for assessment in medical education. Although other competency classifications are available for medical education in general, we used the classification specific for CBT [5] for developing the conceptual framework. Delphi panelists had the opportunity to modify their judgment based on feedback without being overtly influenced by others in the group. On the other hand, there were some study limitations. First, this was a self-assessment mechanism. Second, students from one of the two medical colleges were excluded from the analysis of convergent validity (n = 70) as consent forms were signed but were not stored along with the completed questionnaire. The questionnaire did not have any identifier information. Hence, it was not possible to link the questionnaire score with university examination scores of a student belonging to this medical college. Third, we did not perform test-retest as a measure of reliability. Validity, reliability, and exploratory factor analysis of the 58-item questionnaire were not studied by administering to a different study group - a potential area for future research. Future research is required in other settings which may include confirmatory factor analysis to verify the factor structure of the items.


  Conclusion Top


We developed a 58 item questionnaire which can be used for competency-based assessment in community-based undergraduate medical education. This tool can be viewed as the opportunity for entering into the realm of competency-based assessment in community-based UGME in India. It is a valuable addition to the existing assessment methods in India and can guide experts in a need-based design of curriculum and teaching/training methodology.

Acknowledgment

We acknowledge the contribution of the Delphi specialist panel who took out time from their busy schedules to contribute toward development of the questionnaire: Prof. Amarjeet Singh, Department of Community Medicine, PGIMER, Chandigarh, India; Prof. Amol Dongre, Department of Community Medicine, SMVMCH, Puducherry, India; Dr. Diwakar Mohan, Public Health Specialist, John Hopkins School of Public Health, Baltimore, USA; Prof Gautam Roy, Dr. Mahalakshmy T, and Dr. Palanivel C, Department of Preventive and Social Medicine, JIPMER, Puducherry, India; Dr Himanshu Negandhi, Indian Institute of Public Heath, New Delhi, India; and Dr. Star Pala, Department of Community Medicine, NEIGRIHMS, Shillong, India. We also acknowledge the support of students who helped us during pretesting of the questionnaire. We thank the Department for International Development (DFID), UK, for funding the Global Operational Research Fellowship Programme at the International Union Against Tuberculosis and Lung Disease (The Union), Paris, France in which Hemant Deepak Shewade works as an operational research fellow.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
  References Top

1.
Deutsch S, Noble J, editors. Community-based Teaching: A Guide to Developing Education Programs for Medical Students and Residents in the Practitioner's Office. Philadelphia: ACP Press; 1997.  Back to cited text no. 1
    
2.
Long DM. Competency-based residency training: The next advance in graduate medical education. Acad Med 2000;75:1178-83.  Back to cited text no. 2
    
3.
Medical Council of India. Vision 2015. New Delhi, India: Medical Council of India; 2011.  Back to cited text no. 3
    
4.
Kumar R. Development of community medicine sub-specialities. Indian J Community Med 2005;30:43.  Back to cited text no. 4
  [Full text]  
5.
Ladhani Z, Scherpbier AJ, Stevens FC. Competencies for undergraduate community-based education for the health professions – A systematic review. Med Teach 2012;34:733-43.  Back to cited text no. 5
    
6.
Williams B, Brown T, Onsman A. Exploratory factor analysis: A five-step guide for novices. Australas J Paramed 2012;8.  Back to cited text no. 6
    
7.
Abdi H, Williams LJ. Principal component analysis. Wiley Interdiscip Rev Comput Stat 2010;2:433-59.  Back to cited text no. 7
    
8.
Crawford IM, Lomas RA. Factory analysis – A tool for data reduction. Eur J Mark 1980;14:414-21.  Back to cited text no. 8
    
9.
Shewade HD, Jeyashree K, Chinnakali P. Reviving community medicine in India: The need to perform our primary role. Int J Med Public Health 2014;4:29-32.  Back to cited text no. 9
  [Full text]  
10.
Harris P, Snell L, Talbot M, Harden RM. Competency-based medical education: Implications for undergraduate programs. Med Teach 2010;32:646-50.  Back to cited text no. 10
    



 
 
    Tables

  [Table 1], [Table 2]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Background
Methods
Results
Discussion
Conclusion
References
Article Tables

 Article Access Statistics
    Viewed1882    
    Printed45    
    Emailed0    
    PDF Downloaded321    
    Comments [Add]    

Recommend this journal