|ORIGINAL RESEARCH PAPER
|Year : 2010 | Volume
| Issue : 3 | Page : 425
On-line Capacity-Building Program on "Analysis of Data" for Medical Educators in the South Asia Region: A Qualitative Exploration of our Experience
AR Dongre1, TV Chacko2, S Banu2, S Bhandary3, RA Sahasrabudhe4, S Philip5, PR Deshmukh6
1 Sri Manakula Vinayagar Medical College and Hospital, Puducherry, India, India
2 PSG-FAIMER Regional Institute, Coimbatore, India
3 Patan Academy of Health Sciences, Lalitpur, Nepal
4 Bharati Vidyapeeth University Medical College, Pune, Maharashtra, India
5 T.D.Medical College, Alappuzha, India
6 Mahatma Gandhi Institute of Medical Sciences, Sewagram, India
|Date of Submission||24-Nov-2009|
|Date of Acceptance||31-Jul-2010|
|Date of Web Publication||30-Nov-2010|
A R Dongre
Department of Community Medicine, Sri Manakula Vinayagar Medical College and Hospital, Madagedipeth, Puducherry - 605 107
Source of Support: None, Conflict of Interest: None
Background and Objective: In medical education, using the World Wide Web is a new approach for building the capacity of faculty. However, there is little information available on medical education researchers' needs and their collective learning outcomes in such on-line environments. Hence, the present study attempted: 1)to identify needs for capacity-building of fellows in a faculty development program on the topic of data analysis; and 2) to describe, analyze and understand the collective learning outcomes of the fellows during this need-based on-line session.
Material and Methods: The present research is based on quantitative (on-line survey for needs assessment) and qualitative (contents of e-mails exchanged in listserv discussion) data which were generated during the October 2009 Mentoring and Learning (M-L) Web discussion on the topic of data analysis. The data sources were shared e-mail responses during the process of planning and executing the M-L Web discussion. Content analysis was undertaken and the categories of discussion were presented as a simple non-hierarchical typology which represents the collective learning of the project fellows.
Results: We identified the types of learning needs on the topic 'Analysis of Data' to be addressed for faculty development in the field of education research. This need-based M-L Web discussion could then facilitate collective learning on such topics as 'basic concepts in statistics', tests of significance, Likert scale analysis, bivariate correlation, and simple regression analysis and content analysis of qualitative data.
Conclusions: Steps like identifying the learning needs for an on-line M-L Web discussion, addressing the immediate needs of learners and creating a flexible reflective learning environment on the M-L Web facilitated the collective learning of the fellows on the topic of data analysis. Our outcomes can be useful in the design of on-line pedagogical strategies for supporting research in medical education.
Keywords: Needs assessment, faculty development, on-line discussion
|How to cite this article:|
Dongre A R, Chacko T V, Banu S, Bhandary S, Sahasrabudhe R A, Philip S, Deshmukh P R. On-line Capacity-Building Program on "Analysis of Data" for Medical Educators in the South Asia Region: A Qualitative Exploration of our Experience. Educ Health 2010;23:425
|How to cite this URL:|
Dongre A R, Chacko T V, Banu S, Bhandary S, Sahasrabudhe R A, Philip S, Deshmukh P R. On-line Capacity-Building Program on "Analysis of Data" for Medical Educators in the South Asia Region: A Qualitative Exploration of our Experience. Educ Health [serial online] 2010 [cited 2020 Oct 20];23:425. Available from: https://www.educationforhealth.net/text.asp?2010/23/3/425/101471
Background and Introduction
The World Wide Web is increasingly being used for delivering on-line learning programs1. Background on the faculty development program called ‘Mentoring and Learning Web’ (M-L Web) by the Regional Institute Fellows of the Foundation for Advancement of International Medical Education and Research (FAIMER) has been reported earlier2. It is an on-line component of the faculty leadership development program by “The FAIMER Institute”, which begins with a residential session at the regional center followed by an 11-month, on-line M-L Web discussion through listserv on topics of stated interest by fellows. This series of month-long intersession activity is moderated by fellows and regional faculty members3. As part of this and as a requirement of their capacity-building efforts, first-year fellows carry out a Curriculum Innovation Project (CIP), conduct data analysis and communicate their successes at their home institutions.
Due to lack of sound statistical background, most medical faculty taking up educational research are prone to commit errors in data analysis and its interpretation4. Hence, to ensure timely technical help in analysis of CIP-generated data, an M-L Web discussion on the topic ‘Analysis of Data’ was undertaken on the listserv of the PSG-FAIMER Regional Institute (PSG-FRI), Coimbatore, India during October 20095. As the M-L Web is a new approach, little information was available on participants’ collective learning outcomes in this particular on-line environment. The relationship between educational design and its outcome in networked learning is seen as a matter of concern6. Hence, the present study attempted to identify learning needs for capacity-building of fellows in data analysis and to describe, analyze and understand the collective learning outcomes of the fellows and their different learning styles during this need-based on-line session. This paper adds to the body of knowledge of how and what is experienced and understood in need-based sessions on data analysis and it attempts to bring collective learning in virtual space into focus.
Material and Methods
The present research is based on quantitative (on-line survey for needs assessment) and qualitative (contents of the e-mails exchanged on listserv discussion) data which were generated during shared conversations on the M-L Web during October 2009 on the topic of data analysis. All the fellows had a face-to-face meeting during an onsite session where they were oriented to the process of listserv discussion.
Learning needs assessment for the M-L Web discussion: In order to identify the capacity-building needs of the fellows with regard to their CIPs in the area of data analysis, one month prior to the scheduled discussion a team of four discussion moderators conferred on-line and reviewed all sixteen CIPs of first-year fellows. They examined the objectives, study designs and the evaluation plans of CIPs, and finalized the learning objectives of the session. To further refine these learning objectives for the M-L Web discussion reflecting their immediate learning needs, an on-line survey using a structured questionnaire covering quantitative data analysis and five open ended questions relating to qualitative data analysis issues was mailed on the listserv. Finally, based on the findings of the survey and review of CIP proposals submitted by fellows to identify their data analysis learning needs, the M-L Web discussion objectives for the month were refined and finalized. It was also decided to remain focused on the current data analysis needs or queries of the fellows with regard to facilitating their successful implementation of their CIPs. A total of 75 e-mails were exchanged between members of the planning group, of which two-thirds occurred before the M-L Web discussion month.
Discussion process: The shared listserv discussion on ‘Analysis of Data’ was based on adult learning principles7. It was facilitated during the month of October 2009 with division of the month into weekly focus themes each led by a moderator. The first half of the month was devoted to capacity-building on quantitative methods and the second half to qualitative methods. During each of these halves of the month, the first week was devoted to giving an overview on the methods by the first-year fellowship moderators. The second week of each was devoted to responding to the specific learning needs of the fellows focusing on data analysis issues related to their implementation of their CIPs. This week’s moderation was led by second-year fellows who had already completed their CIPs the previous year and were also professionally trained in the field of qualitative and quantitative data analysis. Fellows were encouraged to reflect and express their free and spontaneous responses, raise questions and share their problems and experiences. Moderators ensured prompt feedback on whether they had understood the concept or agreed with them. This activity was supervised by two faculty members of PSG-FRI who were the originators of this entire process.
Data collection, analysis and reporting: We stored the text messages of all e-mails which were exchanged during the discussion period. The e-mail responses satisfy the criterion of ‘low inference descriptors’ as participants do their own transcribing8. During listserv communication, fellows contributed pieces of information to the data analysis discussion. In the first two weeks, 93 e-mails were exchanged, generating 24 pages of information (9,739 words) on quantitative data analysis. In the latter two weeks, 61 mails were exchanged resulting in 19 pages of information (7,810 words) on qualitative analysis.
The responses for the initial needs assessment survey were also quantified. For e-mails relating to discussion, a manual content analysis was done to identify and retain the participants’ collective learning9. Descriptive categories were formed from the contents of e-mails to characterize understanding of the fellows. The e-mail responses had characteristics of both speech and writing. Hence, the units of analysis were sentences and paragraphs which were identified using topic sentences as the beginning of units and concluding, summary and transition statements (where used) as completion of units. The categories of discussion on analysis of quantitative and qualitative data are presented as a simple non-hierarchical typology which represents conceptions of the phenomenon by the fellows10. The content of personal e-mails, e-mail attachment of published papers and other social mails exchanged during the discussion period were excluded from the analysis. Italic text appearing in the analysis signifies direct quotes. The first author performed the content analysis and the second, third and the last author reviewed it. Any disagreements were resolved through discussion. We have followed ‘Consolidated Criteria for Reporting Qualitative Research’ (COREQ) guidelines while reporting the present work11.
Written consent was obtained from the fellows at the beginning of the FAIMER fellowship to document and analyze listserv discussion for themes, contents and number of submissions associated with particular themes or content. The results of the analysis of the e-mail discussions were also member checked with the fellows for correctness at the time of end of the week summary as well as at the time of submission of the ‘Scholarly Report’ which is a requirement of the fellowship.
There were 30 fellows (14 from year 2008 and 16 from year 2009) from different states of India, Nepal and Malaysia. All were medical educators from different disciplines of Medicine having more than three years of teaching experience. Four fellows (2 each from year 2008 and year 2009) were the moderators under the supervision of two PSG-FRI faculty members. Overall, the discussion was participatory, factual and reflective in nature.
Capacity-building needs identified for the M-L Web discussion: During the planning stage, after reviewing the CIPs of the 2009 fellows, the major topics that emerged for inclusion in the discussions were: basic concepts of statistics (normal distribution, variables); tests of significance; Likert scale analysis; and content analysis of qualitative data. As found in the pre-discussion on-line survey, among fellows who attempted competency-based exercises, only 4 (22.2%) had competency to test normality assumptions, 11 (52.4%) could do Likert scale analysis and 12 (60%) could apply appropriate tests of significance (Table 1).
Table 1: Needs assessment for on-line capacity-building in quantitative data analysis
Twenty of 26 fellows responded to open-ended questions on qualitative data analysis as summarized in Table 2. These responses pointed to the need for discussion on such topics as: methods in qualitative research; poor attitude/faith in qualitative research; apprehension about its subjective nature; and lack of experience in handling such datasets and the reporting of qualitative research. Of the fellows, 19 had been using qualitative methods in their work setting, mainly, in the form of responses to open-ended questions. Sixteen fellows reported not receiving any formal training in these methods. Related to this, 12 fellows were not comfortable in the reporting of qualitative data. Barriers in this regard included: time-consuming nature of analysis; subjective nature of analysis; and perceived ‘superior’ nature of quantitative research.
Table 2: Reasons for using qualitative research methods and barriers in its applications
Areas of collective learning during discussion on quantitative data analysis: Seventeen fellows (70.8%) participated in this discussion and had at least one posting during the discussion. There were four categories that emerged from the analysis of the M-L Web discussion. Descriptive statistics consisted of: best practices for data collection and processing; basic concepts like normal distribution; and development of a codebook before data entry. Inferential statistics entailed: analysis of variance; factor analysis of Likert scale items; bivariate correlation; and simple regression analysis. Queries to moderators pertaining to CIPs included issues related to: types of variables; selection of tests of significance; and analysis using Likert scales. The fourth category consisted of exercises based on the actual dataset. These exercises were related to: the development of a codebook; creating a derived variable; analyzing a contingency table; selecting a test of significance; and performing simple regression analysis.
Areas of collective learning during discussion on qualitative data analysis: Twelve fellows (50%) participated in this discussion and had at least one posting during the discussion. There were again four categories that emerged. First, types of qualitative research methods such as participatory tools and techniques, in-depth techniques (focus group discussion) and systematic techniques were discussed. The second category of discussion was analysis of qualitative data where the steps of content analysis and the various methods to ensure trustworthiness and validity of data were discussed. During this discussion, the Power Point presentations on transcription, coding and reporting qualitative research were shared. Third, queries about CIPs were related to the procedure and reporting of focus group discussion, analysis of open-ended questions and analysis of free list and pile sort data. As part of this, a discussion on ‘ethical issues in qualitative research’ took place. The fourth category of discussion was manual versus computer-aided content analysis and the scope of software in qualitative data analysis.
Fellows’ feedback reflecting different ways of learning: Twelve fellows provided feedback on their learning experiences through the M-L Web discussions. Fellows learned collaboratively by articulating and sharing their ideas and expertise through discussion or by performing an activity or task. One fellow expressed that the description and discussion were very elaborate and the examples and attachments (exercises based on dataset instructions) were self-explanatory.
Fellows could link their learning from on-line contributions to their own practice. A remark of one fellow illustrated this. I am going to use techniques like FGD and open-ended questions in trying to assess what the students expect from this (CIP) activity and what they actually gain at the end of it. I shall use all the knowledge gained during this discussion for undertaking the research project, collecting data and analyzing it.
Fellows described both positive and negative experiences. One said that the entire discussion provided framework on how to go about in analyzing and presenting the data that was collected, but I am still not comfortable with computer-aided content analysis (of qualitative data).
Fellows perceived ‘data analysis’ as a difficult topic but felt they could learn it in the present on-line learning style. Related to this, a comment was: the team has been able to manage a rather difficult topic to (an) easy flowing and interesting one. Fellows expressed their wish to communicate in the future for further clarifications in data analysis and its reporting.
Overall, the M-L Web brought together learners who were geographically scattered and listserv discussion remained focused on the learning needs of the fellows. The M-L Web discussion offered learning on such topics as ‘basic concepts in statistics’ (normal distribution, variables), tests of significance, Likert scale analysis, bivariate correlation, and simple regression analysis and content analysis of qualitative data. There was discussion on perceived barriers to qualitative research methods such as its time-consuming nature, subjectivity and the perceived ‘superior’ nature of quantitative research. This timely discussion and feedback on the topic of data analysis is expected to benefit first-year fellows who are currently working on their CIPs, offering support in study design, implementation and analysis plans and production of technically robust research work. The exploration of e-mail contents identified the learning needs of fellows in data analysis, and provided an example of collective learning outcomes and different ways of learning in an on-line environment. Similarly, Watland found this line of inquiry useful for researching networked management learning phenomena of on-line tutor support to improve education12.
Recently, Swift et al. emphasized the importance of teaching statistics to medical undergraduates and its relevance to their future careers13. In India, basic statistics is taught as part of the undergraduate curriculum, but the topic is often neglected as it is perceived as difficult to understand14. Later on, statistics can become more important at the time of completion of a dissertation during post-graduate studies. Thereafter, busy medical educators/faculty find it difficult to attend workshops on biostatistics due to their preoccupation with routine professional and social responsibilities.
Interestingly, there was a high response rate to our discussion on quantitative data analysis in the first half month, but it was relatively low for qualitative data analysis in the second half. This could be because qualitative research was perceived as ‘inferior’ to quantitative work or possibly because ignorance or lack of exposure to qualitative methods prevented fellows from asking to clarify questions about the subject. The latest trend in the field of research is the combined use of quantitative and qualitative research methods, i.e. mixed-method design within a single dataset. It is in this area that the largest abuses of qualitative data are occurring, largely because methodological principles are not followed15. The present approach was an important way of finding out how the development of collective understanding within the domain of data analysis can be facilitated by using different learning styles in an on-line environment for medical education researchers.
This was our first experience at the PSG-FAIMER Regional Institute listserv addressing the topic ‘Analysis of Data’. The initial response has been very encouraging. Our experience has shown that problem-based/need-based discussion helps to elicit better responses on the listserv. However, more experience and follow-up are required to be able to comment on the outcomes of the present on-line discussion in terms of better analysis of CIPs and their scientific reporting.
Since on-line learning is a relatively new technique, such early sharing of experiences from the M-L Web discussion become important. On-line learning is a flexible and convenient method of faculty capacity-building and overcomes limitations of geographical isolation16. At the same time, potential weaknesses of on-line learning should be kept in mind, such as its impersonal approach, lack of spontaneous response compared to classes, fear of getting overloaded with information and links, need for special equipment and skills and access to internet services17. Active learning may not occur in an on-line environment unless the interaction is deliberately planned and encouraged by the instructor18.
In conclusion, steps such as identifying the learning needs for an on-line M-L Web discussion, addressing the immediate needs of the learners and a flexible reflective learning environment on the M-L Web facilitated collective learning outcomes of the fellows on the topic of data analysis. The outcome is useful in better understanding how to design on-line pedagogical strategies for supporting research in medical education.
We would like to thank all PSG-FAIMER Fellows who actively participated in the on-line discussion on ‘Analysis of Data’ which was facilitated in the month of October 2009. The findings and conclusions do not necessarily reflect the opinion of the Foundation for Advancement of International Medical Education and Research (FAIMER).
1. McKimm J, Jollie C, Cantillon P. Web based learning. In: Cantillon P, Hutchinson L, Wood D, editors. ABC of learning and teaching in Medicine. BMJ Books; 2003. p. 42-45.
2. Anshu, Bansal P, Mennin SG, Burdick WP, Singh T. Online faculty development for medical educators: Experience of South Asian Program. Education for Health. 2008; 21(3):175. Available from: http://www.educationforhealth.net/articles/subviewnew.asp?ArticleID=175
3. Burdick WP, Morahan P, Norcini JJ. Slowing the brain drain: FAIMER education programs. Medical Teacher. 2006; 28:631-634.
4. Ercan I, Yazici B, Yang Y, Özkaya G, Cangur S, Ediz B, Kan I. Misusage of statistics in medical research. European Journal of General Medicine. 2007; 4(3):128-134. Available from http://www.ejgm.org/arsiv/2007/3/6.pdf
5. PSG-FAIMER Regional Institute (PSG-FRI). Retrieved June 21, 2010 from: http://psg.faimerfri.org/
6. Jones C, Asensio M, Goodyear P. Networked learning in higher education: Practitioners' perspectives. The Association for Learning Technology Journal. 2000; 8(2):18-28.
7. Kaufman DM. Applying educational theory in practice. In: Cantillon P, Hutchinson L, Wood D, editors. ABC of learning and teaching in Medicine. BMJ Books; 2003. p.1.
8. Silverman D. Interpreting qualitative data. London: SAGE publications Ltd; 2006.
9. Qualitative content analysis. Retrieved November 1, 2009 from: www.southalabama.edu/coe/bset/johnson/lectures/lec17.pdf
10. Donald R. 15 methods of data analysis in qualitative research. Retrieved December 12, 2008 from: http://qualitativeresearch.ratcliffs.net/15methods.pdf
11. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care. 2007; 19:349-357.
12. Watland PA. On Phenomenography and researching online tutor support. Retrieved April 8, 2010 from: http://www.networkedlearningconference.org.uk/past/nlc2006/abstracts/pdfs/06Watland.pdf
13. Swift L, Miles S, Price GM, Shepstone L, Leinster SJ. Do doctors need statistics? Doctors’ use of and attitude to probability and statistics. Statistics in Medicine. 2009; 28(15):1969-1981.
14. Dongre AR, Deshmukh PR, Garg BS. Formative exploration of students’ perception about Community Medicine teaching at Mahatma Gandhi Institute of Medical Sciences, Sewagram, India. Online Journal of Health and Allied Sciences. 2008; 7(3):1-5.
15. Morse JM. Evolving trends in qualitative research: Advances in Mixed-Method design. Qualitative Health Research. 2005; 15(5):583-585.
16. Berge ZL. Components of on-line classroom. In: Weiss RE, Knowlton DS, Speck BW, editors. New directions for teaching and learning: Principles of effective teaching in the on-line classroom. San Francisco: Jossey Bass; 2000. p. 23-28.
17. Moore MG, Kearsley G. Distance education – A systems review; 1996; CA: Wadsworth Publishing Company.
18. Richardson JET. The concept and methods of phenomenographic research. Review of Educational Research. 1999; 69(1):53-82.