Print this page Email this page Users Online: 912 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
ORIGINAL RESEARCH ARTICLE
Year : 2015  |  Volume : 28  |  Issue : 3  |  Page : 187-193

Effectiveness of an online Problem-Based learning curriculum for training family medical doctors in Brazil


1 Center for Educational Development in Health, School of Public Health of Ceará, Fortaleza, Ceará, Brazil
2 Department of Psychology, Institute of Medical Education Research Rotterdam, Erasmus Medical Center, Education and Child Studies, Faculty of Social Sciences, Erasmus University Rotterdam, The Netherlands
3 Department of Clinical Medicine, Federal University of Ceará, Ceará, Brazil
4 Department of Psychology, Education and Child Studies, Faculty of Social Sciences, Erasmus University Rotterdam, The Netherlands

Date of Web Publication11-Mar-2016

Correspondence Address:
Jose Batista Cisne Tomaz
Av. Beira Mar, 4344, Apto. 1102, Mucuripe, Fortaleza-Ce CEP: 60.165-121
Brazil
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/1357-6283.178605

  Abstract 

Background: Problem-based learning (PBL) and distance education (DE) have been combined as educational approaches in higher education. This combination has been called distributed PBL. In health professions education it has been called online PBL (OPBL). However, more research on the effectiveness of OPBL is needed. The present study aims at evaluating the effectiveness of an OPBL curriculum for training family medical doctors in Brazil. Methods: We used a pretest–posttest control group design in this study. Thirty family physician participants were non-randomly assigned to the experimental group and the same number to the control group. Three instruments for collecting data were used: A multiple choice question knowledge test, an Objective Structural Clinical Examination (OSCE) for assessing the ability to apply the Mini Mental State Exam (MMSE) and a test based on clinical cases for assessing the ability to make an adequate differential diagnosis of dementia. Multivariate Analysis of Variance (MANOVA) and univariate tests were conducted to see if the difference between the two groups was significant. The effect size was measured by Cohen's d. Results: A total of 50 participants completed the study. The results show significant effects of the course on participants' knowledge and diagnostic skills. Discussion: The results may indicate that innovative pedagogical approaches such as PBL can be effective in an online environment in a low-resources context, with the advantages of DE approach.

Keywords: Competency-based curriculum, distance education, effectiveness, Family Health, online problem-based learning, web-based education


How to cite this article:
Tomaz JB, Mamede S, Filho JM, Roriz Filho JS, van der Molen HT. Effectiveness of an online Problem-Based learning curriculum for training family medical doctors in Brazil. Educ Health 2015;28:187-93

How to cite this URL:
Tomaz JB, Mamede S, Filho JM, Roriz Filho JS, van der Molen HT. Effectiveness of an online Problem-Based learning curriculum for training family medical doctors in Brazil. Educ Health [serial online] 2015 [cited 2020 May 29];28:187-93. Available from: http://www.educationforhealth.net/text.asp?2015/28/3/187/178605


  Background Top


Problem-based Learning (PBL) has become a well-established constructivist educational approach in higher education. In the medical education field, PBL has been presented as a useful educational alternative to conventional instruction. PBL is defined as an approach to learning and education in which students deal with problems in small groups under the supervision of a tutor.[1] Its roots are based on the philosophies of rationalism and American functionalism and it is strongly influenced by cognitive psychology.[2] As an educational approach, PBL should include six main components: The problem, small group discussion, the tutor, individual study, assessment and blocks and units of the curriculum.[3]

In short, PBL works as follows: From analysis of and reflection on a problem situation that is presented to participants working in small groups (tutorial groups), they identify their key knowledge gaps and establish what they need to learn (learning goals) to solve the problem.[1] In its possibly most well-known format, this approach consists of seven steps [1],[3] [Figure 1].
Figure 1: PBL process

Click here to view


The effectiveness of face-to-face PBL is well established. Problem-based curricula provide a student-centered learning environment and encourage an inquisitive style of learning in students as opposed to the rote memorization and short-term learning strategies stimulated by traditional programmes.[4] PBL enhances intrinsic interest in subject matter and appears to enhance self-directed learning skills.[2] Recall of information, causal reasoning and collaborative learning construction seem to take place in PBL during the tutorial group.[5] PBL positively affects typical PBL-related competencies, such as the interpersonal skills and cognitive domains, and the more general work-related skills, such as the ability to work more efficiently.[6]

The combination of PBL and distance education (DE) has typically been called distributed problem-based learning (dPBL),[7] and in health professions education has been called online PBL (OPBL). In OPBL learning is mediated through computer technology, and a 'virtual' learning environment is provided to enable students to collaborate in small groups. The conceptual foundation for OPBL is the PBL theoretical background, processes, methods, and outcomes.[8]

Thus far, to our knowledge, research on the effectiveness of OPBL in low-resources settings is scarce. Most studies of OPBL have been conducted in developed countries and focused on program evaluation and case reports. Few studies have used a pretest–posttest control group design.[9],[10],[11],[12] This design is used in the present study to assess OPBL effectiveness on students' knowledge and skills acquisition. These studies were outside the health field.

The findings of the few studies on effectiveness of OPBL in the health field suggested that OPBL is as effective as the text-based learning approaches in improving students' learning and their learning environment in small group discussion.[13] In addition, virtual collaborative learning was found to be as effective as conventional PBL in the acquisition of clinical reasoning skills, although it was less well accepted by the participants (fourth-year medical students) than traditional PBL sessions.[14]

More research is needed on the effectiveness of OPBL on knowledge and skills acquisition, particularly in a low-resource context. The present study was intended to contribute to a better understanding of this issue. It focuses on a postgraduate course, “Clinical Approach for Elderly with Dementia,” offered by the School of Public Health of Ceará in Fortaleza, Brazil. We expected the course to increase participants' knowledge and diagnostic skills in dealing with the elderly with dementia.

The aim of the present study was to evaluate the effectiveness of an OPBL curriculum in health professions education in a low-resources setting. It also aims to support our decision to use an innovative pedagogical approach to train family physicians in our low-resources context, with the advantages and benefits of DE approach.


  Intervention Top


The course clinical approach for elderly with dementia

The design of the curriculum of the course, “Clinical Approach for Elderly with Dementia,” was based on the model called Design Approach to Competency-Based Curriculum[15] or Outcome-based curriculumdevelopment.[16]According to this model, a list of competences and the learning objectives specify the outcomes of the course. From the list of competences and learning objectives, educational strategies were established.

We considered the specific competences of the physician who works in primary care in our context. A multidisciplinary team comprised of a specialist in distance education, computer technicians (webmaster and web designer) and content specialists (geriatrician and a public health specialist) was responsible for the design of the curriculum.

The course is 120 hours long, including 100 hours at distance and 20 hours face-to-face. The curriculum is divided into five sequential units. Course activities were conducted at a mean rate of 10 hours per week, for a total duration of 12 weeks (3 months).

The didactic approach was based on the PBL seven step model [1] adapted to online, web-based context [Figure 1].

In summary, OPBL worked as follows in the course. Virtual Tutorial Groups (VTGs)were created. Groups of 10 to 15 participants, supervised by a facilitator, formed each VTG. All participants were family physicians from the family health teams (FHT) from the State of Ceará, Brazil. The VTGs were the main strategy to achieve the cognitive learning objectives and to go through the seven steps approach. The VTGs were performed in virtual forums in which the students could communicate with each other asynchronously. The virtual forum worked as a whiteboard in which the group recorded ideas and hypotheses, acquired information and pursued learning issues.[17] Synchronous communication tools (chats) were also used when needed for complementing the discussion or concluding the problem solving. Just like in face-to-face tutorial groups, there were three phases in the VTG—Problem Analysis, Individual Study and Problem Solving—and the students had to follow the seven steps proposed for PBL. The complete cycle took about eight hours (four hours for the analysis of the problem and four hours to resolve the problem). Students were required to attend a four hours workshop at the beginning of the course to learn how the VTG works.

The clinical problems posed to the groups were loosely structured to allow the learners greater latitude in their inquiries and to stimulate them to gather more information and create multiple hypotheses about the problem's cause and management.[17] Course planners elaborated them previously, taking into account the context in which the physicians actually worked. The facilitator's main role was to promote the learning process among the students, to ensure the proper implementation of the OPBL cycle and to encourage good interactions among the students.[17]

Complementary educational strategies were used, including clinical skills training, team and individual projects and community practice, in order to accomplish the development of the competences. A Course Guide, including General Guidelines for Study at distance, and Study Guides for each Unit, was provided in the Learning Management System (LMS) (MOODLE®) to support learning.

The following DE tools were used in the course: A website, a Learning Management System (LMS) (MOODLE®), virtual forum, chat and email. In addition, video-lectures were available on the website and a CD-ROM with the video-lectures and texts were distributed to the participants.


  Methods Top


This program evaluation used a pre- test - posttest with control group design. We were interested in assessing and comparing the knowledge and diagnostic skills of both experimental and control groups before and after the course.


  Participants Top


Sixty volunteers were included in this study. They were family physicians working in the family health teams (FHT) of the State of Ceará, Brazil. Ceará is one of the poorest states of Brazil. Its Human Development Index (HDI) is 0.723, 22nd among the 27 Brazilian states.[18] The infant mortality rate (IMR) in Ceará is 16.2 deaths/1,000 live births,[19] one of the highest in Brazil. Most participants (M = 42. 70%) came from the FHT of the state capital. The participants were evenly divided into experimental and control groups. They were not allocated randomly because of practical circumstances and financial constraints. We guaranteed that participants from the control group could take the course after the collection of pretest and posttest data that assessed the program, in order to motivate them to participate as control group subjects. They were asked to have no access to any format of permanent education on dementia during the experimental study. We were able to collect a complete study data for 50 participants (25 of the experimental group and 25 of the control group), missing information for 10 participants. Nine participants had dropped out of the course for personal reasons, including overwork, family issues, and simultaneous participation in other courses. One participant from the experimental group was excluded because his test scores were outliers, being much lower than others' scores.

[Table 1] and [Table 2] present participants' demographic information. The groups did not differ statistically in any of their demographics.
Table 1: Means and standard deviations of the participants' age, weekly work hours and time after graduation of experimental group and control group

Click here to view
Table 2: Percentages and numbers of the participants' gender and marital status of experimental group and control group

Click here to view


Instruments

Three instruments for data collection were used: A paper and pencil multiple choice question (MCQ) test for the assessment of knowledge on dementia; an Objective Structured Clinical Examination (OSCE) for the assessment of the mastery of diagnostic skills – the application of the Mini-Mental State Examination (MMSE) skill; and a short essay written test consisting of five clinical cases to assess the ability to give an adequate differential diagnosis on dementia.

The knowledge test

The knowledge test consisted of 30 multiple-choice questions (MCQ) covering the knowledge goals of the course. It was developed especially for the purpose of this study by three geriatricians and revised by an educationalist, according to the quality criteria for MCQ design in the area of health sciences.[20] Two versions of the knowledge test were made, one as pretest and a second as post-test, to minimize the bias of memorization. The questions reflect the knowledge of and insight in dementia according the learning goals of the course.

For each correctly answered question the participant received one point. Therefore, the range of the scores was 0 to 30. We did a linear transformation of the raw scores to a ten-point scale (0 to 10) in order to adapt them to a grade format. Cronbach's alpha of the post-test was 0.68.

The diagnostic skills tests

The mastery of the two diagnostic skills was evaluated by means of an OSCE and a short essay written test based on clinical cases. The OSCE was used for the assessment of the mastery of the application of the MMSE and the second test for the assessment of the ability to give a correct differential diagnosis of dementia.

The OSCE has been used to assess clinical competence since 1975.[21] This method has proven to be reliable and valid for assessing clinical skills.[22],[23],[24] It provides a more valid examination than the traditional approach to clinical examinations.[22] An OSCE has three main characteristics: A simulation of clinical reality using real or standardized patients, direct observation of clinical competence, and the assessment of performance with structured clinical checklists.[25]

The application of the MMSE

The MMSE was proposed for testing cognitive functioning in 1975. It is a short, simplified, standardized and scored form that includes eleven questions and requires only 5-10 minutes to administer. The test proven to be valid and reliable for separating patients with cognitive disturbance from those without such disturbance.[26] The MMSE is divided into two sections. The first covers orientation, memory, and attention. The second tests ability to name, follow verbal and written commands, write a sentence spontaneously, and copy a complex polygon similar to a Bender-Gestalt Figure. Maximum total score is 30.[26]

In this OSCE, we used a simulated patient [27] who was an elderly actress previously trained for the performance. The simulated patient should follow a script previously elaborated by the clinical specialist and revised by an educationalist to adapt it to the test purposes. The script describes a case of an elderly woman with probable dementia. The procedure in this OSCE is as follows. In a room simulating a doctor's office, the participant is instructed to perform the application of MMSE. The simulated patient should answer the questions according to the script previously elaborated. Each encounter lasted approximately 10 minutes and was videotaped for further revision by the observers and to facilitate the assessment in case of doubt. Two observers assessed the participant's performance by using a Performance Assessment Checklist (PAC). Each observer assessed the performance of 25 participants from both the experimental and control groups. Observers were blinded to which group each participant was in. The inter-rater reliability was very high: 0.95 (Intraclass Correlation Coefficient). Cronbach's alpha was 0.95.

In order to assess the mastery of application of MMSE we developed a 5-point scale checklist (PAC) (1 = totally disagree to 5 = totally agree). This performance assessment instrument is based on the structure of the MMSE and consists of 30 behavioral items. Examples of items are: 'Explained to the patient the objective and meaning of the test' (explanation), 'Asked the patient to say the day of the week, the date, month, year and approximate time of this day' (orientation), 'Established the probable diagnosis correctly' (interpretation) and 'Negotiated with the patient and the caregiver the approach to be adopted' (conclusion). So, the second dependent variable in this study is the score on this OSCE test (range 0-30). As we did in the knowledge test, a linear transformation of the raw scores to a ten-point scale (0 to 10) was made in order to adapt them to a grade format.

Test for the ability to make a correct differential diagnosis

The second test was used to assess differential diagnosis skills. It consists of five clinical cases, each one related to the five most common causes of dementia. A geriatrician specialized in dementia prepared the cases, which were subsequently revised by an educationalist to adapt them to the test purposes.[28] Magnetic resonance imaging (MRI) and computed tomography (CT) images of the skull were added to each clinical case. The procedure in this OSCE is as follows. In a classroom participants were asked to read each clinical case and answer: i) Whether it was a case of dementia and, if so, ii) which would be the most likely cause. MRI and CT of the skull were projected on a screen for each case. Participants had to answer each case in around five minutes. Cronbach's alpha was 0.80. So, the third dependent variable in this study is the score on this OSCE test (range 0-10 - two points for each correctly answered clinical case).

Procedures

The pretest occurred during the first face-to-face meeting at the beginning of the course. Participants were informed of the purpose of the study before answering the test and we took verbal consent from them and assured them of total anonymity. In this phase, we administered the knowledge test for measuring the basic level of knowledge about dementia and the skills tests for assessing the mastery of clinical skills of the participants of the experimental group: The application of MMSE and the differential diagnosis of possible causes of dementia.

In the last face-to-face meeting, at the end of the course (around four months after the beginning of the course), we administered the post-test. We measured again the level of knowledge about dementia and mastery of the clinical skills of the participants.

In the control group, the same pretest and posttest procedures were used. The interval between the pretest and posttest was four months, equal to the duration of the course. Participants from the control group could take the course only after the collection of data on pretest and posttest.

Data analysis

The data were analysed using the statistics software Statistical Product and Service Solutions (SPSS 17.0, IBM, Chicago, USA). We ran an independent samples t-test between the mean scores of both the experimental and control groups, in order to verify if they were comparable in terms of their profile. Cronbach's alpha was calculated in order to measure the internal consistence of the three instruments for data collection.

We computed the frequencies of the responses of participants in the pre-test and post-test instruments and checked the effects of the OPBL course on all the three dependent variables: Knowledge, application of the MMSE (OSCE) and ability to give a correct differential diagnosis. Multivariate Analysis of Variance (MANOVA) and univariate tests were conducted to see if the difference between the two groups was significant. The effect size was measured by Cohen's d.

Ethical aspects

The study was approved by the Ethics Research Committee of the School of Public Health of Ceará and participants signed a consent term to participate in the study.


  Results Top


The main results are presented in [Table 3].
Table 3: Means and standard deviations for the pre- and post-test and effect size (d) on knowledge and skills for the experimental group and control group

Click here to view


The results indicate that the participants of the experimental group showed more progress on all the three dependent variables, i.e. knowledge, use of the MMSE (OSCE) and ability to give a correct differential diagnosis, than the participants of the control group.

The MANOVA showed that the difference between the two groups was significant (F = 10.38, P < 0.001). Also the univariate tests showed significant differences on the knowledge test (F = 17.98, P < 0.001), the MMSE (F = 63.47, P < 0.001) and DD (F = 43.98, P < 0.001) all in favor of the experimental group. The effect size measured by Cohen's d was moderate for knowledge, very large for application of the MMSE (OSCE 1) and large for ability to make a correct differential diagnosis.


  Discussion Top


This study investigated the effects of an OPBL curriculum on participants' knowledge and diagnostic skills following a course on Clinical Approach for Elderly with Dementia. To our knowledge, it is the first study that evaluates the effectiveness of an OPBL curriculum based on the Seven Steps model [1] adapted to the web environment in the health field in a low-resources context. The results indicate that participants in the experimental group who attended the OPBL course had higher scores following the course on the knowledge and skills tests compared to control group participants who received no training on the topic. The effect on knowledge can be considered as moderate and the effect on diagnostic skills as large to very large.

Our results are partly in line with the outcomes of a few other studies on OPBL effectiveness in other domains, such as computer science. These studies showed that students from the experimental web-based PBL approach achieved better academic performance than students from the controlled content-based approach.[9],[11] However, another study have found no significant effect on content knowledge acquisition scores of an OPBL course on the use of computers in education, although it did have a significant effect on critical thinking skills.[10]

In the health field, other studies have also found positive effects of OPBL on students' achievement. A quasi-experimental, post-test only study compared the virtual problem-based learning (VPBL) exercise delivered via Internet and a text-based version of the same PBL exercise on students' achievement and their perceptions of the learning environment.[13] The authors of that study concluded that the VPBL is as effective as the text-based version for improving students' learning and their learning environment in small group discussion. In another study, the authors used a randomized controlled design to investigate the effect of a virtual collaborative online module on clinical reasoning acquisition compared to a traditional PBL tutorial group.[14] They found that virtual collaborative learning was as effective as conventional PBL for acquiring clinical reasoning skills, although it was less well accepted by the participants (fourth-year medical students) than traditional PBL sessions.

An interesting finding of the present study was the robust gain on diagnostic skills by the experimental group relative to the control group. The development of skills at a distance is a great challenge for distance course planners. A few studies have approached this issue. One of these studies examined counseling skill acquisition for Rehabilitation Counseling students enrolled in a distance education course and has found positive results.[29] However, most participants indicated that they would have preferred a traditional approach to learning counseling skills, although they perceived distance education to be an effective use of their time. In our study, we assume that the use of skills training during a second face-to-face meeting could have influenced the high achievement of the experimental group on the two OSCEs. In this meeting, participants had the opportunity to practice their skills during a simulated patient based skills training session. It is important to highlight that the cognitive part of the skills (knowing how) was supposed to be grasped during the virtual tutorial groups.

Our findings have high relevance for continuing education of health professionals since DE programmes can reach a considerable number of people (thousands) living even hundreds of kilometres from educational institutions in a short period of time. In large countries like Brazil, the use of DE approach is even more relevant. In our context a significant number of health professionals must be trained, including family physicians, and a well-designed DE based programme can be a cost-effective strategy. More specifically, the results of our studies indicate the effectiveness of OPBL in developing clinicians' knowledge and diagnostic skills related to dementia. In Brazil, as in other developing countries, the population is aging rapidly and dementia will be an important prevalent health problem.

Further studies are required with other learner groups and controlling for other variables such as the guidance provided by tutors, students' individual characteristics and communication styles, and aspects of the course design and implementation. Although the present study revealed interesting findings about the effect of OPBL on the achievement of knowledge and skills, a better and more definitive understanding of the factors that influence the OPBL effectiveness is still needed. As Dolmans stated, “we should not only focus our research on the effectiveness of educational interventions, but also on determining why an intervention is effective or not and under which conditions.” (p. 1129).[30]

Besides the strength of the experimental design of this study, it has several limitations. First, we have not been able to allocate the participants randomly to the two groups because of practical circumstances and financial constraints. The 15% dropout rate might have biased findings. Secondly, longer term a follow-up test has not been included in the design, therefore the program lasting effects cannot be known.

We conclude that it is possible to conduct an effective OPBL curriculum to train family physicians in Brazil, and this method may well be useful for training other clinician groups.

In general, this OPBL course was found to be effective in knowledge and skills acquisition. These results indicate that innovative pedagogical approaches such as PBL can be effective in an online environment in a low-resources context with the advantages and benefits of DE approach.

Acknowledgements

The authors wish to thank Paulo César Almeida for his statistical support.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
  References Top

1.
Schmidt HG. Problem-based learning: Rationale and description. Med Educ 1983;17:11-6.  Back to cited text no. 1
    
2.
Norman GR, Schmidt HG. The psychological basis of problem-based learning: A review of the evidence. Acad Med 1992;67:557-65.  Back to cited text no. 2
    
3.
Schmidt HG. Foundations of problem-based learning: Some explanatory notes. Med Educ 1993;27:422-32.  Back to cited text no. 3
    
4.
Schmidt HG, Dauphinee WD, Patel VL. Comparing the effects of problem-based and conventional curricula in an international sample. J Med Educ 1987;62:305-15.  Back to cited text no. 4
    
5.
Dolmans DH, Schmidt HG. What do we know about cognitive and motivational effects of small group tutorials in problem-based learning? Adv Health Sci Educ Theory Pract 2006;11:321-36.  Back to cited text no. 5
    
6.
Schmidt HG, Vermeulen L, van der Molen HT. Longterm effects of problem-based learning: A comparison of competencies acquired by graduates of a problem-based and a conventional medical school. Med Educ 2006;40:562-7.  Back to cited text no. 6
    
7.
Wheeler S. Learner support needs in online problem-based learning. Q Rev Distance Educ 2006;7:175-84.  Back to cited text no. 7
    
8.
Scripture JD. Recommendations for designing and implementing distributed problem-based learning. Am J Distance Educ 2008;22:207-21.  Back to cited text no. 8
    
9.
Atan H, Sulaiman F, Idrus RM. The effectiveness of problem-based learning in the web-based environment for the delivery of an undergraduate physics course. Int Educ J 2005;6:430-7.  Back to cited text no. 9
    
10.
Sendag S, Odabas HF. Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 2009;53:132-41.  Back to cited text no. 10
    
11.
Baturay MH, Bay OF. The effects of problem-based learning on the classroom community perceptions and achievement of web-based education students. Comput Educ 2010;55:43-52.  Back to cited text no. 11
    
12.
King E. Can PBL-GIS work online? J Geog 2008;107:43-51.  Back to cited text no. 12
    
13.
Bowdish BE, Chauvin S, Kreisman N, Britt M. Travels towards problem based learning in medical education (VPBL). Instr Sci 2003;31:231-53.  Back to cited text no. 13
    
14.
Raupach T, Muenscher C, Anders S, Steinbach R, Pukrop T, Hege I, et al. Web-based collaborative training of clinical reasoning: A randomized trial. Med Teach 2009;31:e431-7.  Back to cited text no. 14
    
15.
ten Cate O. Trust, competence, and the supervisor's role in postgraduate training. BMJ 2006;333:748-51.  Back to cited text no. 15
    
16.
Harden RM. Developments in outcome-based education. Med Teach 2002;24:117-20.  Back to cited text no. 16
    
17.
Barrows H. Is it truly possible to have such a thing as dPBL? Distance Educ 2002;23:119-22.  Back to cited text no. 17
    
18.
CEPAL/PNUD/OIT. Emprego, Desenvolvimento Humano e Trabalho Decente – A Experiência Brasileira Recente; 2008. Available from: http://www.cepal.org/brasil/noticias/noticias/3/34013/EmpregoDesenvHumanoTrabDecente.pdf. [Last accessed on 2013 May 14].  Back to cited text no. 18
    
19.
Brasil/MS. Indicadores de Mortalidade. C.1 Taxa de Mortalidade Infantil; 2010. Available from: http://www.tabnet.datasus.gov.br/cgi/idb2011/c01b.htm. [Last accessed on 2013 May 14].  Back to cited text no. 19
    
20.
Case SM, Swanson DB. Constructing Written Test Questions for the Basic and Clinical Sciences. National Board of Medical Examiners (NBME); 2002. Available from: http://www.nbme.org/pdf/itemwriting_2003/2003iwgwhole.pdf. [Last accessed on 2012 Mar 18].  Back to cited text no. 20
    
21.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-51.  Back to cited text no. 21
    
22.
Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979;13:41-54.  Back to cited text no. 22
    
23.
Sullivan PO, Chao S, Russell M, Levine S, Fabiny A. Development and implementation of an objective structured clinical examination to provide formative feedback on communication and interpersonal skills in geriatric training. J Am Geriatr Soc 2008;56:1730-5.  Back to cited text no. 23
    
24.
Wallace J, Rao R, Haslam R. Simulated patients and objective structured clinical examinations: Review of their use in medical education. Adv Psychiatr Treat 2002;8:342-8.  Back to cited text no. 24
    
25.
Van der Vleuten C, Swanson DB. Assessment of clinical skills with standardized patients: State of the art. Teach Learn Med 1990;2:58-76.  Back to cited text no. 25
    
26.
Folstein MF, Folstein SE, McHugh PR. “Mini-mental state”. A practical method for grading the cognitive state of patients for the clinician. J Psychiatr Res 1975;12:189-98.  Back to cited text no. 26
    
27.
Adamo G. Simulated and standardized patients in OSCEs: Achievements and challenges 1992-2003. Med Teach 2003;25:262-70.  Back to cited text no. 27
    
28.
Karani R, Leipzig RM, Callahan EH, Thomas DC. An unfolding case with a linked objective structured clinical examination (OSCE): A curriculum in inpatient geriatric medicine. J Am Geriatr Soc 2004;52:1191-8.  Back to cited text no. 28
    
29.
Degiorgio L. Examining Distance Education in Teaching Clinical Counselling Skills to Rehabilitation Counsellors-in-Training. [PhD Thesis]; 2009. Available from: http://www.books.google.com.br/books?id=QMJqAjkszKMC & printsec=frontcover & source=gbs_ge_summary_r & cad=0#v=onepage & q & f=false. [Last accessed on 2012 Jun 02].  Back to cited text no. 29
    
30.
Dolmans D. The effectiveness of PBL: The debate continues. Some concerns about the BEME movement. Med Educ 2003;37:1129-30.  Back to cited text no. 30
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]


This article has been cited by
1 Knowledge evaluation instruments for dementia caregiver education programs: A scoping review
Nicholas V Resciniti,Weizhou Tang,Masroora Tabassum,Joseph Lee Pearson,Sharon Melinda Spencer,Matthew C Lohman,Diane K Ehlers,Dana Al-Hasan,Maggi C Miller,Ana Teixeira,Daniela B Friedman
Geriatrics & Gerontology International. 2020;
[Pubmed] | [DOI]
2 Application of problem-based learning and case-based learning integrated method in the teaching of maxillary sinus floor augmentation in implant dentistry
Yunfei Liu,Yamei Xu,Yueheng Li,Qingqing Wu
PeerJ. 2020; 8: e8353
[Pubmed] | [DOI]
3 Integrating augmented reality into problem based learning: The effects on learning achievement and attitude in physics education
Mustafa Fidan,Meric Tuncel
Computers & Education. 2019; 142: 103635
[Pubmed] | [DOI]
4 Evaluation of e-learning for medical education in low- and middle-income countries: A systematic review
Sandra Barteit,Dorota Guzek,Albrecht Jahn,Till Bärnighausen,Margarida Mendes Jorge,Florian Neuhann
Computers & Education. 2019; : 103726
[Pubmed] | [DOI]
5 How We Evaluate Postgraduate Medical E-Learning: Systematic Review
Robert de Leeuw,Anneloes de Soet,Sabine van der Horst,Kieran Walsh,Michiel Westerman,Fedde Scheele
JMIR Medical Education. 2019; 5(1): e13128
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Background
Intervention
Methods
Participants
Results
Discussion
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed2697    
    Printed38    
    Emailed1    
    PDF Downloaded537    
    Comments [Add]    
    Cited by others 5    

Recommend this journal