Print this page Email this page Users Online: 261 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
REVIEW ARTICLE
Year : 2011  |  Volume : 24  |  Issue : 3  |  Page : 493

The progress test as a diagnostic tool for a new PBL curriculum


1 King Saud Bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia
2 Erasmus University, Rotterdam, The Nethlands

Date of Submission24-May-2010
Date of Acceptance14-Oct-2011
Date of Web Publication16-Dec-2011

Correspondence Address:
I Al Alwan
King Saud Bin Abdulaziz University for Health Sciences, Riyadh
Saudi Arabia
Login to access the Email id

Source of Support: None, Conflict of Interest: None


PMID: 22267346

  Abstract 

Context: The College of Medicine at King Saud bin Abdulaziz University for Health Sciences (KSAU-HS) is running a PBL-based curriculum. A progress test was used to evaluate components of the basic medical and clinical sciences curriculum.
Objective: To evaluate the performance of students at different levels of the college of medicine curriculum through USMLE-based test that focused on basic medical and clinical sciences topics.
Methods: The USMLE-based basic medical and clinical sciences progress test has been conducted since 2007. It covers nine topics, including: anatomy; physiology; histology; epidemiology; biochemistry; behavioral sciences, pathology, pharmacology and immunology/microbiology. Here we analyzed results of three consecutive years of all students in years 1-4.
Findings: There was a good correlation between progress test results and students' GPA. Progress test results in the clinical topics were better than basic medical sciences. In basic medical sciences, results of pharmacology, biochemistry, behavioral sciences and histology gave lower results than the other disciplines.
Conclusions: Results of our progress test proved to be a useful indicator for both basic medical sciences and clinical sciences curriculum. Results are being utilized to help in modifying our curriculum.

Keywords: Assessment, basic sciences, problem based learning, Progress test


How to cite this article:
Al Alwan I, Al-Moamary M, Al-Attas N, Al Kushi A, AlBanyan E, Zamakhshary M, Al Kadri H M, Tamim H, Magzoub M, Hajeer A, Schmidt H. The progress test as a diagnostic tool for a new PBL curriculum. Educ Health 2011;24:493

How to cite this URL:
Al Alwan I, Al-Moamary M, Al-Attas N, Al Kushi A, AlBanyan E, Zamakhshary M, Al Kadri H M, Tamim H, Magzoub M, Hajeer A, Schmidt H. The progress test as a diagnostic tool for a new PBL curriculum. Educ Health [serial online] 2011 [cited 2019 Sep 15];24:493. Available from: http://www.educationforhealth.net/text.asp?2011/24/3/493/101426

Introduction

Medical schools worldwide aim to graduate competent physicians able to serve the community and advance the field of medicine. Competency is acquired through the development and integration of three main domains: cognitive, psychomotor and affective1. A well-designed curriculum should ensure the achievement of these competencies through the provision of effective instruction, adequate resources and proper student assessment. We have observed through curriculum implementation and educational research is that medical students perceive the curriculum content they are assessed on to be the real curriculum. In other words, quality and quantity of student learning is largely influenced by exams2,3. For this reason, proper alignment between curriculum instructional format and assessment is essential in order to avoid the mismatch between program objectives, graduate performance level, and community needs and expectations4.

Assessment of students’ knowledge gained and its application over the course of the medical school years is a challenge. The purpose of any assessment is either formative, to enhance performance, or summative, for accountability and decision making purposes. Recently, the 'progress test' has been shown to be an effective longitudinal assessment method for student achievement fulfilling both formative and summative purposes.

The progress test was originally developed in the 1970s by the Universities of Maastricht and Missouri5,6. It was initially started in problem-based learning curriculum for undergraduate schools6, this was followed by undergraduate non-problem-based learning curricula7 and was then applied to postgraduate medical education8.

The progress test can be defined as a method of assessing both the acquisition and retention of knowledge at one or more points in the curriculum relative to curricular goals and objectives6,9-12. It differs from periodic exams, quizzes and mid-term examinations in that it covers outcome knowledge towards the end of some portion of the medical curriculum whereas period exams and quizzes are focused on specific modules or semester. Furthermore, we used the progress test as a formative tool for the students, as well as a means of evaluating our curriculum.

All undergraduate medical students of our school sit the same test, which is set at a level that assesses the knowledge expected to be achieved by the end of the undergraduate medical curriculum. Progress tests are a potentially rich source of information for students, teachers and administrators on performance in both basic and clinical sciences5,6,10-12. They help monitor students’ gain in knowledge over time by comparing results of the test taken at different time periods with performance of the total student population. Results on repeated tests can be combined to reveal patterns of knowledge growth across different topics6,9-11. In problem-based learning (PBL) curricula, there is a focus on student self-directed learning and the self-development of learning objectives. Progress testing supports these concepts by providing a measure of students’ gain in and retention of knowledge over time and identifying specific curricular areas of concern as well as students whose progress is at risk13. Studies of the components of basic sciences in a PBL curriculum have commonly claimed deficiency as students’ progress through medical education13-15.

This study reports on the utilization of an annual progress test in a medical curriculum, used as a diagnostic tool to investigate the implementation of its undergraduate medical PBL curriculum. It focuses on the content and delivery of basic medical sciences and highlights areas that require further curricular review and improvement.

Methods

The College of Medicine (COM) at KSAU-HS uses a problem-based, community-oriented curriculum. The curriculum was originally adopted from the medical program at the University of Sydney, Australia. Curricular and instructional challenges with the implementation of PBL were overcome through supportive leadership, faculty enhancement initiatives and perseverance. Over the last five years, an adaptation process saw numerous modifications put forth and implemented by the Curriculum Committee in order to improve relevance to local and cultural realities, to address priority health problems of the community and country, as well as conform to the Saudi Ministry of Higher Education (MoHE) rules and regulations on general curriculum and assessment structure.

It is important for new medical schools such as our own, adopting innovative curricula from an international source and new assessment methods to determine the relevance and efficacy of their strategies. Furthermore, we utilized the USMLE as it is an internationally recognized assessment tool of medical students following graduation. However, unlike other institutions we split the content of the USMLE exam to test integrated basic sciences and clinical sciences as two different examinations.

The COM started in September 2004 as a graduate entry level program where graduates from colleges of science, pharmacy, veterinary or applied medical sciences were accepted to a four-year medical program. In the fall of 2007, a six-year medical program was started for high-school graduate students. For ease of identification, the six-year, high-school entry program is referred to as Stream I, while the graduate-entry program is referred to as Stream II.

The progress test was only applied to Stream II students as Stream I students were in the preparatory phase, focusing mainly on strengthening English language and basic sciences.

The MoHE established assessment bylaws that mandate summative assessment in the form of mid-module continuous assessment and end-of-module final examinations on which student promotion is based16. Beyond these regulations, the COM has the flexibility to determine how best to structure its assessment program to effectively serve its purposes of informing improvement and providing accountability.

Progress Test at the COM

The progress Test at the COM was started in 2007 and is currently an integral component of the assessment program. It serves a formative function by informing students about their level of achievement and providing feedback to students on areas that need strengthening. It is also utilized by COM faculty to identify areas in the curriculum that may require revision or reinforcement.

The exam is made up of multiple-choice questions (MCQs) with five response options plus an additional sixth option to select 'I don’t know,' which is intended to minimize guessing6. Reference questions are exclusively chosen from previous United States Medical Licensing Examination (USMLE) tests with a new set of items identified for each test administered. Grading utilizes the norm-referencing approach which identifies how each student performed relative to their year cohort.

Two different progress tests were developed and conducted over the past three years. Since 2007, a basic medical sciences exam has been administered annually. This is made up of 180 MCQs with content equally distributed among nine subjects: Anatomy, Physiology, Histology, Epidemiology, Biochemistry, Behavioral Sciences, Pathology, Pharmacology and Immunology/Microbiology. In the academic year 2008–2009 a second progress test was initiated. This was a clinical sciences exam consisting of 200 MCQs and covering nine clinical specialties: Internal Medicine, Surgery, Pediatrics, Obstetrics and Gynecology, Psychiatry, Neurology, Physical Diagnosis, Preventive Medicine and Public Health, Critical Care and Emergency Medicine.

The progress test is mandatory for all cohorts of Stream II who are presented with the same test as an assessment of their 'progress' toward fulfilling the program’s final objectives. Each subsequent administration of the progress test sample the same domains of competence, with items drawn from a large pool of new questions.

Validity of content is established and maintained by ensuring that each test administered and item included is in line with the blueprint.

Statistical Analysis

We entered results for both progress tests of all cohorts (from 2007 to 2009) into a Microsoft Office Excel sheet. We then transferred data to the Statistical Package for Social Sciences (SPSS) program (version 18) for the purpose of data management and analysis. Descriptive statistics were carried out by calculating the number and percent for categorical variables, the mean and standard deviation for continuous variables. We constructed a linear regression model to get the beta coefficient as well as a 95% coefficient interval (CI) for the effect of the continuous grade point average (CGPA) on the progress test results. We also calculated the Pearson correlation coefficient; its p-value were also calculated. A p-value ≤ 0.05 was considered to be statistically significant.

Results

All students were male, Saudi nationals and their age ranged between 23 and 31 years. Average age in years was 26.8, 27.7 and 26.3 for first to third cohorts respectively. 

Table 1 shows correlation between students’ GPAs and the progress Test (2007) score in cohorts 1, 2 and 3 of stream II. Results show significant correlations between the GPAs and progress Test with correlation coefficients ranging from 0.38 to 0.77.

Table 1: Correlation (R2) between progress test results and GPA for basic medical sciences for cohorts 1 -3 in Stream II (2007)







Figure 1a shows the results of the progress Test (2008) for all four cohorts. It compares results of the basic medical sciences and the clinical sciences exam. For both exams, there was an increment in the mean score for all cohorts, with cohort 4 (new students) scoring the lowest, while cohort 1 (year 4 students) scoring the highest in both exams. This indicates growth and maintenance of basic medical and clinical knowledge over the years of schooling. Another interesting finding was the significant tendency towards higher performance in the clinical sciences exam as compared to the Basic Medical sciences for cohort 1 (p= 0.14), cohort 2 (p < 0.001) and cohort 3 (p= 0.047) of stream II. There was no statistical difference with p=0.11 for cohort 4.

Figure 1b demonstrates the results for the year 2009, comparing the Clinical Exam with the basic sciences exam. Similar results to the year 2008 were found, with the clinical exam giving higher scores than basic sciences.







Figure 1a. Comparison of basic medical sciences and clinical sciences examination 2008 Stream II.









Figure 1b. Comparison of basic medical sciences and clinical examination – 2009 Stream II.

Figure 2 dissects the results of the basic sciences progress test by specialty with Pharmacology, Biochemistry, Behavioral Sciences and Cell Biology giving lower scores than Anatomy, Physiology, Immunology and Pathology. This was more evident for cohorts 1 and 2 where both had better results than cohorts 3 and 4.







Figure 2. Comparison of mean scores for basic medical sciences by specialty.

Discussion

Although the concept of the progress test is similar in different medical schools, its implementation may differ in its content, the number of tests administered per year and the grading strategy used. However, the main feature is that it is a comprehensive exam that assesses performance on several successive administrations rather than on just one17.

It is recommended for PBL schools to use progress tests in order to keep up with students and curriculum progress. Medical school assessment in Saudi Arabia is mandated by the regulations of the Ministry of Higher Education to have continuous and final assessment for each subject or module delivered by the school. However, medical schools have room to determine the specific characteristics of their assessment tools. This is not a favorable method for problem-based learning18.

The COM at KSAU-HS has a unique position among Saudi medical schools as it applies a graduate entry PBL curriculum adopted from an international school (University of Sydney) and therefore requires innovation and flexibility in assessing its effectiveness and evaluating its success. Due to these differences, we opted to evaluate our curriculum content and the effectiveness of our instructional methods utilizing the results of our progress tests.

Progress examinations can provide information about many aspects of students and curriculum evaluation, including peer comparison among schools, comparison among students to identify those who could benefit from remediation, formative assessment as part of an overall assessment plan, high-stakes assessment to determine progression in the curriculum, low-stakes assessment and as an adjunct to program evaluation19.

We used progress tests to evaluate the basic medical sciences component of our curriculum. Three years’ experience showed consistent deficiencies of either the content or the teaching method in four basic medical sciences subjects: pharmacology, biochemistry, behavioral sciences, and cell biology.

Our progress testing experience showed a correlation of the students GPA and their progress test results, with higher correlation in senior students compared to junior students. This was for both clinical and basic medical sciences. There was no apparent loss of gained knowledge for either basic sciences or clinical sciences, but a clear low score for basic medical sciences subjects which was consistent for three years in a row, including Pharmacology, Biochemistry and Behavioral Sciences.

Compared to our study, the experience of Maastricht University showed a similar pattern of gain in knowledge for clinical sciences with lower scores for basic sciences and conversely showed loss of gained knowledge for basic sciences over the years of medical school6.

The University of San Paolo showed the same trend in achievement of knowledge with higher gains in clinical sciences than in basic sciences, and a similar finding to our own results, in that gained knowledge for basic sciences continued until the end of their studies. This might be explained by the repetition of basic sciences teaching during the clinical years of medical schools7.

Both our clinical and basic sciences items were obtained from USMLE previous exams. We opted to choose these items as they are internationally built assessment items held as a standard for licensing to practice medicine in the United States of America. We did not utilize local or in-house Faculty submitted items in order to avoid bias, which may affect our interpretation of the results.

Our study has a number of limitations, including a relatively small sample size Also, questions from the USMLE Step 1 used in the basic medical sciences exam focused on content geared towards the first two years of medical school curriculum and not graduate knowledge of basic science objectives20. In addition, the use of international assessment items may not completely match the original objectives of our curriculum. Although not a focus of our study, norm referencing was utilized to make comparisons with the groups and not between groups and therefore it does not provide the ability to differentiate exact knowledge gain from year to year. Finally, the progress exam, although mandatory for all students, remained formative, and lack of student enthusiasm to attend or exert effort may have affected their performance.

Conclusion

Based on results of our school’s progress tests, focused, short workshops were held for students in cohort 1 in their final year for the subjects that had deficiencies, i.e. pharmacology, microbiology, behavioral medicine and cell biology. A taskforce was created to review all sessions in relation to these subjects and make recommendations to improve either the content or instruction to ensure quality assurance. This was followed up by evaluation of student feedback and further progress tests to assure improvement in these deficiencies.

Our experience with the use of progress testing as a tool for curriculum evaluation proved to be useful and opened a door for re-evaluation in the next few years to study the effects of modifications and changes made in our curriculum based on previous results.

References

1Bloom BS. Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc; 1956.

2Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Medical Education. 1983; 17(3):165-171.

3Frederiksen N. The real test bias: Influences of testing on teaching and learning. American Psychologist. 1984; 39:193-202.

4Verhoeven BH, van der Steeg AF, Scherpbier AJ, Muijtjens AM, Verwijnen GM, van der Vleuten CP. Reliability and credibility of an angoff standard setting procedure in progress testing using recent graduates as judges. Medical Education. 1999. 33(11):832-837.

5Arnold L, Willoughby TL. The quarterly profile examination. Academic Medicine. 1990; 65(8):515–516.

6Van der Vleuten CPM, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with progress testing in a problem-based learning curriculum. Medical Teacher. 1996; 18(2):103-109.

7Tomic ER, Martins MA, Lotufo PA, Benseñor IM. Progress testing: evaluation of four years of application in the school of medicine, University of Sao Paulo. Clinics (Sao Paulo). 2005; 60(5):389-396.

8Dijksterhuis MG, Scheele F, Schuwirth LW, Essed GG, Nijhuis JG, Braat DD. Progress testing in postgraduate medical education. Medical Teacher. 2009. 31(10): e464-468.

9Arnold L, Willoughby TL. The quarterly profile examination. Academic Medicine. 1990; 65(8):515-516.

10Mahadev GK, O’Neill PA, Owen AC, McCardle P, Benbow E, Byrne GJ. Seven years experience of progress testing in Manchester UK. In: M. B. Maldonado (Ed.),11th international Ottawa conference on medical education. Barcelona; 2004. p. 192-193.

11Föller T, Brauns K, Fuhrmann S, Hanfler S, Hoffmann J, Kölbel S, et al., Five years of progress testing at Charité Universitätsmedizin Berlin, Germany. . In: M. B. Maldonado (Ed.),11th international Ottawa conference on medical education. Barcelona; 2004. p. 192.

12Blake JM, Norman GR, Keane DR, Mueller CB, Cunnington J, Didyk N. Introducing progress testing in McMaster University’s problem-based medical curriculum: Psychometric properties and effect on learning. Academic Medicine. 1996; 71(9):1002-1007.

13Calvert MJ, Ross NM, Freemantle N, Xu Y, Zvauya R, Parle JV. Examination performance of graduate entry medical students compared with mainstream students. Journal of the Royal Society of Medicine. 2009; 102(10):425-430.

14Ferrieer BM, Woodward CA. Does premedical academic background influence medical graduates' perceptions of their medical school or their subsequent career paths and decisions? Medical Education. 1983; 17(2):72-78.

15Prince KJ, Van De Wiel M, Scherpbier AJ, Can Der Vleuten CP, Boshuizen HP. A qualitative analysis of the transition from theory to practice in undergraduate training in a PBL medical school. Advances in Health Sciences Education. 2000. 5:105-116.

16Ministry of Higher Education. Rules and Regulations of the Council of Higher Education and Universities. Second Edition Ed. Riyadh: Ministry of Higher Education; 2006.

17Portanova R, Adelman M, Jollick JD, Schuler S, Modrzakowski M, Soper E, Ross-Lee B. Student assessment in the Ohio University College of Osteopathic Medicine CORE system: progress testing and objective structured clinical examinations. The Journal of the American Osteopathic Association. 2000; 100(11):707-712.

18Van der Vleuten CPM, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with progress testing in a problem-based curriculum. Medical Teacher. 1996; 18):103-109.

19Plaza, C.M., Progress examinations in pharmacy education. Americal Journal of Pharmaceutical Education, 2007; 71(4):66.

20Federation of State Medical Boards, D., TX and the National Board of Medical Examiners, Philadelphia, PA. United States Medical Licensing Examination. Available from: http://www.usmle.org.






 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract

 Article Access Statistics
    Viewed1693    
    Printed79    
    Emailed0    
    PDF Downloaded309    
    Comments [Add]    

Recommend this journal