METHODS: The Jefferson Scale of Physician Lifelong Learning for medical student (JeffSPLL-MS) questionnaire was used. Factor analysis was performed, Cronbach's alpha and effect size were calculated. The types of learning activities that contribute to LL skills were identified.
RESULTS: Three-factor structure emerged from the factor analysis and were identified as learning beliefs and motivation, skills in seeking information and attention to learning opportunities. A significant increase (p
Methods: . We assessed links through curriculum mapping, between assessments and expected learning outcomes of dental physiology curriculum of three batches of students (2012-14) at Melaka-Manipal Medical College (MMMC), Manipal. The questions asked under each assessment method were mapped to the respective expected learning outcomes, and students' scores in different assessments in physiology were gathered. Students' (n = 220) and teachers' (n=15) perspectives were collected through focus group discussion sessions and questionnaire surveys.
Results: . More than 75% of students were successful (≥50% scores) in majority of the assessments. There was moderate (r=0.4-0.6) to strong positive correlation (r=0.7-0.9) between majority of the assessments. However, students' scores in viva voce had a weak positive correlation with the practical examination score (r=0.230). The score in the assessments of problem-based learning had either weak (r=0.1-0.3) or no correlation with other assessment scores.
Conclusions: . Through curriculum mapping, we were able to establish links between assessments and expected learning outcomes. We observed that, in the assessment system followed at MMMC, all expected learning outcomes were not given equal weightage in the examinations. Moreover, there was no direct assessment of self-directed learning skills. Our study also showed that assessment has supported students in achieving the expected learning outcomes as evidenced by the qualitative and quantitative data.
MATERIALS AND METHODS: A survey of the tutors who had used the instrument was conducted to determine whether the assessment instrument or form was user-friendly. The 4 competencies assessed, using a 5-point rating scale, were (1) participation and communication skills, (2) cooperation or team-building skills, (3) comprehension or reasoning skills and (4) knowledge or information-gathering skills. Tutors were given a set of criteria guidelines for scoring the students' performance in these 4 competencies. Tutors were not attached to a particular PBL group, but took turns to facilitate different groups on different case or problem discussions. Assessment scores for one cohort of undergraduate medical students in their respective PBL groups in Year I (2003/2004) and Year II (2004/2005) were analysed. The consistency of scores was analysed using intraclass correlation.
RESULTS: The majority of the tutors surveyed expressed no difficulty in using the instrument and agreed that it helped them assess the students fairly. Analysis of the scores obtained for the above cohort indicated that the different raters were relatively consistent in their assessment of student performance, despite a small number consistently showing either "strict" or "indiscriminate" rating practice.
CONCLUSION: The instrument designed for the assessment of student performance in the PBL tutorial classroom setting is user-friendly and is reliable when used judiciously with the criteria guidelines provided.