Displaying publications 61 - 80 of 140 in total

Abstract:
Sort:
  1. Kam CA
    Singapore Med J, 1978 Jun;19(2):106-8.
    PMID: 751183
    A system of tutorials preparing students for the Primary F.F.A.R.A.C.S. examination is described. It is suggested that this system would be suitable for teaching students in a peripheral training hospital.
    Matched MeSH terms: Educational Measurement
  2. Lim AS, Lee SWH, Karunaratne N, Caliph S
    Am J Pharm Educ, 2020 Nov;84(11):7920.
    PMID: 34283749 DOI: 10.5688/ajpe7920
    Objective. To examine pharmacy students' performance on and perceptions regarding the use of an interactive online tool for practicing to take objective structured clinical examinations (OSCEs).Methods. The Monash OSCE Virtual Experience (MOVE), an online module consisting of 20 pharmacy case scenarios with virtual patients, was piloted with final-year pharmacy students at Monash University campuses in Australia and Malaysia. A mixed methods approach that included reviewing user attempts and comparing grades, collecting student-administered questionnaires, and holding focus groups was used to examine students' perception and performance.Results. More than 99% of all students attempted at least one online case scenario in preparation for their final in-person OSCE, and 81% attempted all 20 scenarios two or more times. Ninety percent of students at the Malaysia campus and 70% of students at the Australia campus reported that MOVE was a helpful study tool for their OSCE preparation. However, a raw comparison of user attempts and OSCE grades did not find a direct correlation between online module attempts and assessment grades. Self-administered questionnaire and focus group results indicated that MOVE prepared students for targeted and time-restricted history-taking and problem-solving skills. Overall, students perceived MOVE to be a useful learning tool and a less overwhelming learning experience than were face-to-face sessions. Nevertheless, students still preferred face-to-face OSCE practice with simulated patients over online practice with virtual patients.Conclusion. The Monash OSCE Virtual Experience was perceived by our students as a flexible and useful online learning aid in preparing for their final-year OSCE However, there was no direct correlation between online practice attempts and students' exam grades.
    Matched MeSH terms: Educational Measurement
  3. Lim A, Krishnan SS, Blebil AQ, Malone D
    Int J Pharm Pract, 2023 Dec 19;31(6):646-649.
    PMID: 37410964 DOI: 10.1093/ijpp/riad048
    OBJECTIVES: To describe the implementation and assess whether an objective structured clinical examination (OSCE) is a viable assessment tool for testing Antimicrobial Stewardship (AMS) principles.

    METHODS: A three-station OSCE set in a hospital and community pharmacy was designed and mapped to the World Health Organisation's AMS intervention practical guide. This OSCE comprised 39 unique cases and was implemented across two campuses (Malaysia and Australia) at one institute. Stations were 8 min long and consisted of problem-solving and applying AMS principles to drug therapy management (Station 1), counselling on key antimicrobials (Station 2) or managing infectious diseases in primary care (Station 3). Primary outcome measure to assess viability was the proportion of students who were able to pass each case.

    KEY FINDINGS: Other than three cases with pass rates of 50, 52.8 and 66. 7%, all cases had pass rates of 75% or more. Students were most confident with referral to medical practitioner cases and switching from intravenous to oral or empirical to directed therapy.

    CONCLUSIONS: An AMS-based OSCE is a viable assessment tool in pharmacy education. Further research should explore whether similar assessments can help improve students' confidence at recognising opportunities for AMS intervention in the workplace.

    Matched MeSH terms: Educational Measurement
  4. Singh H, Mohammed AH, Stokes E, Malone D, Turner J, Hassan BAR, et al.
    Curr Pharm Teach Learn, 2024 Jan;16(1):69-76.
    PMID: 38158327 DOI: 10.1016/j.cptl.2023.12.007
    BACKGROUND AND PURPOSE: This study aimed to evaluate an accelerated dispensing course for graduate entry (GE) pharmacy students with prior science-related degrees to join undergraduate (UG) students in year three of the Monash Pharmacy degree.

    EDUCATIONAL ACTIVITY AND SETTING: A one day accelerated dispensing course using MyDispense software was delivered to 59 GE students. The accelerated dispensing course was identical to the standard three-week dispensing course delivered to UG students. The same assessment of dispensing skills was conducted after course completion for both UG and GE students and included dispensing four prescriptions of varying difficulty. The assessment scores of the UG and GE students were compared. Perception data from the accelerated course were also collected.

    FINDINGS: The accelerated dispensing curriculum was well received by students. They found the simulation relevant to practice, easy to navigate, and helpful for preparing them for assessment. Overall, 5.1% of GE students failed the assessment, which was lower than the 32.6% failure rate in the UG cohort. Comparison of assessment grades between UG and GE students showed no notable disadvantage to attainment of learning outcomes with the accelerated curriculum. However, UG students were more likely to provide unsafe instructions compared to GE students in their labeling for three out of four prescriptions.

    SUMMARY: An accelerated dispensing curriculum can be effectively delivered to mature learners with a prior science-related degree as no notable deficiencies were identified when comparing the assessment results of GE students against UG students when both student cohorts undertook the same dispensing assessment.

    Matched MeSH terms: Educational Measurement
  5. Lim AS, Ling YL, Wilby KJ, Mak V
    Curr Pharm Teach Learn, 2024 Mar;16(3):212-220.
    PMID: 38171979 DOI: 10.1016/j.cptl.2023.12.028
    BACKGROUND: Objective structured clinical examinations (OSCEs) remain an integral part of pharmacy education. This study aimed to characterize key researchers, areas, and themes in pharmacy education OSCEs using a bibliometric review with content analysis.

    METHODS: A bibliometric review was conducted on literature from over 23 years from January 2000 to May 2023. Articles focusing on any type of OSCE research in pharmacy education in both undergraduate and postgraduate sectors were included. Articles were excluded if they were not original articles or not published in English. A summative content analysis was also conducted to identify key topics.

    RESULTS: A total of 192 articles were included in the analysis. There were 242 institutions that contributed to the OSCE literature in pharmacy education, with the leading country being Canada. Most OSCE research came from developed countries and were descriptive studies based on single institution data. The top themes emerging from content analysis were student perceptions on OSCE station styles (n = 98), staff perception (n = 19), grade assessment of OSCEs (n = 145), interprofessional education (n = 11), standardized patients (n = 12), and rubric development and standard setting (n = 8).

    IMPLICATIONS: There has been a growth in virtual OSCEs, interprofessional OSCEs, and artificial intelligence OSCEs. Communication rubrics and minimizing assessor variability are still trending research areas. There is scope to conduct more research on evaluating specific types of OSCEs, when best to hold an OSCE, and comparing OSCEs to other assessments.

    Matched MeSH terms: Educational Measurement
  6. Solarsh G, Lindley J, Whyte G, Fahey M, Walker A
    Acad Med, 2012 Jun;87(6):807-14.
    PMID: 22643380 DOI: 10.1097/ACM.0b013e318253226a
    The learning objectives, curriculum content, and assessment standards for distributed medical education programs must be aligned across the health care systems and community contexts in which their students train. In this article, the authors describe their experiences at Monash University implementing a distributed medical education program at metropolitan, regional, and rural Australian sites and an offshore Malaysian site, using four different implementation models. Standardizing learning objectives, curriculum content, and assessment standards across all sites while allowing for site-specific implementation models created challenges for educational alignment. At the same time, this diversity created opportunities to customize the curriculum to fit a variety of settings and for innovations that have enriched the educational system as a whole.Developing these distributed medical education programs required a detailed review of Monash's learning objectives and curriculum content and their relevance to the four different sites. It also required a review of assessment methods to ensure an identical and equitable system of assessment for students at all sites. It additionally demanded changes to the systems of governance and the management of the educational program away from a centrally constructed and mandated curriculum to more collaborative approaches to curriculum design and implementation involving discipline leaders at multiple sites.Distributed medical education programs, like that at Monash, in which cohorts of students undertake the same curriculum in different contexts, provide potentially powerful research platforms to compare different pedagogical approaches to medical education and the impact of context on learning outcomes.
    Matched MeSH terms: Educational Measurement/methods; Educational Measurement/standards
  7. Loh KY, Kwa SK
    Med Educ, 2009 Nov;43(11):1101-2.
    PMID: 19874515 DOI: 10.1111/j.1365-2923.2009.03501.x
    Matched MeSH terms: Educational Measurement/methods*; Educational Measurement/standards
  8. Loh KY, Nalliah S
    Med Educ, 2008 Nov;42(11):1127-8.
    PMID: 18991988 DOI: 10.1111/j.1365-2923.2008.03217.x
    Matched MeSH terms: Educational Measurement/methods*; Educational Measurement/standards
  9. Awaisu A, Mohamed MH, Al-Efan QA
    Am J Pharm Educ, 2007 Dec 15;71(6):118.
    PMID: 19503702
    OBJECTIVES: To assess bachelor of pharmacy students' overall perception and acceptance of an objective structured clinical examination (OSCE), a new method of clinical competence assessment in pharmacy undergraduate curriculum at our Faculty, and to explore its strengths and weaknesses through feedback.

    METHODS: A cross-sectional survey was conducted via a validated 49-item questionnaire, administered immediately after all students completed the examination. The questionnaire comprised of questions to evaluate the content and structure of the examination, perception of OSCE validity and reliability, and rating of OSCE in relation to other assessment methods. Open-ended follow-up questions were included to generate qualitative data.

    RESULTS: Over 80% of the students found the OSCE to be helpful in highlighting areas of weaknesses in their clinical competencies. Seventy-eight percent agreed that it was comprehensive and 66% believed it was fair. About 46% felt that the 15 minutes allocated per station was inadequate. Most importantly, about half of the students raised concerns that personality, ethnicity, and/or gender, as well as interpatient and inter-assessor variability were potential sources of bias that could affect their scores. However, an overwhelming proportion of the students (90%) agreed that the OSCE provided a useful and practical learning experience.

    CONCLUSIONS: Students' perceptions and acceptance of the new method of assessment were positive. The survey further highlighted for future refinement the strengths and weaknesses associated with the development and implementation of an OSCE in the International Islamic University Malaysia's pharmacy curriculum.

    Matched MeSH terms: Educational Measurement/methods*; Educational Measurement/standards
  10. Nagandla K, Gupta ED, Motilal T, Teng CL, Gangadaran S
    Natl Med J India, 2019 7 4;31(5):293-295.
    PMID: 31267998 DOI: 10.4103/0970-258X.261197
    Background: Assessment drives students' learning. It measures the level of students' understanding. We aimed to determine whether performance in continuous assessment can predict failure in the final professional examination results.

    Methods: We retrieved the in-course continuous assessment (ICA) and final professional examination results of 3 cohorts of medical students (n = 245) from the examination unit of the International Medical University, Seremban, Malaysia. The ICA was 3 sets of composite marks derived from course works, which includes summative theory paper with short answer questions and 1 of the best answers. The clinical examination includes end-of-posting practical examination. These examinations are conducted every 6 months in semesters 6, 7 and 8; they are graded as pass/fail for each student. The final professional examination including modified essay questions (MEQs), 1 8-question objective structured practical examination (OSPE) and a 16-station objective structured clinical examination (OSCE), were graded as pass/fail. Failure in the continuous assessment that can predict failure in each component of the final professional examination was tested using chi-square test and presented as odds ratio (OR) with 95% confidence interval (CI).

    Results: Failure in ICA in semesters 6-8 strongly predicts failure in MEQs, OSPE and OSCE of the final professional examination with OR of 3.8-14.3 (all analyses p< 0.001) and OR of 2.4-6.9 (p<0.05). However, the correlation was stronger with MEQs and OSPE compared to OSCE.

    Conclusion: ICA with theory and clinical examination had a direct relationship with students' performance in the final examination and is a useful assessment tool.

    Matched MeSH terms: Educational Measurement/methods*; Educational Measurement/statistics & numerical data
  11. Liew SC, Dutta S, Sidhu JK, De-Alwis R, Chen N, Sow CF, et al.
    Med Teach, 2014 Jul;36(7):626-31.
    PMID: 24787534 DOI: 10.3109/0142159X.2014.899689
    The complexity of modern medicine creates more challenges for teaching and assessment of communication skills in undergraduate medical programme. This research was conducted to study the level of communication skills among undergraduate medical students and to determine the difference between simulated patients and clinical instructors' assessment of communication skills.
    Matched MeSH terms: Educational Measurement/methods*; Educational Measurement/statistics & numerical data
  12. Lee Chin K, Ling Yap Y, Leng Lee W, Chang Soh Y
    Am J Pharm Educ, 2014 Oct 15;78(8):153.
    PMID: 25386018 DOI: 10.5688/ajpe788153
    To determine whether human patient simulation (HPS) is superior to case-based learning (CBL) in teaching diabetic ketoacidosis (DKA) and thyroid storm (TS) to pharmacy students.
    Matched MeSH terms: Educational Measurement/methods; Educational Measurement/standards*
  13. Chaudhuri JD
    J Indian Med Assoc, 2010 Mar;108(3):168-9.
    PMID: 21043355
    The system of medical education has not changed much over the years. This article discusses the present method of teaching of medical students. Suggestions for change in the methods have been suggested in order to produce better doctors.
    Matched MeSH terms: Educational Measurement
  14. Sivarajan S, Soh EX, Zakaria NN, Kamarudin Y, Lau MN, Bahar AD, et al.
    BMC Med Educ, 2021 Jun 07;21(1):326.
    PMID: 34098931 DOI: 10.1186/s12909-021-02717-5
    BACKGROUND: Wire-bending skills is commonly taught through live demonstrations (LD) though flipped classroom (FC) method has gained popularity. Continuous formative assessment promotes personalised learning via closely monitored progress, with the identification of students' strengths and weaknesses. This study aims to evaluate the effects of LD and FC teaching methods, supplemented with continuous formative assessment, on dental students' learning of wire-bending skills for six types of removable orthodontic appliance components. A deeper understanding of the relative effectiveness between LD and FC teaching methods can help identify the most appropriate method to achieve student learning objectives, which is especially important given the current Covid-19 pandemic.

    METHODS: Forty third-year undergraduate dental students were randomly assigned into FC (n = 20) or LD (n = 20) cohort. Each student attended six teaching sessions, each to teach students' competency in fabricating one type of wire component, for a total competency in fabricating six wire components over the course of six teaching sessions. Either LD or FC teaching methods were used. After each session, wire assignments had to be submitted. Wire assignments were then evaluated using a blinded wire-bending assessment protocol. As part of their formative assessment, the assessment results were distributed to students, lecturers, and technicians before the next session. After the first session (T0) and at the end of all six sessions (T1), students completed a self-reported questionnaire.

    RESULTS: The mean wire-bending scores for FC were significantly higher than LD for two of the six assignments, namely the Adams clasp (p 

    Matched MeSH terms: Educational Measurement
  15. Ramamurthy S, Er HM, Devi Nadarajah V, Radhakrishnan AK
    Med Teach, 2021 Jul;43(sup1):S6-S11.
    PMID: 31408404 DOI: 10.1080/0142159X.2019.1646894
    BACKGROUND: Lifelong learning (LL) is an important outcome of medical training. The objective of this study is to measure the orientation of medical students toward LL and to determine the types of self-directed learning (SDL) activities that contribute toward LL skills.

    METHODS: The Jefferson Scale of Physician Lifelong Learning for medical student (JeffSPLL-MS) questionnaire was used. Factor analysis was performed, Cronbach's alpha and effect size were calculated. The types of learning activities that contribute to LL skills were identified.

    RESULTS: Three-factor structure emerged from the factor analysis and were identified as learning beliefs and motivation, skills in seeking information and attention to learning opportunities. A significant increase (p 

    Matched MeSH terms: Educational Measurement
  16. Carr JE
    Med J Malaysia, 1975 Sep;30(1):3-9.
    PMID: 1207528
    Matched MeSH terms: Educational Measurement
  17. Vashe A, Devi V, Rao KR, Abraham RR
    Natl Med J India, 2021 8 17;34(1):40-45.
    PMID: 34397005 DOI: 10.4103/0970-258X.323445
    Background: . The relevance of curriculum mapping to determine the links between expected learning outcomes and assessment is well stated in the literature. Nevertheless, studies confirming the usage of such maps are minimal.

    Methods: . We assessed links through curriculum mapping, between assessments and expected learning outcomes of dental physiology curriculum of three batches of students (2012-14) at Melaka-Manipal Medical College (MMMC), Manipal. The questions asked under each assessment method were mapped to the respective expected learning outcomes, and students' scores in different assessments in physiology were gathered. Students' (n = 220) and teachers' (n=15) perspectives were collected through focus group discussion sessions and questionnaire surveys.

    Results: . More than 75% of students were successful (≥50% scores) in majority of the assessments. There was moderate (r=0.4-0.6) to strong positive correlation (r=0.7-0.9) between majority of the assessments. However, students' scores in viva voce had a weak positive correlation with the practical examination score (r=0.230). The score in the assessments of problem-based learning had either weak (r=0.1-0.3) or no correlation with other assessment scores.

    Conclusions: . Through curriculum mapping, we were able to establish links between assessments and expected learning outcomes. We observed that, in the assessment system followed at MMMC, all expected learning outcomes were not given equal weightage in the examinations. Moreover, there was no direct assessment of self-directed learning skills. Our study also showed that assessment has supported students in achieving the expected learning outcomes as evidenced by the qualitative and quantitative data.

    Matched MeSH terms: Educational Measurement
  18. Nwameme A, Dako-Gyeke P, Asampong E, Allotey P, Reidpath DD, Certain E, et al.
    PLoS Negl Trop Dis, 2023 Mar;17(3):e0011139.
    PMID: 36961830 DOI: 10.1371/journal.pntd.0011139
    The Special Programme for Research and Training in Tropical Diseases developed a massive open online course (MOOC) on implementation research with a focus on infectious diseases of poverty (IDPs) to reinforce the explanation of implementation research concepts through real case studies. The target MOOC participant group included public health officers, researchers and students. By reshaping institutions and building resilience in communities and systems, implementation research will allow progress towards universal health coverage and sustainable development goals. This study evaluates learners' knowledge in implementation research after completing the MOOC using anonymous exit survey responses. Of the almost 4000 enrolled in the two sessions of the MOOC in 2018, about 30% completed all five modules and the assessments, and were awarded certificates. The majority of the participants were early to mid-career professionals, under the age of 40, and from low- and middle-income countries. They were slightly more likely to be men (56%) with a Bachelor or a Master's degree. Participants were public health researchers (45%), public health officers (11%) or students (11%). On completion of the course, an exit survey revealed that 80.9% of respondents indicated significant improvement to strong and very strong implementation research knowledge. This evaluation clearly shows the usefulness of the MOOC on implementation research for reaching out to field researchers and public health practitioners who are facing problems in the implementation of control programmes in low- and middle-income countries.
    Matched MeSH terms: Educational Measurement
  19. Yusoff MS, Hadie SN, Abdul Rahim AF
    Med Educ, 2014 Feb;48(2):108-10.
    PMID: 24528391 DOI: 10.1111/medu.12403
    Matched MeSH terms: Educational Measurement/methods*
  20. Lai NM, Teng CL, Nalliah S
    Educ Health (Abingdon), 2012 Jul;25(1):33-9.
    PMID: 23787382
    CONTEXT: The Fresno test and the Berlin Questionnaire are two validated instruments for objectively assessing competence in evidence-based medicine (EBM). Although both instruments purport to assess a comprehensive range of EBM knowledge, they differ in their formats. We undertook a preliminary study using the adapted version of the two instruments to assess their correlations when administered to medical students. The adaptations were made mainly to simplify the presentation for our undergraduate students while preserving the contents that were assessed.
    METHODS: We recruited final-year students from a Malaysian medical school from September 2006 to August 2007. The students received a structured EBM training program within their curriculum. They took the two instruments concurrently, midway through their final six months of training. We determined the correlations using either the Pearson's or Spearman's correlation depending on the data distribution.
    RESULTS: Of the 120 students invited, 72 (60.0%) participated in the study. The adapted Fresno test and the Berlin Questionnaire had a Cronbach's alfa of 0.66 and 0.70, respectively. Inter-rater correlation (r) of the adapted Fresno test was 0.9. The students scored 45.4% on average [standard deviation (SD) 10.1] on the Fresno test and 44.7% (SD 14.9) on the Berlin Questionnaire (P = 0.7). The overall correlation between the two instruments was poor (r = 0.2, 95% confidence interval: -0.07 to 0.42, P = 0.08), and correlations remained poor between items assessing the same EBM domains (r = 0.01-0.2, P = 0.07-0.9).
    DISCUSSION: The adapted versions of the Fresno test and the Berlin Questionnaire correlated poorly when administered to medical students. The two instruments may not be used interchangeably to assess undergraduate competence in EBM.
    Matched MeSH terms: Educational Measurement/methods*
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links