Validation and perception of a key feature problem examination in neurology
Autoři:
Meike Grumer aff001; Peter Brüstle aff002; Johann Lambeck aff001; Silke Biller aff002; Jochen Brich aff001
Působiště autorů:
Department of Neurology and Neuroscience, Medical Center, University of Freiburg, Freiburg, Germany
aff001; Center of Competence for the Evaluation of Teaching in Medicine Baden-Württemberg, Albert-Ludwigs-University Freiburg, Freiburg, Germany
aff002
Vyšlo v časopise:
PLoS ONE 14(10)
Kategorie:
Research Article
doi:
https://doi.org/10.1371/journal.pone.0224131
Souhrn
Objective
To validate a newly-developed Key Feature Problem Examination (KFPE) in neurology, and to examine how it is perceived by students.
Methods
We have developed a formative KFPE containing 12 key feature problems and 44 key feature items. The key feature problems covered four typical clinical situations. The items were presented in short- and long-menu question formats. Third- and fourth-year medical students undergoing the Neurology Course at our department participated in this study. The students' perception of the KFPE was assessed via a questionnaire. Students also had to pass a summative multiple-choice question examination (MCQE) containing 39 Type-A questions. All key feature and multiple-choice questions were classified using a modified Bloom’s taxonomy.
Results
The results from 81 KFPE participants were analyzed. The average score was 6.7/12 points. Cronbach’s alpha for the 12 key-feature problems was 0.53. Item difficulty level scores were between 0.39 and 0.77, and item-total correlations between 0.05 and 0.36. Thirty-two key feature items of the KFPE were categorized as testers of comprehension, application and problem-solving, and 12 questions as testers of knowledge (MCQE: 15 comprehension and 24 knowledge, respectively). Overall correlations between the KFPE and the MCQE were intermediate. The KFPE was perceived well by the students.
Conclusions
Adherence to previously-established principles enables the creation of a valid KFPE in the field of Neurology.
Klíčová slova:
Human learning – Lectures – Reasoning – Statistical distributions – Taxonomy – Teachers – Universities – Reasoning skills
Zdroje
1. Flanagan E, Walsh C, Tubridy N. 'Neurophobia'—attitudes of medical students and doctors in Ireland to neurological teaching. Eur J Neurol. 2007;14:1109–1112. doi: 10.1111/j.1468-1331.2007.01911.x 17880566
2. Schon F, Hart P, Fernandez C. Is clinical neurology really so difficult? J Neurol Neurosurg Psychiatry. 2002;72:557–559. doi: 10.1136/jnnp.72.5.557 11971033
3. Zinchuk AV, Flanagan EP, Tubridy NJ, Miller WA, McCullough LD. Attitudes of US medical trainees towards neurology education: "Neurophobia"—a global issue. BMC Med Educ. 2010;10:49. doi: 10.1186/1472-6920-10-49 20573257
4. Jozefowicz RF. Neurophobia: the fear of neurology among medical students. Arch Neurol. 1994;51:328–329. doi: 10.1001/archneur.1994.00540160018003 8155008
5. Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355:2217–2225. doi: 10.1056/NEJMra054782 17124019
6. Bordage G, Page G. An alternate approach to PMPs, the key feature concept. In: Hart I, Harden R. Further Developments in Assessing Clinical Competence. Montreal: Can-Heal Publications. 1987:p.57–75.
7. Page G, Bordage G. The Medical Council of Canada’s key features project: a more valid written examination of clinical decision-making skills. Acad Med. 1995;70:104–110. doi: 10.1097/00001888-199502000-00012 7865034
8. Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills. Academic Medicine. 1995;70:194–201. doi: 10.1097/00001888-199503000-00009 7873006
9. Schuwirth LW, van der Vleuten CP, Donkers HH. A closer look at cueing effects in multiple-choice questions. Med Educ.1996;30:44–49. doi: 10.1111/j.1365-2923.1996.tb00716.x 8736188
10. Case SM, Swanson DB, Ripkey DR. Comparison of items in five-option and extended-matching formats for assessment of diagnostic skills. Acad Med. 1994;69(10 Suppl):1–3.
11. Schuwirth LW, van der Vleuten CP, Stoffers HE, Peperkamp AG. Computerized long-menu questions as an alternative to open-ended questions in computerized assessment. Med Educ. 1996;30:50–55. doi: 10.1111/j.1365-2923.1996.tb00717.x 8736189
12. Rotthoff T, Baehring T, Dicken HD, Fahron U, Richter B, Fischer MR, Scherbaum WA. Comparison between Long-Menu and Open-Ended Questions in computerized medical assessments. A randomized controlled trial. BMC Med Educ. 2006;6:50. doi: 10.1186/1472-6920-6-50 17032439
13. Fischer MR, Kopp V, Holzer M, Ruderich F, Jünger J. A modified electronic key feature examination for undergraduate medical students: validation threats and opportunities. Med Teach. 2005;27:450–5. doi: 10.1080/01421590500078471 16147800
14. NKLM [online]. Accessed at: http://www.nklm.de/kataloge/nklm/lernziel/uebersicht. Accessed July 11, 2019.
15. Hochlehnert A, Brass K, Möltner A, Schultz JH, Norcini J, Tekian A, Jünger J. Good exams made easy: the item management system for multiple examination formats. BMC Med Educ. 2012;12:63 doi: 10.1186/1472-6920-12-63 22857655
16. Krebs, R. (2002): Anleitung zur Herstellung von MC-Fragen und MC-Prüfungen. Institut für Aus-, Weiter- und Fortbildung IAWF; Abt. für Ausbildungs- und Examensforschung AAE. Bern.
17. Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? BMC Med Educ. 2007;7:49. doi: 10.1186/1472-6920-7-49 18045500
18. Palmer EJ, Duggan P, Devitt PG, Russell R. The modified essay question: its exit from the exit examination? Med Teach. 2010;32:e300–307. doi: 10.3109/0142159X.2010.488705 20653373
19. Möltner A, Schellberg D, Jünger J. Basic quantitative analyses of medical examinations. GMS Z Med Ausbild. 2006;23:Doc53.
20. Messick S. Validity. In: Educational Measurement, 3rd edn. Ed: Linn RL. New York: American Council on Education and Macmillan 1989:13–104
21. Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37:830–837. doi: 10.1046/j.1365-2923.2003.01594.x 14506816
22. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. Washington, DC: AERA 2014.
23. Downing SM. Reliability: on the reproducibility of assessment data. Med Educ. 2004;38:1006–1012. doi: 10.1111/j.1365-2929.2004.01932.x 15327684
24. Van Der Vleuten CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996;1:41–67. doi: 10.1007/BF00596229 24178994
25. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith R, Hays R, Kent A, Perrott V, Roberts T. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33:206–214. doi: 10.3109/0142159X.2011.551559 21345060
26. Eva KW, Wood TJ. Can the strength of candidates be discriminated based on ability to circumvent the biasing effect of prose? Implications for evaluation and education. Acad Med. 2003;78(10 Suppl):78–81.
27. Eva KW, Wood TJ, Riddle J, Touchie C, Bordage G. How clinical features are presented matters to weaker diagnosticians. Med Educ. 2010;44:775–785. doi: 10.1111/j.1365-2923.2010.03705.x 20633217
28. Hatala R, Norman GR. Adapting the Key Features Examination for a clinical clerkship. Med Educ. 2002;36:160–165. doi: 10.1046/j.1365-2923.2002.01067.x 11869444
29. Hurtz GM, Chinn RN, Barnhill GC, Hertz NR. Measuring clinical decision making: do key features problems measure higher level cognitive processes? Eval Health Prof. 2012;35:396–415 doi: 10.1177/0163278712446639 22605792
30. Schmidmaier R, Eiber S, Ebersbach R, Schiller M, Hege I, Holzer M, Fischer MR. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting? BMC Med Educ. 2013;22;13:28 doi: 10.1186/1472-6920-13-22
31. Trudel JL, Bordage G, Downing SM. Reliability and validity of key feature cases for the self-assessment of colon and rectal surgeons. Ann Surg. 2008;248:252–258. doi: 10.1097/SLA.0b013e31818233d3 18650635
32. Koh GC, Khoo HE, Wong ML, Koh D. The effects of problem-based learning during medical school on physician competency: a systematic review. Canadian Medical Association Journal 2008;178:34–41 doi: 10.1503/cmaj.070565 18166729
33. Jost M, Brüstle P, Giesler M, Rijntjes M, Brich J. Effects of additional team-based learning on students' clinical reasoning skills: a pilot study. BMC Res Notes. 2017;10:282. doi: 10.1186/s13104-017-2614-9 28705246
34. Brich J, Jost M, Brüstle P, Giesler M, Rijntjes M. Teaching neurology to medical students with a simplified version of team-based learning. Neurology. 2017;89:616–622. doi: 10.1212/WNL.0000000000004211 28701497
35. Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ 1996;1: 41–67.
36. Wenghofer E, Klass D, Abrahamowicz M, Dauphinee D, Jacques A, Smee S, Blackmore D, Winslade N, Reidel K, Bartman I, Tamblyn R. Doctor scores on national qualifying examinations predict quality of care in future practice. Medical Education 2009;43:1166–1173 doi: 10.1111/j.1365-2923.2009.03534.x 19930507
37. Tamblyn R, Abrahamowicz M, Dauphinee D, Wenghofer E, Jacques A, Klass D, Smee S, Blackmore D, Winslade N, Girard N, Du Berger R, Bartman I, Buckeridge DL, Hanley JA. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA. 2007;298:993–1001. doi: 10.1001/jama.298.9.993 17785644
38. Tamblyn R, Abrahamowicz M, Dauphinee D, Wenghofer E, Jacques A, Klass D, Smee S, Eguale T, Winslade N, Girard N, Bartman I, Buckeridge DL, Hanley JA. Influence of physicians' management and communication ability on patients' persistence with antihypertensive medication. Arch Intern Med. 2010;170:1064–72. doi: 10.1001/archinternmed.2010.167 20585073
39. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138:476–81. doi: 10.7326/0003-4819-138-6-200303180-00012 12639081
Článek vyšel v časopise
PLOS One
2019 Číslo 10
- S diagnostikou Parkinsonovy nemoci může nově pomoci AI nástroj pro hodnocení mrkacího reflexu
- Je libo čepici místo mozkového implantátu?
- Pomůže v budoucnu s triáží na pohotovostech umělá inteligence?
- AI může chirurgům poskytnout cenná data i zpětnou vazbu v reálném čase
- Nová metoda odlišení nádorové tkáně může zpřesnit resekci glioblastomů
Nejčtenější v tomto čísle
- Correction: Low dose naltrexone: Effects on medication in rheumatoid and seropositive arthritis. A nationwide register-based controlled quasi-experimental before-after study
- Combining CDK4/6 inhibitors ribociclib and palbociclib with cytotoxic agents does not enhance cytotoxicity
- Experimentally validated simulation of coronary stents considering different dogboning ratios and asymmetric stent positioning
- Risk factors associated with IgA vasculitis with nephritis (Henoch–Schönlein purpura nephritis) progressing to unfavorable outcomes: A meta-analysis
Zvyšte si kvalifikaci online z pohodlí domova
Všechny kurzy