Automated capture and assessment of medical student clinical experience.

Abstract

Currently, many medical educators track trainee clinical experience using student-created manual logs. Using a web-based portfolio system that captures all notes written by trainees in the electronic medical record, we examined a graduating medical student's clinical notes to determine if we could automatically assess exposure to 10 institution-defined core clinical topics. We located all biomedical concepts in his clinical notes, divided by note section, using the KnowledgeMap concept identifier. Notes were ranked according to the concepts matching each core topic's concept list. Clinician educators then reviewed each note to determine relevance to the core topic. The student covered all core topics, with between 2 and 41 notes containing highly relevant discussions. The algorithm effectively predicted relevance (p<0.001). This method is a promising first step toward automated competency assessment.