Lise McCoy MTESL, Clinical Affairs Unit, School of Osteopathic Medicine in Arizona
Frederic N. Schwartz, DO, Department of Family and Community Medicine, School of Osteopathic Medicine in Arizona
Overview: Improving the quality of training for primary care physicians is an important goal, as the United States faces increasing deficits in numbers of family physicians. The School of Osteopathic Medicine in Arizona (SOMA) trains all its undergraduate medical students at community campuses that provide medical home services to vulnerable underserved populations during MS2-4. During their experiences, medical students see a large volume and variety of patients. Students use technology to log patient encounters daily through SOMA’s online database, E*Value. Student clinical logs include patient demographics, diagnoses, procedures, and clinical presentations. To date, more than 400,000 clinical case logs have been submitted electronically in the first 18 months of the program. The large quantity of case logs collected during student rotations provides the college with valuable information regarding the variety of cases seen. Another key aspect of the clinical training program relates to professionalism. The college strives to increase student professionalism with respect to their clinical case logging. We teach the students that documentation regarding clinical case logs is a critically important clinical skill, as it provides basic training in keeping accurate patient records. A familiar medical profession adage goes: if it wasn’t documented, it didn’t happen.
The Study: SOMA tracks both clinical case logs and clinical performance evaluations within E*Value, collecting data from MS3 and MS4 students at eleven remote sub-campuses. This allows investigations on the relationship between professionalism and volume of undergraduate case logs per region. In this study, SOMA investigates the relationship between clinical case log volume and student professionalism ratings as evaluated by clinical preceptors during the third and fourth years. This study also highlights successful ways that technology supports collection and review of student assessment data in a distributed education model.
Methodology: After each rotation, MS3 and MS4 students are rated by preceptors using the SOMA Clinical Performance Evaluation, and this data is tracked within E*Value. This evaluation contains specific items relating to professionalism. SOMA proposes to rank student performance for 200 medical students with respect to their professionalism scores on the Clinical Performance Evaluation. During their third and fourth years, medical students are required to log every patient encounter in E*Value. SOMA will tally the student case log volume for diagnoses and procedures logged during MS3 and MS4, and rank student performance using this measure. Next, SOMA will compare the two rankings and report any correlations between volume of case logs and student professionalism scores. The study will also report notable variations in clinical case logging and/or professionalism among SOMA’s eleven sub-campuses.
Implications: This presentation will review the educational implications from studying evaluations performance data in relationship to case log data. It will also summarize points related to using technology to track clinical performance evaluation data and case log data over a distributed, multi-campus model. This line of inquiry may lead to some insights about case log data collection and its role in medical education, and whether electronic case logging can be more widely used for collaborative interactions, assessments, and projects.