PURPOSE: E-learning provides a flexible and effective approach to learning and is increasingly used in otorhinolaryngology (ORL). We developed a national theoretical e-learning course for ORL trainees and describe our experiences with implementation as well as piloting the e-learning course.
METHODS: E-learning course content was developed as structured multiple-choice quizzes for the European core curriculum textbook. An open source learning management system was adapted for a self-paced e-learning course. We piloted the e-learning course as a non-mandatory option for the 15 residents participating in the Danish four-day national training course in rhinology in February 2019. A post-course survey was sent out to the participants and used in the evaluation along with activity data from the learning management system.
RESULTS: Fourteen out of 15 trainees participated in the e-learning course. Nine participants completed >95 % of the course. The activity data demonstrated that participants with the highest completion rate typically began well in advance of the course (>2 months). Overall the e-learning course was rated positively in relation to learning and as preparation for the national training course. Participants responded that the level of the e-learning course was higher than and also at times in slight incongruity with the content of the national curriculum. Participants would like protected study time for e-learning activities in their residency program. All participants responded that they would use e-learning in relation to future national training courses.
CONCLUSIONS: Developing a national e-learning course is feasible and is well-received by trainees as well as other educational stakeholders.
A variety of structured assessment tools for use in surgical training have been reported, but extant assessment tools often employ paper-based rating forms. Digital assessment forms for evaluating surgical skills could potentially offer advantages over paper-based forms, especially in complex assessment situations. In this paper, we report on the development of cross-platform digital assessment forms for use with multiple raters in order to facilitate the automatic processing of surgical skills assessments that include structured ratings. The FileMaker 13 platform was used to create a database containing the digital assessment forms, because this software has cross-platform functionality on both desktop computers and handheld devices. The database is hosted online, and the rating forms can therefore also be accessed through most modern web browsers. Cross-platform digital assessment forms were developed for the rating of surgical skills. The database platform used in this study was reasonably priced, intuitive for the user, and flexible. The forms have been provided online as free downloads that may serve as the basis for further development or as inspiration for future efforts. In conclusion, digital assessment forms can be used for the structured rating of surgical skills and have the potential to be especially useful in complex assessment situations with multiple raters, repeated assessments in various times and locations, and situations requiring substantial subsequent data processing or complex score calculations.