Today, in the 21st century, the volume of medical knowledge has grown exponentially. Historically, medical education was focused on developing clinical skills and medical knowledge. Today such a narrow focus is deemed completely inadequate. Now a physician is expected to be a medical expert who is also a scholar dedicated to lifelong learning, and who integrates new and rapidly progressing knowledge. He or she is expected to be an excellent communicator, a patient advocate, a leader within the health care system, and a collaborator working with a multitude of other professions. In 1996, the Royal College of Physicians and Surgeons of Canada recognized the need for this broad skill set with the publication of the CanMEDS physician competency framework (Frank, 2015). The CanMEDS framework provided a new set of goals within medical education. With these new goals come new challenges. The now antiquated assessment models struggle to evaluate many skills demanded in a competent physician. Studies have shown that traditional means of evaluation cannot easily assess the performance of many of the physician roles defined in the CanMEDS framework (Chou, 2008; Sherbino, 2013).
A new means of evaluating and determining the progress of a learner within medical education is required. This has led to the educational model of competency-based medical education (CBE) becoming adopted as the optimal means to evaluate and determine the progress of the medical learner. A CBE curriculum operates on the premise of defined educational objectives. The trainee must demonstrate proficiency in these define objectives to progress to more advanced tasks. The notion of CBE is not new, and the theoretical behaviorist underpinnings of this construct were first defined by experimental psychologists such as Thorndike and Skinner (Morcke, 2013).
The Royal College has responded to the adoption of a CBE framework by defining the physician roles with various competencies now called milestones. Collectively, these are to be evaluated (Frank, 2014). Now a new challenge is on the horizon for the medical educator. How can the educator evaluate proficiency within these now defined milestones? How can we determine if the medical trainee is indeed competent for his or her expected level of training in a given role? Without developing valid methods of evaluation, an effective CBE curriculum is impossible. I believe digitally assisted instruction will play an essential role in empowering both the medical trainee and the educator in learning, teaching and evaluating CBE milestones. Likewise, a growing body of work has demonstrated that technology-enhanced simulation is a superior for teaching and demonstrating mastery of many procedural skills required by healthcare professionals (Cook, 2013; Willis, 2015). Cognitive and problem solving tasks can also be effectively and efficiently taught and evaluated through the application of digital technologies. The digital medium is superbly poised to address such needs by applying connectivist theory. Furthermore, such technology facilitates discovery learning and learner centered design principles. Additionally, online assessment of graduate medical trainees has also been found to be effective in improving learner performance (Karakus, 2014; Nacca, 2014; Pusponegoro, 2015; Wickens, 2015). Collectively, these developments and insights into the digital facets of medical education suggest that these tools are useful adjuvants in instruction and assessment. This technology is particularly advantageous when applied to curricula based on CBE milestones, and due to the Royal College mandate towards this instructional design model, this technology is needed more than ever within medical education. Digital tools allow instructional objectives which are not commonly encountered at the bedside nor easily assessed within the everyday clinical environment to be teachable and assessable within a CBE curriculum. Therefore, the medical educator must be versed in educational theory and practice, along with digital technology application. Though successful implementation of clinical bedside teaching together with mindful instructional design which is inclusive towards digital technology the best possible learning opportunities to the physician in training can be provided.
Chou, S., Cole, G., McLaughlin, K., & Lockyer, J. (2008). CanMEDS evaluation in Canadian postgraduate training programmes: Tools used and programme director satisfaction. Medical Education, 42(9), 879-886.
Frank, J., Snell, L., & Sherbino, J. (2015). The Draft CanMEDS 2015 Physician Competency Framework. CanMEDS 2015.
Sherbino, J., Kulasegaram, K., Worster, A., & Norman, G. (2013). The reliability of encounter cards to assess the CanMEDS roles. Advances in Health Science Education, 18(5), 987-996.
Karakus, A., & Şenyer, N. (2014). The preparedness level of final year medical students for an adequate medical approach to emergency cases: Computer-based medical education in emergency medicine. International Journal of Emergency Medicine, 7(3), 1-6.
Morcke, A., Dornan, T., & Eika, B. (2013). Outcome (competency) based education: An exploration of its origins, theoretical basis, and empirical evidence. Advances in Health Science Education, 18(4), 851-863.
Nacca, N., Holliday, J., & Ko, P. (2014). Randomized Trial of a Novel ACLS Teaching Tool: Does it Improve Student Performance? Western Journal of Emergency Medicine, 15(7), 913-918.
Pusponegoro, H., Soebadi, A., & Surya, R. (2015). Web-Based Versus Conventional Training for Medical Students on Infant Gross Motor Screening. Telemedicine and E-Health, 21(12), 1-6.
Wickens, B., Lewis, J., Morris, D., Husein, M., Ladak, H., & Agrawal, S. (2015). Face and content validity of a novel, web-based otoscopy simulator for medical education. Journal of Otolaryngology 54(11) 1-8.