Conexiant
Login
  • Corneal Physician
  • Glaucoma Physician
  • New Retinal Physician
  • Ophthalmology Management
  • Ophthalmic Professional
  • Presbyopia Physician
  • Retinal Physician
The Ophthalmologist
  • Explore

    Explore

    • Latest
    • Insights
    • Case Studies
    • Opinion & Personal Narratives
    • Research & Innovations
    • Product Profiles

    Featured Topics

    • Anterior Segment
    • Glaucoma
    • Retina

    Issues

    • Latest Issue
    • Archive
  • Subspecialties
    • Cataract
    • Cornea
    • Glaucoma
    • Neuro-ophthalmology
    • Oculoplastics
    • Optometry
    • Pediatric
    • Retina
  • Business

    Business & Profession

    • Professional Development
    • Business and Entrepreneurship
    • Practice Management
    • Health Economics & Policy
  • Training & Education

    Career Development

    • Professional Development
    • Career Pathways

    Events

    • Webinars
    • Live Events
  • Events
    • Live Events
    • Webinars
  • Community

    People & Profiles

    • Power List
    • Voices in the Community
    • Authors & Contributors
  • Multimedia
    • Video
Subscribe
Subscribe

False

Advertisement
The Ophthalmologist / Issues / 2026 / February / Competence Over Time
Health Economics and Policy Insights Opinions

Competence Over Time

Should time really surpass competence when it comes to ophthalmology training?

By Ann Sofia Skou Thomsen, Lars Konge 2/23/2026 5 min read

Share


In a discipline where micrometers matter and errors can permanently alter a patient’s life, ophthalmology training still relies heavily on one of the bluntest instruments available: time. Case numbers, procedures logged, or time used on a virtual-reality simulator all continue to function as proxies for real-life competence.

Lars Konge

Ann Sofia Skou Thomsen

Rethinking what it means to be competent

Competence in ophthalmology is not an abstract concept. It is the demonstrable ability to perform complex technical, cognitive, and decision-making tasks safely, consistently, and independently. And yet these time-based systems assume that competence emerges naturally through exposure. Spend long enough in the operating room, the logic goes, and proficiency will eventually follow.

However, decades of educational research tells a different story. Our own research (1-3) has repeatedly demonstrated that learners acquire skills at vastly different rates. Some trainees do reach proficiency early, while others require more extended periods of training. So when progression is dictated by time rather than competence, two things might happen: capable trainees are held back when they are already competent, and underprepared ones are pushed forward when they still require more training.

Studies on cataract surgery training using simulation-based mastery learning also revealed wide variability in how long trainees needed to reach predefined proficiency benchmarks. Crucially, once those benchmarks were met, performance in the operating room improved – independent of training duration (4).

When legacy becomes liability

The apprenticeship model – “see one, do one, teach one” – has deep roots in surgical culture. Observation, gradual participation, and increasing responsibility remain central elements of ophthalmology training. These elements are not inherently flawed – but they are insufficient on their own.

Modern ophthalmic surgery takes place in a context of limited operating room availability, increasing subspecialization, and high expectations for patient safety. Teaching opportunities vary between supervisors, feedback is often informal, and assessment frequently relies on global impressions rather than structured criteria.

Our research has shown that expert judgment alone is insufficient to reliably assess surgical competence (1,5). Even experienced surgeons often disagree on what constitutes “good enough,” and their evaluations can be influenced by bias, familiarity, or context. And so, without shared standards and objective measures, the apprenticeship model risks producing variability rather than reliability. What once worked in a different era may now expose patients and trainees to unnecessary risk.

Competency-based education

Competency-Based Medical Education (CBME) is not simply a curricular tweak. It is a fundamental shift in focus - from time spent training to outcomes achieved.

At its core, CBME asks a radical question: which skills must a trainee reliably demonstrate before being allowed to operate on a patient’s eye?

This reframing has profound implications. Progression is no longer automatic with seniority. Instead, it depends on demonstrated competence across clearly defined domains – expectations are explicit, assessment is systematic, and feedback is structured.

Work in simulation-based training provides concrete examples of how this can be operationalized: define performance standards, measure skills objectively, and require mastery before advancement (6). This approach replaces vague expectations with transparent, measurable benchmarks.

Measurement is the unpleasant but necessary backbone of CBME. Technical skills can be measured reliably using assessment tools with evidence of validity – particularly in simulation settings. Metrics such as error rates, instrument handling, tissue damage, and procedural flow all offer objective insight into a trainee’s performance. Without such measurement, educators are left to guess.

This type of objective assessment protects patients, supports trainees, and provides educators with actionable information. It enables early identification of skill gaps and targeted remediation before patients are exposed to risk.

Barriers to implementation

So if the evidence supporting CBME is strong, why has implementation been so slow?

CBME challenges deeply ingrained hierarchies. It exposes variability in teaching and assessment practices, and it disrupts the notion of seniority-based privilege. It requires faculty development, shared assessment frameworks, and institutional commitment. It also introduces a level of transparency that can feel uncomfortable to some.

Resistance can also stem from concern about workload or loss of autonomy. While others worry that competence cannot be fully captured by metrics, despite evidence to the contrary. Often, these concerns reflect legitimate pressures within clinical practice, rather than any opposition to patient safety.

Therefore, understanding these barriers is essential – not to assign blame, but to address them thoughtfully.

The road ahead

The future of ophthalmology training must be intentional, measurable, and patient-centered.

This statement is neither radical nor unrealistic. Simulation-based mastery learning should be a standard prerequisite – not an adjunct – to operating room experience. Objective assessment tools should be embedded throughout training, not reserved for final evaluations. Faculty should be supported in developing assessment expertise alongside their surgical and clinical skills.

Crucially, progression should be flexible. Faster learners should progress without unnecessary delay, while those needing more training should receive it without stigma.

Competency as the gold standard

Ophthalmology demands precision, responsibility, and trust – and our training systems should evolve so that they continue to embody those principles.

Time-based training offers simplicity, but competence-based training offers safety. The growing evidence in ophthalmic surgical training has made one thing clear: excellence cannot be assumed – it must be demonstrated.

The tools exist for CBME. The only remaining question is whether we have the courage to let evidence – not habit – define how the future generations of ophthalmologists are trained. Competence must be measured, achieved, and sustained. Anything less is a risk we should no longer be willing to take.

References

  1. Thomsen et al., “Update on simulation-based surgical training and assessment in ophthalmology: a systematic review,” Ophthalmology, 122, 1111 (2015).  Jun;122(6):1111-1130. PMID: 25864793.
  2. Petersen et al., “Pretraining of basic skills on a virtual reality vitreoretinal simulator,” Acta Ophthalmol., [Online ahead of print] (2022). PMID: 34609052.
  3. Thomsen et al., “Is there inter-procedural transfer of skills in intraocular surgery? A randomized controlled trial,” Acta Ophthalmol., [Online ahead of print] (2017). PMID: 28371367.
  4. Thomsen et al., “Operating Room Performance Improves after Proficiency-Based Virtual Reality Cataract Surgery Training,” Ophthalmology, 124, 524 (2017). PMID: 28017423.
  5. Borgersen et al., “Gathering Validity Evidence for Surgical Simulation: A Systematic Review,” Annals of Surgery., 267,1063 (2018). PMID: 29303808.
  6. Bjerrum et al., “Surgical simulation: Current practices and future perspectives for technical skills training,” Med Teach., 40, 668 (2018). PMID: 29911477.

About the Author(s)

Ann Sofia Skou Thomsen

Ann Sofia Skou Thomsen is a board-certified ophthalmologist and clinical research associate professor in ophthalmology at the University of Copenhagen, Denmark, with a PhD in technology-enhanced training and assessment of surgical skills. She is associated with Department of Ophthalmology, Rigshospitalet, and Copenhagen Academy of Medical Education and Simulation (CAMES). Her research spans virtual reality–based simulation training, transfer of skills, objective skills assessment, and robot-assisted surgery — with multiple peer-reviewed publications.

More Articles by Ann Sofia Skou Thomsen

Lars Konge

Lars Konge is a certified cardio-thoracic surgeon, consultant at the Copenhagen Academy for Medical Education and Simulation (CAMES) Rigshospitalet, and a full professor of medical education at the University of Copenhagen and a part-time professor at the University of Southern Denmark. He has a PhD in assessment, and his main research areas are simulation-based training and certification of technical skills. Professor Konge is one of the world’s most active researchers in medical education and has published more than 400 papers in international peer-reviewed journals. He has been the supervisor on 34 PhD dissertations, and is currently supervising 12 PhD students regarding scientific projects on medical education.

More Articles by Lars Konge

Related Content

Newsletters

Receive the latest Ophthalmology news, personalities, education, and career development – weekly to your inbox.

Newsletter Signup Image

False

Advertisement

False

Advertisement

Explore More in Ophthalmology

Dive deeper into the world of Ophthalmology. Explore the latest articles, case studies, expert insights, and groundbreaking research.

False

Advertisement
The Ophthalmologist
Subscribe

About

  • About Us
  • Work at Conexiant Europe
  • Terms and Conditions
  • Privacy Policy
  • Advertise With Us
  • Contact Us

Copyright © 2025 Texere Publishing Limited (trading as Conexiant), with registered number 08113419 whose registered office is at Booths No. 1, Booths Park, Chelford Road, Knutsford, England, WA16 8GS.

Disclaimer

The Ophthalmologist website is intended solely for the eyes of healthcare professionals. Please confirm below: