As can be seen from the rest of my portfolio, whilst I directly teach for a proportion of my role, a much larger part of my work is directed to effective leadership, support, development and coordination of this very significant teaching effort. As evidence and recognition of the enormous scope of this coordination, management and support role, in 2011, I was selected to chair the Clinical Learning and Assessment Committee (CLAC). For the past eight years, I have successfully led, influenced and guided CLAC, comprised of senior clinicians from all our clinical sites. This role requires me, amongst other things, to report to senior Faculty Committees and respond to the accreditation requirements of the Australian Medical Council (AMC) – the accrediting body for all medical courses across Australia. During my tenure, I have led significant changes to assessment of clinical exams, delivered AMC accreditation reports and developed and strengthened the relationship between Faculty and our clinical sites and in particular the staff working with and teaching our students therein. In this, I have been aided by (and had the pleasure to mentor) two Teaching Fellows, one from mid-2017–early 2018 and another since the end of 2018.
In direct reference to the work of my extensive team and the curriculum redesign and innovations that I have put in place, the most recent AMC Reaccreditation Assessment noted “The Faculty has an excellent skills program ...”. Whilst this is clearly a team effort, I can claim considerable responsibility for how well this part of our program runs and its quality-assured status nationally. In particular, my leadership of CLAC and the flow-on effect of this and my co-ordination/supervisory activities across all clinical sites was important in two specific commendations (areas praised) in the AMC’s latest reaccreditation report:
- Commendation (AMC Standard 1.6): “The strong relationship between the Faculty and health administrators at each clinical site visited.”
- Commendation (AMC Standard 4.1): “The level of involvement of teaching staff, including clinicians with a conjoint appointment, and the Faculty’s organised processes that involve and encourage staff in the program’s learning and teaching methods.”
My further work in CLAC has had a singular purpose – to improve the process and reliability of high-stakes clinical examinations. One important step that began before my last promotion was to enable hundreds of clinical examiners to utilise the iPad-based OAapp for grading and feedback provision in clinical examinations. Through my subsequent leadership on CLAC, I have ensured this process occurs across the original five metropolitan Clinical Schools and now includes our five rural Clinical Schools (Coffs Harbour, Port Macquarie, Wagga Wagga, Albury, Griffith). Across these sites, ‘mission critical’ clinical examinations are conducted for >750 students each year. In establishing OAapp use as standard practice for OSCEs, I simplified the process of marking for examiners, but also the administration tasks around accurate recording and transmission of grades, since these are now electronically transmitted back to Faculty. Grades and examiner comments can now be published to students immediately; reducing post-exam stress and providing maximal educational value.
I have also led the implementation of a new, evidence-based approach to addressing borderline performance in clinical examinations, resulting in changes to the grading practice and the implementation of a ‘Borderline’ grade for these examinations. This allows us to accurately identify student performance at the borderline and thus remediate the skills of lower performing students. In addition, pre-prepared comments I wrote, easily selected by examiners on the OAapp, streamline provision of richer and more specific feedback from examiners to every student examined. Previously, using handwritten assessment sheets, only unsatisfactory students received detailed feedback and I found it unacceptable that valuable examiner feedback (grades and free text comments) was not available to many students. In developing pre-prepared comments, I worked hard to provide rich and meaningful feedback, yet succinctly, satisfying both the students’ diverse learning requirements and the practical requirements of examiners using the app.
Evidence of Impact:
- My implementation of AIB and iPad exams enabled standardisation of clinical examinations across clinical campuses, whilst introduction of the Borderline grade resulted in greater reliability in CS assessment, informed by relevant literature and accepted professional practice. As a result, ‘mission-critical’ clinical examinations are more easily delivered, processed and overseen and quicker to grade and with greater reliability, whilst students benefit by receiving detailed personalised feedback.
- Clinical examination outcomes – Whilst the implementation of the Borderline grade was not explicitly to make the clinical examinations more rigorous, the effect seen after implementation was a rise in the fail rate initially. However, in recent years, with the impact of the SPP and programs such OSPIA, which are primarily directed toward communication skills development, Year 2 student clinical examination results have shown an encouraging trend: 21 failed students in 2017 (out of 253 = 8.3%); only 5 failed on communication skills, with >75% students failing on physical examination. In 2018 out of 260 examinees only 12 students failed (rate = 4.6%), and 10 of these (>80%) did so on physical examination. (In response, I have implemented new learning activities to further develop students’ skills in physical examination.)