Sunday, December 10, 2023

Ongoing project draft : Optimizing clinical complexity through real patient OSCEs in user driven online learning portfolios and learning fora

Title :


Optimizing clinical complexity through real patient OSCEs in user driven online learning portfolios and learning fora



Introduction: 

Most learning is a process of objectively structuring subjective complex multidimensional real life data (blooms level 3-5) into a two dimensional space (blooms level 1 aka knowledge) that can be stored forwarded asynchronously and modeled conceptually to gain understanding (blooms level 2) through further analysis ( level 4) and evaluation (level 5) and then relooped into the learning ecosystem as creative communication/publication (level 6). This learning is cyclic and one can keep moving in and out of these levels at any entry or exit point regardless of level numbers. 

More about bloom levels in a past lecture here :



Problem statement :

Objectively structured clinical evaluations OSCEs have been regarded as a medical educational assessment tool for many decades and at some point threatened the long case into extinction. (1-2) 

However gradually people may have realized that traditional OSCEs are simplified simulations that disregard real world complexities and create an illusion of certainty for learners and assessors alike. (1-2)

Solution: Modification of traditional format preserving realism and clinical complexity developed through candidate's ability toward Socratic questioning also leading to further analysis and evaluation of the long case leading to Bloom level 6 learning insights  on the case not just as a one time summative assessment over a short period of hours but over days of the patients stay in the hospital and subsequent follow up at home all the while continuing the student's evaluation through direct 360 degree observation by multiple evaluators and their feedback inputs around the patient serving as valuable assessment indicators during thematic analysis. (2) 


Methods :

We have developed a modification of traditional format preserving realism and clinical complexity developed through candidate's ability toward Socratic questioning also leading to further analysis and evaluation of the long case leading to Bloom level 6 learning insights  on the case as demonstrated and published here : https://vaishnavimaguluri.blogspot.com/2023/07/general-medicine-case-report-osce.html

And here :



Similar (although not quite) approaches by others to preserve realism and clinical complexity in osce assessments in the link below (also detailed in the references)



Results :

2019 batch OSCEs as pre final UGs in  December 2023:


2018 batch OSCEs as internship completion assessment done as a patient teacher encounter along with direct observation techniques currently prevalent in medical education 



Quantitative results are in the above links for further analysis as of December 2023 while a sample of qualitative thematic analysis (also in the above links) done using pre designed prompts to an AI-LLM(CGpt4) is shared again  quoted below :

To quote from CGPT4:



The reflective writing assignment submitted by the medical PG intern, Navyatha Pothula, can be evaluated along Bloom's Taxonomy and the specific rubric provided:


1. **Bloom's Taxonomy Evaluation**:


   - **Knowledge**: The intern demonstrates knowledge by discussing cases, such as dermatomyositis, erythroderma secondary to psoriasis, dengue, thrombocytopenia, and hyperglycemia【7†source】【8†source】【9†source】【10†source】.



   - **Comprehension**: The understanding is shown in the ability to discuss the medical cases and their complexities, as well as the lessons learned from the internship experience【11†source】【12†source】.



   - **Application**: The application of knowledge is evident in managing clinical responsibilities and learning to prioritize and allocate time efficiently【7†source】.



   - **Analysis**: There is an analysis of cases and an interpretation of fever charts and treatment plans【8†source】【9†source】.



   - **Synthesis**: The intern synthesizes information from various cases and experiences to understand the importance of balancing empathy with professional detachment and other essential skills【7†source】【11†source】.



   - **Evaluation**: The intern evaluates the emotional toll of patient care and the importance of human connection, courage, teamwork, balancing, and mentorship in the medical profession【12†source】.


   Points missing at each level could include more in-depth analysis of patient data, synthesis of broader medical knowledge, and critical evaluation of medical practices or personal performance."

LLM driven analysis of our more than 1000s  of online learning portfolios in our dashboard here : https://medicinedepartment.blogspot.com/2022/02/?m=0 is an ongoing medical cognition project showcased at (4) here : https://medicinedepartment.blogspot.com/2023/10/ongoing-medical-cognition-projects-in.html?m=0


Discussion :

The current project plan is to thematically analyze all the available data for 2019 and 2018 batch of 200 health professionals in each batch and qualitatively evaluate an early phase trial of real OSCEs evolving from near extinct and largely unstructured long cases where the osce becomes a strong driver toward generating student learning outcomes as demonstrated in our linked results  and those student learning outcomes  in turn can be shown to drive patient health outcomes, which is our holy grail to integrate medical education and practice. 

References (and further reading) :

1) a)  Reid H, Gormley GJ, Dornan T, Johnston JL. Harnessing insights from an activity system - OSCEs past and present expanding future assessments. Med Teach. (2021) 43:44–9.

Preparing for OSCE success diverts students away from the complexity of authentic clinical environments. Students will not qualify and will, therefore, be of no use to patients without getting marks providing evidence of competence. Performing in a formulaic and often non patient-centred way is the price to pay for a qualification. Acknowledging the stultifying effect of standardising human behaviour for OSCEs opens up possibilities to release latent energy for change in medical education. In this imagined future, the overall object of education is refocused on patient care.



2) a) Hatala, R., Marr, S., Cuncic, C. et al. Modification of an OSCE format to enhance patient continuity in a high-stakes assessment of clinical performance. BMC Med Educ 11, 23 (2011). https://doi.org/10.1186/1472-6920-11-23

"Fragmentation of complex clinical cases into brief OSCE stations may result in some loss of validity as the authenticity of comprehensively examining a single patient is lost. Examiners may be constrained from assessing the trainee's understanding of a complex patient [2]. The lack of a full patient assessment may send an inaccurate message to the trainees as to the characteristics of clinical competence valued by the examination board.

One potential solution has used work-sampling strategies to assess clinical competence across multiple patients in real clinical practice, as in the mini-clinical evaluation exercise (mini-CEX) [6]. A second solution has been to improve the long case by using direct observation of performance, multiple examiners and standardized rating forms."

b) On DOTs directly observed training in medical education

Pereira AG, Woods M, Olson APJ, van den Hoogenhof S, Duffy BL, Englander R. Criterion-Based Assessment in a Norm-Based World: How Can We Move Past Grades? Acad Med. 2018 Apr;93(4):560-564

Abstract: 
In the United States, the medical education community has begun a shift from the Flexnerian time-based model to a competency-based medical education model. The graduate medical education (GME) community is substantially farther along in this transition than is the undergraduate medical education (UME) community.GME has largely adopted the use of competencies and their attendant milestones and increasingly is employing the framework of entrustable professional activities (EPAs) to assess trainee competence. The UME community faces several challenges to successfully navigating a similar transition. First is the reliance on norm-based reference standards in the UME-GME transition, comparing students' performance versus their peers' with grades, United States Medical Licensing Examination Step 1 and Step 2 score interpretation, and the structured Medical School Performance Evaluation, or dean's letter. Second is the reliance on proxy assessments rather than direct observation of learners. Third is the emphasis on summative rather than formative assessments.Educators have overcome a major barrier to change by establishing UME outcomes assessment criteria with the advent and general acceptance of the physician competency reference set and the Core EPAs for Entering Residency in UME. Now is the time for the hard work of developing assessments steeped in direct observation that can be accepted by learners and faculty across the educational continuum and can be shown to predict clinical performance in a much more meaningful way than the current measures of grades and examinations. The acceptance of such assessments will facilitate the UME transition toward competency-based medical education.



No comments:

Post a Comment