Other Publications

Education Columns

Understanding the Perceptions of Geriatric Medical Educators Around Teaching Clinical Reasoning to Medical Learners

By Julie N. Thai, MD, MPH, Section of Geriatric Medicine, Division of Primary Care and Population Health, Department of Medicine, Stanford University School of Medicine

Background

Older adult patients often present diagnostic challenges for several reasons, including communication barriers between clinicians and patients, complex comorbidities, and clinician biases that may affect quality of care.¹,² This population is particularly vulnerable to medical errors, especially those involving medication use.³

In addition, current literature indicates that more than 10% of cases of chronic obstructive pulmonary disease, heart failure, dementia, Parkinson disease, cerebrovascular accident/transient ischemic attack, and acute myocardial infarction are misdiagnosed (either overdiagnosed or underdiagnosed) among patients aged 65 years or older.⁴ The proportion of older adults in the United States is projected to increase to approximately 21.6% by 2030.⁵ Despite this demographic shift, there remains limited expertise on how best to address and overcome diagnostic challenges in this population.

Clinical reasoning—the process of gathering and interpreting information, formulating differential diagnoses, and determining diagnostic probabilities—is a complex skill taught through both theoretical and experiential learning.⁶ However, it remains unclear how medical learners are taught clinical reasoning in ways that promote diagnostic excellence when caring for older adults.⁷

Methods

A national cross-sectional survey was conducted to assess the perceptions of geriatric medical educators regarding the teaching of clinical reasoning skills to trainees. Eligible participants were geriatric medical educators in the United States who interacted with medical learners in any capacity.

Participants were recruited from 2 sources: (1) the directory of geriatric medicine fellowship program directors (PDs) obtained through the American Medical Association FREIDA database (N = 76) and (2) the American Geriatrics Society (AGS) member discussion forum (AGS membership >6000; the number of active forum users eligible for inclusion is unknown). PDs were included as they represent known geriatric medical educators.

Recruitment emails were sent to PDs, and study information was posted on the AGS forum. Both communications contained a link to the survey, which was administered via Qualtrics. Reminder emails and posts were distributed 7 days after the initial outreach. Descriptive statistics and frequency analyses were performed.

This study was approved by the Stanford University Institutional Review Board. Funding was provided through the Age-Friendly Fellowship of the Society to Improve Diagnosis in Medicine, in collaboration with the Gordon and Betty Moore Foundation and the John A. Hartford Foundation.

Results

A total of 60 respondents participated, all of whom self-identified as geriatric medical educators. Most were physicians board certified in geriatric medicine and practicing in academic medical centers.

  • Perceptions of clinical reasoning training: 65% agreed that explicit training in clinical reasoning can reduce medical errors and improve diagnostic accuracy.
  • Training experience: 37% reported having received formal instruction on teaching clinical reasoning; 41% reported not knowing where to obtain such training.
  • Training interest: 78% expressed interest in receiving additional training.
  • Reported barriers: Lack of teaching time and lack of formal training in clinical reasoning concepts were identified as primary barriers.

The main limitation of this study was the small sample size. However, the aim was exploratory—to gather preliminary insights into the needs and perspectives of geriatric medical educators regarding clinical reasoning instruction.

Discussion

Findings from this study align with prior research demonstrating that medical educators have called for structured, formalized training in teaching clinical reasoning.⁸⁻⁹ A recent national needs assessment across multiple disciplines similarly found that while clerkship directors recognized the importance of clinical reasoning instruction, they were constrained by limited curricular time.¹⁰

Despite the small sample, the results of this study have important implications for medical education reform. Providing formal training to geriatric medical educators on how to teach clinical reasoning—particularly in the context of diagnosing older adults—may contribute to improved diagnostic accuracy and patient outcomes. Future directions include developing workshops and other educational modalities to “teach the teachers.”

References

  1. Burgener AM. Enhancing communication to improve patient safety and to increase patient satisfaction. Health Care Manag (Frederick). 2020;39(3):128–132.
  2. Cassel C, Fulmer T. Achieving diagnostic excellence for older patients. JAMA. 2022;327(10):919–920. doi:10.1001/jama.2022.1813
  3. Santell JP, Hicks RW. Medication errors involving geriatric patients. Jt Comm J Qual Patient Saf. 2005;31(4):233–238. doi:10.1016/S1553-7250(05)31030-0
  4. Skinner TR, Scott IA, Martin JH. Diagnostic errors in older patients: a systematic review of incidence and potential causes in seven prevalent diseases. Int J Gen Med. 2016;9:137–146. doi:10.2147/IJGM.S96741
  5. US Census Bureau. 2020 Census Data. Accessed [Month Day, Year]. https://www.census.gov
  6. Durning SJ, Jung E, Kim DH, Lee YM. Teaching clinical reasoning: principles from the literature to help improve instruction from the classroom to the bedside. Korean J Med Educ. 2024;36(2):145–155. doi:10.3946/kjme.2024.292
  7. Centers for Disease Control and Prevention. Core Elements of Hospital Diagnostic Excellence (DxEx). Accessed [Month Day, Year]. https://www.cdc.gov/patient-safety/hcp/hospital-dx-excellence/index.html
  8. Mohd Tambeh SN, Yaman MN. Clinical reasoning training sessions for health educators: a scoping review. J Taibah Univ Med Sci. 2023;18(6):1480–1492. doi:10.1016/j.jtumed.2023.06.002
  9. Wagner F, Sudacka M, Kononowicz A, et al. Current status and ongoing needs for the teaching and assessment of clinical reasoning: an international mixed-methods study from the students’ and teachers’ perspective. BMC Med Educ. 2024;24:622. doi:10.1186/s12909-024-05518-8
  10. Gold JG, Knight CL, Christner JG, Mooney CE, Manthey DE, Lang VJ. Clinical reasoning education in the clerkship years: a cross-disciplinary national needs assessment. PLoS One. 2022;17(8):e0273250. doi:10.1371/journal.pone.0273250

Contact Us

11400 Tomahawk Creek Parkway

Leawood, KS 66211

(800) 274-7928

stfmoffice@stfm.org 

 

 

Ask a Question
AI Chatbot Tips

Tips for Using STFM's AI Assistant

STFM's AI Assistant is designed to help you find information and answers about Family Medicine education. While it's a powerful tool, getting the best results depends on how you phrase your questions. Here's how to make the most of your interactions:

1. Avoid Ambiguous Language

Be Clear and Specific: Use precise terms and avoid vague words like "it" or "that" without clear references.

Example:

Instead of: "Can you help me with that?"
Try: "Can you help me update our Family Medicine clerkship curriculum?"
Why this is important: Ambiguous language can confuse the AI, leading to irrelevant or unclear responses. Clear references help the chatbot understand exactly what you're asking.

2. Use Specific Terms

Identify the Subject Clearly: Clearly state the subject or area you need information about.

Example:

Instead of: "What resources does STFM provide?"
Try: "I'm a new program coordinator for a Family Medicine clerkship. What STFM resources are available to help me design or update clerkship curricula?"
Why this is better: Providing details about your role ("program coordinator") and your goal ("design or update clerkship curricula") gives the chatbot enough context to offer more targeted information.

3. Don't Assume the AI Knows Everything

Provide Necessary Details:The STFM AI Assistant has been trained on STFM's business and resources. The AI can only use the information you provide or that it has been trained on.

Example:

Instead of: "How can I improve my program?"
Try: "As a program coordinator for a Family Medicine clerkship, what resources does STFM provide to help me improve student engagement and learning outcomes?"
Why this is important: Including relevant details helps the AI understand your specific situation, leading to more accurate and useful responses.

4. Reset if You Change Topics

Clear Chat History When Switching Topics:

If you move to a completely new topic and the chatbot doesn't recognize the change, click the Clear Chat History button and restate your question.
Note: Clearing your chat history removes all previous context from the chatbot's memory.
Why this is important: Resetting ensures the AI does not carry over irrelevant information, which could lead to confusion or inaccurate answers.

5. Provide Enough Context

Include Background Information: The more context you provide, the better the chatbot can understand and respond to your question.

Example:

Instead of: "What are the best practices?"
Try: "In the context of Family Medicine education, what are the best practices for integrating clinical simulations into the curriculum?"
Why this is important: Specific goals, constraints, or preferences allow the AI to tailor its responses to your unique needs.

6. Ask One Question at a Time

Break Down Complex Queries: If you have multiple questions, ask them separately.

Example:

Instead of: "What are the requirements for faculty development, how do I register for conferences, and what grants are available?"
Try: Start with "What are the faculty development requirements for Family Medicine educators?" Then follow up with your other questions after receiving the response.
Why this is important: This approach ensures each question gets full attention and a complete answer.

Examples of Good vs. Bad Prompts

Bad Prompt

"What type of membership is best for me?"

Why it's bad: The AI Chat Assistant has no information about your background or needs.

Good Prompt

"I'm the chair of the Department of Family Medicine at a major university, and I plan to retire next year. I'd like to stay involved with Family Medicine education. What type of membership is best for me?"

Why it's good: The AI Chat Assistant knows your role, your future plans, and your interest in staying involved, enabling it to provide more relevant advice.

Double Check Important Information

While the AI Chat Assistant is a helpful tool, it can still produce inaccurate or incomplete responses. Always verify critical information with reliable sources or colleagues before taking action.

Technical Limitations

The Chat Assistant:

  • Cannot access external websites or open links
  • Cannot process or view images
  • Cannot make changes to STFM systems or process transactions
  • Cannot access real-time information (like your STFM Member Profile information)

STFM AI Assistant
Disclaimer: The STFM Assistant can make mistakes. Check important information.