Other Publications

Education Columns

Training Medical Students to Communicate Effectively With Interprofessionals Using a Simulated Electronic Health Record

By Zaiba Jetpuri, DO; Thomas Dalton, MD; Dan Sepdham, MD; Kate Bridges; Tamara McGregor, MD

Introduction
Communication between health care professionals within the electronic health record (EHR) is common and increasingly necessary for efficient and effective patient care. Medical professionals communicate frequently with providers across professional lines through the EHR, but limited curricula exists to train medical students in this critical skill.1 In a 2009 pilot study conducted at the University of Texas Southwestern Medical Center (UTSW), researchers found that medical students need further guidance and instruction in order to display appropriate EHR skills, as they do not otherwise typically get the opportunity to practice or demonstrate these skills.

The purpose of this educational intervention was to introduce a standardized, simulated experience in a family medicine (FM) clerkship, with the goal of teaching students about the principles of interprofessional communication in a simulated EHR environment.

Methods
In this exploratory study, medical students at the beginning of a 4-week family medicine (FM) clerkship were asked to evaluate their understanding of electronic communication with health professionals and patients using eight Likert scale questions. The questionnaire exhibited strong reliability (α =.86). Clerkship students were also asked to complete a seven-item, multiple choice test to assess their knowledge on EHR communication (Appendix 1). Students participated in an online didactic module reviewing the importance of professional communication and interprofessional roles, and then simulated collaborative health care practices to complete weekly EHR tasks over their 4 weeks of clerkship. At the end of FM clerkship, participants completed the same confidence and knowledge evaluation posttest. Multiple choice answers were coded for correctness (1=Correct, 0=Incorrect) a paired t-test was used to compare pre vs post results.

Results
Students (N=423) completed the pretest from May 2016 throughJanuary 2018, with 378 (89.4%) participants completing both the pretest and posttest measures. Of the 378 participants, 65% had not previously engaged in communication with other professionals using an EHR system.

Medical student confidence significantly increased on all eight questions (see Table 1), indicating the intervention had a positive effect on participants’ comfort in interprofessionally using an EHR. The knowledge sum scores increased significantly (P<.001) over the course of the clerkship, and the mean increased almost a full point (pretest M=3.37, SD=.60; posttest M=4.31, SD=.48). Finally, those who scored higher on the knowledge test at the pretest were more likely to score higher on the posttest.

 

Table 1

 

Questions

Mean Difference

SD

I have strong written communication skills.

.49*

.83

I understand the elements of written communication in a professional environment.

.62*

.84

I understand the elements of written communication in an electronic health record.

.95*

.99

I understand the most effective methods for written communication between health professionals.

1.21*

.95

I understand how the most effective methods for written communication from health professionals to patients.

1.16*

.94

I can identify the roles and responsibilities of other health care team members.

.63*

.78

I understand the mechanics of how to communicate with other professionals within an electronic health record.

1.29*

1.05

I possess the skills to communicate as part of a team in the electronic health record.

1.00*

1.04

 

Discussion
Medical students have the potential to gain confidence and knowledge about interprofessional EHR communication through skills training and practice. Though many students exhibited growth from this process, this study had several limitations. Primarily, we found the educational tool was limited in its ability to automatically message students for the weekly assignments. This made it time exhaustive and required manual input from staff. Also, since students were simulating communication with mock health professionals, they were less motivated to complete the tasks assigned as there was no immediate application of their skill and there was no evaluation of students utilizing these skills. Students also were only expected to use the educational EHR four times over the clerkship. As there was no heavy use, it is difficult to adequately assess if a student’s skill improved.

Future directions for improvement would be to implement a longitudinal simulation during the entire clerkship phase and expand the communication scenarios so they are more robust. Additionally, incorporating a method for evaluating the students over their practice sessions would allow for real-time formative feedback and create further value of the experience.  Another consideration would be to integrate this project within the Allied Health profession schools so medical students could practice interprofessional communication directly with their Allied Health student contemporaries.

Conclusion
By introducing an EHR simulation to students, medical programs have the potential to train students in a meaningful skill to enhance their future medical practice and communication with interprofessional colleagues.

References

  1. Hammoud MM, Dalymple JL, Christner JG, et al. Medical student documentation in electronic health records: a collaborative statement from the Alliance for Clinical Education. Teach Learn Med. 2012;24(3):257-266. https://doi.org/10.1080/10401334.2012.692284
  2. Morrow JB, Dobbie AE, Jenkins C, Long R, Mihalic A, Wagner J. First-year medical students can demonstrate EHR-specific communication skills: a control-group study. Fam Med. 2009;41(1):28-33.
Ask a Question
AI Chatbot Tips

Tips for Using STFM's AI Assistant

STFM's AI Assistant is designed to help you find information and answers about Family Medicine education. While it's a powerful tool, getting the best results depends on how you phrase your questions. Here's how to make the most of your interactions:

1. Avoid Ambiguous Language

Be Clear and Specific: Use precise terms and avoid vague words like "it" or "that" without clear references.

Example:
Instead of: "Can you help me with that?"
Try: "Can you help me update our Family Medicine clerkship curriculum?"
Why this is important: Ambiguous language can confuse the AI, leading to irrelevant or unclear responses. Clear references help the chatbot understand exactly what you're asking.

2. Use Specific Terms

Identify the Subject Clearly: Clearly state the subject or area you need information about.

Example:
Instead of: "What resources does STFM provide?"
Try: "I'm a new program coordinator for a Family Medicine clerkship. What STFM resources are available to help me design or update clerkship curricula?"
Why this is better: Providing details about your role ("program coordinator") and your goal ("design or update clerkship curricula") gives the chatbot enough context to offer more targeted information.

3. Don't Assume the AI Knows Everything

Provide Necessary Details:The STFM AI Assistant has been trained on STFM's business and resources. The AI can only use the information you provide or that it has been trained on.

Example:
Instead of: "How can I improve my program?"
Try: "As a program coordinator for a Family Medicine clerkship, what resources does STFM provide to help me improve student engagement and learning outcomes?"
Why this is important: Including relevant details helps the AI understand your specific situation, leading to more accurate and useful responses.

4. Reset if You Change Topics

Clear Chat History When Switching Topics:

If you move to a completely new topic and the chatbot doesn't recognize the change, click the Clear Chat History button and restate your question.
Note: Clearing your chat history removes all previous context from the chatbot's memory.
Why this is important: Resetting ensures the AI does not carry over irrelevant information, which could lead to confusion or inaccurate answers.

5. Provide Enough Context

Include Background Information: The more context you provide, the better the chatbot can understand and respond to your question.

Example:
Instead of: "What are the best practices?"
Try: "In the context of Family Medicine education, what are the best practices for integrating clinical simulations into the curriculum?"
Why this is important: Specific goals, constraints, or preferences allow the AI to tailor its responses to your unique needs.

6. Ask One Question at a Time

Break Down Complex Queries: If you have multiple questions, ask them separately.

Example:
Instead of: "What are the requirements for faculty development, how do I register for conferences, and what grants are available?"
Try: Start with "What are the faculty development requirements for Family Medicine educators?" Then follow up with your other questions after receiving the response.
Why this is important: This approach ensures each question gets full attention and a complete answer.

Examples of Good vs. Bad Prompts

Bad Prompt

"What type of membership is best for me?"

Why it's bad: The AI Chat Assistant has no information about your background or needs.

Good Prompt

"I'm the chair of the Department of Family Medicine at a major university, and I plan to retire next year. I'd like to stay involved with Family Medicine education. What type of membership is best for me?"

Why it's good: The AI Chat Assistant knows your role, your future plans, and your interest in staying involved, enabling it to provide more relevant advice.

Double Check Important Information

While the AI Chat Assistant is a helpful tool, it can still produce inaccurate or incomplete responses. Always verify critical information with reliable sources or colleagues before taking action.

Technical Limitations

The Chat Assistant:

  • Cannot access external websites or open links
  • Cannot process or view images
  • Cannot make changes to STFM systems or process transactions
  • Cannot access real-time information (like your STFM Member Profile information)

STFM AI Assistant
Disclaimer: The STFM Assistant can make mistakes. Check important information.