Other Publications

Education Columns

Training Postgraduate Learners in Laceration Repair Using Video Conference Technology

By Stephen K Stacey, DO, Erin F Morcomb, MD, Karen C Cowan, MD, Mark D McEleney, MD, Christopher J Tookey, MD, La Crosse-Mayo Family Medicine Residency, La Crosse, WI

Background                                                  

Instructional design for remote procedural skills training has several obstacles1. These obstacles came to the forefront in the early days of social distancing directives due to COVID-19. How do we teach learners procedural skills if we cannot be side-by-side or even in the same room? How does one simultaneously teach learners remotely at home and across several different residencies? Is this teaching method effective, and does it appeal to all levels of learners? Our family medicine residency program developed a remote laceration repair training curriculum aimed at addressing these questions.

Intervention

We held a 2-hour virtual workshop to instruct in laceration repair techniques involving 35 family medicine residents across four individual programs. The training included injection techniques, instrument ties, and the following suture types: simple interrupted, vertical mattress, horizontal mattress, subcuticular, and corner stitch. Learners prepared by completing required readings2,3 and a pretest which assessed cognitive understanding as well as subjective comfort with the procedures. Suture training was done on chicken thighs with skin intact, which was used due to its semblance to human skin and easy availability4. Learners used a tablet, laptop computer, or cell phone and were provided with standard suture supplies (figure 1). Learners practiced these sutures while information and demonstration were delivered via synchronous video teleconference led by faculty, allowing for real-time interaction with the remote learners. The faculty lecturer broadcast a top-down view of the demonstration using ambient lighting and a standard tablet positioned horizontally on a platform with the camera pointed downwards. After the live session was complete, learners completed a posttest similar in content to the pretest to assess for acquisition of knowledge and confidence in each technique. They submitted a video recording and photograph of each suture technique which was graded on a faculty-developed rubric based on similar previously published rubrics5,6.

 


Figure 1
Supplies used for the procedure workshop include a chicken thigh with skin, absorbent pad, fenestrated drape, iris scissors, tissue forceps, needle drivers, scalpel, absorbable and non-absorbable sutures, and a syringe with injection needle. Supplies not pictured include disposable exam gloves, a sharps container, and a tablet or cell phone capable of making video calls.


Results

The learner survey contained two open-ended questions: (1) “How would you rate the training overall?” and (2) “How would you compare this training with prior training events?” Out of 35 learners, we received a total of 67 responses to the two questions (96% completion rate). Of these responses, 38 (57%) were positive, 9 (13%) were negative, and 20 (30%) were neutral or a mixture of positive and negative. Residents were asked to rate their confidence in performing the procedures on a 10-point scale both before and after the training. The difference between pretest scores and posttest scores was compared and a P-value calculated using a two-sample T-test assuming unequal variance (Table 1).

Table 1: Increase in Procedural Confidence of Learners by Postgraduate Year.

Post-graduate year (PGY)

Increase in confidence (10-point scale)

P-value

PGY-1

2.29

< 0.001

PGY-2

1.64

0.045

PGY-3

1.43

0.011

Conclusions

Procedural skills training involving preparatory reading with synchronous video demonstration conveys a learning experience that is both effective and rated highly by participants at all stages of residency training. While this was not specifically assessed, we expect that all residents have had prior exposure to suturing. We hypothesized that residents those with more prior exposure would report lower average improvement. Taking PGY year as a proxy for experience, this seems to be the case. This remote training showed that learners from many different residencies are able to participate simultaneously from dozens of locations.

Further investigations could assess expanding real-time feedback by having more faculty members present to cycle through the video feed during the session, or comparing remote learning to similar live training. This protocol is low-cost and easy to implement with commonly available supplies. This method holds potential for situations in which large-group assembly is precluded, resources are scarce, or travel is a limiting factor.

References

  1. Christensen MD, Oestergaard D, Dieckmann P et al. Learners' Perceptions During Simulation-Based Training: An Interview Study Comparing Remote Versus Locally Facilitated Simulation-Based Training. Simul Healthc. 2018 Oct;13(5):306-315.
  2. Forsch RT, Little SH, Williams C. Laceration Repair: A Practical Approach. Am Fam Physician. 2017 May 15;95(10):628-636.
  3. Worster B, Zawora MQ, Hsieh C. Common questions about wound care. Am Fam Physician. 2015 Jan 15;91(2):86-92.
  4. Denadai R, Saad-Hossne R, Martinhão Souto LR. Simulation-based cutaneous surgical-skill training on a chicken-skin bench model in a medical undergraduate program. Indian J Dermatol. 2013 May;58(3):200-7.
  5. Williamson JA, Farrell R, Skowron C et al. Evaluation of a method to assess digitally recorded surgical skills of novice veterinary students. Vet Surg. 2018 Apr;47(3):378-384.
  6. Routt E, Mansouri Y, de Moll EH et al. Teaching the Simple Suture to Medical Students for Long-term Retention of Skill. JAMA Dermatol. 2015 Jul;151(7):761-5.
Ask a Question
AI Chatbot Tips

Tips for Using STFM's AI Assistant

STFM's AI Assistant is designed to help you find information and answers about Family Medicine education. While it's a powerful tool, getting the best results depends on how you phrase your questions. Here's how to make the most of your interactions:

1. Avoid Ambiguous Language

Be Clear and Specific: Use precise terms and avoid vague words like "it" or "that" without clear references.

Example:
Instead of: "Can you help me with that?"
Try: "Can you help me update our Family Medicine clerkship curriculum?"
Why this is important: Ambiguous language can confuse the AI, leading to irrelevant or unclear responses. Clear references help the chatbot understand exactly what you're asking.

2. Use Specific Terms

Identify the Subject Clearly: Clearly state the subject or area you need information about.

Example:
Instead of: "What resources does STFM provide?"
Try: "I'm a new program coordinator for a Family Medicine clerkship. What STFM resources are available to help me design or update clerkship curricula?"
Why this is better: Providing details about your role ("program coordinator") and your goal ("design or update clerkship curricula") gives the chatbot enough context to offer more targeted information.

3. Don't Assume the AI Knows Everything

Provide Necessary Details:The STFM AI Assistant has been trained on STFM's business and resources. The AI can only use the information you provide or that it has been trained on.

Example:
Instead of: "How can I improve my program?"
Try: "As a program coordinator for a Family Medicine clerkship, what resources does STFM provide to help me improve student engagement and learning outcomes?"
Why this is important: Including relevant details helps the AI understand your specific situation, leading to more accurate and useful responses.

4. Reset if You Change Topics

Clear Chat History When Switching Topics:

If you move to a completely new topic and the chatbot doesn't recognize the change, click the Clear Chat History button and restate your question.
Note: Clearing your chat history removes all previous context from the chatbot's memory.
Why this is important: Resetting ensures the AI does not carry over irrelevant information, which could lead to confusion or inaccurate answers.

5. Provide Enough Context

Include Background Information: The more context you provide, the better the chatbot can understand and respond to your question.

Example:
Instead of: "What are the best practices?"
Try: "In the context of Family Medicine education, what are the best practices for integrating clinical simulations into the curriculum?"
Why this is important: Specific goals, constraints, or preferences allow the AI to tailor its responses to your unique needs.

6. Ask One Question at a Time

Break Down Complex Queries: If you have multiple questions, ask them separately.

Example:
Instead of: "What are the requirements for faculty development, how do I register for conferences, and what grants are available?"
Try: Start with "What are the faculty development requirements for Family Medicine educators?" Then follow up with your other questions after receiving the response.
Why this is important: This approach ensures each question gets full attention and a complete answer.

Examples of Good vs. Bad Prompts

Bad Prompt

"What type of membership is best for me?"

Why it's bad: The AI Chat Assistant has no information about your background or needs.

Good Prompt

"I'm the chair of the Department of Family Medicine at a major university, and I plan to retire next year. I'd like to stay involved with Family Medicine education. What type of membership is best for me?"

Why it's good: The AI Chat Assistant knows your role, your future plans, and your interest in staying involved, enabling it to provide more relevant advice.

Double Check Important Information

While the AI Chat Assistant is a helpful tool, it can still produce inaccurate or incomplete responses. Always verify critical information with reliable sources or colleagues before taking action.

Technical Limitations

The Chat Assistant:

  • Cannot access external websites or open links
  • Cannot process or view images
  • Cannot make changes to STFM systems or process transactions
  • Cannot access real-time information (like your STFM Member Profile information)

STFM AI Assistant
Disclaimer: The STFM Assistant can make mistakes. Check important information.