Other Publications

Education Columns

Board Review Sessions: Clicker Use to Improve Knowledge Through Gamification

By Jumana Antoun, MD;MSc, American University of Beirut

Background

The ACGME-accredited family medicine program at the American University of Beirut uses the American Board of Family Practice (ABFP) In-training Examination (ITE) as a standardized summative knowledge assessment for postgraduate years (PGY) 2 and 3 residents. About 10 years ago, we first noted that our program scores were consistently lower than the US national average score, particularly for our PGY-2 residents. For example, in 2009, our program’s average score for PGY-2 residents was 25 points lower than the national US average score for the same year of training. Therefore, starting that same year, a new mandatory teaching session that is focused on board review questions was introduced for PGY-2 and PGY-3 residents. Since the three other mandatory teaching activities (core content, journal club, and morning report) occurred early in the morning, this new session was scheduled at noon, a time that is least preferred by residents as it cuts into their lunch break and clinical duties. Therefore, there was a particular need to deliver the session in an interactive, fun, and engaging way.

Intervention

The faculty member who delivered these weekly 1-hour sessions integrated principles of gamification1 by using automated response systems, or clickers. Each resident was assigned a clicker that he/she used throughout the PGY-2 and PGY-3 years of residency. For this past year, we moved to using the mobile phone as a clicker using Turningpoint Software. Each session included 15 board-style questions that were projected using PowerPoint. Each question was followed by a discussion of the rationale behind the correct answer; tips on using the stem information and understanding the logical reasoning of the options were shared. A summary slide with a brief explanation of the medical information relevant to the question was projected after each discussion. At the end of the session, a leaderboard slide revealed the top three residents and the team (PGY2 or PGY3) that scored the highest. At the end of the year, prizes were given to the top resident and all members of the winning team. An electronic anonymous voluntary survey was delivered to residents during the 2017 academic year to assess the residents’ perception concerning the usefulness of this educational session.

Results

Attendance at the board review sessions was good, and consistently higher than that of the other three core activities. The residents were engaged, as evidenced by: (1) no residents fell asleep, and all were trying to answer the question, and (2) on some occasions, residents started to discuss the answer amongst themselves, before the poll even closed. Furthermore, last year, when we implemented the mobile phone system as clickers, some residents were able to join while they were on their vacation or rotation outside the of the hospital. At times, additional questions arose from the discussion, and one of the residents would read about the topic and email what he or she learned to the group. More importantly, the program’s ITE scores have markedly improved over the years, with a difference of +52 points between our and the US average score for PGY-2 year in 2017. In 2017, a survey found that board review sessions were the most appreciated educational activity by the residents. For instance, 80% rated these sessions as very useful compared to 50% for the core content activity, and only 2% for each of the journal club and morning report activities. When asked to choose the most useful activity, 90% of the residents selected board review.

Conclusions

Gamification of board review sessions using clickers guided residents in test-taking strategies. The anonymous response along with immediate feedback enhanced learning. Residents actively participated and were able to recognize their weaknesses as compared to the entire group. The use of clickers also allowed faculty to identify and remediate gaps in the residents’ knowledge. Residents were motivated to attend as they liked to compete and win while learning.

References

  1. Edmonds S. Gamification of learning. Training and Development in Australia. 2011;38(6):20.
Ask a Question
AI Chatbot Tips

Tips for Using STFM's AI Assistant

STFM's AI Assistant is designed to help you find information and answers about Family Medicine education. While it's a powerful tool, getting the best results depends on how you phrase your questions. Here's how to make the most of your interactions:

1. Avoid Ambiguous Language

Be Clear and Specific: Use precise terms and avoid vague words like "it" or "that" without clear references.

Example:
Instead of: "Can you help me with that?"
Try: "Can you help me update our Family Medicine clerkship curriculum?"
Why this is important: Ambiguous language can confuse the AI, leading to irrelevant or unclear responses. Clear references help the chatbot understand exactly what you're asking.

2. Use Specific Terms

Identify the Subject Clearly: Clearly state the subject or area you need information about.

Example:
Instead of: "What resources does STFM provide?"
Try: "I'm a new program coordinator for a Family Medicine clerkship. What STFM resources are available to help me design or update clerkship curricula?"
Why this is better: Providing details about your role ("program coordinator") and your goal ("design or update clerkship curricula") gives the chatbot enough context to offer more targeted information.

3. Don't Assume the AI Knows Everything

Provide Necessary Details:The STFM AI Assistant has been trained on STFM's business and resources. The AI can only use the information you provide or that it has been trained on.

Example:
Instead of: "How can I improve my program?"
Try: "As a program coordinator for a Family Medicine clerkship, what resources does STFM provide to help me improve student engagement and learning outcomes?"
Why this is important: Including relevant details helps the AI understand your specific situation, leading to more accurate and useful responses.

4. Reset if You Change Topics

Clear Chat History When Switching Topics:

If you move to a completely new topic and the chatbot doesn't recognize the change, click the Clear Chat History button and restate your question.
Note: Clearing your chat history removes all previous context from the chatbot's memory.
Why this is important: Resetting ensures the AI does not carry over irrelevant information, which could lead to confusion or inaccurate answers.

5. Provide Enough Context

Include Background Information: The more context you provide, the better the chatbot can understand and respond to your question.

Example:
Instead of: "What are the best practices?"
Try: "In the context of Family Medicine education, what are the best practices for integrating clinical simulations into the curriculum?"
Why this is important: Specific goals, constraints, or preferences allow the AI to tailor its responses to your unique needs.

6. Ask One Question at a Time

Break Down Complex Queries: If you have multiple questions, ask them separately.

Example:
Instead of: "What are the requirements for faculty development, how do I register for conferences, and what grants are available?"
Try: Start with "What are the faculty development requirements for Family Medicine educators?" Then follow up with your other questions after receiving the response.
Why this is important: This approach ensures each question gets full attention and a complete answer.

Examples of Good vs. Bad Prompts

Bad Prompt

"What type of membership is best for me?"

Why it's bad: The AI Chat Assistant has no information about your background or needs.

Good Prompt

"I'm the chair of the Department of Family Medicine at a major university, and I plan to retire next year. I'd like to stay involved with Family Medicine education. What type of membership is best for me?"

Why it's good: The AI Chat Assistant knows your role, your future plans, and your interest in staying involved, enabling it to provide more relevant advice.

Double Check Important Information

While the AI Chat Assistant is a helpful tool, it can still produce inaccurate or incomplete responses. Always verify critical information with reliable sources or colleagues before taking action.

Technical Limitations

The Chat Assistant:

  • Cannot access external websites or open links
  • Cannot process or view images
  • Cannot make changes to STFM systems or process transactions
  • Cannot access real-time information (like your STFM Member Profile information)

STFM AI Assistant
Disclaimer: The STFM Assistant can make mistakes. Check important information.