Research

CAFM Educational Research Alliance (CERA)

About CERA

CERA, the CAFM Educational Research Alliance, is a framework to focus and support medical education research. CERA conducts approximately five surveys per year of:

  • Family medicine residency directors (surveyed twice per year)
  • Clerkship directors
  • Department chairs
  • General membership, including subsets of members as selected by applicants
  • Family medicine residents
  • Medical students

CERA Vision

Excellent family medicine educational research

CERA Mission

Provide a centralized infrastructure to:

  • Produce rigorous and generalizable medical education research
  • Facilitate collaboration among medical education researchers
  • Provide training and mentorship in educational research methods
  • Ensure that the work of CERA reflects and supports efforts to address equity, diversity, and antiracism

How CERA Works

  • Investigators respond to calls for proposals to submit questions for surveys
  • Each CERA survey includes questions submitted by investigators, as well as a set of recurring demographic and organizational questions to provide data for historical comparisons
  • Once proposals have been approved, experienced researchers/mentors join each project team to help refine questions, facilitate analysis, and prepare and submit manuscripts.
  • Researchers receive their individual survey results, plus the recurring question responses. Researchers are given 3 months to analyze the data from the survey prior to release of the data to the general membership. The expectation is that investigators will write and submit a paper within those 3 months.
  • Members of STFM, NAPCRG, AFMRD, and ADFM use CERA data for secondary analysis.

Survey Schedule

2025 Survey Dates

Program Directors
Call for Proposals: 12/9/24–1/7/25
Survey Dates: 4/22/25–5/23/25

Clerkship Directors
Call for Proposals: 1/27/25–2/25/25
Survey Dates: 6/10/25–7/11/25

Department Chairs 
Call for Proposals: 3/24/25–4/22/25
Survey Dates: 8/5/25–9/5/25

General Membership 
Call for Proposals: 5/19/25–6/17/25
Survey Dates: 9/30/25–10/31/25

Program Directors
Call for Proposals: 6/23/25–7/22/25
Survey Dates: 11/4/25–12/5/25

207

Published Papers

CERA studies have resulted in 207 published manuscripts. See the full list.

238

Completed Presentations

CERA studies have resulted in 238 scholarly presentations. See the full list.

Wow! What a job to all. Thank you so very much for letting me be a part of this. I have learned so much from the process and I am just amazed at the strengths of all involved from the amazing writing, thoughts, statistical resources, and video conferences. I am really excited at our findings and hope it generates lots of thoughtful consideration. Thank you again so much. Maybe I will get to meet you all someday :).
We used the tenure question from all 11 program director surveys to look at the mean and median for 2011–2017. We are so grateful for this data and the CERA survey and the sharing. This is a priority topic for AFMRD and the data was so helpful.
Image Description

Steven Brown, MD

University of Arizona College of Medicine-Phoenix Family Medicine Residency

Questions?

If you have questions about CERA, contact Sam Grammer.

Contact Us

 

11400 Tomahawk Creek Parkway

Leawood, KS 66211

(800) 274-7928

stfmoffice@stfm.org 

 

 

Ask a Question
AI Chatbot Tips

Tips for Using STFM's AI Assistant

STFM's AI Assistant is designed to help you find information and answers about Family Medicine education. While it's a powerful tool, getting the best results depends on how you phrase your questions. Here's how to make the most of your interactions:

1. Avoid Ambiguous Language

Be Clear and Specific: Use precise terms and avoid vague words like "it" or "that" without clear references.

Example:
Instead of: "Can you help me with that?"
Try: "Can you help me update our Family Medicine clerkship curriculum?"
Why this is important: Ambiguous language can confuse the AI, leading to irrelevant or unclear responses. Clear references help the chatbot understand exactly what you're asking.

2. Use Specific Terms

Identify the Subject Clearly: Clearly state the subject or area you need information about.

Example:
Instead of: "What resources does STFM provide?"
Try: "I'm a new program coordinator for a Family Medicine clerkship. What STFM resources are available to help me design or update clerkship curricula?"
Why this is better: Providing details about your role ("program coordinator") and your goal ("design or update clerkship curricula") gives the chatbot enough context to offer more targeted information.

3. Don't Assume the AI Knows Everything

Provide Necessary Details:The STFM AI Assistant has been trained on STFM's business and resources. The AI can only use the information you provide or that it has been trained on.

Example:
Instead of: "How can I improve my program?"
Try: "As a program coordinator for a Family Medicine clerkship, what resources does STFM provide to help me improve student engagement and learning outcomes?"
Why this is important: Including relevant details helps the AI understand your specific situation, leading to more accurate and useful responses.

4. Reset if You Change Topics

Clear Chat History When Switching Topics:

If you move to a completely new topic and the chatbot doesn't recognize the change, click the Clear Chat History button and restate your question.
Note: Clearing your chat history removes all previous context from the chatbot's memory.
Why this is important: Resetting ensures the AI does not carry over irrelevant information, which could lead to confusion or inaccurate answers.

5. Provide Enough Context

Include Background Information: The more context you provide, the better the chatbot can understand and respond to your question.

Example:
Instead of: "What are the best practices?"
Try: "In the context of Family Medicine education, what are the best practices for integrating clinical simulations into the curriculum?"
Why this is important: Specific goals, constraints, or preferences allow the AI to tailor its responses to your unique needs.

6. Ask One Question at a Time

Break Down Complex Queries: If you have multiple questions, ask them separately.

Example:
Instead of: "What are the requirements for faculty development, how do I register for conferences, and what grants are available?"
Try: Start with "What are the faculty development requirements for Family Medicine educators?" Then follow up with your other questions after receiving the response.
Why this is important: This approach ensures each question gets full attention and a complete answer.

Examples of Good vs. Bad Prompts

Bad Prompt

"What type of membership is best for me?"

Why it's bad: The AI Chat Assistant has no information about your background or needs.

Good Prompt

"I'm the chair of the Department of Family Medicine at a major university, and I plan to retire next year. I'd like to stay involved with Family Medicine education. What type of membership is best for me?"

Why it's good: The AI Chat Assistant knows your role, your future plans, and your interest in staying involved, enabling it to provide more relevant advice.

Double Check Important Information

While the AI Chat Assistant is a helpful tool, it can still produce inaccurate or incomplete responses. Always verify critical information with reliable sources or colleagues before taking action.

Technical Limitations

The Chat Assistant:

  • Cannot access external websites or open links
  • Cannot process or view images
  • Cannot make changes to STFM systems or process transactions
  • Cannot access real-time information (like your STFM Member Profile information)

STFM AI Assistant
Disclaimer: The STFM Assistant can make mistakes. Check important information.