Teaching Population Health and Performance Improvement in Family Medicine Residency

By Louanne Friend, PhD, RN, Associate Professor, Department of Family, Internal & Rural Medicine, University of Alabama College of Community Health Sciences / Associate Director Research and Quality Improvement Tuscaloosa Family Medicine Residency, Tuscaloosa, AL

BACKGROUND

Family medicine residencies are expected to teach population health and performance improvement (PI),¹ yet residents often experience PI as a checkbox exercise rather than authentic practice change. At our program, we aimed to help residents view themselves as panel managers accountable for outcomes in defined patient populations. We leveraged 2 underused assets: a population health platform linked to the electronic health record and a simple, tool-agnostic PI framework to create a postgraduate year 2 (PGY-2) curriculum that makes population health tangible, builds practical PI habits, and aligns with American Board of Family Medicine PI requirements.⁵

This habit-focused, team-reliable curriculum uses basic Plan-Do-Study-Act (PDSA) methodology and addresses variability in residency population health practice as well as the need for protected time and mentorship.² ⁴

METHODS

All PGY-2 residents complete a required PI project focused on a small personal panel of 20 patients (10 with hypertension and 10 with diabetes) identified via the population health platform. The small panel size keeps data review manageable and maintains visibility of individual patient narratives.

Residents select 1 or 2 specific care gaps (eg, overdue hemoglobin A1c testing, missing blood pressure treatment plan, lack of foot examination documentation), develop a brief SMART (Specific, Measurable, Achievable, Relevant, Time-bound) aim, and design a ≤30-second micro-workflow embedded in routine clinic operations (eg, a rooming checklist item, previsit huddle prompt, or standardized outreach script).

Each project tracks (1) a reliability measure (frequency with which the micro-workflow occurs), (2) a proximal outcome (eg, documentation of a blood pressure treatment plan or completion of a foot examination), and (3) lessons learned and intended practice changes. Residents report all data via an online survey instrument.

Program evaluation was structured using Kirkpatrick’s 4-level model.⁶ Reaction (Level 1) was assessed through brief resident evaluations; learning (Level 2) through review of aims, data tables, and written reflections; behavior (Level 3) through reliability measures and staff feedback; and early results (Level 4) through small panel–level improvements in documentation and blood pressure control.

RESULTS

To date, 68 residents have completed projects. Several educational shifts have emerged. Residents increasingly describe efforts to “fix the system” for their panels rather than “fix charts,” and they report greater confidence using data to drive change and identify missed opportunities (eg, tobacco use, statin prescribing, no-show patterns) within their own panels.

Micro-workflows strengthened collaboration with nursing and front-desk staff, in part because residents co-designed these processes with team members. Some workflows, such as huddle prompts for high-risk patients, extended beyond individual projects and were adopted as standard clinic processes.

Kirkpatrick evaluation data support these observations (Table 1).

Table 1:

At Level 1, residents rated the curriculum as relevant and feasible, particularly because projects were anchored in their own continuity panels. Level 2 artifacts demonstrated improved ability to develop specific aims and select appropriate measures. Early Level 3 findings showed more consistent use of panel reports and clinic huddles to identify care gaps. At Level 4, small but meaningful improvements in documentation and process reliability were observed in several panels.

DISCUSSION

This curriculum appears to support a shift in residents’ professional identity, from viewing population health and PI as ancillary tasks to recognizing them as core elements of family medicine practice.

Implementation challenges included incomplete PI component completion during busy rotations; the need for coaching to narrow overly ambitious 4- to 8-week aims into feasible, high-yield targets; and diminished data entry and follow-up without protected time, reminders, and streamlined templates.

Despite limitations related to time constraints, variable engagement, and single-site implementation, several practical lessons emerged. Panel-based PI does not require sophisticated software. Small panels, clearly defined aims, team-reliable micro-workflows, and simple online survey tracking were sufficient to generate measurable change. Framing evaluation using Kirkpatrick’s model provided a feasible structure to document reaction, learning, behavior change, and early clinical outcomes, supporting sustainable integration of population health education.

Practically, the PI framework provided residents with a repeatable approach applicable in continuity clinic: identify a small panel, select a single care gap, develop a focused SMART aim, and implement a brief micro-workflow integrated into routine rooming, huddles, or outreach. This approach reframes PI from a checkbox requirement to a transferable clinical habit centered on data-informed panel management, team-reliable processes, and iterative refinement that residents can carry forward into independent practice.

REFERENCES

  1. Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Family Medicine. Published 2023. Accessed December 10, 2025. https://www.acgme.org
  2. Carek S, Brown C, Neutze D, et al. Understanding population health management practices among family medicine residency programs. Fam Med. 2025;57(6):1-7.
  3. Sell J, Riley TD, Miller EL. Integrating quality improvement and community engagement education: curricular evaluation of resident population health training. Fam Med. 2022;54(8):634-639.
  4. Griesbach S, Theobald M, Kolman K, et al. Joint guidelines for protected nonclinical time for faculty in family medicine residency programs. Fam Med. 2021;53(6):443-452.
  5. American Board of Family Medicine. Performance Improvement (PI) Activities. Accessed December 10, 2025. https://www.theabfm.org/continue-certification/performance-improvement
  6. Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. 3rd ed. Berrett-Koehler Publishers; 2006.
Ask a Question
AI Chatbot Tips

Tips for Using STFM's AI Assistant

STFM's AI Assistant is designed to help you find information and answers about Family Medicine education. While it's a powerful tool, getting the best results depends on how you phrase your questions. Here's how to make the most of your interactions:

1. Avoid Ambiguous Language

Be Clear and Specific: Use precise terms and avoid vague words like "it" or "that" without clear references.

Example:

Instead of: "Can you help me with that?"
Try: "Can you help me update our Family Medicine clerkship curriculum?"
Why this is important: Ambiguous language can confuse the AI, leading to irrelevant or unclear responses. Clear references help the chatbot understand exactly what you're asking.

2. Use Specific Terms

Identify the Subject Clearly: Clearly state the subject or area you need information about.

Example:

Instead of: "What resources does STFM provide?"
Try: "I'm a new program coordinator for a Family Medicine clerkship. What STFM resources are available to help me design or update clerkship curricula?"
Why this is better: Providing details about your role ("program coordinator") and your goal ("design or update clerkship curricula") gives the chatbot enough context to offer more targeted information.

3. Don't Assume the AI Knows Everything

Provide Necessary Details:The STFM AI Assistant has been trained on STFM's business and resources. The AI can only use the information you provide or that it has been trained on.

Example:

Instead of: "How can I improve my program?"
Try: "As a program coordinator for a Family Medicine clerkship, what resources does STFM provide to help me improve student engagement and learning outcomes?"
Why this is important: Including relevant details helps the AI understand your specific situation, leading to more accurate and useful responses.

4. Reset if You Change Topics

Clear Chat History When Switching Topics:

If you move to a completely new topic and the chatbot doesn't recognize the change, click the Clear Chat History button and restate your question.
Note: Clearing your chat history removes all previous context from the chatbot's memory.
Why this is important: Resetting ensures the AI does not carry over irrelevant information, which could lead to confusion or inaccurate answers.

5. Provide Enough Context

Include Background Information: The more context you provide, the better the chatbot can understand and respond to your question.

Example:

Instead of: "What are the best practices?"
Try: "In the context of Family Medicine education, what are the best practices for integrating clinical simulations into the curriculum?"
Why this is important: Specific goals, constraints, or preferences allow the AI to tailor its responses to your unique needs.

6. Ask One Question at a Time

Break Down Complex Queries: If you have multiple questions, ask them separately.

Example:

Instead of: "What are the requirements for faculty development, how do I register for conferences, and what grants are available?"
Try: Start with "What are the faculty development requirements for Family Medicine educators?" Then follow up with your other questions after receiving the response.
Why this is important: This approach ensures each question gets full attention and a complete answer.

Examples of Good vs. Bad Prompts

Bad Prompt

"What type of membership is best for me?"

Why it's bad: The AI Chat Assistant has no information about your background or needs.

Good Prompt

"I'm the chair of the Department of Family Medicine at a major university, and I plan to retire next year. I'd like to stay involved with Family Medicine education. What type of membership is best for me?"

Why it's good: The AI Chat Assistant knows your role, your future plans, and your interest in staying involved, enabling it to provide more relevant advice.

Double Check Important Information

While the AI Chat Assistant is a helpful tool, it can still produce inaccurate or incomplete responses. Always verify critical information with reliable sources or colleagues before taking action.

Technical Limitations

The Chat Assistant:

  • Cannot access external websites or open links
  • Cannot process or view images
  • Cannot make changes to STFM systems or process transactions
  • Cannot access real-time information (like your STFM Member Profile information)

STFM AI Assistant
Disclaimer: The STFM Assistant can make mistakes. Check important information.