Teaching Population Health and Performance Improvement in Family Medicine Residency
BACKGROUND
Family medicine residencies are expected to teach population health and performance improvement (PI),¹ yet residents often experience PI as a checkbox exercise rather than authentic practice change. At our program, we aimed to help residents view themselves as panel managers accountable for outcomes in defined patient populations. We leveraged 2 underused assets: a population health platform linked to the electronic health record and a simple, tool-agnostic PI framework to create a postgraduate year 2 (PGY-2) curriculum that makes population health tangible, builds practical PI habits, and aligns with American Board of Family Medicine PI requirements.⁵
This habit-focused, team-reliable curriculum uses basic Plan-Do-Study-Act (PDSA) methodology and addresses variability in residency population health practice as well as the need for protected time and mentorship.² ⁴
METHODS
All PGY-2 residents complete a required PI project focused on a small personal panel of 20 patients (10 with hypertension and 10 with diabetes) identified via the population health platform. The small panel size keeps data review manageable and maintains visibility of individual patient narratives.
Residents select 1 or 2 specific care gaps (eg, overdue hemoglobin A1c testing, missing blood pressure treatment plan, lack of foot examination documentation), develop a brief SMART (Specific, Measurable, Achievable, Relevant, Time-bound) aim, and design a ≤30-second micro-workflow embedded in routine clinic operations (eg, a rooming checklist item, previsit huddle prompt, or standardized outreach script).
Each project tracks (1) a reliability measure (frequency with which the micro-workflow occurs), (2) a proximal outcome (eg, documentation of a blood pressure treatment plan or completion of a foot examination), and (3) lessons learned and intended practice changes. Residents report all data via an online survey instrument.
Program evaluation was structured using Kirkpatrick’s 4-level model.⁶ Reaction (Level 1) was assessed through brief resident evaluations; learning (Level 2) through review of aims, data tables, and written reflections; behavior (Level 3) through reliability measures and staff feedback; and early results (Level 4) through small panel–level improvements in documentation and blood pressure control.
RESULTS
To date, 68 residents have completed projects. Several educational shifts have emerged. Residents increasingly describe efforts to “fix the system” for their panels rather than “fix charts,” and they report greater confidence using data to drive change and identify missed opportunities (eg, tobacco use, statin prescribing, no-show patterns) within their own panels.
Micro-workflows strengthened collaboration with nursing and front-desk staff, in part because residents co-designed these processes with team members. Some workflows, such as huddle prompts for high-risk patients, extended beyond individual projects and were adopted as standard clinic processes.
Kirkpatrick evaluation data support these observations (Table 1).
Table 1:
At Level 1, residents rated the curriculum as relevant and feasible, particularly because projects were anchored in their own continuity panels. Level 2 artifacts demonstrated improved ability to develop specific aims and select appropriate measures. Early Level 3 findings showed more consistent use of panel reports and clinic huddles to identify care gaps. At Level 4, small but meaningful improvements in documentation and process reliability were observed in several panels.
DISCUSSION
This curriculum appears to support a shift in residents’ professional identity, from viewing population health and PI as ancillary tasks to recognizing them as core elements of family medicine practice.
Implementation challenges included incomplete PI component completion during busy rotations; the need for coaching to narrow overly ambitious 4- to 8-week aims into feasible, high-yield targets; and diminished data entry and follow-up without protected time, reminders, and streamlined templates.
Despite limitations related to time constraints, variable engagement, and single-site implementation, several practical lessons emerged. Panel-based PI does not require sophisticated software. Small panels, clearly defined aims, team-reliable micro-workflows, and simple online survey tracking were sufficient to generate measurable change. Framing evaluation using Kirkpatrick’s model provided a feasible structure to document reaction, learning, behavior change, and early clinical outcomes, supporting sustainable integration of population health education.
Practically, the PI framework provided residents with a repeatable approach applicable in continuity clinic: identify a small panel, select a single care gap, develop a focused SMART aim, and implement a brief micro-workflow integrated into routine rooming, huddles, or outreach. This approach reframes PI from a checkbox requirement to a transferable clinical habit centered on data-informed panel management, team-reliable processes, and iterative refinement that residents can carry forward into independent practice.
REFERENCES
- Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Family Medicine. Published 2023. Accessed December 10, 2025. https://www.acgme.org
- Carek S, Brown C, Neutze D, et al. Understanding population health management practices among family medicine residency programs. Fam Med. 2025;57(6):1-7.
- Sell J, Riley TD, Miller EL. Integrating quality improvement and community engagement education: curricular evaluation of resident population health training. Fam Med. 2022;54(8):634-639.
- Griesbach S, Theobald M, Kolman K, et al. Joint guidelines for protected nonclinical time for faculty in family medicine residency programs. Fam Med. 2021;53(6):443-452.
- American Board of Family Medicine. Performance Improvement (PI) Activities. Accessed December 10, 2025. https://www.theabfm.org/continue-certification/performance-improvement
- Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. 3rd ed. Berrett-Koehler Publishers; 2006.
