top of page

ABOUT US

Across many of the grant evaluations we’ve conducted, which span capacity-building, workforce initiatives, and K-12 education projects, we've observed some common practices associated with successful projects. How many of these practices do you use with your grant? Use this quick quiz to find out.


How to score:

For each item, give your project:

2 = Yes, consistently

1 = Sometimes / partly in place

0 = Not yet


The 10-practice grant success quiz


1) We have a simple logic model or theory of change that staff can explain in plain language.

This is the “map” that keeps a project from becoming a long list of disconnected tasks. The best versions are short, visual, and actively used—not just filed away with the grant application. Logic models are widely recommended as tools for planning, communicating, and evaluating how activities connect to intended outcomes.


2) We’ve narrowed our performance measures to a small set that we actually use.

Many grants struggle under the weight of too many indicators. Strong projects identify a manageable handful that directly reflect the outcomes in the logic model. This approach to performance measurement also improves reporting clarity and supports real improvement during implementation.


3) Our roles and decision-making are clear.

Successful grants rarely depend on heroic effort from one person. They succeed on clear lines of responsibility: who participates in decision-making, who owns each grant deliverable, and who is accountable when timelines slip.


4) We run the project with a steady cadence (not just bursts before reports).

A predictable rhythm—monthly check-ins, short action logs, and visible next steps—keeps momentum and reduces last-minute chaos. This aligns with continuous improvement approaches that emphasize routine documentation and follow-through.


5) We track risks early and revisit them.

Staff turnover, procurement delays, partner shifts, and seasonal constraints can derail even well-designed projects. Proactive risk management is a recognized grant management practice to protect timelines, budgets, and compliance.


6) We know our “core components” and protect them.

Strong programs are consistent where it matters most. That means identifying the few essential elements that must be delivered with fidelity, even if other parts of the program vary by site or participants.


7) We allow smart local adaptation—and document it.

The strongest projects don’t confuse flexibility with drift. They distinguish between acceptable adaptations and changes that would weaken the model. This “fidelity and fit” balance is especially emphasized in education grant contexts.


8) Our data process is realistic for our staffing and context.

Good data systems are are simple, routine, and built to survive busy seasons, staff changes, and competing priorities. The best ones focus on a small set of measures (see #2), assign clear ownership, and follow a predictable schedule so data is ready well before reporting deadlines.


9) Partners are integrated into implementation—not just listed in the proposal.

Effective partnerships show up in the work, not just the application narrative. Roles are clear, timelines are shared, and partners have regular touchpoints where they help solve problems and shape adjustments. When partners are truly embedded, they expand capacity and increase the odds that key activities will continue after funding ends.


10) We began sustainability planning early.

This is one of the most consistent predictors of long-term impact. Sustainability should be planned from the start of a grant, not in the final year.


Your score (0–20)

17–20: Strong shape. Your project is operating like a mature, high-functioning grant. Your main opportunity is refining the few weak spots into repeatable systems.

13–16: Good foundation with a few stress points. Your project is likely to deliver most core outcomes, but it may be vulnerable to turnover or scope creep.

9–12: Uneven implementation. Your project demonstrates strong intention and effort, but the “operating system” needs tightening.

0–8: High risk, high opportunity. The good news: small changes can make a big difference quickly.


If your score suggests a few of these practices need tightening, that’s good news. Most grants don’t need a redesign—they need a sharper operating system.


Shaffer Evaluation Group can help you build it. We support grantees with practical, right-sized evaluation and implementation tools, such as logic models that get used, lean measurement plans, and early sustainability roadmaps. Whether you need a quick mid-course tune-up or a full external evaluation, we’ll help you turn a good idea into a project that runs smoothly, proves its value, and lasts.


ree


Engaging Stakeholders Early


One of the most valuable steps in any evaluation is collaborating with stakeholders who are directly involved in program implementation or data collection. While project directors often oversee planning, they may not be the ones executing the work or gathering data. By involving them early, project directors and evaluators gain insight into what’s feasible and what may need adjustment.


These leaders, embedded in the daily operations of their schools or departments, can identify who has the capacity to support implementation and data efforts—insights that are often missed without their input. Early engagement fosters a sense of ownership and accountability among stakeholders, which can enhance the evaluation process.


Being Strategic with Data Collection


More data isn’t always better. One of the most important lessons in evaluation is the value of being intentional about what data to collect—and why. Project directors should focus on gathering data that directly supports understanding of implementation progress. It’s essential to be mindful of the burden placed on both data collectors and respondents, such as survey participants.


A helpful starting point is to review what data are already being collected. Leveraging existing sources can reduce duplication, streamline efforts, and ensure that new data collection is purposeful and manageable. This strategic approach not only saves time but also enhances the quality of the evaluation.


Prioritizing Ongoing Communication


Consistent communication is essential for a successful evaluation. Project directors should regularly connect with those responsible for implementation to understand how things are progressing. This helps identify emerging needs and ensures staff have the time and resources to carry out their roles effectively.


These check-ins—whether through recurring meetings or informal in-person visits—build trust and foster a collaborative environment. Equally important is maintaining open lines of communication between the evaluator and the project director. Regular meetings not only provide valuable context about implementation but also create space for reflection, adaptation, and strategic decision-making.


Building Relationships for Effective Evaluation


Evaluation is as much about relationships and strategy as it is about data. Across diverse projects and grant programs, three lessons consistently stand out: the importance of stakeholder collaboration, the need for intentional and manageable data collection, and the value of ongoing communication.


When these elements are prioritized, evaluations become more responsive, grounded, and impactful—ultimately supporting programs in achieving their goals more effectively. A strong relationship between evaluators and stakeholders can lead to richer insights and a more nuanced understanding of program dynamics.


Conclusion: The Path Forward


As we continue to navigate the complexities of evaluation, these lessons remain at the forefront of our practice. By engaging stakeholders early, being strategic with data collection, and prioritizing ongoing communication, we can enhance the effectiveness of our evaluations.


This approach not only benefits the evaluation process but also strengthens the programs we aim to support. Together, we can make a bigger difference for underserved communities both in the US and globally.


Interested in working with Shaffer Evaluation Group? Contact us today for a free 30-minute consultation: seg@shafferevaluation.com.


ree

Sustainability planning for federal grants is a disciplined process. It ensures a project’s goals, principles, and activities endure after the award period ends. Federal funders increasingly prioritize long-term impact. Proposals and performance reports should demonstrate a credible plan for lasting success. In this article, I will share guidance on sustainability planning to support your federal grant.


Start Early


Treat sustainability as a design constraint from day one. Don’t wait until the final quarter to address it. Build time, roles, and budget for sustainability work into your management plan. Revisit this plan at every reporting cycle to ensure you stay on track.


Use a Sustainability Assessment


At Shaffer Evaluation Group, we guide clients using the Program Sustainability Assessment Tool (PSAT). The PSAT helps teams score and prioritize capacity across eight domains: Environmental Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, and Strategic Planning. Conducting a quick PSAT baseline early in Year 1, a midpoint check, and a final assessment creates a clear trajectory of capacity-building for your organization. This also provides evidence of progress for funders.


Build a Sustainability Plan


Your sustainability plan should be developed early in your grant term. Here are the core components a sustainability plan for a federally funded project should include, distilled from cross-agency best practices:


  • Purpose & Outcomes: A concise statement of your project's value proposition, intended outcomes, and beneficiaries, aligned with a logic model you will continue to monitor and adjust post-award.

  • Core Program Model & Operating Protocols: A description of your core program model and what must be maintained (including workflows) so the project can persist with quality after grant closeout.

  • Governance & Ownership: Named post-award owner(s) and ongoing “champions” to anchor the work.

  • Funding Strategy (Braided Financing): A targeted mix of reallocated (internal) funding and external sources, with milestones, responsible leads, and cultivation plans (e.g., philanthropy, fee-for-service).

  • Partnerships & MOUs: Roles, cost-shares, and data-sharing agreements with agencies, nonprofits, and community partners that are essential to sustain activities.

  • Workforce Continuity & Knowledge Transfer: Staffing map, cross-training to avoid single points of failure, onboarding materials, and a schedule for refresher trainings.

  • Communications & Stakeholder Engagement: A plan to keep sponsors, end-users, and other stakeholders informed and invested.

  • Data, Evaluation & Learning: Post-grant metrics, feedback loops, and routines for using evidence in decisions.

  • Policy & Institutionalization: The specific policies, procedures, and standards that will embed the work into normal operations.

  • Risk & Scenario Planning: Anticipated risks (funding gaps, turnover, vendor change), trigger points, contingencies, and “minimum viable operations” to protect continuity.

  • Technology & Infrastructure: Post-award requirements for platforms, equipment, licensing, accessibility/privacy, and support so services can run at a steady state.


Operationalize It


Translate the plan into a 12–24 month roadmap. Include owners, milestones, and a lightweight dashboard. Budget explicitly for sustainability tasks, such as grant writing time, partnership stewardship, and training refreshers.


How We Help


Shaffer Evaluation Group uses the PSAT to run a rapid diagnostic. We co-facilitate strategy sessions with leadership and partners. Together, we co-develop a sustainability roadmap that balances quick wins (policy changes, SOPs, cross-training) with longer-horizon plays (braided funding, institutionalization). A thoughtful sustainability plan signals stewardship, strengthens proposals, and, most importantly, keeps effective work alive for the communities it was designed to serve.


For more information, contact us today at seg@shafferevaluation.com.


ree

Anchor 1

Shaffer Evaluation Group, 1311 Jamestown Road, Suite 101, Williamsburg, VA 23185   833.650.3825

Contact Us     © Shaffer Evaluation Group LLC 2020 

bottom of page