A few years ago, the Department of Defense Education Activity (DoDEA) recently changed how grant goals were written. At the DoDEA Annual Community of Practice Meeting in November 2024, SEG's Dr. Stacy Ashworth delivered an insightful presentation on selecting goal outcome measures to ensure grantee success. This month's blog post recaps Dr. Ashworth's valuable guidance, offering strategies to help DoDEA grantees achieve their objectives.
A Portrait of a Successful Project
We will begin by looking at two projects with STEM goals. The project outcomes are listed as bullets beneath the project name.
Project A • Decline in math state test scores from SY2019 to SY2022 • Decline in average score of military-connected (MC) students on state test score category from SY2019 to SY2022
| Project B • 1,400 students (potentially duplicated) participating in extracurriculars throughout the grant period • 70% of teachers “agreed” or “strongly agreed” participating in STEM activities/using STEM items had improved students’ STEM skills • 175 computer science and STEM lessons developed • 1,100 students (potentially duplicated) participating in STEM Camp
|
From the outcomes of each project, it is clear that Project B was demonstrated to be more successful than Project A. However, Project A and Project B were the exact same project.
DoDEA recently refined its approach to grant goal formulation, transitioning from narrowly defined goals with a single quantitative indicator to a more comprehensive framework that incorporates multiple indicators. This shift is exemplified by Project A and Project B. In Project A, the evaluation relied on limited indicators, leading to a perception of limited success by Year 4. However, under the revised goal-setting approach in Project B, additional indicators were assessed in Year 5, revealing the project's broader achievements and successful implementation. This case underscores the importance of utilizing diverse and comprehensive outcome measures to accurately capture the full impact of educational initiatives.
Ensuring Your Evaluation Plan Highlights Your Success
As DoDEA grant evaluators, we frequently encounter projects that appear less successful due to two primary challenges:
Data Collection Challenges: Often, districts plan to collect specific data but find it unavailable because the process is burdensome or lacks willing personnel at the school level.
DODEA Grants
Misalignment of Outcome Measures: There's a tendency to over-rely on state test data or other measures that, while valid, don't align with the project's objectives. For instance, if a project's strategy involves providing professional development to enhance math instruction, an aligned outcome would assess teacher math efficacy through surveys or focus groups. Although improvements in student standardized test scores are desirable, numerous factors influence these results, and they may not accurately reflect the project's success.
To address these issues, it's essential to:
Select a Balanced Mix of Quantitative and Qualitative Outcome Measures: Ensure that most are closely aligned with the project's goals.
Include Broader Outcomes Cautiously: While incorporating less-aligned outcomes, such as standardized test data, can provide additional insights, they should not be the sole indicators of success.
By adopting this comprehensive approach, we create multiple avenues to demonstrate a DoDEA project's success, leading to more accurate evaluations and meaningful educational improvements.
Sample Measures
When selecting outcome measures for your DoDEA grant evaluation, it's essential to consider existing data collection efforts and identify who will be responsible for gathering any new data. Additionally, exploring diverse methods of data analysis and presentation can provide a more comprehensive story of your project's impact. A sample list of quantitative sources and qualitative sources are provided below.
Quantitative Sources and Measures
Assessments
Surveys
Feedback forms
Grades
o Percent of students receiving A/B or D/F
o Percent of students with D/F that improve to A/B by end of year
o Improvement from prior year
Course enrollment
o Enrollment in advanced/optional courses
o Number of seats filled in [area] courses
o Number of unique students enrolled in [area] courses
Service recipients
o Number of students receiving [type of service]
o Number of students receiving [type of service] that no longer need service by end of year
Number of [type] offered
o Number of opportunities to learn about [program]
o Number of [type of program] offered
Number of participants
Qualitative Sources and Measures
Open-ended survey responses/Focus group responses
o Academic improvement/achievement
*Parent perceptions of student improvement in [topic]
*Teacher perceptions of student improvement or achievement in [topic]
*Student perceptions of [topic] achievement
o Professional growth
*Teacher perceptions in ability to implement [topic]
o Program impact
*Stakeholder perceptions of impact of program innovations
Planning to write an application for a DoDEA grant and looking for an evaluation partner? SEG will assist you with formulating goals and provide you with an evaluation matrix at the grant application stage at no cost to your institution in exchange for being listed in your application as the external evaluator. Contact us today for a free 30-minute consultation: seg@shafferevaluation.com.
Comments