2016 Federal Index


Repurpose for Results

In FY16, did the agency shift funds away from any practice, policy, or program which consistently failed to achieve desired outcomes? (Note: Meeting this criteria requires both Agency and Congressional action.)

Score
8
Administration for Children and Families (HHS)
  • The Head Start Designation Renewal System requires Head Start ($8.6 billion in FY17) grantees to compete for grants moving forward if they failed to meet criteria related to service quality, licensing and operations, and fiscal and internal controls. The 2007 Head Start Reauthorization Act made all Head Start grants renewable, five-year grants. At the end of each five-year term, grantees that are running high-quality programs will have their grants renewed. But grantees that fall short of standards are now required to compete to renew grants. Grantees whose ratings on any of the three domains of the Classroom Assessment Scoring System, an assessment of adult:child interactions linked to improved outcomes, fall below a certain threshold, or in the lowest 10 percent of grantees, must also compete.
  • ACF, in collaboration with the HHS Health Resources and Services Administration, has established criteria for evidence of effectiveness of home visiting models, and oversees the Home Visiting Evidence of Effectiveness Review (HomVEE), which determines whether models have evidence of effectiveness. To date HomVEE has reviewed evidence on 45 home visiting models, determining 20 of these to have evidence of effectiveness. Grantees must use at least 75% of their federal home visiting funds to implement one or more of these models.
  • ACF’s FY18 budget request proposes to eliminate the Community Services Block Grant and the Social Services Block Grant (see pp. 144 and 348).
Score
4
Corporation for National and Community Service
  • Over the past 8 years, AmeriCorps has reduced the amount of a grant award if member enrollment/retention benchmarks are not met. For example, in FY17, a grantee that had not met the performance targets requested 380 part-time members (the equivalent of $1 million) and was given zero.
  • Over the past 3 years, money has been redirected away from grantees with poor past performance metrics and awarded to grantees with positive past performance metrics so that they will have “forward funding.” (CNCS programs typically award one-year grants with an option to receive continuation awards). For example, in FY17, AmeriCorps “forward funded” 12 grants, with 9% of its competitive funding based on the reallocation of dollars from poor performing grantees to high performing grantees.
  • According to CNCS policy, Americorps State Commission staff will assess their recompeting subgrantees’ past performance and submit those assessments to CNCS. CNCS will assess its recompeting direct grantees related to past performance. This assessment is in addition to the evaluation of the applicant’s eligibility for funding or the quality of its application on the basis of the Selection Criteria. Results from this assessment will inform funding decisions. In evaluating programmatic performance, CNCS will consider the following for applicants that are current formula and competitive grantees and are submitting applications for the same program model:
    • Grant progress reports – attainment of Performance Measures
    • Enrollment and retention
    • Compliance with 30-day enrollment and exit requirements in the AmeriCorps portal
    • Site visit or other monitoring findings (if applicable)
    • Significant opportunities and/or risks of the grantee related to national service
    • Commission Rank
Score
8
Millennium Challenge Corporation
  • MCC has an established Policy on Suspension and Termination that lays out the reasons for which MCC may suspend or terminate assistance to partner countries. Assistance may be suspended or terminated, in whole or in part, if a country (1) engages in activities contrary to the national security interests of the US, (2) engages in a pattern of actions inconsistent with the MCC’s eligibility criteria, or (3) fails to adhere to its responsibilities under a compact or threshold grant, or related agreement. MCC has terminated a compact partnership, in part or in full, seven times out of 33 compacts approved to date, and has suspended partner country eligibility (both compact and threshold) four times, most recently seen with the suspension of Tanzania in March 2016 due to democratic rights concerns. MCC’s Policy on Suspension and Termination also allows MCC to reinstate eligibility when countries demonstrate a clear policy reversal, a remediation of MCC’s concerns, and an obvious commitment to MCC’s eligibility indicators. For example, in early 2012, MCC suspended Malawi’s Compact prior to Entry into Force as MCC determined that the Government of Malawi had engaged in a pattern of actions inconsistent with MCC’s eligibility criteria, specifically around democratic governance. Thereafter, the new Government of Malawi took a number of decisive steps to improve the democratic rights environment and reverse the negative economic policy trends of concern to MCC, which led to a reinstatement of eligibility for assistance in mid-2012.
  • MCC also consistently monitors the progress of Compact programs and their evaluations, using the learning from this evidence to make changes to MCC’s portfolio. For example, MCC recently undertook a review of its portfolio investments in roads in an attempt to better design, implement, and evaluate road investments. Through evidence collected across 16 compacts with road projects, MCC uncovered seven key lessons including the need to prioritize and select projects based on a road network analysis, to standardize content and quality of road data collection across road projects, and to consider cost and the potential for learning in determining how road projects are evaluated. This body of evidence and analysis will be published in September 2017 as MCC’s next Principles into Practice paper. Critically, the lessons from this analysis are already being applied to road projects in compacts in development in Cote d’Ivoire and Nepal.
Score
1
Substance Abuse and Mental Health Services Administration
  • The SAMHSA budget provides performance information along with budget information which Congress can use to determine funding levels. Each year the program Centers review grantees within each program, project, or activity in terms of performance and financial management, when funding decisions are made for continuation funding. It is up to each Center to determine the factors that go into decisions related to continued funding based on guidance from the Office of Financial Management, Division of Grants Management. To the extent that costs are reduced for continuation funding, those funds can be repurposed to fund new grantees or to provide additional contract support for those grantees. In FY 2017, SAMHSA underwent a stringent review process for all funding requests utilizing both program and fiscal performance. During this process, SAMHSA utilized $51M in unspent funding from existing grantees to fund new programs and activities.
  • CBHSQ staff conducted a summer evaluation inventory in the summer of 2016, requesting that program staff from the Centers provide information related to how their evaluation findings inform the next iteration of their programs and/or new evaluation activities. For the most part, program staff indicated that evaluation findings were used to improve the next round of funding opportunity announcements and thus grantee implementation of program.
Score
7
U.S. Agency for International Development
  • USAID’s updated operational policy for planning and implementing country programs has incorporated a set of tools and practices called Collaborating, Learning, and Adapting (CLA), that include designing adaptable activities that build in feedback loops; using flexible implementing mechanisms; and adopting a management approach that includes consulting with partners about how implementation is evolving and what changes need to be made. Through the Program Cycle, USAID encourages managing projects and activities adaptively, responding to rigorous data and evidence and shifting design and/or implementation accordingly.
  • USAID uses rigorous evaluations to maximize its investments. A recent independent study found that 71 percent of USAID evaluations have been used to modify and/or design USAID projects. Below are a few recent examples where USAID has shifted funds and/or programming decisions based on performance:
    • Tunisia: A USAID enterprise reform project in Tunisia used adaptive management to build feedback loops and flexibility into their project design, learning from monitoring data and adapting programming based on the evidence. The project’s components regularly collaborate in order to follow leads and identify opportunities. As a result, the project more than doubled initial targets – and created approximately 10% of the total net new jobs created in the Tunisian economy.
    • Cambodia: Findings from a mid-term evaluation of USAID activities under a public health project are informing the design of a new project to more efficiently integrate activities and enable donors and implementing partners to collaborate more easily. Findings are also contributing to the phasing out of some poor performing project components.
    • Peru: In response to a 2016 evaluation of a conflict mitigation activity which found that the intervention did not address root causes of conflict, USAID incorporated key changes for the design and award process of future programming, requiring new activities to address root causes of conflict for the particular context.
    • Vietnam: Based on the findings of a 2016 mid-term evaluation and subsequent stakeholder consultations, USAID’s Vietnam Forest and Delta Program reduced its scope and focused more on local needs, resulting in more efficient use of resources and positive feedback from government officials and implementing partners.
    • Ethiopia: A mid-term impact evaluation of the Feed the Future (FtF) portfolio in Ethiopia found that more than half of the communities in the project’szone of influence had not been sufficiently reached by programming. Based on these findings, USAID is already in the process of reducing the number of interventions and their geographic coverage for the final years of the project in Ethiopia. USAID plans to assess and learn from the re-focusing effort to inform programming in other countries.
  • USAID’s Securing Water for Food: A Grand Challenge for Development (SWFF) selected the highest potential water-for-food innovations and is providing grant funds and ongoing assistance to support business development. SWFF starts as a competition, but the winners must continually show results to receive a new tranche of funding. To move forward, grantees must achieve technical and financial milestones, such as increased crop yields and total product sales. Of the first 15 awardees, nine received Year 2 funding; six did not, because they did not meet the target number of end-users/customers in a cost-effective way and because their model was not deemed sustainable without USAID funding. By using milestone-based funding, SWFF has helped over one million farmers and other customers grow more than 3,000 tons of food and save almost 2 billion liters of water. In addition, SWFF innovators have formed more than 125 partnerships and secured more than $10 million in leveraged funding.
  • DIV, another example of USAID directing funds based on evidence of effectiveness, is mentioned earlier in the Innovation section.
Score
7
U.S. Department of Education
  • ED seeks to shift program funds to support more effective practices by prioritizing the use of entry evidence. For ED’s grant competitions where there is evaluative data about current or past grantees, or where new evidence has emerged independent of grantee activities, ED typically reviews such data to shape the design of future grant competitions.
  • Additionally, ED uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, ESEA’s Education Innovation and Research (EIR) program – the successor to i3 – supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students.
  • The President’s 2018 Budget request eliminates or reduces funding for more than 30 discretionary programs that do not address national needs, duplicate other programs, are ineffective, or are more appropriately supported with State, local, or private funds. Major eliminations and reductions in the 2018 Budget include:
    • Supporting Effective Instruction State grants (Title II-A), a savings of $2.3 billion. The program is proposed for elimination because evidence shows that the program is poorly structured to support activities that have a measurable impact on improving student outcomes. It also duplicates other ESEA program funds that may be used for professional development (p. C-16-C20).
    • 21st Century Community Learning Centers program, a savings of $1.2 billion. The program lacks strong evidence of meeting its objectives, such as improving student achievement. Based on program performance data from the 2014-2015 school year, more than half of program participants had no improvement in their math and English grades and nearly 60 percent of participants attended centers for fewer than 30 days (pp. C-23-C-24).
  • It is also worth noting that one of the themes of the FY18 budget for ED was “building evidence around educational innovation.” Consistent with this, the Department sustains funding for all IES-authorized activities and for continued support of State and local-based research, evaluation and statistics that help educators, policymakers and other stakeholders improve outcomes for all students. As another example, the budget requested $42 million for Supporting Effective Educator Development (SEED) to provide evidence-based professional development activities and prepare teachers and principals from nontraditional preparation and certification routes to serve in high-need LEAs.
  • In the previous administration, ED worked with Congress to eliminate 50 programs, saving more than $1.2 billion, including programs like Even Start (see pp. A-72 to A-73) (-$66.5 million in FY11) and Mentoring Grants (see p. G-31) (-$47.3 million in FY10), which the Department recommended eliminating out of concern based on evidence.
Score
7
U.S. Dept. of Housing & Urban Development
  • HUD’s FY17 budget request included a new formula for funding Housing Choice Voucher Administrative Fees that shifts funding away from inappropriately compensated public housing agencies and increases overall funding according to evidence about actual costs of maintaining a high-performing voucher program. (See here for more info.)
  • HUD’s FY17 budget request sought a $11 billion shift (pp.8–9) of resources toward housing vouchers for homeless families based on the rigorous experimental analysis of 4 service options in the Family Options study.
  • HUD’s FY18 budget request sought to eliminate funding for Community Development Block Grants. A 2005 PD&R evaluation had shown that targeting of CDBG resources toward communities with greater needs would be greatly enhanced by any of four alternatives to the 1978 statutory formula, but such improvements have not been authorized. An earlier 1995 evaluation found that although CDBG had made a contribution to community development, the neighborhood interventions generally were ad hoc rather than well-coordinated and strategic.
Score
6
U.S. Department of Labor
  • DOL’s evidence-based strategy is focused on program performance improvement and expansion of strategies and programs on which there is evidence of positive impact from rigorous evaluations. The department takes all action possible to improve performance before considering funding reductions or program termination. However, DOL does use program performance measures and results from evaluations to make decisions about future funding. For example:
    • In 2016 DOL established a methodology assessing the performance of Job Corps centers and selecting Centers for closure. In 2016 one Job Corps Center was closed because of its chronic low performance. Closure of this center allows DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in July 2016, DOL announced the closure and the methodology used for selecting Centers for closure.
  • In 2016 DOL established a methodology assessing the performance of Job Corps centers and selecting Centers for closure. In 2016 one Job Corps Center was closed because of its chronic low performance. Closure of this center allows DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in July 2016, DOL announced the closure and the methodology used for selecting Centers for closure.
  • All discretionary grant performance is closely monitored and has been used to take corrective action and make decisions about continued funding. For example, Youthbuild grant funding is based heavily on past performance. Organizations that have previously received and completed a YouthBuild grant award receive points based on past performance demonstrated totaling 28 points (almost 30% of their score). This effectively weeds out low performing grantees from winning future awards. (For more information, see the Grant Funding Announcement.) Additionally, DOL uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, recent grant programs such as TechHire and America’s Promise supports the creation, development, implementation, replication, and scaling up of evidence-based practices designed to improve outcomes.
  • DOL’s FY18 budget request prioritizes programs with demonstrated evidence (e.g., by allocating $90 million to expand apprenticeships, an evidence-based approach that combines on-the-job training with classroom instruction) and proposes reductions to unproven or duplicative activities (e.g., it proposes a reduction of $238 million by closing additional Job Corps centers that do not meet performance standards, and proposes eliminating the Senior Community Service Employment Program).
Back to the Standard

Visit Results4America.org