2017 Federal Index


Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY17? (Example: Chief Evaluation Officer)

Administration for Children and Families (HHS)

Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
Identify the staff position (title), budget, and staff for the agency’s Evaluation Officer (or equivalent).
The Deputy Assistant Secretary for Planning, Research, and Evaluation serves in a role equivalent to the Chief Evaluation Officer for the Administration for Children and Families (ACF). A Senior Executive Service career official, the Deputy Assistant Secretary oversees ACF’s Office of Planning, Research, and Evaluation (OPRE) and supports evaluation and other learning activities across the agency. ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation oversees a research and evaluation budget of approximately $200 million in FY19. OPRE has 64 federal staff positions; OPRE staff are experts in research and evaluation methods and data analysis as well as ACF programs and policies and the populations they serve.

Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
Identify the staff position (title), budget, and staff for the agency’s Chief Data Officer (or equivalent).
In 2016, ACF established a new Division of Data and Improvement (DDI) providing federal leadership and resources to improve the quality, use, and sharing of ACF data. The Director of DDI reports to the Deputy Assistant Secretary for Planning, Research, and Evaluation and oversees work to improve the quality, usefulness, interoperability, and availability of data and to address issues related to privacy and data security and data sharing. DDI has 9 federal staff positions and an FY19 budget of approximately $4.4M (not including salaries).

Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
Describe in 2-3 sentences the agency’s governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs.
With the 2016 reorganization that created the Division of Data and Improvement (DDI), ACF nested the following functions within the Office of Planning, Research, and Evaluation: strategic planning; performance measurement and management; research and evaluation; statistical policy and program analysis; synthesis and dissemination of research and evaluation findings; data quality, usefulness, and sharing; and application of emerging technologies to improve the effectiveness of programs and service delivery. This reorganization was for the purpose of consolidating and giving the Deputy Assistant Secretary for Planning, Research, and Evaluation oversight for evaluation, data, statistical, and related functions.

Corporation for National and Community Service
  • CNCS’s Office of Research and Evaluation Director (R&E) oversees the development of social science research designed to measure the impact of CNCS programs and shape policy decisions; encourage a culture of performance and accountability in national and community service programs; provide information on volunteering, civic engagement, and volunteer management in nonprofit organizations; and assist in the development and assessment of new initiatives and demonstration projects. The R&E Director, who is overseeing R&E’s $4 million budget and a staff of 9 in FY17, reports directly to the CNCS Chief of Staff and is a member of CNCS’s Leadership Team and Policy Council. The R&E Director also meets regularly with CNCS Program Directors to identify areas where evidence can be generated and used for various decisions.
  • The R&E Director meets annually with all CNCS program offices to identify priorities and negotiate which pools of funds are need to support the year’s priorities. The FY17 plan was developed through a series of formal and informal conversations.
Millennium Challenge Corporation
  • MCC’s Monitoring and Evaluation (M&E) Division, which falls within the Department of Policy and Evaluation (DPE), has a staff of 23 and an estimated FY17 budget of $21.2 million in due diligence (DD) funds. These resources are used to directly measure high-level outcomes and impacts in order to assess the attributable effects of MCC’s programs and activities. Departments throughout the agency have a total of $71.9 million in DD funds in FY17. The M&E Managing Director, as well as the Vice President for the Department of Policy and Evaluation, have the authority to execute M&E’s budget and inform policy decisions affecting independent evaluations. The M&E Managing Director participates in technical reviews of proposed investments as well as in regular monitoring meetings in order to inform policy and investment decisions. The Vice President sits on the Agency’s Investment Management Committee which examines the evidence base for each investment before it is approved by the MCC Board and conducts regular oversight over the compact (i.e., grant program) development process. The MCC also recently appointed a new Chief Economist in DPE to oversee and strengthen the economic evidence base used for program development, including economic growth diagnostics, beneficiary analyses, and cost-benefit analyses.
Substance Abuse and Mental Health Services Administration
  • The director of SAMHSA’s Center for Behavioral Health Statistics and Quality (CBHSQ) Division of Evaluation, Analysis and Quality (DEAQ) serves as the agency’s evaluation lead with key evaluation staff housed in this division. In addition, the agency’s chief medical officer (CMO), as described in the 21st Century Cures Act, plays a key role in addressing evaluation approaches and the utilization of evidence-based programs and practices among grantees; at this time, a collaborative approach between CBHSQ and the Office of the CMO is being established to ensure broad agency evaluation oversight by senior staff. The Office of the CMO is housed within the agency’s emerging Mental Health Policy Lab (currently the Office of Policy, Planning and Innovation) and will influence evaluation policy decisions across the agency in a more systematic manner as the new Policy Lab is stood up in January 2018.
  • SAMHSA’s Office of Policy, Planning and Innovation provides policy perspectives and guidance to raise awareness around SAMHSA’s research and behavioral health agenda. OPPI also facilitates the adoption of data-driven practices among other federal agencies and partners such as the National Institutes for Health, the Centers for Disease Control and Prevention, and the Centers for Medicare and Medicaid Services.
  • At this time, evaluation authority, staff, and resources are decentralized and found throughout the agency. SAMHSA is composed of four Centers, the Center for Mental Health Services (CMHS), the Center for Substance Abuse Treatment (CSAT), the Center for Substance Abuse Prevention (CSAP) and the Center for Behavioral Health Statistics and Quality (CBHSQ). CMHS, CSAT, and CSAP oversee grantee portfolios and evaluations of those portfolios. Evaluation decisions within SAMHSA are made within each Center specific to their program priorities and resources. Each of the three program Centers uses their program funds for conducting evaluations of varying types. CBHSQ, SAMHSA’s research arm, provides varying levels of oversight and guidance to the Centers for evaluation activities. CBHSQ also provides technical assistance related to data collection and analysis to assist in the development of evaluation tools and clearance packages. Within CBHSQ’s DEAQ, the Quality, Evaluation, Performance Branch (QEPB) builds internal capacity for “developing more rigorous evaluations conducted” internally and externally to assess the “impact of its behavioral health programs… and treatment measures,” and the Analysis and Services Research Branch (ASRB) focuses on effective delivery and financing of health care and services.
  • SAMHSA evaluations are funded from program funds that are used for service grants, technical assistance, and for evaluation activities. Evaluations have also been funded from recycled funds from grants or other contract activities. Given the broad landscape of evaluation authority and funding, a variety of evaluation models have been implemented. These include recent evaluations funded and managed by the program Centers (e.g., First Episode Psychosis, FEP); evaluations funded by the Centers but directed outside of SAMHSA (e.g., Assisted Outpatient Treatment, AOT), and those that CBHSQ directly funds and executes (e.g., Primary and Behavioral Health Care Integration, PBHCI, and the Cures-funded Opioid State Targeted Response funding). Evaluations require different degrees of independence to ensure objectivity and the models above afford SAMHSA the latitude to enhance evaluation rigor and independence on a customized basis.
  • In 2016, CBHSQ conducted a summer review of evaluation activities with the program Centers and presented its findings to the SAMHSA Executive Leadership Team (ELT). As a result, SAMHSA revised and finalized a new Evaluation Policy and Procedure (P&P) grounded in an earlier evaluation P&P and is currently developing a Learning Agenda to prioritize activities to address gaps in data collection, data analysis and the identification of evidence based practices in high profile areas (e.g. SMI, SED, Opioids, Marijuana, Suicide, Health Financing, among others.) The new Evaluation P&P requires Centers to identify research questions and appropriately match the type of evaluation to the maturity of the program. A new workgroup, the Cross-Center Evaluation Review Board (CCERB), composed of Center evaluation experts, will now review significant evaluations at critical milestones in the planning and implementation process, providing specific recommendations to the Center Director having the lead for the evaluation. SAMHSA’s Cross Center Evaluation Review Board (CCERB) works with the four centers within SAMHSA: CSAP, CMHS, CSAT, and CBHSQ to advise, conduct, collaborate, and coordinate on all evaluation and data collection activities that occur within SAMHSA. CCERB staff provides support for program-specific and administration-wide evaluations. SAMHSA’s CMO will also play a key role in reviewing evaluation proposals and clearing final reports.
U.S. Agency for International Development
U.S. Department of Education
  • ED’s Institute of Education Sciences (IES), with a budget of $605.3 million in FY17, has primary responsibility for education research, evaluation, and statistics. The IES Director is appointed by the President and confirmed by the U.S. Senate, and advises the U.S. Education Secretary on research, evaluation and statistics activities. Four Commissioners support the IES Director, including the Commissioner for the National Center for Education Evaluation and Regional Assistance (NCEE), who is responsible for planning and overseeing ED’s major evaluations. IES employed approximately 180 full-time staff in FY 2017, including approximately 25 staff in NCEE.
  • The Office of Planning, Evaluation, and Policy Development’s (OPEPD) Program and Policy Studies Services (PPSS) has a staff of 20 and serves as the Department’s internal analytics office. PPSS conducts short-term evaluations to support continuous improvement of program implementation and works closely with program offices and senior leadership to inform policy decisions with evidence. While some evaluation funding – such as that for Special Education Studies and Evaluations – is appropriated to IES ($10.8 million in FY17), most evaluations are supported by funds appropriated to ED programs. NCEE and PPSS staff work closely with program offices to design program evaluations that reflect program priorities and questions. IES and PPSS provide regular briefings on results to help ensure information can be used by program offices for program improvement.
  • IES and PPSS staff collaborate closely through ED’s Evidence Planning Group (EPG) with other senior staff from the ED’s Office of Planning, Evaluation, and Policy Development (OPEPD), including Budget Service, and the Office of Innovation and Improvement (OII). EPG supports programs and advises Department leadership on how evidence can be used to improve Department programs. EPG has coordinated, for example, the development of revised evidence definitions and related selection criteria for competitive grant programs that align with the Elementary and Secondary Education Act, as amended by the Every Student Succeeds Act (P.L. 114-95) (ESSA). EPG has also facilitated cross-office alignment of evidence investments in technical assistance and pooling program funds for evaluations.
  • Senior officials from IES, OPEPD, and OII are part of ED’s leadership structure. Officials from OPEPD and OII weigh in on major policy decisions. OPEPD leadership plays leading roles in the formation of the Department’s annual budget requests, recommendations for grant competition priorities, including evidence, and providing technical assistance to Congress to ensure that evidence informs policy design.
U.S. Dept. of Housing & Urban Development
  • HUD’s Office of Policy Development & Research (PD&R) informs HUD’s policy development and implementation by conducting, supporting, and sharing research, surveys, demonstrations, program evaluations, and best practices. PD&R achieves this mission through three interrelated core functions: (1) collecting and analyzing national housing market data (including with the Census Bureau); (2) conducting research, program evaluations, and demonstrations; and (3) providing policy advice and analytic support to the HUD Secretary and program offices. PD&R is led by an Assistant Secretary who oversees six offices, about 149 staff including a team of field economists that work in HUD’s 10 regional offices across the country, and a budget of $113 million in FY17. The Assistant Secretary ensures that evidence informs policy development through frequent personal engagement with other principal staff, the Secretary, and external policy officials; HUDstat performance review meetings (see Question #4 below for a description); speeches to policy audiences, sponsorship of public research briefings, and policy implications memoranda. The Assistant Secretary also regularly engages with each HUD program office to ensure that metrics, evaluations, and evidence inform program design, budgeting, and implementation.
  • Periodic PD&R meetings with program offices enable PD&R to share knowledge about evaluation progress and program offices to share knowledge about emerging needs for research, evaluation, and demonstrations to advance program policy.
U.S. Department of Labor
  • DOL’s Chief Evaluation Officer is a senior official with responsibility for all activities of the Chief Evaluation Office (CEO), and coordination of evaluations Department-wide. In 2016, DOL’s Chief Evaluation Officer was converted to a career position, a change which more fully cements the principle of independence and reflects the Department’s commitment to institutionalizing an evidence-based culture at DOL. Evaluation results and products are approved and released by the Chief Evaluation Officer (as per the CEO Evaluation Policy), and disseminated in various formats appropriate to practitioners, policymakers, and evaluators.
  • The CEO includes 15 full-time staff plus a small number of contractors and 1-2 detailees at any given time. This staff level is augmented by staff from research and evaluation units in other DOL agencies. For example, the Employment and Training Administration has 9 FTE’s dedicated to research and evaluation activities with which CEO coordinates extensively. CEO staff have expertise in research and evaluation methods as well as in DOL programs and policies and the populations they serve. CEO also employs technical working groups on the majority of evaluation projects whose members have deep technical and subject matter expertise. Further, CEO staff engage and collaborate with program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.
  • In FY17, the CEO will directly oversee an estimated $40 million in evaluation funding (this includes the direct appropriation, the set aside amount and other funds that come from programmatic accounts where evaluations are co-funded). The $40 million includes the appropriated budget for the Departmental Program Evaluation (over $8 million in FY17) and the Department’s evaluation set-aside funds (of up to 0.75% of select department accounts), which will be approximately $24 million in FY17. CEO also collaborates with DOL and other Federal agencies on additional evaluations being carried out by other offices and/or supported by funds appropriated to DOL programs such as Employment and Training Administration (ETA) pilots, demonstrations and research and evaluations of large grant programs including the Performance Partnership Pilots (P3), American Apprenticeship Initiative (AIA), the Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program, and Reentry Programs for Ex-Offenders.
  • The CEO also participates actively in the performance review process during which each operating agency meets with Department leadership to review progress on performance goals established for the year required under Government Performance and Results Act (GPRA).
  • The CEO’s role is to incorporate evidence and evaluation findings as appropriate and to identify knowledge gaps that might be filled by evaluations or convey evidence that can inform policy and program decisions or performance. DOL’s Chief Evaluation Officer and senior staff are part DOL’s leadership structure and weigh in on major program and policy decisions and play a role in the formation of the DOL’s agencies’ annual budget requests, recommendations around including evidence in grant competitions, and providing technical assistance to the Department leadership to ensure that evidence informs policy design. There are a number of mechanisms set up to facilitate this: CEO participates in quarterly performance meetings with DOL leadership and the Performance Management Center (PMS); CEO reviews agency operating plans and works with agencies and the PMS to coordinate performance targets and measures and evaluations findings; Quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process; and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
Back to the Standard

Visit Results4America.org