The Department continually assesses its RD&T programs to ensure their effectiveness and performance. This section of the RD&T Annual Funding Report discusses DOT's RD&T evaluation strategy, which is depicted in Table 2.
As shown, this broad strategy comprises three principal mechanisms: (1) systematic application of the Administration's R&D Investment Criteria and Program Assessment Rating Tool (PART); (2) annual internal review of operating administration RD&T programs; and (3) external RD&T coordination and review.
To guide the planning and management of research across the government, the Administration has established three investment criteria for RD&T: relevance, quality, and performance. The criteria incorporate established best practices for research evaluation as identified by the National Academy of Sciences, Government Accountability Office, and others. Each criterion has both prospective and retrospective elements:
Relevance. Programs must have complete plans, with clear goals and priorities; must articulate their potential public benefits; and must be relevant to national and customer needs.
Quality. Programs must use clearly stated, defensible methods for awarding funding; those allocating funds through means other than a competitive,
merit-based process must document how quality is maintained.
Performance. Programs must maintain long-term objectives, with annual measures and targets, and define appropriate outputs, outcomes, schedules, and decision points.
The PART contains questions to assess how well agencies are implementing the investment criteria. On the basis of the PART, programs are rated as Effective, Moderately Effective, Adequate, Ineffective, or Results Not Demonstrated. To date, DOT RD&T programs have been assessed through the PART in FAA, FHWA, FRA, FTA, NHTSA, PHMSA, and RITA.
Within the Department there are two primary mechanisms for ensuring implementation of the R&D Investment Criteria and PART: (1) evaluation of RD&T best practices, and (2) research investment planning.
In an August 2006 report on RD&T coordination in the Department, the U.S. Government and Accountability Office (GAO) recommended that DOT develop "a strategy to ensure that the results of all of DOT's RD&T activities are evaluated according to established best practices."1 In response to this recommendation, the Department's RD&T Program Review Working Group, representing each of the research-performing DOT administrations and OST, identified a set of evaluation best practices to be the focus of periodic RD&T program reviews. Shown in Table 3, the evaluation best practices reflect guidelines for research evaluation recommended by the GAO;2 the National Academies Committee on Science, Engineering, and Public Policy;3 and OMB.4
In FY 2007, the RD&T Program Review Working Group assessed operating administrations' adherence to these practices using a set of formal evaluation guidelines and agreed-upon implementation criteria. The results of this review process are presented in Table 4.
As shown in the table, the reviews demonstrated the following with regard to implementation of the RD&T evaluation best practices:
External Stakeholder Involvement. Most DOT operating administrations have a transparent and consistent process for involving stakeholders in their RD&T programs. However, some do not have a clear and documented process for responding to stakeholder recommendations. Stakeholder involvement is an area for improvement for follow-up in future RD&T program reviews.
Merit Review of Competitive Proposals. Across the Department, competitive RD&T grants and contracts are awarded based upon merit review. These procedures are transparent and clearly documented.
Independent Expert Review. Operating administrations are in compliance with OMB guidelines for peer review of "Highly Influential" and "Influential" research. However, some administrations do not use independent expert review to evaluate significant RD&T programs or use the results to guide program decisions. Independent expert review is an area for improvement for follow-up in future RD&T program reviews.
RD&T Performance Measures. Most operating administrations have documented long-term objectives for significant RD&T programs and all have annual RD&T milestones. In addition to these measures, common performance measures can be developed for RD&T for inclusion in the next Transportation RD&T Strategic Plan.
RD&T Coordination. Operating administrations have demonstrated support for the RD&T strategies and for cooperative research both within and outside DOT.
In concert with the RD&T Planning Council and RD&T Planning Team, RITA is designing a new research investment planning process to fulfill its statutory responsibility for coordinating, facilitating, and reviewing the Department's research and development programs and activities. The research investment planning process will help DOT to institute a collaborative approach to RD&T investments. Moreover, it will allow the Department to:
In conjunction with the implementation of a new research investment planning process, RITA is creating an online searchable database to inventory and track all RD&T activities, from budget planning through execution. The goal is to achieve greater budget transparency and bring into one database all of the RD&T data that are currently scattered among the different operating administrations. When completed, the database will allow policy makers, researchers, and other users to search for RD&T information by research topic, funding level, grant description, contractor, State, Congressional district and more. As such, the database will be a critical tool for coordinating DOT's research investments.
A critical element of the RD&T evaluation best practices and research investment planning process is systematic consultation and engagement with external stakeholders and experts. Such activities avoid duplication, uphold the technical quality of research, and ensure that RD&T programs are wise public investments that address critical needs.
Within the operating administrations, external coordination and review are essential for establishing RD&T priorities, programmatic activities, and performance metrics. Table 5 summarizes operating administrations' external RD&T evaluation processes, the results of their most recent evaluations, and a schedule for planned reviews in FY 2008.
1 Opportunities for Improving the Oversight of DOT's Research Programs and User Satisfaction with Transportation Statistics. GAO-06-917. p. 13.
2 Highway Research: Systematic Selection and Evaluation Processes Needed for Research Program. May 2002. GAO-02-573.
3 Evaluating Federal Research Programs: Research and the Government Performance and Results Act. 1999. Washington, D.C. National Academy Press.
4 Guidance for Completing the Program Assessment Rating Tool (PART). March 2005.