Performance Measurement

Performance measurement is a research technique and tool used to help BJA meet the requirements of the Government Performance and Results Act Modernization Act (GPRA Modernization Act; Public Law 111-352), which was signed into law in 2011. The act requires federal agencies to report performance data more frequently and to link strategic planning to performance goals. BJA’s commitment to achieving the goals of the GPRA Modernization Act is outlined in the FY 2013-2018 Strategic Plan, which was unveiled in November 2012.

BJA believes that effective program management starts with meaningful performance measures designed to easily collect individual grantee performance data, along with a system to analyze this data to inform program management decisions. In FY 2012, BJA continued to invest in the Performance Measurement Tool (PMT),, for its grantees to report results. BJA uses a comprehensive analysis process to produce performance reports on grantee data. With the help of TTA partners, BJA analyzes data and targets strategies that address issues identified from the data. These analyses are compiled in a series of performance reports published by BJA, fostering transparency about the use of taxpayer funds.

BJA uses performance measures to assess grantee progress toward program goals and objectives as well as for an accountability tool. Program performance measures are indicators, statistics, or metrics used to gauge program performance, conveying the extent to which each program’s purpose or goals are being met. BJA uses performance measures for setting program priorities; allocating resources; adopting new program approaches or changing processes; sharing these with appropriate stakeholders; and setting expectations for grantees and mentoring them.

FY 2012 Accomplishments

In FY 2012, BJA added new performance measures in the PMT for the Tribal Criminal and Civil Legal Assistance Program. BJA worked with stakeholders, practitioners, and GAO to refine performance measures for certain programs whose data were already being collected to ensure relevancy, consistency, and accuracy. Performance measures were revised for the following programs in FY 2012:

Based on recommendations from GAO’s Adult Drug Court Grant Program review (GAO 12-53), BJA worked toward improving its performance management process by using the recommendations in GAO’s Managing for Results: Enhancing Agency Use of Performance Information Management Decision Making (GAO 05-927, September 9, 2005).

BJA also worked with analysts in FY 2012 to interpret grantee data that were used in written reports, data requests and data memos, research and evaluation, and GrantStat. The GrantStat process allows BJA to take a closer look not only at grantees but also at overall program success. Using these data, BJA is also better able to direct TTA.

Performance Measure Development

Literature Review and Logic Model Development

In FY 2012, BJA implemented the performance measure development process. This ensures that performance measures reflect current research while still allowing for substantial stakeholder input. Developing new performance measures starts with a review of the program solicitation, as it outlines purpose areas and allowable activities within the program and serves as a guide for identifying outcomes. It is then important to develop a broader understanding of the program and the scientific literature on which it is based. The scientific literature helps to identify key program design features, processes, and outcomes. All of this information is then laid out in a logic model that ultimately informs the “problem statement,” goals of the program, inputs and outputs, activities, and program outcomes. The use of a logic model helps to identify the steps that must be taken (and measured) to achieve certain outcomes.

For example, in FY 2012, BJA started developing a comprehensive set of measures for the Tribal Court Assistance Program and the Indian Alcohol and Substance Abuse Prevention Program. BJA recognizes how important it is to develop measures that are meaningful and sensitive to tribal issues. Before the measures were drafted and while the logic model was being created, analysts reviewed the pertinent scientific literature. They consulted a wide range of papers, including Fahey, King, and Kane’s Crime and Justice in Indian Country: A Summary of Talking Circle Findings and the Tribal Law and Order Act of 2010, a 2011 report from the Crime and Justice Institute at Community Resources for Justice in Boston. The literature review helped identify issues and matters pertinent to tribal courts, such as the complexities of establishing a tribal court in Indian Country and addressing related jurisdictional issues.

In this process, BJA realized it was also essential to engage the TTA providers to help inform the process from a local perspective. These TTA providers generally have intimate knowledge of and experience with local tribal grantees and their programs. Feedback from TTA providers informed the development of the logic model and draft performance measures, ensuring these new measures are meaningful and responsive to tribal issues.

Pilot Period Review and Revisions

From time to time, BJA revises performance measures to better reflect grant activity, reflect new knowledge and research, and respond to the needs of legislators, the academic community, and other stakeholders. This was the case in FY 2012 for PDMP and the SCA Program.

The performance measures for the PDMP were updated in FY 2012 to reflect changes in the way researchers use the performance measurement data reported by grantees. Researchers told BJA that the timeframes for some of the measures BJA collects are not consistent with other research in the field, and BJA responded by adjusting the reporting guidelines. This included providing training and technical support to grantees during the change and to researchers in making them aware of the new information. Grantees began reporting on the new measures in January 2013.

The revision of performance measures for SCA Programs was a multistep process that started with identifying problematic measures and developing revised measures with guidance from relevant stakeholders, such as TTA providers. Once a revised set of draft measures was developed, BJA staff and other stakeholders vetted the new instrument so that analysts could receive their feedback and suggested changes. The final step was a vetting process with grantees to gather more feedback and explain the data collection process to ensure accurate data collection. Feedback from these vetting sessions was included in the revised measures, which were returned to BJA staff for final approval. The vetting process included a webinar with 196 grantees participating in the vetting process, representing 33 percent of all SCA organizations. In addition, 39 grantees attended in-person vetting workshops at the SCA Conference in May 2012.

Once the revised measures were made available in the PMT, grantees entered a pilot data entry period of two FY 2012 quarters. The pilot period data were used to ensure that data entered by grantees are valid and reliable, and that grantees have a chance to become familiar with the reporting requirements, system, and process. In the final step, analysts conduct data validations and reliability assurances to verify that the measures are accurate. This process includes a review of the new data to validate and, if necessary, edit the questions or better define what each measure is asking; and to establish additional validation checks and rules in the system.

Oversight and Accountability

In FY 2012, GAO conducted performance measurement reviews of the ADGDGP (GAO 12-53) and the ARRA JAG Program (GAO 11-87). For the drug court program, GAO made one recommendation for executive action: that BJA “document key methods used to guide future revisions of its adult drug-court program performance measures.” This recommendation included a plan for assessing measures after grantees report on them and a documented rationale for when measures are refined. BJA took action to respond to these findings and successfully closed out the GAO review.

For the ARRA JAG program, GAO made two recommendations for executive action. First, GAO recommended that BJA “consider, as appropriate, key attributes of successful performance measurements systems, such as clarity, reliability, linkage, objectivity, and measurable targets.” In response, BJA received feedback from more than 100 state and local grantees to assist with developing relevant performance measures. The process included telephone interviews with grantees, a focus group, and meetings with stakeholders. Analysts and staff conferred with experts in performance measurement and evaluation of formula grant programs, made presentations to constituent groups, and vetted the measures with grantees.

For ARRA JAG, GAO also recommended that BJA develop a mechanism to validate the integrity of performance data. In response, BJA implemented system-level validation rules in the PMT that help verify and identify inconsistent responses at the data entry level.

Selected Programs and PMT Data Profiles

Adult Drug Court Discretionary Grant Program

The ADCDGP is intended to build and/or expand drug court capacity at the state, local, and tribal levels to reduce crime and substance abuse among high-risk, high-need offenders. Some of the key components that serve as guidelines for drug court operations include early intervention and intensive treatment, close judicial supervision, mandatory and random drug and alcohol testing, community supervision, appropriate incentives and sanctions, and recovery support services.

As of July-September 2012, 1,148 drug courts were operating and reporting data into the PMT. The ADCDGP had an overall graduation rate of 47 percent in FY 2012, coming within 1 percentage point of achieving its goal of 48 percent (Figure 1). However, in the last two quarters of FY 2012, the target graduation rate was achieved. In total, 5,316 drug court participants graduated from their programs. BJA is striving to ensure drug courts are institutionalizing the use of validated risk and needs assessment instruments. In FY 2012, 14,106 assessments of participants were completed, which revealed that more than half (52 percent) were high-risk, high-need offenders. Randomized drug and alcohol testing is an important component of the drug court program. In FY 2012, more than 66,000 drug and alcohol tests were administered, and only approximately 17 percent revealed illegal substance use.

The drug court program has also noted a number of successes represented in the qualitative data collection portion of the PMT. As seen in Case Study 1, these results show evidence of grantee efforts and funding impact.

Second Chance Act Adult Offender Reentry Demonstration Program

As part of BJA’s evidence-based initiative, the SCA Adult Offender Reentry Demonstration Program collects performance measures on a number of key areas, such as program capacity; new admissions; percentage of high-, moderate-, and low-risk offenders; employment outcomes; and program completion. Through measuring program performance in this way, BJA can identify program success, offer evidence to inform the program, and prepare information and findings to share with the criminal justice field.


The reentry program had a number of successes in FY 2012. The overall completion rates were 83 percent for pre-release and 47 percent for post-release programs. The reentry program has also been successfully serving target populations. This program was intended to focus on moderate- to high-risk offenders convicted as adults. Figure 2 indicates that over the last three reporting periods, 89 percent of the enrollees on average who were assessed using a valid risk and needs assessment instrument were found to be either moderate or high risk. This indicates the enrolled population is as intended.

The reentry program has also had a number of successes reported by the grantees in the qualitative data collection portion of the PMT. As seen in Case Study 2, these results show the specific effects of the programs at the street level and offer evidence of grantee efforts.

Edward Byrne Memorial Justice Assistance Grant Program

The JAG Program is the leading source of federal justice funding to state and local jurisdictions. The program provides states, tribes, and local governments with critical funding necessary to support a range of program areas.

BJA’s review of performance data submitted by JAG grantees revealed a need to revise the measures, to reduce the burden placed on grantees to collect large amounts of data, and to foster clarity of understanding among all grant recipients. The ultimate goal of the revision process was to produce measures that accurately convey the value of JAG grants and improve the overall quality of programs.

Over the past 2 years, BJA has been revising the performance measures for the JAG Program in response to the GAO report Recovery Act: Department of Justice Could Better Assess Justice Assistance Grant Program Impact (GAO-11-87). To address the report findings, BJA developed a process to understand the concerns of grantees and to better respond to stakeholders. This required a new set of measures to meet BJA’s reporting needs to internal and external stakeholders. The measures were also developed to give grantees the opportunity to better describe how they used JAG funding and what their programs accomplished during the reporting period.

Under the new measures, grantees report their data according to the seven purpose areas listed below. Their reporting gives detailed data on the amount of funding spent on each purpose area and the activities conducted in each (Figure 3). JAG grantees began to collect data on the new JAG performance measures in April 2012 and submitted their first data report in June 2012. The following is a list of the seven program areas and the types of data collected with the new measures:

In addition to the new questions, the PMT questionnaires now include a narrative section for grantees to provide details of their accomplishments during the reporting period (Case Study 3). Accomplishments include any additional information or program accomplishments grantees want to share with BJA, including benefits or changes observed as a result of their JAG-funded activities.

Research and Data Analysis

Program Performance Reports

In FY 2012, BJA expanded reporting of grantee performance information by adopting a new format for reports. Responding to feedback from BJA staff, grantees, and the general public, BJA unveiled new Program Performance Reports that reflect input from these audiences. Designed to be short and flexible, these reports include information on the status of up to six key performance variables that give an overall performance picture across all grantees. These reports are used by BJA staff to identify issues in grantee performance or reporting, distributed to grantees to allow for peer-to-peer performance comparisons, and are posted on the BJA web site. Sample Program Performance Reports can be found on BJA's SCA Program web page.


Annual Reports

Annual reports provide an overall picture of grantee and program performance in a fiscal year. This gives BJA and the public a strategic view of grant activity and allows for adjustment in grantee strategy and overall program management. Selected annual reports are posted on BJA’s web site. In FY 2012, BJA produced a special report on JAG Drug Task Force performance measures that detailed the activity and findings of a special task force. The task force was convened to gather information from grantees and stakeholders on the JAG Drug Task Force performance measures. The goals were to improve the findings by analyzing the data that grantees report, and to reduce the level of effort required of grantees to report this information. This report can be found at

Closeout Reports

As noted previously, BJA revises performance measures to better reflect grant activity or respond to the needs of BJA staff, legislators, the academic community, and other stakeholders. When measures are revised, BJA often produces a closeout report of the existing measures in preparation for rollout of the new measures. Whenever possible, these reports are posted on BJA’s web site. For example, the closeout report for the previous JAG measures can be found at; the closeout report for the measures used for the Justice and Mental Health Collaboration Program through FY 2011 is available at

Validity and Reliability Report

When new measures are developed, BJA produces a Validity and Reliability report, which details the measurement and analysis properties of the new measures. The results of the analysis are used to revise the measures before they are finalized and used in future reporting periods. The Drug Courts Quality Assurance Plan, for example, can be accessed at

Special Reports

BJA also produces a variety of special reports to meet specific needs of the criminal justice field, including three created in FY 2012:


During FY 2012, BJA replicated and expanded its GrantStat process, which is used to continuously analyze and monitor grant and program performance. GrantStat is based on the anticrime strategy CompStat, which law enforcement agencies across the country use to help reduce crime through systematic data collection, crime analysis, and heightened accountability. GrantStat helps BJA staff assess program performance to
determine the health of a cohort of grantees, address the needs of individual grantees, and identify promising practices that can be studied further and shared with others.

In FY 2012, BJA revisited the programs reviewed in FY 2011 (the Adult Drug Court Program, the Correctional Facilities on Tribal Land program, and the SCA Reentry Demonstration program) and expanded the process to include SAVIN and SPI. During the GrantStat review process, BJA identified opportunities to improve grantee performance by targeting technical assistance to specific grantees. The process facilitated information exchange between BJA staff and TTA providers, resulting in identification of program-wide issues and trends that were previously unavailable. In addition to program management changes, BJA has changed some internal processes as a result of GrantStat. These include adding information to solicitations to help applicants design and propose projects that use evidence-based techniques, and revising program performance measures to address information needs identified during GrantStat.

Training and Technical Assistance

In FY 2012, the PMT Help Desk fielded and responded to 6,402 technical assistance requests from BJA grantees and conducted 48 formal and informal (webinar) training events. A total of 3,091 BJA grantees and federal staff attended at least one training event. Figure 4 presents a breakdown of technical assistance requests by grant program and type for each quarter in FY 2012.