U.S. Department of Housing and Urban Development, Office of Policy Development and
Research. A Guide to Evaluating Crime Control of Programs in Public Housing. Washington,
DC: Prepared for the U.S. Department of Housing and Urban Development by KRA Corporation;
View entire document
Reporting Your Findings
A program evaluation report is an important document. It integrates what you have
learned from the evaluation of your initiative. There are different ways of reporting this
information, depending on how you want to use the report and on your audience. A program
evaluation report can do the following:
- Guide management decisions by identifying areas in which changes may be needed for
- Tell the story of implementing your initiative and show the impact on residents.
- Advocate your initiative to potential funders or to other agencies in the community.
- Improve violence prevention efforts in public housing.
These uses suggest that there are various audiences for an evaluation report. These
audiences can include staff and Public Housing Agency (PHA) administrators, current and
potential funding sources, other PHAs, and local and national advocacy organizations.
Whatever the type of report you plan to develop, it is critical to include
statistically nonsignificant analysis results as well as statistically significant ones,
because there is as much to learn from program approaches or models that do not work and
why you surmise that they don't work as there is from those approaches that do appear to
Nonsignificant results should not be thought of as failures. Efforts to change
knowledge, attitudes, and behaviors through programmatic interventions are not always
going to work. Currently, so little is known about what does and does not work that any
information on violence prevention in public housing will greatly increase knowledge in
Preparing an evaluation report for program funders
The report to program funders will probably be the most comprehensive one you prepare.
Often funders will use your report to demonstrate the effectiveness of their grant
initiatives and to support allocation of additional funds for similar program-related
efforts. It is as important to your funding sources to show that your program is important
and worthwhile (and a good use of their money) as it is for you to demonstrate that your
program works. A report that is useful for this purpose will include detailed information
about the program, the evaluation design and methods, and the types of data analyses
A sample outline of an evaluation report for program funders is shown on the following
pages. The outline is developed as a final report and assumes all the information
collected on your program has been analyzed. However, this outline may also be used for
interim reports, with different sections completed at various times during the evaluation
and feedback provided to program personnel on the ongoing status of the evaluation.
Final Evaluation Report
I. Introduction: General Description of the Initiative
(approximately one page long)
- Description of initiative components, including services or training delivered
and target population for each component.
- Description of collaborative efforts (if relevant), including the agencies
participating in the collaboration and their various roles and responsibilities in
- Description of strategies for recruiting residents (if relevant).
- Description of special issues relevant to serving the residents and plans to
1. Agency and staffing issues.
2. Residents' cultural backgrounds, socioeconomic status, literacy
levels, and so forth.
II. Evaluation of Implementation Objectives
A. Description of implementation objectives (measurable
- What you planned to do (planned services/interventions/ training/education; duration and
intensity of each service/intervention/training period).
- Who you planned to have do it (planned -staffing ments and
qualifications/characteristics of staff).
- Target population (intended characteristics and number of members of the target
population to be reached by each service/intervention/training/education effort and how
you plan to recruit residents).
- Description of the objectives for collaborating with community agencies.
Services/interventions/training provided by collaborating agencies.
B. Statement of evaluation questions (Were program
implementation objectives attained? If not, why not?
What were the barriers to and facilitators of attaining implementation objectives?).
- How successful was the program in attaining its objective of implementing an after
school program for resident youths? What were the policies, practices, and procedures
used to attain this objective? What were the barriers and facilitators to attaining this
- How successful was the program in recruiting the intended target population and serving
the expected number of participants? What were the policies, practices, and procedures
used to recruit and maintain participants in the program? What were the barriers and
facilitators to attaining this objective?
- How successful was the program in attaining its objective with respect to establishing
collaborative relationships with other agencies in the community? What were the policies,
practices, and procedures used to attain this objective? What were the barriers and
facilitators to attaining this objective?
C. Description of data collection methods and data collected
for each evaluation question.
1. Description of data collected.
2. Description of methodology of data collection.
3. Description of data sources (such as documents, staff, residents,
and collaborating agency staff).
D. Description of data analysis procedures.
E. Description of results of analysis.
1. Statement of findings with respect to each evaluation question.
- The program's success in attaining the objective.
- The effectiveness of particular policies, practices, and procedures in attaining the
- The barriers and facilitators to attaining the objective.
2. Statement of issues that may have affected the evaluation's
- The need to make changes in the evaluation because of changes in program implementation
or characteristics of the residents served.
- Staff turnover during the research, resulting in inconsistent data collection
- Changes in evaluation staff.
III. Evaluation of Outcome Objectives
A. Description of outcome objectives (in measurable terms).
- What changes were residents expected to exhibit as a result of their participation in
each service/intervention/ training module provided by the program?
- What changes were residents expected to exhibit as a result of participation in the
program in general?
- What changes were expended to occur in the community?
B. Statement of evaluation questions, evaluation design, and
method for assessing change for each question.
- How effective was the program in attaining its expected outcome of decreasing youth gang
involvement? How was this measured? What design was used to establish that a change
occurred and to relate change to the program's interventions (such as pre- and
post-intervention, control groups, and comparison groups)? Why was this design selected?
- How effective was the program in attaining its expected outcome of increasing youths'
self-esteem? How was this measured? What design was used to establish that a change
occurred and to relate the change to the interventions? Why was this design selected?
- How effective was the program in increasing the knowledge and skills of participants?
How was this measured? What design was used to establish that a change occurred and to
relate the change to the interventions? Why was this design selected?
C. Discussion of data collection methodologies for each
1. Discussion of data collected.
2. Discussion of methodology of data collection.
- Case record reviews.
- Self-report questionnaires or inventories. [If you developed an instrument for this
evaluation, attach a copy to the final report.
3. Data sources for each evaluation question, and sampling plans
D. Discussion of issues that affected the outcome evaluation
and how they were addressed.
1. Program-related issues.
- Staff turnover.
- Changes in target population characteristics.
- Changes in services/interventions during the course of the program.
- Changes in staffing plans.
- Changes in collaborative arrangements.
- Characteristics of participants.
2. Evaluation-related issues.
- Problems encountered in obtaining participant consent.
- Change in numbers of participants served, requiring change in analysis plans.
- Questionable cultural relevance of evaluation data collection instruments and/or
E. Data analysis procedures.
- Summary of procedures.
- Distributions of program characteristics and participant (resident) characteristics.
- Descriptive measures (for example, averages or most commonly occurring responses).
- Cross tabulations (for example, outcome measures by various participant groups).
F. Results of data analysis.
- Present statistically significant and nonsignificant analysis results (including
statement of established level of significance) for each outcome evaluation question.
- Discuss any issues or problems relevant to the analysis.
- Issues relevant to data collection procedures, particularly consistency in methods and
consistency among data collectors.
- Issues relevant to the number of participants served by the program and those included
in the analysis.
- Missing data or differences in sizes of samples for the analysis.
G. Discussion of results.
- Provide an interpretation of results for each evaluation question, including any
explanatory information from the process evaluation.
- The effectiveness of the program in attaining a specific outcome objective.
- Variables associated with attaining specific outcomes, such as characteristics of the
population, characteristics of the service provider or trainer, duration and/or intensity
of services or training, and characteristics of the service or training.
- Discuss any issues relevant to the interpretation of results.
IV. Integration of Process and Outcome Evaluation Information
- Summary of process evaluation results.
- Summary of outcome evaluation results.
- Discussion of potential relationships between implementation and outcome
- Did particular policies, practices, or procedures used to attain program implementation
objectives have differing impacts on participant outcomes?
- How did practices and procedures used to recruit and maintain participants in services
affect participant outcomes?
- What collaboration practices and procedures were found to be related to attaining
expected community outcomes?
- Were particular training modules more effective than others in attaining expected
outcomes for participants? If so, what were the features of these modules that may have
contributed to their effectiveness (such as characteristics of the trainers,
characteristics of the curriculum, and the duration and intensity of the services)?
V. Recommendations to Program Administrators for Future Program and
Preparing an evaluation report for staff and PHA personnel
An evaluation report for staff and Public Housing Agency (PHA) personnel may be used to
support management decisions about ongoing or future violence prevention efforts. This
type of report may not need to include as much detail on the evaluation methodology but
might focus instead on findings. The report could include the information noted in the
sample outline of the final evaluation report, including information in sections II. E.
(description of results of analysis of implementation information), III. D. (discussion of
issues that affected the outcome evaluation and how they were addressed), III. F. (results
of data analysis on outcome information), III. G. (discussion of results), and IV. C.
(discussion of potential relationships between implementation and outcome evaluation
Final reports should be accompanied by an executive summary of one to five pages that
summarizes the key evaluation methods and results, so that readers will not have to review
all of the details of the report if they do not have the time.
Disseminating the results of your evaluation
In addition to producing formal evaluation reports, you may want to take advantage of
other opportunities to share what you have learned with people in your community or with
the field in general. You might want to consider drafting letters to community
organizations that may be interested in the activities and results of your work. Other
ways to let people know what you have done include the following:
- Producing press releases and articles for local professional publications such as
newsletters and journals.
- Making presentations on the results of your program at meetings at the local police
department, university, public library, or other settings.
- Listing your evaluation report or other evaluation-related publications in relevant
databases, on electronic bulletin boards, and with clearinghouses.
- Making phone calls and scheduling meetings with similar programs to share your
experience and results.
Many of the materials listed in the resources section of this manual contain ideas and
guidelines for producing different types of informational products related to evaluations.