U.S. Department of Housing and Urban Development, Office of Policy Development and Research. A Guide to Evaluating Crime Control of Programs in Public Housing. Washington, DC: Prepared for the U.S. Department of Housing and Urban Development by KRA Corporation; 1997.  pp.8.1-8.13.

View entire document

 

Chapter 8
    Reporting Your Findings

A program evaluation report is an important document. It integrates what you have learned from the evaluation of your initiative. There are different ways of reporting this information, depending on how you want to use the report and on your audience. A program evaluation report can do the following:

These uses suggest that there are various audiences for an evaluation report. These audiences can include staff and Public Housing Agency (PHA) administrators, current and potential funding sources, other PHAs, and local and national advocacy organizations.

Whatever the type of report you plan to develop, it is critical to include statistically nonsignificant analysis results as well as statistically significant ones, because there is as much to learn from program approaches or models that do not work and why you surmise that they don't work as there is from those approaches that do appear to work.

Nonsignificant results should not be thought of as failures. Efforts to change knowledge, attitudes, and behaviors through programmatic interventions are not always going to work. Currently, so little is known about what does and does not work that any information on violence prevention in public housing will greatly increase knowledge in the field.

Preparing an evaluation report for program funders

The report to program funders will probably be the most comprehensive one you prepare. Often funders will use your report to demonstrate the effectiveness of their grant initiatives and to support allocation of additional funds for similar program-related efforts. It is as important to your funding sources to show that your program is important and worthwhile (and a good use of their money) as it is for you to demonstrate that your program works. A report that is useful for this purpose will include detailed information about the program, the evaluation design and methods, and the types of data analyses conducted.

A sample outline of an evaluation report for program funders is shown on the following pages. The outline is developed as a final report and assumes all the information collected on your program has been analyzed. However, this outline may also be used for interim reports, with different sections completed at various times during the evaluation and feedback provided to program personnel on the ongoing status of the evaluation.

Sample Outline
Final Evaluation Report

I. Introduction: General Description of the Initiative (approximately one page long)

  1. Description of initiative components, including services or training delivered and target population for each component.
  2. Description of collaborative efforts (if relevant), including the agencies participating in the collaboration and  their various roles and responsibilities in the initiative.
  3. Description of strategies for recruiting residents (if relevant).
  4. Description of special issues relevant to serving the residents and plans to address them.

    1. Agency and staffing issues.

    2. Residents' cultural backgrounds, socioeconomic status, literacy levels, and so forth.

II. Evaluation of Implementation Objectives

    A. Description of implementation objectives (measurable objectives).

  1. What you planned to do (planned services/interventions/ training/education; duration and intensity of each service/intervention/training period).
  2. Who you planned to have do it (planned -staffing ments and qualifications/characteristics of staff).
  3. Target population (intended characteristics and number of members of the target population to be reached by each service/intervention/training/education effort and how you plan to recruit residents).
  4. Description of the objectives for collaborating with community agencies.

            a. Planned collaborative arrangements.

            b. Services/interventions/training provided by collaborating agencies.

    B. Statement of evaluation questions (Were program implementation objectives attained? If not, why not?        What were the barriers to and facilitators of attaining implementation objectives?).

    Examples:

  1. How successful was the program in attaining its objective of implementing an after school program for resident youths? What were the policies, practices, and procedures used to attain this objective? What were the barriers and facilitators to attaining this objective?
  2. How successful was the program in recruiting the intended target population and serving the expected number of participants? What were the policies, practices, and procedures used to recruit and maintain participants in the program? What were the barriers and facilitators to attaining this objective?
  3. How successful was the program in attaining its objective with respect to establishing collaborative relationships with other agencies in the community? What were the policies, practices, and procedures used to attain this objective? What were the barriers and facilitators to attaining this objective?

    C. Description of data collection methods and data collected for each evaluation question.

    1. Description of data collected.

    2. Description of methodology of data collection.

    3. Description of data sources (such as documents, staff, residents, and collaborating agency staff).

   D. Description of data analysis procedures.

   E. Description of results of analysis.

    1. Statement of findings with respect to each evaluation question.

    Examples:

  1. The program's success in attaining the objective.
  2. The effectiveness of particular policies, practices, and procedures in attaining the objective.
  3. The barriers and facilitators to attaining the objective.

    2. Statement of issues that may have affected the evaluation's findings.

    Examples:

  1. The need to make changes in the evaluation because of changes in program implementation or characteristics of the residents served.
  2. Staff turnover during the research, resulting in inconsistent data collection procedures.
  3. Changes in evaluation staff.

III. Evaluation of Outcome Objectives

    A. Description of outcome objectives (in measurable terms).

  1. What changes were residents expected to exhibit as a result of their participation in each service/intervention/ training module provided by the program?
  2. What changes were residents expected to exhibit as a result of participation in the program in general?
  3. What changes were expended to occur in the community?

    B. Statement of evaluation questions, evaluation design, and method for assessing change for each question.

    Examples:

  1. How effective was the program in attaining its expected outcome of decreasing youth gang involvement? How was this  measured? What design was used to establish that a change occurred and to relate change to the program's interventions (such as pre- and post-intervention, control groups, and comparison groups)? Why was this design selected?
  2. How effective was the program in attaining its expected outcome of increasing youths' self-esteem? How was this measured? What design was used to establish that a change occurred and to relate the change to the interventions? Why was this design selected?
  3. How effective was the program in increasing the knowledge and skills of participants? How was this measured? What design was used to establish that a change occurred and to relate the change to the interventions? Why was this design selected?

    C. Discussion of data collection methodologies for each evaluation question.

    1. Discussion of data collected.

    2. Discussion of methodology of data collection.

    Examples:

  1. Case record reviews.
  2. Interviews.
  3. Self-report questionnaires or inventories. [If you developed an instrument for this evaluation, attach a copy to the final report.
  4. Observations.

    3. Data sources for each evaluation question, and sampling plans when relevant.

    D. Discussion of issues that affected the outcome evaluation and how they were addressed.

    1. Program-related issues.

  1. Staff turnover.
  2. Changes in target population characteristics.
  3. Changes in services/interventions during the course of the program.
  4. Changes in staffing plans.
  5. Changes in collaborative arrangements.
  6. Characteristics of participants.

    2. Evaluation-related issues.

  1. Problems encountered in obtaining participant consent.
  2. Change in numbers of participants served, requiring change in analysis plans.
  3. Questionable cultural relevance of evaluation data collection instruments and/or procedures.

    E. Data analysis procedures.

  1. Summary of procedures.
  2. Distributions of program characteristics and participant (resident) characteristics.
  3. Descriptive measures (for example, averages or most commonly occurring responses).
  4. Cross tabulations (for example, outcome measures by various participant groups).

   F. Results of data analysis.

  1. Present statistically significant and nonsignificant analysis results (including statement of established level of significance) for each outcome evaluation question.
  2. Discuss any issues or problems relevant to the analysis.

    Examples:

  1. Issues relevant to data collection procedures, particularly consistency in methods and consistency among data collectors.
  2. Issues relevant to the number of participants served by the program and those included in the analysis.
  3. Missing data or differences in sizes of samples for the analysis.

    G. Discussion of results.

  1. Provide an interpretation of results for each evaluation question, including any explanatory information from the process evaluation.
    1. The effectiveness of the program in attaining a specific outcome objective.
    2. Variables associated with attaining specific outcomes, such as characteristics of the population, characteristics of the service provider or trainer, duration and/or intensity of services or training, and characteristics of the service or training.
  2. Discuss any issues relevant to the interpretation of results.

IV. Integration of Process and Outcome Evaluation Information

  1. Summary of process evaluation results.
  2. Summary of outcome evaluation results.
  3. Discussion of potential relationships between implementation and outcome evaluation results.

    Examples:

  1. Did particular policies, practices, or procedures used to attain program implementation objectives have differing impacts on participant outcomes?
  2. How did practices and procedures used to recruit and maintain participants in services affect participant outcomes?
  3. What collaboration practices and procedures were found to be related to attaining expected community outcomes?
  4. Were particular training modules more effective than others in attaining expected outcomes for participants? If so, what were the features of these modules that may have contributed to their effectiveness (such as characteristics of the trainers, characteristics of the curriculum, and the duration and intensity of the services)?

V. Recommendations to Program Administrators for Future Program and Evaluation Efforts

Preparing an evaluation report for staff and PHA personnel

An evaluation report for staff and Public Housing Agency (PHA) personnel may be used to support management decisions about ongoing or future violence prevention efforts. This type of report may not need to include as much detail on the evaluation methodology but might focus instead on findings. The report could include the information noted in the sample outline of the final evaluation report, including information in sections II. E. (description of results of analysis of implementation information), III. D. (discussion of issues that affected the outcome evaluation and how they were addressed), III. F. (results of data analysis on outcome information), III. G. (discussion of results), and IV. C. (discussion of potential relationships between implementation and outcome evaluation results).

Final reports should be accompanied by an executive summary of one to five pages that summarizes the key evaluation methods and results, so that readers will not have to review all of the details of the report if they do not have the time.

Disseminating the results of your evaluation

In addition to producing formal evaluation reports, you may want to take advantage of other opportunities to share what you have learned with people in your community or with the field in general. You might want to consider drafting letters to community organizations that may be interested in the activities and results of your work. Other ways to let people know what you have done include the following:

Many of the materials listed in the resources section of this manual contain ideas and guidelines for producing different types of informational products related to evaluations.