Print This Page

Create a Strategy

Have you narrowed down the evaluation questions? Have you discussed how evaluation responsibilities might be divided? If so, you are ready to begin organizing the information and writing up an evaluation strategy. Keep in mind:

  • If you are working with an outside evaluator, it is important that you and your staff understand why the evaluator has chosen a specific design or methodology.
  • It is also necessary that an expert reviews the evaluation strategy to make sure the selected methodology is appropriate for answering your evaluation questions.

The common ingredients of an evaluation strategy include:

  1. The purpose of the evaluation
  2. The list of evaluation questions
  3. The program logic model or a list of program objectives agreed upon by the evaluation team
  4. A list of data sources, data collection methods, and a strategy for data analysis
  5. Procedures for managing the evaluation including the division of responsibilities and evaluation timelines

Below is a more detailed outline of an evaluation strategy, developed by the Administration of Children and Families. It is not necessary for your program to include all of the information in this outline. Given the specific purpose of your evaluation, the evaluation strategy that you develop may emphasize only certain components found here.

Sample Outline for Evaluation Plan

  • Evaluation framework
    • What you are going to evaluate
      • Program model (assumptions about target population, interventions, immediate outcomes, intermediate outcomes, and final outcomes)
      • Program implementation objectives (stated in general and then measurable terms)
        • What you plan to do and how
        • Who will do it
        • Participant population and recruitment strategies
      • Participant outcome objectives (stated in general and then measurable terms)
      • Context for the evaluation
    • Questions to be addressed in the evaluation
      • Are implementation objectives being attained? If not, why (i.e., what barriers or problems have been encountered?)? What kinds of things facilitated implementation?
      • Are participant outcome objectives being attained? If not, why (i.e., what barriers or problems have been encountered?)? What kinds of things facilitated attainment of participant outcomes?
        • Do participant outcomes vary as a function of program features (i.e., which aspects of the program are most predictive of expected outcomes?)?
        • Do participant outcomes vary as a function of characteristics of the participants or staff?
    • Timeframe for the evaluation
      • When data collection will begin and end
      • How and why timeframe was selected
  • Evaluating implementation objectives – procedures and methods (Question 1: Are implementation objectives being attained, and if not, why not?)
    • Objective 1 (state objective in measurable terms)
      • Type of information needed to determine if objective 1 is being attained and to assess barriers and facilitators
      • Sources of information (i.e., where you plan to get the information including staff, participants, program documents); be sure to include your plans for maintaining confidentiality of the information obtained during the evaluation
      • How sources of information were selected
      • Time frame for collecting information
      • Methods for collecting the information (such as interviews, paper and pencil instruments, observations, records reviews)
      • Methods for analyzing the information to determine whether the objective was attained (i.e., tabulation of frequencies, assessment of relationships between or among variables)
    • Repeat this information for each implementation objective being assessed in the evaluation.
  • Evaluating participant outcome objectives-procedures and methods (Question 2: Are participant outcome objectives being attained and if not, why not?)
    • Evaluation design
    • Objective 1 (state outcome objective in measurable terms)
      • Types of information needed to determine if objective 1 is being attained (i.e., what evidence will you use to demonstrate the change?)
      • Methods of collecting that information (e.g., questionnaires, observations, surveys, interviews) and plans for pilot-testing information collection methods
      • Sources of information (such as program staff, participants, agency staff, program managers, etc.) and sampling plan, if relevant
      • Timeframe for collecting information
      • Methods for analyzing the information to determine whether the objective was attained (i.e., tabulation of frequencies, assessment of relationships between or among variables using statistical tests)
    • Repeat this information for each participant outcome objective being assessed in the evaluation
  • Procedures for managing and monitoring the evaluation
    • Procedures for training staff to collect evaluation-related information
    • Procedures for conducting quality control checks of the information collection process
    • Timelines for collecting, analyzing, and reporting information, including procedures for providing evaluation-related feedback to program managers and staff