Public Health Program Evaluation Instructions
Students will design a Public Health Program Evaluation document. The document will be based on a working public health program in your local community. Be sure to develop an evaluation plan to ensure your program evaluations are ca
ied out efficiently in the future and to ensure your evaluation plan is documented so you can regularly and efficiently ca
y out your evaluation activities.
Plans must include the following sections:
I. Title Page (name of the organization that is being, or has a product/service/program that is being, evaluated; date)
II. Table of Contents
III. Executive Summary (one-page, concise program background , type of evaluation conducted, what decisions are being aided by the findings of the evaluation, who is making the decision
IV. Engagement of Stakeholders
· All groups identified - those involved in program operations; those served or affected by the program; and primary intended users of the evaluation.
· Address rights of human subjects, human interactions, conflict of interest
· Cultural competency addressed
V. Description of the Program
· Statement of need -describes the problem, goal, or opportunity that the program addresses; the nature of the problem or goal, who is affected, how big it is, and whether (and how) it is changing; problem/opportunity to which program is responding to, the program’s specific objectives
· Expectations - program's intended results-what the program has to accomplish to be considered successful; background about organization and program that is being evaluated; organization description/history.
· Activities - everything the program does to
ing about changes- describe program components, elements, strategies, and actions; principal content of the program, delivery model
· Resources- include the time, talent, equipment, information, money, and other assets available to conduct program activities include program costs and the cost-benefit ratio as part of the evaluation; staffing (description of the number of personnel and roles in the organization that are relevant to developing and delivering the program
· Program's stage of development - reflects program maturity; address three phases of development: planning, implementation, and effects or outcomes; program documentation.
· Program's context - the environment in which the program operates; the area's history, geography, politics, and social and economic conditions, and what other organizations have done.
Public Health Program Evaluation – Instructions
· Logic model - sequence of events input, output, and long-term goals, short-term; flow- chart, map, or table to portray the sequence of steps leading to program results; clear description of program inputs, activities/processes; clear description of outcomes and impact.
VI. Evaluation Design
· Purpose - general intent of the evaluation. (gain insight, improve how things get done, determine what the effects of the program are, affect those who participate)
· Users - specific individuals who will receive evaluation findings.
· Uses - what will be done with what is learned from the evaluation.
· Answer specific questions- clarity and appropriateness of research questions, including which stakeholders will utilize the answers
· Methods-(experimental, quasi-experimental, and observational or case study designs)
· Agreements- summarize the evaluation procedures and clarify everyone's roles and responsibilities; describe how the evaluation activities will be implemented.
· Addresses evaluation impact ,practical procedures, political viability, cost effectiveness, service orientation, complete and fair assessment, fiscal responsibility .
VII. Gathering Evidence
· Indicators - general concepts about the program and its expected effects into specific, measurable parts; description of independent and dependent variables and how they will be measured and analyzed.
· Sources of evidence -people, documents, or observations, criteria used to select sources clearly stated; who participated.
· Quality -the appropriateness and integrity of information gathered
· Quantity-the amount of evidence gathered
· Logistics - the methods, timing, and physical infrastructure for gathering and handling evidence time periods sampled, data collection method and tools, any limitations caused by this method and tool
· Address Information scope and selection; information sources, valid and reliable information
VIII. Justification of Conclusions
· Standards -the values held by stakeholders about the program.
· Analysis and synthesis -methods to discover and summarize an evaluation's findings; detect patterns in evidence, by analysis, synthesis, mixed method
· Interpretation - figure out what the findings mean; interpretations and conclusions
· Judgments - statements about the merit, worth, or significance of the program, compare against one or more selected standards.
· Recommendations-actions to consider as a result of the evaluation; recommendations regarding the decisions that must be made about the service/program.
· Addresses identification of values, analysis of quantitative and qualitative information
IX. Use and Dissemination of Lessons Learned
· Design -how the evaluation's questions, methods, and overall processes are constructed.
· Preparation - steps taken to get ready for the future uses of the evaluation findings.
· Feedback - communication that occurs among everyone involved in the evaluation.
· Follow-up - the support that users need during the evaluation and after they receive evaluation findings.
· Dissemination - the process of communicating the procedures or the lessons learned from an evaluation to relevant audiences in a timely, unbiased, and consistent fashion.
X. Reflection on Standards for "good" evaluations Utility Standards
1. Stakeholder Identification: People who are involved in (or will be affected by) the evaluation should be identified, so that their needs can be addressed.
2. Evaluator Credibility: The people conducting the evaluation should be both trustworthy and competent, so that the evaluation will be generally accepted as credible or believable.
3. Information Scope and Selection: Information collected should address pertinent questions about the program, and it should be responsive to the needs and interests of clients and other specified stakeholders.
4. Values Identification: The perspectives, procedures, and rationale used to interpret the findings should be carefully described, so that the bases for judgments about merit and value are clear.
5. Report Clarity: Evaluation reports should clearly describe the program being evaluated, including its context, and the purposes, procedures, and findings of the evaluation. This will help ensure that essential information is provided and easily understood.
6. Report Timeliness and Dissemination: Significant midcourse findings and evaluation reports should be shared with intended users so that they can be used in a timely fashion.
7. Evaluation Impact: Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders, so that the evaluation will be used.
1. Practical Procedures: The evaluation procedures should be practical; to keep disruption of everyday activities to a minimum while needed information is obtained.
2. Political Viability: The evaluation should be planned and conducted with anticipation of the different positions or interests of various groups. This should help in obtaining their cooperation so that possible attempts by these groups to curtail evaluation operations or to misuse the results can be avoided or counteracted.
3. Cost Effectiveness: The evaluation should be efficient and produce enough valuable information that the resources used can be justified.
1. Service Orientation: Evaluations should be designed to help organizations effectively serve the needs of all of the targeted participants.
2. Formal Agreements: The responsibilities in an evaluation (what is to be done, how, by whom, when) should be agreed to in writing, so that those involved are obligated to follow all conditions of the agreement, or to formally renegotiate it.
3. Rights of Human Subjects: Evaluation should be designed and conducted to respect and protect the rights and welfare of human subjects, that is, all participants in the study.
4. Human Interactions: Evaluators should respect basic human dignity and worth when working with other people in an evaluation, so that participants do not feel threatened or harmed.
5. Complete and Fair Assessment: The evaluation should be complete and fair in its examination, recording both strengths and weaknesses of the program being evaluated. This allows strengths to be built upon and problem areas addressed.
6. Disclosure of Findings: The people working on the evaluation should ensure that all of the evaluation findings, along with the limitations of the evaluation, are accessible to everyone affected by the evaluation, and any others with expressed legal rights to receive the results.
7. Conflict of Interest: Conflict of interest should be dealt with openly and honestly, so that it does not compromise the evaluation processes and results.
8. Fiscal Responsibility: The evaluator's use of resources should reflect sound accountability procedures and otherwise be prudent and ethically responsible, so that expenditures are accounted for and appropriate.
1. Program Documentation: The program should be described and documented clearly and accurately, so that what is being evaluated is clearly identified.
2. Context Analysis: The context in which the program exists should be thoroughly examined so that likely influences on the program can be identified.
3. Described Purposes and Procedures: The purposes and procedures of the evaluation should be monitored and described in enough detail that they can be identified and assessed.
4. Defensible Information Sources: The sources of information used in a program evaluation should be described in enough detail that the adequacy of the information could be assessed.
5. Valid Information: The information gathering procedures should be chosen or developed and then implemented in such a way that they will assure that the interpretation a
ived at is valid.
6. Reliable Information: The information gathering procedures should be chosen or developed and then implemented so that they will assure that the information obtained is sufficiently reliable.
7. Systematic Information: The information from an evaluation should be systematically reviewed and any e
ors found should be co
8. Analysis of Quantitative Information: Quantitative information - data from observations or surveys - in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered.
9. Analysis of Qualitative Information: Qualitative information - descriptive information from interviews and other sources - in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered.
10. Justified Conclusions: The conclusions reached in an evaluation should be explicitly justified, so that stakeholders can understand their worth.
11. Impartial Reporting: Reporting procedures should guard against the distortion caused by personal feelings and biases of people involved in the evaluation, so that evaluation reports fairly reflect the evaluation findings.
12. Metaevaluation: The evaluation itself should be evaluated against these and other pertinent standards, so that it is appropriately guided and, on completion, stakeholders can closely examine its strengths and weaknesses.
Public Health Program Evaluation
Students will design a Public Health Program Evaluation document. The document will be based on a
working public health program in your local community. Be sure to develop an evaluation plan to
ensure your program evaluations are ca
ied out efficiently in
the future and to ensure your evaluation
plan is documented so you can regularly and efficiently ca
y out your evaluation activities.
Plans must include the following sections: