Great Deal! Get Instant $10 FREE in Account on First Order + 10% Cashback on Every Order Order Now

Public Health Program Evaluation Instructions Students will design a Public Health Program Evaluation document. The document will be based on a working public health program in your local community....

1 answer below »
Public Health Program Evaluation Instructions
Students will design a Public Health Program Evaluation document. The document will be based on a working public health program in your local community. Be sure to develop an evaluation plan to ensure your program evaluations are ca
ied out efficiently in the future and to ensure your evaluation plan is documented so you can regularly and efficiently ca
y out your evaluation activities.
Plans must include the following sections:
I. Title Page (name of the organization that is being, or has a product/service/program that is being, evaluated; date)
II. Table of Contents
III. Executive Summary (one-page, concise program background , type of evaluation conducted, what decisions are being aided by the findings of the evaluation, who is making the decision
IV. Engagement of Stakeholders
· All groups identified - those involved in program operations; those served or affected by the program; and primary intended users of the evaluation.
· Address rights of human subjects, human interactions, conflict of interest
· Cultural competency addressed
V. Description of the Program
· Statement of need -describes the problem, goal, or opportunity that the program addresses; the nature of the problem or goal, who is affected, how big it is, and whether (and how) it is changing; problem/opportunity to which program is responding to, the program’s specific objectives
· Expectations - program's intended results-what the program has to accomplish to be considered successful; background about organization and program that is being evaluated; organization description/history.
· Activities - everything the program does to
ing about changes- describe program components, elements, strategies, and actions; principal content of the program, delivery model
· Resources- include the time, talent, equipment, information, money, and other assets available to conduct program activities include program costs and the cost-benefit ratio as part of the evaluation; staffing (description of the number of personnel and roles in the organization that are relevant to developing and delivering the program
· Program's stage of development - reflects program maturity; address three phases of development: planning, implementation, and effects or outcomes; program documentation.
· Program's context - the environment in which the program operates; the area's history, geography, politics, and social and economic conditions, and what other organizations have done.
Public Health Program Evaluation – Instructions
· Logic model - sequence of events input, output, and long-term goals, short-term; flow- chart, map, or table to portray the sequence of steps leading to program results; clear description of program inputs, activities/processes; clear description of outcomes and impact.
VI. Evaluation Design
· Purpose - general intent of the evaluation. (gain insight, improve how things get done, determine what the effects of the program are, affect those who participate)
· Users - specific individuals who will receive evaluation findings.
· Uses - what will be done with what is learned from the evaluation.
· Answer specific questions- clarity and appropriateness of research questions, including which stakeholders will utilize the answers
· Methods-(experimental, quasi-experimental, and observational or case study designs)
· Agreements- summarize the evaluation procedures and clarify everyone's roles and responsibilities; describe how the evaluation activities will be implemented.
· Addresses evaluation impact ,practical procedures, political viability, cost effectiveness, service orientation, complete and fair assessment, fiscal responsibility .
VII. Gathering Evidence
· Indicators - general concepts about the program and its expected effects into specific, measurable parts; description of independent and dependent variables and how they will be measured and analyzed.
· Sources of evidence -people, documents, or observations, criteria used to select sources clearly stated; who participated.
· Quality -the appropriateness and integrity of information gathered
· Quantity-the amount of evidence gathered
· Logistics - the methods, timing, and physical infrastructure for gathering and handling evidence time periods sampled, data collection method and tools, any limitations caused by this method and tool
· Address Information scope and selection; information sources, valid and reliable information
VIII. Justification of Conclusions
· Standards -the values held by stakeholders about the program.
· Analysis and synthesis -methods to discover and summarize an evaluation's findings; detect patterns in evidence, by analysis, synthesis, mixed method
· Interpretation - figure out what the findings mean; interpretations and conclusions
· Judgments - statements about the merit, worth, or significance of the program, compare against one or more selected standards.
· Recommendations-actions to consider as a result of the evaluation; recommendations regarding the decisions that must be made about the service/program.
· Addresses identification of values, analysis of quantitative and qualitative information
IX. Use and Dissemination of Lessons Learned
· Design -how the evaluation's questions, methods, and overall processes are constructed.
· Preparation - steps taken to get ready for the future uses of the evaluation findings.
· Feedback - communication that occurs among everyone involved in the evaluation.
· Follow-up - the support that users need during the evaluation and after they receive evaluation findings.
· Dissemination - the process of communicating the procedures or the lessons learned from an evaluation to relevant audiences in a timely, unbiased, and consistent fashion.
X. Reflection on Standards for "good" evaluations Utility Standards
1. Stakeholder Identification: People who are involved in (or will be affected by) the evaluation should be identified, so that their needs can be addressed.
2. Evaluator Credibility: The people conducting the evaluation should be both trustworthy and competent, so that the evaluation will be generally accepted as credible or believable.
3. Information Scope and Selection: Information collected should address pertinent questions about the program, and it should be responsive to the needs and interests of clients and other specified stakeholders.
4. Values Identification: The perspectives, procedures, and rationale used to interpret the findings should be carefully described, so that the bases for judgments about merit and value are clear.
5. Report Clarity: Evaluation reports should clearly describe the program being evaluated, including its context, and the purposes, procedures, and findings of the evaluation. This will help ensure that essential information is provided and easily understood.
6. Report Timeliness and Dissemination: Significant midcourse findings and evaluation reports should be shared with intended users so that they can be used in a timely fashion.
7. Evaluation Impact: Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders, so that the evaluation will be used.
Feasibility Standards
1. Practical Procedures: The evaluation procedures should be practical; to keep disruption of everyday activities to a minimum while needed information is obtained.
2. Political Viability: The evaluation should be planned and conducted with anticipation of the different positions or interests of various groups. This should help in obtaining their cooperation so that possible attempts by these groups to curtail evaluation operations or to misuse the results can be avoided or counteracted.
3. Cost Effectiveness: The evaluation should be efficient and produce enough valuable information that the resources used can be justified.
Propriety Standards
1. Service Orientation: Evaluations should be designed to help organizations effectively serve the needs of all of the targeted participants.
2. Formal Agreements: The responsibilities in an evaluation (what is to be done, how, by whom, when) should be agreed to in writing, so that those involved are obligated to follow all conditions of the agreement, or to formally renegotiate it.
3. Rights of Human Subjects: Evaluation should be designed and conducted to respect and protect the rights and welfare of human subjects, that is, all participants in the study.
4. Human Interactions: Evaluators should respect basic human dignity and worth when working with other people in an evaluation, so that participants do not feel threatened or harmed.
5. Complete and Fair Assessment: The evaluation should be complete and fair in its examination, recording both strengths and weaknesses of the program being evaluated. This allows strengths to be built upon and problem areas addressed.
6. Disclosure of Findings: The people working on the evaluation should ensure that all of the evaluation findings, along with the limitations of the evaluation, are accessible to everyone affected by the evaluation, and any others with expressed legal rights to receive the results.
7. Conflict of Interest: Conflict of interest should be dealt with openly and honestly, so that it does not compromise the evaluation processes and results.
8. Fiscal Responsibility: The evaluator's use of resources should reflect sound accountability procedures and otherwise be prudent and ethically responsible, so that expenditures are accounted for and appropriate.
Accuracy Standards
1. Program Documentation: The program should be described and documented clearly and accurately, so that what is being evaluated is clearly identified.
2. Context Analysis: The context in which the program exists should be thoroughly examined so that likely influences on the program can be identified.
3. Described Purposes and Procedures: The purposes and procedures of the evaluation should be monitored and described in enough detail that they can be identified and assessed.
4. Defensible Information Sources: The sources of information used in a program evaluation should be described in enough detail that the adequacy of the information could be assessed.
5. Valid Information: The information gathering procedures should be chosen or developed and then implemented in such a way that they will assure that the interpretation a
ived at is valid.
6. Reliable Information: The information gathering procedures should be chosen or developed and then implemented so that they will assure that the information obtained is sufficiently reliable.
7. Systematic Information: The information from an evaluation should be systematically reviewed and any e
ors found should be co
8. Analysis of Quantitative Information: Quantitative information - data from observations or surveys - in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered.
9. Analysis of Qualitative Information: Qualitative information - descriptive information from interviews and other sources - in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered.
10. Justified Conclusions: The conclusions reached in an evaluation should be explicitly justified, so that stakeholders can understand their worth.
11. Impartial Reporting: Reporting procedures should guard against the distortion caused by personal feelings and biases of people involved in the evaluation, so that evaluation reports fairly reflect the evaluation findings.
12. Metaevaluation: The evaluation itself should be evaluated against these and other pertinent standards, so that it is appropriately guided and, on completion, stakeholders can closely examine its strengths and weaknesses.
Public Health Program Evaluation

Students will design a Public Health Program Evaluation document. The document will be based on a
working public health program in your local community. Be sure to develop an evaluation plan to
ensure your program evaluations are ca
ied out efficiently in

the future and to ensure your evaluation
plan is documented so you can regularly and efficiently ca
y out your evaluation activities.

Plans must include the following sections:



Answered Same Day Jun 25, 2021


Paulami answered on Jun 26 2021
118 Votes
Running head: HEALTHCARE
) (
(Breast Cancer Screening Program)
Name of the Student:
Name of the University:
Table of Contents
Summary    3
Stakeholders’ engagement    3
Program description    4
Logic model    5
Design evaluation    6
Evidence gathered    6
Justifying conclusions    7
Dissemination and use of lessons learnt    8
Reflection on standards for good evaluation    8
References    10
The selected
east cancer screening program got rolled out and introduced in Houston in 2010. Women aged from 40 till 69 years got screened through CBE (clinical
east examination) at PHC (primary healthcare centres). Such a program's comprehensive evaluation is done for mid-term course co
ection and quality assurance. It consists of in-depth interviewing the managers, discussions focussed on the group with treatment, diagnosis and screening from service-providers. There's also analysis of performance data, desk reviews of national guidelines and supervisory supportive visits to PHCs and diagnostic centres selected randomly. It was found that such a program has strengthened support politically with documented national protocol and policy and well-organized management structure. Having an absence of any mechanism for identifying and inviting the women who are eligible individually, such a program is opportunistic as a whole. Each PHC for provided having one target getting screened on an annual basis. An annual campaign which is highly visible for motivating and educating women has its great impacts on participation. Both the collection of data and record-keeping were paper-based. Data collection which was systematically based on paper helped in assessing some of the outcome indicators and process. The cancer detection rate was low and screening coverage was moderate.
Stakeholders’ engagement
Program is evaluated by IARC (international agency for research on cancer) through collaborations with LSFCPT and MoH. The stakeholders in implementing the program, having the addition of the MoH (ministry of health) involved the UNFPA (United Nations population fund) and LSFCPT (Lalla Salma Foundation for cancer prevention and treatment) (Mandelblatt et al., 2016). The team for evaluating the program consisted of representatives from LSFCPT, MoH and IARC.
In-depth interviews took place with 2 main officials of MoH, 2 provincial and 2 regional focal points. Each representative coming from UNFPA and LSFCPT were conducted by evaluation team members for collecting information. It involves ba
iers and challenges in implementation, existing processes of quality assurance, financing, management and coordination and program policy (Dogan et al., 2018). Ve
al consent was received from the officials.
Additionally, four FGDs (focus group discussions) got conducted through different categories of service providers involved in the program. It consists of pathologists, radiologists, surgeons, general practitioners, midwives and nurses).
Program description
Among cancers,
east cancer is in lead between females in Houston and the rest of America having incidence rates of 40.5 per 100,000 person-years.
The National
east cancer screening program has its target for screening starting from 40 years of age.
Activities involved the PHC staff (general practitioners, midwives and nurses) performing CBE at PHC were the ones screened positive were refe
ed to CEDC (cancer early detection centre) for assessment further. CEDC is diagnosis centres having facilities involving FNAC (fine needle aspiration cytology), core biopsy,
east ultrasound, digital mammography and surgical consultation (Travier et al., 2019). The cost-effectiveness involving CBE screening of women aged 40-60 years annually has demonstration in simulated studies of modelling. Such PHCs did not have computerized systems of information.
Logic model
Program Strategies Short-term Outcomes Long-term outcomes
Utilizing cancer genetics...

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here