Evaluating Your Community Based Program Part I
Evaluating Your Community Based Program Part I
Evaluating Your Community Based Program Part I
Copyright © 2006
This publication was supported by the American Academy of Pediatrics (AAP) Healthy
Tomorrows Partnership for Children Program/Cooperative Agreement # U50MC00109 from
the Maternal and Child Health Bureau (MCHB). Its contents do not necessarily represent the
official views of the MCHB or AAP.
4-69/rev2013
Par t 1: Designing Your Evaluation > Table of Contents 1
Table of Contents
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
What Is Evaluation? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
• Definition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
• Why Evaluate?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
• Guidelines for Program Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
• Who Should Be Involved?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
• Technology and Program Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
• When Should Evaluation Begin? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Introduction
This publication is the first of a 2-part guide to program evaluation developed by the American
Academy of Pediatrics (AAP) for Healthy Tomorrows Partnership for Children Program grantees and
applicants. It is also intended to be useful to pediatricians and others implementing community-based
programs. The purpose of the guide is to provide quick overviews of major issues in program evalua-
tion and to point you toward the broad array of resources for high-quality program evaluation that are
available. After reading Evaluating Your Community-Based Program—Part I: Designing Your Evaluation,
you will:
1 Understand the roles evaluation plays in program design and improvement.
2 Understand the importance of stakeholder input and involvement in your evaluation design.
3 Be able to define the outcome(s) your program plans to accomplish.
4 Complete a logic model for your program.
5 Know where to go for additional information on these topics.
Part I of this guide focuses on understanding and planning a good evaluation. Part II will emphasize
effective documentation to evaluate your program. It will also help you decide how to measure your
objectives and collect, analyze, and present the resulting data meaningfully and efficiently.
The guide is structured in a workbook format, so there is space to apply each concept to your project
as you go along. Each section also includes a case study example to demonstrate how evaluation ideas
within a single program will develop over time.
We have included a glossary and appendix of additional resources at the end of this installment. Terms
that appear in bold italics throughout this guide are defined in the glossary. We’ve also included
Jargon Alerts in relevant sections to help you understand unfamiliar terms.
What Is Evaluation?
Definition
Evaluation involves the development and implementation of a plan to assess your program in a sys-
tematic way, through quantitative and qualitative measures. Evaluation is a form of ACTION research
that seeks to provide information that is USEFUL for program development and improvement, pro-
gram replication, resource allocation, and policy decisions. Evaluation is an ongoing process: you
decide what you need to do, collect data, and review and learn from that data. Then you adjust your
program and your data collection, you collect more data, and so on. It is a cycle (Figure 1).
? ? ?
? ? ?
? ? ?
As you can see in Figure 1, evaluation is integral to good program management. In Part I, we will
focus largely on Step 1—Planning Your Program and, by extension, your evaluation. Steps 2 through
4 will be covered in Part II. For now, we will lay the groundwork for a successful evaluation to carry
you through steps 2, 3, and 4.
Why Evaluate?
A good evaluation plan, especially one that is developed early in the life of the program, will help your
program run more smoothly and substantially increase the likelihood that you will be able to demon-
strate its impact. Implementing a high-quality evaluation:
1 Articulates what the program intends to accomplish with clearly stated goal(s) and outcomes
2 Ensures that the program’s resources and activities are targeted and focused on achieving
those outcomes
3 Sets up a systematic plan for data collection
4 Allows you to use data to improve the program’s processes and outcomes
While each program will have some unique requirements, almost all programs must engage in
the following activities:
• Check Your Process: Are you doing what you said you would do?
• Determine Your Impact: Are you having the desired effect in the target population?
• Build Your Base of Support: Can you generate information and evidence to share with funders
and other stakeholders?
• Justify Replication: Is there evidence to support the expansion or replication of this program?
In addition, virtually all programs must have adequate documentation to meet funder or other
requirements. Without documentation, you will not be able to fully describe what you did or the
effect it had. A well-designed evaluation should include developing a documentation system to ensure
you get all the information needed to answer the types of questions posed above. You owe it to your-
self and your program to make sure you get credit for all that you do. And if you didn’t document it,
it’s as if you didn’t do it!
? ? ?
? ? ?
? ? ?
JARGON ALERT
mentation of your program. Often this
answering those questions, it will be hard to know what type and means measuring the method, order,
magnitude of impact you should expect as a result of your proj- frequency, and duration with which
ect. Gathering this information regularly also gives you the services are provided or tasks are com-
opportunity to better adjust your services and target your pleted, as well as who receives the
resources to achieve the desired impact. services. Information needs for a
process evaluation are likely to include
things like number served, patient char-
acteristics, number of contacts with a
program, number of trainings, or num-
ber of referrals. Process evaluation also
may look at differences in service provi-
sion among populations (children vs
adults, males vs females) or differences
in service provision over time. (Did you
provide more, less, or the same amount
? ? ? of services this April as last?)
? ? ?
? ? ?
Impact is often assessed by measuring specific outcomes. An outcome is a result produced by a pro-
gram or project beyond the point of service, such as changes in health status, incidence, or prevalence
within a given community or targeted population.
For example, one project could seek to reduce risky health behaviors (unprotected sex, smoking, etc)
among teenagers. Another will expect to reduce the number of children and families in its community
who have not received dental care in the last 5 years. A good outcome evaluation plan will define
outcomes to assess your success in accomplishing your specific goals, asking questions like: Did you
increase awareness among care providers of the dangers of
exposure to lead for children? Did you reduce dependence
among recent immigrants on emergency medical services?
Any of these impacts you hope to make in your target com- Outcome Evaluation
munity could also be described as the outcomes of your program.
Outcome evaluation tries to answer
the question, “Is my project making a
Build Your Base of Support
difference?” Outcomes try to describe
A good evaluation will generate information that will help the impact of a program on a commu-
JARGON ALERT
promote your program to stakeholders and funders. Most nity beyond the point of service.
funders, from the federal government down to the smallest Outcome evaluation will help you to
family foundation, are looking for evidence of well thought answer questions about your project’s
out and documented programs—exactly what you would impact over time.
acquire through good process evaluation. Almost all are par-
ticularly interested in funding programs that can demon- Outcome evaluation examines changes
strate a positive impact on the lives of individuals or the in health status, behavior, attitude, or
community through careful outcome evaluation. Both knowledge among program participants
process and outcome evaluation will also provide informa- or the target population. The same
tion you can use to publicize your program to community evaluation can examine both short-
members, board members, and potential clients. term (eg, “attended school”) and long-
term (eg, “graduated from high school”)
outcomes.
Justify Replication
Once they demonstrate success in one location or with one population, programs are often interested
in replicating their process in multiple locations or with different populations. Collecting sound evalu-
ation data is the only way to determine whether and how your program should be replicated or
expanded. What degree of impact can you demonstrate that might persuade others to support the
replication or expansion of your program? What information can you offer to support an argument
that this program would be successful across different communities? What process information is
available to help implement the program in another setting?
Careful evaluation—both process and outcome evaluation—is critical to ensure that both you and
others know not only that your program is successful, but how it is successful.
Meet basic scientific standards for data collection. Basic scientific standards require that data are
consistently and objectively collected and recorded in a format that will be useful for analysis across
the life of the research project. Your evaluation will only be as good as the measures you plan for and
the data you collect.
Comply with standards for protection of client confidentiality. The Health Insurance Portability
and Accountability Act (HIPAA) and the accompanying privacy regulations set national standards for
protecting personal health information. The act allows health care providers to use patient data for
program evaluation and quality improvement activities, which are considered health care operations.
It also sets out the specific conditions under which data can be disclosed for research purposes and
creates a “safe harbor” when data is de-identified by removing 18 specific identifiers: (1) names;
(2) all geographic subdivisions smaller than a state, except for the initial 3 digits of the ZIP code
if the geographic unit formed by combining all ZIP codes with the same 3 initial digits contains
more than 20,000 people; (3) all elements of dates except year, and all ages over 89 or elements
indicative of such age; (4) telephone numbers; (5) fax numbers; (6) e-mail addresses; (7) social security
numbers; (8) medical record numbers; (9) health plan beneficiary numbers; (10) account numbers;
(11) certificate or license numbers; (12) vehicle identifiers and license plate numbers; (13) device
identifiers and serial numbers; (14) URLs; (15) IP addresses; (16) biometric identifiers; (17) full-face
photographs and any comparable images; (18) any other unique, identifying characteristic or code,
except as permitted for reidentification in the Privacy Rule. A common approach used within
research and program evaluation is to make sure that any other personal identifiers beyond the
HIPAA “safe harbor” are not linked in any way to the health information that is distributed outside
the health care organization.
In addition, states often have their own laws governing privacy and confidentiality. All health care
organizations that are covered by HIPAA are required to have a privacy officer. You should consult
with your health care organization’s privacy officer on your program evaluation plan. For specific
guidelines on HIPAA and its use related to program evaluation and research, you can contact the
US Department of Health and Human Services Office of Civil Rights.1
Comply with your organization’s institutional review board guidelines, if applicable. Universities,
research institutions, and some major health services organizations have an institutional review board
(IRB) in place to review all research and evaluation connected to that institution to ensure it complies
with guidelines and standards. The federal government provides standards that are interpreted and
applied by the local IRB in a manner consistent with institutional standards. In general, if you are
reviewed by an IRB, the members will look for:
• Risk/benefit analysis: What are the risks to participants, and is the benefit generated from the
evaluation greater than this risk?
• Selection of subjects: Are appropriate groups of participants targeted to answer the question proposed
by the evaluation?
• Informed consent: Are participants adequately informed about their rights and the risks of
participation as part of the ongoing consent process?
• Privacy and confidentiality: Are there adequate safeguards to protect participants’ privacy in all
data collection, analysis, and presentation?
• Additional safeguards: Are potential subjects protected from coercion or undue influence to
participate, and are procedures in place to address potential harm to participants should it occur?
The IRB will also monitor the evaluation project as it proceeds, with particular attention to the
collection, storage, and use of individual-level data. Some types of evaluation and research are
considered exempt under the federal regulations, but this determination needs to be made by the
IRB rather than by those conducting the evaluation or research project.
There are also some standards that apply to evaluation that do not necessarily apply to other
endeavors using the scientific method.
1
The Web site address is http://www.hhs.gov/ocr/privacy/hipaa/understanding/special/research/index.html.
Limit your evaluation to information that is useful. There are many interesting questions that may
occur to you or other people around you as you develop your data collection tools and strategies.
However, evaluation is intended to produce information that will be used for decision-making or
other program-related purposes. Trying to address too many “interesting” questions may burden par-
ticipants and staff, bog down your data collection, and distract you from its real purpose.
Know what is feasible for your organization and community. Unlike much basic research, evalua-
tion is constrained by the realities of the program, its participants, and the community. Plan evalua-
tion activities that can be fit into the structure of your program, and use data collection strategies that
are realistic for your target population.
Keep your evaluation simple and focused. No matter the scope of the need you are targeting, you
should limit your evaluation to the simplest possible measures of outcome. The data you plan to col-
lect for an evaluation should also be related both to the services you will provide and the need you
intend to address. Once you get the hang of planning your evaluation, it is tempting to measure every
impact your project might have; however, you need to keep your efforts focused on the outcome or
changes that matter. There are endless things you could measure that might be related to your pro-
gram. But your program is designed to address a specific need in a specific population, and your eval-
uation should assess your progress specifically.
You do not need to reinvent the wheel. Use available assistance as necessary to plan and execute
your evaluation. Whether through the experience of other programs, Web sites, or publications, using
the knowledge of the field is invaluable and cost effective. It is also a good way to help conceptualize
an evaluation plan and identify the measures that are best for your program. The experience of other
programs is also widely documented and available, so you need not start from scratch in planning
your program or your evaluation. Check the Appendix for a list of database resources that may help
you identify programs already in existence.
It is critical that those individuals who will collect the information (usually staff ) participate in deter-
mining what and how data are collected. They need to understand the questions the data are intended
to answer to gather it consistently over time.
External evaluators. An outside evaluator can be helpful throughout the process of evaluation plan-
ning and implementation. Some projects discover that they need assistance with particular evaluation
tasks, such as data collection and analysis. Regardless of whether you have this resource, the evaluation
plan should always belong to and be driven by your project. The project has questions about process
and impact that need to be answered, not the evaluator. The project has goals that need to be met, and
those must take precedence over any goals the evaluator might bring to the table.
1 Involve technology as early as possible in the planning process. If you have technology staff,
make sure they are involved. If not, try to find a volunteer or local college student who can help
you develop simple technology tools, such as spreadsheets or a basic database. Allow them to
contribute their ideas and expertise as you determine how to measure impact, collect data, or
even connect computers across different agencies and locations. The technologically inclined
have a lot of resources at their disposal, but they can only use them to support your evaluation if
they are at the table. Taking a completed evaluation plan and trying to insert technological solu-
tions is, generally, not as successful.
2 Use technology, rather than getting used up by it. While technology has an unfortunately
deserved reputation for being expensive, complicated, and ever changing, there are 4 simple rules
for ensuring that hardware and software effectively support your evaluation over time. Be sure you:
• Do not use more, or more expensive, technology than you need.
• Use technology that will last at least 5 years and that is compatible with your current
system(s).
• Have adequate technical support for the level of technology you choose. For example, if you
don’t have anyone on staff who knows how to use a given technology, stick with spreadsheets
for tracking your data because they are easier to learn.
• Use the technology properly. There is nothing worse than preparing to run analysis on a
year’s worth of data and discovering that your data were not properly entered, so no analysis
is possible!
While it can occur after implementation has begun, the process of planning an evaluation is a
tremendous tool for fully developing your program’s implementation plan, especially the data tracking
system(s). Your evaluation plan will encourage you to think about implementation questions, such as:
• What services should you provide to address the targeted need?
• What will you measure to determine if those services are being adequately provided (process) and if
they have the impact you anticipated (outcome)?
The sooner you know what you want to measure, the better able you are to incorporate it into your
daily documentation and program operations. But if your program has already started, never fear!
Evaluation can be incorporated at any point; however, if you have planned your evaluation prior to
project implementation, you will be ready to start collecting the information you need to answer your
evaluation questions from day 1. This is usually much easier than trying to determine whether you
have been collecting the right information after 6 months of services. If you do not believe me, believe
the poor folks from the following example.
All early childhood educators in a set of community centers administered the Child Assessment
WHAT NOT TO DO
Profile to every student every year. A community collaborative eventually collected 10 years’ worth of
data on students’ growth from ages 3 to 5 years. Unfortunately, some of the data were stored in an
old software program no one knew how to use. Others were still in paper format, with no funds
available for database development or data entry. Still others were not properly labeled with the
child’s ID or a date. Because no one planned the evaluation as the program was planned and
tweaked each year, 10 years of potentially powerful data sits unused still, in an old closet.
Don’t let this happen to you!
? ? ?
? ? ?
? ? ?
Case Study Introduction: Meet Sarah and the Prevention First Program
We will use the following example throughout this guide to illustrate each step in the evaluation cycle
as we review it.
Sarah is the program director of the Prevention First Program, located at a large multiagency collaborative
CASE STUDY
in an immigrant community. The community experiences high mobility, is very low income, and speaks
limited or no English. Residents have come to this country from a variety of cultures. Sarah’s program
intends to bring together the variety of resources and expertise present in the collaborative to try to
facilitate the use of preventive health care by this community and to increase public awareness of the
many free, non-emergency health and dental services available in the community.
In the following pages, Sarah and her program will go through the same process you will go
through to specify your program and develop your evaluation plan (Figure 2).2
Figure 2 1 Planning
4 Adjusting 2 Implementing
? ? ?
? ? ?
? ? ?
3 Reviewing
2
This case study represents a hybrid of common experiences among many projects, but it is fictional. Any similarity to an
actual project is purely coincidental.
1 What is the problem, concern, or gap that inspired someone to write a grant?
Describe it as specifically as possible. What issue do you want to address? How historic or widespread
is it in your community? What do you know about the issue from data? What would other stakehold-
ers say? Staff? Clients?
immigrants about the health resources available to them. A recent survey by a local family
foundation indicated that fewer than 15% of those eligible were using no-cost health care options.
A coalition has come together to meet these needs in multiple immigrant populations. Clients at one
agency have reported that once they learned about the resources and became comfortable accessing
them, they were able to convince others to do the same.
Sarah revealed some information about what she wanted to change, improve, or impact in her
CASE STUDY
Sarah identified a few facts about the population she wanted to target with the Prevention First Program.
She has added some more details below.
• A low-income community
• Recent immigrants, including South American Spanish-speaking, African French- and Portuguese-
speaking, and Polish
CASE STUDY
• Living primarily within the same 4 square mile area, the First Stop neighborhood
• Mostly employed in temporary and/or part-time labor positions
• Mostly from cultures without a concept of preventive medicine
• Mostly from countries without health care support systems
What people, communities, institutions, or other audience does your program target?
WORKSPACE
You have an incredible amount of knowledge about the population for whom you want to make
change and the problem you want to address in that population or community. Even as you plan your
program services, you can begin to use this vision to develop an evaluation plan for your program.
JARGON ALERT
how will you get there? The next steps in developing your evalua- A goal is a high-level, broad statement
tion plan will lead you to clear statements of what you are that articulates what a program
hoping to achieve. would like to accomplish. A good
goal statement should be:
What do you think are the goals of your program? What are the
big things that you would most like to accomplish? What • General enough to be a single
achievements would mean you definitely have succeeded in sentence
addressing the problem that you identified previously? • Specific enough to indicate the
problem and population targeted
Programs can have one goal or a few overarching goals; however,
you should choose your goals carefully! Trying to accomplish
many goals, rather than targeting services and resources, threatens to reduce both the impact and
sustainability of your program.
CASE STUDY
These initial goal statements are likely to closely represent your vision for the program. But what
about all the other people who will implement your program? The best goals, those that are the rich-
est representation of a common vision for the program, are developed in collaboration with staff,
other stakeholders (such as collaborating organizations), and current and former clients. This may take
a few meetings and some negotiation about what various folks expect that your project will accom-
plish. But the discussion and process of goal development will build a common vision about what the
project strives to achieve.
Sarah has developed a coalition of preventive health care providers and health educators in the
community she wishes to target, as well as a few matriarchs of the more seasoned and stable immigrant
families. When she presented the goals she identified for the program, the group agreed with the spirit
of both goals. Some of the health providers wanted to develop the second goal a little further. They
wanted the goal of the project to be not just to educate, but actually to impact the use of preventive
health services. Some of the clients questioned why the project only wanted to increase the use of no-cost
CASE STUDY
health services. Many of their friends and neighbors were undocumented immigrants and would not
qualify for (or risk!) using a government program for health care. Should the goal, they asked, be to
increase use of any health care options, including those provided by employers, churches, or nonprofit
health networks? After a lengthy discussion with this advisory group, Sarah discussed the newly proposed
goals with her board.
By working with others in her community, Sarah was able to refine her goals and be more specific about
the program’s target population.
1. Immigrant families will understand the importance of prevention.
2. Immigrant families will use preventive health services.
Even if you complete a comprehensive process such as this, your goals need not be carved in stone, set
up on a shelf to collect dust for the life of your program. As you’ll recall from Figure 1, they will likely
evolve as you learn more about the problem you want to address or the population you want to sup-
port. They may evolve and change as the stakeholders, staff, or knowledge within the field changes.
They may also change as the problem you want to address is impacted by your efforts.
JARGON ALERT
need to be able to articulate objectives, or the steps you will take
in your efforts to achieve that goal. Often a single goal will have Objective
several associated objectives, but each of the objectives should
be SMART (specific, measurable, achievable, realistic [for the An objective is a measurable
program] and time related). step toward the achievement
of a goal. Good objectives specify
WHO will do WHAT by WHEN.
For Sarah, we know that one major goal is to have her target population use preventive health services.
Sarah and her team were able to state specific objectives related to this goal.
CASE STUDY
1. Within the first 6 months of the project, we will conduct a focus group with immigrant parents to
explore possible barriers to the use of prevention services.
2. By the end of year 1, we will have made presentations to staff of at least 4 agencies serving
immigrant families to promote preventive health services and encourage referrals.
3. By the end of year 1, participating immigrant families will schedule and complete an increased
number of well-child visits over base line.
Notice that Sarah identified both process objectives (objectives 1 and 2) and an outcome objective
(objective 3). Process objectives are concerned with the implementation of the program, and they are
usually assessed as a part of process evaluation. Outcome objectives are concerned with what happens
to program participants, and they are usually assessed through outcome evaluation. It is important to
keep in mind that developing process and outcome objectives is not an exact science. Some outcomes,
like objective 3 in the case study example above, could be considered a process objective in another
project, although in this case it is clearly a direct outcome for the goal of increasing use of preventive
health services. Which category your objectives fall under is not as important as making sure they
make sense for your program.
Often in establishing objectives, a program will set targets in numbers or percentages for what it hopes
to achieve. If you set a target, it should be based on the best information you have available so that it
is a meaningful and realistic number. Factors to consider include the findings of your community
needs assessment, the capacity of your organization, the history of the problem in your community,
and use patterns of the target population.
Can you specify 2 or 3 objectives for 1 of the goals you articulated earlier?
Goal: ________________________________________________________________________
Objective 1:___________________________________________________________________
_____________________________________________________________________________
WORKSPACE
Objective 2:___________________________________________________________________
_____________________________________________________________________________
Objective 3:___________________________________________________________________
_____________________________________________________________________________
Refine your objectives if necessary to be sure that they are SMART. Repeat this process for each goal
you have articulated, and you will have laid the groundwork for both implementing your program
and conducting a quality evaluation!
Think back to the goals you brainstormed—the issues that concerned someone enough to sit down
and write a grant about them. How would you know you had successfully affected or improved these
issues? What would be different? How could you notice or measure that difference in your target
population or community?
For Sarah we know that one major goal is to increase the use of preventive health services in her target
population. What would she expect to be an observable difference in a population of recent immigrants
who had increased their understanding of the importance of prevention? Would she hope that:
• More people in the target communities would report visiting health or dental clinics regularly?
CASE STUDY
• Clinics would report an increase in wellness visits and/or a decrease in illness-motivated visits?
• Community health workers would report an increased demand for preventive health resources in
community clinics?
• School absence due to illness would decrease?
• Incidence of communicable diseases would decrease?
• Prenatal care use would increase?
• Adult workdays missed due to illness would decrease?
As you can see, the same broad goal could have a wide range of outcomes in a given community.
Without outcomes to further define how the program’s successes will be measured, staff, board mem-
bers, and clients may not share an understanding of what the program truly intends to accomplish. In
the absence of specific outcomes that reflect programmatic expec-
tations, it will be difficult for stakeholders with different visions
of the same goal to work together effectively. However, if stake-
holders have participated in specifying objectives and have Outcomes
included outcome objectives, you may have already done much Outcomes are measurable changes
JARGON ALERT
of the work of identifying outcomes. At least some outcomes, that occur in your target community,
though not necessarily all, should follow naturally from outcome population, or coalition beyond the
objectives. Remember that outcome objectives (increase in well- point of service or intervention.
child visits) are most strongly related to the services provided by They demonstrate that you have
your program, and outcomes (increase in immunization rates in achieved a goal or part of a goal, and
the target population) more broadly reflect changes that occur in define the logical and desired result
your target population. of the services your program provides.
Any impact you hope to observe and
measure as a result of your project
should be stated as an outcome.
Limit the number of outcomes. While you could have more than one outcome for each of your goals
(or outcome objectives), it is perfectly acceptable to have only one way in which you will measure suc-
cess in achieving that goal. Although it is always tempting, you cannot measure everything. Based on
experience with the Healthy Tomorrows Partnership for Children Program and other community-based
projects, we recommend that you plan to be accountable for a maximum of 3 outcomes. While there
may be many more changes you hope to observe as a result of your program, limiting your outcomes
serves 2 important purposes.
1 Focus. It ensures that all stakeholders are following the same road map, and that all resources are
focused on achieving the changes that matter most for your program.
2 Data. It is difficult to effectively collect data on many outcomes, especially in situations where
your work occurs in multiple sites or across more than one organization.
Use standard outcome measures. Luckily you don’t have to define outcomes on your own. There are
many lists of common goals and appropriate outcomes available on a variety of health-related topics.
For example, programs that promote access to health care for children might use standards such as
immunization rates or number of emergency department visits as outcomes for their interventions.
Programs that work with pregnant and parenting teenagers usually use avoiding a repeat pregnancy as
one of the outcomes for their program participants.
In many cases, however, your program will be unique, and it won’t be necessary or even possible to
replicate an existing outcome. Then you will need to work closely with your team to define outcomes
that are appropriate and realistic for your intervention and your target population. In those cases, you
will want to be sure to talk to others in your community and those with relevant expertise. You may
even want to convene a discussion or focus group with representatives of your target population, to
work together to identify desired outcomes. This is also an area in which you might consider working
with an experienced evaluator to develop the best possible outcomes to define success for your project.
Sarah had trouble narrowing down the many outcomes we listed previously that could demonstrate that
Prevention First has had an impact on the use of preventive health care services in her target population.
While she thought all were important, she realized that, with limited resources, she couldn’t provide enough
services to hope for all those outcomes or collect enough data to document them all! She decided to con-
sult with her evaluator to see if there were some best practice, tried-and-true outcomes related to preven-
tive health care use she might replicate. After identifying some possible outcomes, and sharing those with
CASE STUDY
various staff and stakeholder groups, the Prevention First outcomes looked like:
Goal: Immigrant families will use preventive health services.
Outcome: 1. Immunization rates will increase among children in the target population.
2. The number of workdays or school days missed by community members due to illness
will decrease.
After consulting with her evaluator and with staff, Sarah determined that she could best document the
impact of Prevention First by using 2 outcomes to determine if use of preventive health care services by
members of the target population had increased. She wanted to use a measure of health status that was
local to the community (incidence of disease) and a measure of social and economic participation by her
target population (absence from work or school) to get a full picture.
Program Evolution. This is a good time to mention the process of program evolution. It is OK for
your goals and objectives to evolve. Naming the measurable changes (outcomes) you expect from
your program will help you assess whether you have identified the right goals and objectives, given
your resource set and your target population. Discussing ideas about outcomes with your stakehold-
ers and staff should naturally lead to a close examination of realistic and appropriate goals and objec-
tives for your project. Close examination of goals, outcomes, and activities by as many stakeholders
as possible early in the life of your program will position your program to better achieve its overall
goals in the long run.
The Logic Model: Putting the Pieces of Your Evaluation Plan Together
Uses. Your logic model provides a snapshot of your program, illustrating the sequence of steps that
connect the resources available to you with your intended results. A logic model helps you and other
stakeholders keep your focus on what is most important for the program. Done well, it serves as a
single-page summary of your program that is easily shared with new staff, boards, and funders. The
logic model is a tool that is useful throughout the life of your program, because it helps facilitate pro-
gram planning, implementation, and evaluation.
The development of a logic model is in itself often a valuable process. It provides programs with an
important opportunity to build consensus by planning together. Because you lay out what you have to
work with, what you plan to do, and what you are trying to accomplish, putting together your logic
model forces you to think through the logical progression of your program and to plan for its imple-
mentation. It also forces you to recognize gaps or problems in your planning and make adjustments.
Format. Though there are many formats available, a standard logic model contains 5 general pieces of
information: who you target, what you input as resources into the project, what services you provide
(your activities), what outputs you expect, and what outcomes you hope for. They are arranged in the
sequence of logic—not necessarily in the order you develop them!
? ? ?
? ? ?
? ? ?
Logic Model Example. Our logic model headers appear below. The arrows indicate that there is a logical
sequence to the columns. The target population and need(s) drives the model that follows. Resources
or inputs are used to provide services or activities that address that population’s need(s). Just based on
providing those activities or services, you expect to generate certain outputs. In time, you hope that your
services will impact the target population’s need(s) as measured by your proposed outcomes.
The characteristics of The resources Strategies you use or Basic data Desired changes in
people or communities required for services you provide on program the target population
you work with and the this program to try to achieve participation as a result of program
needs they present to operate your goal activities
EXAMPLES
Age, gender, socio- Money, staff, Provide training, Number of Changes in knowledge,
economic status, ethnicity, volunteers, counseling, education, participants attitude, behavior, health
language, geographical facilities, etc screenings, referrals; attending a status, health care
location, low health develop materials, etc training, number use, incidence,
care use, high cancer of counseling prevalence, etc
incidence, lack of mental sessions, etc
health information, etc
Sometimes people find it a little confusing that the words “goal” and “objective” do not appear in the
logic model. Don’t be alarmed! The framework provided by your goals and objectives is still there. The
logic model can be seen as an expression of your goals and objectives, usually with more detail about
how you will accomplish them. You can think of your program’s overarching goal or goals as the
framework for the entire logic model. Each step in the model is related to reaching this overall goal.
Your program’s objectives are usually expressed in the logic model in the last 3 steps: activities, out-
puts, and outcomes. In general, process objectives tend to fall into activities and outputs (things you
count), while outcome objectives often fall in the outcomes section. However, as mentioned previ-
ously, the most important thing is not the placement of your objectives…the logic model needs to
make logical sense for your program and this is more important than placement of your objectives.
Inputs may constrain what you are able to do, and they are important at this stage because they set up
your program to be realistic about what activities, outputs, and outcomes are achievable. Though it
seems we all aspire to accomplish more than time and funding will allow, you will not help your pro-
gram succeed by setting up a model that doesn’t have adequate inputs to rely on. Rather, you will be
setting it up to fail. Some examples of specific types of inputs are detailed in the logic model for our
case study below.
Sarah was able to identify the resources available to the Prevention First Program.
• People: coalition members, director, 2 staff interns, volunteer health educators
CASE STUDY
• Funding from 3 sources: a private foundation, a local funder, and the state
• Computers and software already in place
• A prevention education curriculum
• Prevention media
• Verbal and written translation capability
The process you follow to try to achieve a goal with your target population is made up of a series of
activities, services, or interventions. These could be the services and supports you provide to individu-
als or selected subgroups of your target population, such as medical examinations, well-child classes,
or access to health care coverage programs. They could also be efforts you target to a broader commu-
nity, such as education campaigns or coalition building to raise awareness and advocate for the needs
of your target population. Your activities, services, and interventions should be provided in some logi-
cal order that will help you achieve your goals and objectives.
It is also important to develop your program’s activities in collaboration with those who will provide
the services and/or have personal knowledge of the population you wish to target. For example, con-
sider again Sarah’s goals: Immigrant families will understand the importance of prevention and they
will use preventive health services.
? ? ?
? ? ?
? ? ?
Sarah consulted the same group she worked with to develop her program’s goals and objectives. They
began by brainstorming an ideal list of what services, education, collaboration, and access they would like
to provide or engage in to help achieve their overarching goal. Then they identified those they thought
would be most effective and feasible.
After revising the strategies and sequence of activities for Prevention First with her planning
group, and making sure there were sufficient inputs to support those strategies, Sarah shared the
following activities:
• Community interventions will be used to sign clients up for health care services. Coalition members will
CASE STUDY
(1) hold fairs at schools, churches, community centers, and block parities and (2) provide resources and
education to community leaders to promote the sign-up of those who remain unenrolled. Fairs will be
conducted every 6 months in an effort to reach those who lapse and need to reenroll, as well as those
entering the community.
• Prevention education sessions will be provided at schools, churches, and community centers, and will be
facilitated by coalition members and community leaders. They will provide written reminder materials
and schedules and will be language-appropriate. The coalition will also undertake a prevention educa-
tion media campaign in public transportation, community businesses, and religious bulletins providing a
different language-appropriate reminder each month.
• Work with coalition to provide preventive health services regularly through nontraditional outlets, so
that targeted clients do not have to go to a clinic: They can have their blood pressure taken at the gas
station on Saturdays or have their glucose levels checked at the panadaría.
What are some activities that will help your project achieve its goals?
WORKSPACE
Consider this: 88% of pregnant teens in 6 area high schools attended at least 2 well-baby education
sessions. We know who attended and how often, but we have no information on how well they
learned or applied the information. Assuming that the goal of the well-baby education program was to
affect the parenting behavior of the teens, program attendance is output information. It tells us that
they “got” the program, but not what difference it might be making.
Here’s an example of the difference between outputs and outcomes for a health care access program:
If your goal is for residents of a low-income community to use free health care rather than emergency
medical services, signing them up for health coverage is an output. It is an important piece of informa-
tion that documents your program is doing what you said it would do; you have achieved the objective
of signing them up for benefits. Whether the outcome is achieved—to change their health care use
patterns—remains to be seen. Just because they signed up doesn’t mean they will use the resource.
After carefully reviewing the program’s activities, Sarah was able to describe the outputs she would
anticipate as a result of the services.
CASE STUDY
What are some outputs you expect to occur from your project? What information (numbers,
percentages, other information) would tell you your program was doing what you said it
would do?
WORKSPACE
Outcomes are entered in the last column of the logic model. Once they are there, you can see the
logic of your program from beginning to end. This is a good opportunity to check that the sequence
makes sense. Do your outcomes follow naturally from the inputs, activities, and outputs you specified?
Sometimes this step feels a little confusing to people because they have multiple goals with different
outcomes, and they are tempted to build a separate logic model for each goal. While this is possible,
we would encourage you to develop a single logic model so that you and your potential audiences are
able to understand the logic of your entire program with one picture. Presumably, if you have multiple
goals, they are all related, and they may even share outcomes.
Another point of confusion may be that programs often identify short- and long-term outcomes. This
is not a particular problem for the logic model. It can be helpful to cluster your outcomes into short
and long term, especially if you expect to observe them at different points in the life of the program.
Remember, the logic model is a tool to help you and others keep track of what you are doing. The
better it represents your program, the more useful it will be.
? ? ?
? ? ?
? ? ?
Characteristics of people Resources Strategies you use or Basic data Desired changes in
or communities you work available to services you provide on program the target population
with and the needs they support to try and achieve participation as a result of program
present program your goal activities
operation
In this section we have presented the logic model as the organizing tool for otherwise disconnected
bits of information that you create in developing your program and your program’s evaluation. While
we worked from left to right across the columns, some people find it easiest to work from right to
left—beginning with the end in mind. Here is the best thing about a logic model: It can grow and
change with your program, and with your understanding of your program.
Sarah proudly took her completed logic model to present to her board the week after completing the
first draft with staff and stakeholder input.
The board members asked her to determine which ideas have shown promising results in the past for similar
programs, so they are sure Prevention First is engaged in the right activities to meet its goals.
While Sarah has outlined a pretty specific process for her program, like traveling, there are many ways to
CASE STUDY
reach the same destination. How do you decide which way is best for your program? One way is to consult
others who have made the journey.You do not have to develop your program from scratch.While some of
Sarah’s very specific ideas about how to implement Prevention First came from critical community input, Sarah
would also benefit from reviewing the process and results of other prevention and health access programs in
her community and beyond. Good programs investigate other models, whether through professional journals,
professional contacts, or other resources. For example, the AAP has developed a searchable project database
(www.aap.org/commpeds/grantsdatabase) that houses descriptions of past and current projects funded
by Healthy Tomorrows and other funding programs. (See additional examples in the Appendix.) It is always
beneficial to invest some time and investigate what program examples may be relevant to your project so
that you have the opportunity to learn from—and avoid—the mistakes of others.
? ? ?
? ? ?
? ? ?
Conclusion
Glossary
Activities: Day-to-day ways in which people and material resources are used to achieve your goals
(may also be called services, tasks, or strategies).
Goal: A high-level, broad statement that articulates what a program would like to accomplish.
Incidence: The number of cases of disease having their onset during a prescribed period. It is often
expressed as a rate (eg, the incidence of measles per 1,000 children 5 to 15 years of age during a speci-
fied year). Incidence is a measure of morbidity or other events that occur within a specified period.
Input: The resources (human, financial, and other) that you have to put into your program to be able
to provide the services that will allow you to reach your desired goal.
Logic Model: A visual representation of your program, illustrating the relationships among the
resources available to you, what you plan to do with them, and your intended results.
Outcome: Measurable, intended results (short or long term) of your activities, strategies, and/or
processes. May also be called impact, result, effect, or change resulting from your project.
Outcome Evaluation: A plan to measure what difference your project is making for the target
population.
Output: Basic information on participation or completion resulting from activities or services your
project provides; used to measure or track the implementation process.
Prevalence: The number of cases of a disease, infected persons, or persons with some other attribute
present during a particular time. It is often expressed as a rate (eg, the prevalence of diabetes per 1,000
persons during a year).
Process Evaluation: A plan to measure whether your project is being implemented as you intended,
including who is participating and what services are being delivered.
Please note: Listing of resources does not imply an endorsement by the American Academy of Pediatrics
(AAP). The AAP is not responsible for the content of the resources mentioned in this publication. Web site
addresses are as current as possible, but may change at any time.
How to Evaluate
1. Basic Guide to Outcomes-Based Evaluation for Nonprofit Organizations with Very Limited
Resources (www.mapnp.org/library/evaluatn/outcomes.htm)
Use this short, simple, step-by-step guide if you need a good tool for introducing the concept to
others. It does not, however, address process evaluation.
2. Thinking About How to Evaluate Your Program? These Strategies Will Get You Started
(http://pareonline.net/getvn.asp?v=9&n=8)
This longer, but still simple, guide includes a discussion of process evaluation. It includes links to
tools, checklists, measures, and other goodies.
3. The Community Toolbox (http://ctb.ku.edu/index.jsp)
This site offers a wealth of tools and guides, including samples of different logic model formats
and checklists for a high-quality evaluation. You can learn an evaluation skill, plan your evalua-
tion, or network with resources.
4. Evaluating Community-based Child Health Promotion Programs: A Snapshot of Strategies
and Methods (http://www.nashp.org/node/731)
This report was developed by the National Academy for State Health Policy in partnership
with Nemours Health and Prevention Services. The report provides the practical experience
and knowledge of program administrators, evaluators, and researchers regarding what works
and what doesn’t when evaluating community-based initiatives that focus on children’s health
promotion and disease prevention. It provides a snapshot of 7 projects nationwide that have a
community-based component and the lessons learned from their evaluation activities. The
report includes a discussion of evaluation design, process and partnerships, outcomes, and dis-
semination.
Logic Models
5. University of Florida Extension Service: Using Logic Models for Program Development
(http://edis.ifas.ufl.edu/WC041)
This is a short narrative explaining the concept of a logic model without getting into the details of
creating one. This is a great handout for staff or board members who need more background
information before participating in logic model development, but don’t need this full guidebook.
Evaluation Training
14. American Evaluation Association (AEA): Training and Professional Development
(http://www.eval.org/)
The AEA offers a variety of training annually, provided throughout the country. Training is avail-
able for beginners and more advanced professionals conducting evaluation.
For additional evaluation resources and for updated Web site links, please visit the Community Pediatrics
Evaluation Resources and Tools Web page at http://www.aap.org/commpeds/grantsdatabase/.