Eval Academy

View Original

The Components of an Evaluation Plan

This article is rated as:

Each evaluation is different. You have different stakeholders, different topics, different timelines and different end goals. Some evaluation plans will be simple, and others more complex. When developing your evaluation plan, you can take a mix and match approach to its composition.

Here are some common components of evaluation plans. Your plan might need just a few, or it may need all of them.

Background

What is the nature of the initiative being evaluated? When did it or will it begin? Who is involved? Where is it happening? Why is it significant? What other evaluation work has already been done? This section doesn’t need to be extensive, because your evaluation plan is not the primary source of information about the initiative. But it does need to provide essential information that will inform the evaluation plan. We suggest no more than one page of your evaluation plan should be devoted to background.


Logic Model or Theory of Change

The initiative may already have a logic model or theory of change created. Great! Copy and paste. That logic model will be important in demonstrating how the initiative expects to achieve its intended outcomes. If you need to create one, keep some design principles in mind. If you ever have questions about whether an evaluation question is relevant, look back at that logic model. It’s a great source of ideas for which outcomes to include in the evaluation plan. But remember this rule about logic models: just because something is included in the logic model doesn’t mean you have to measure it. You probably don’t have the time or resources to include every output and outcome in your evaluation plan.


Situation and Goal Statements

These statements often accompany a logic model. They define the current situation and what the ideal situation will be. Situation statements, sometimes called problem statements, identify the nature of the need and who is experiencing it. Goal statements describe how the future will be better for those people when that need is met. Note that these are statements and not paragraphs – be as concise as you can be and limit the text to 1-3 sentences.


Acknowledgment of Contributors

Some people worked hard on this plan – give them some credit! If your evaluation plan spans a long timeframe, or if there are staff changes, it can be helpful for future readers of the plan to know who to track down when they have questions.


Assumptions

You know you have them, so be clear about them. Articulate the things you assume to be true about the project. For example, you might be assuming that funding amounts will remain stable for the lifetime of the initiative. You might be assuming that there will be sufficient buy-in from leadership to make the evaluation a success. Discussing and documenting those assumptions can help to avoid confusion later and may even lead you to some important evaluation questions.


External Factors

You can’t control everything about the evaluation project, or the initiative you’re evaluating. Writing down those factors that are beyond your control can help to set realistic expectations for the evaluation. External factors may exert their power in the middle of your evaluation project, and articulating them can help you to better prepare. Some common external factors include changes in the political context – an election call may mean that you have to pause your work for several weeks, or a swing to red or blue may signal impending changes to funding or project direction. Weather or emergency situations are beyond the meagre evaluator’s control, but may have a huge impact on your project. Staffing changes may mean that the people you are relying on to help collect data or broker introductions are no longer available.


Evaluation Purpose

This evaluation is intended to do something – what is it? Hopefully it will drive decision-making and change. This section can define what kind of decisions the evaluation will support. The purpose may include elements of accountability – perhaps there are organizational or funder mandates to evaluate certain components.


Evaluation Scope

You can’t evaluate everything—sorry. We know you might want to. But a successful project needs some boundaries. What time period are you examining? Which sites? Which program(s)? Defining scope in your evaluation plan will help keep you on track. Scope creep is real and it can lead to conflicts between evaluation sponsors and practitioners. This section gives you a basis upon which to base “but can we also...” and “oh, but what about...” discussions.


Evaluation Questions

Developing evaluation questions is a skill unto itself. These questions will be a foundational component of your evaluation plan. They provide direction for the data elements you will need to collect and the methods you will use to collect them. Many evaluation plans will include a table describing evaluation questions, indicators, methods and other information to guide the project.


Data Collection Plan

The data collection plan is usually best compiled in a table, although each element may warrant a separate text summary, too.

  • Data Sources: Who or what will have your data? Program participants? An administrative database? Government statistics websites?

  • Methods: How will you collect data? Through interviews? Surveys? Observation? A data request?

  • Timing: When is each element collected or delivered to you?

  • Responsibility: Who will do the work? Defining responsibility is key in ensuring that the work actually happens.

  • Cautions: What should you be wary of? Will recruitment be challenging? Is there a risk of small sample sizes? Is a site particularly difficult to access?


Stakeholder Matrix

A stakeholder matrix is particularly helpful for projects involving multiple stakeholder groups. For example, cross-agency projects with multiple funders may have several different stakeholders who each have their own needs for the evaluation. Being clear about who stakeholders are and what they need will help make sure that the information you collect and report will meet their needs.

  • Types of stakeholders: Can your stakeholders be grouped? For example, program sponsors, program funders, or program beneficiaries?

  • Stakeholder names: It may be best to name your stakeholders, either by their role (e.g. CEO, Program Manager) or by organization name.

  • Purpose/ use: Stakeholders will have different uses for the evaluation. These may include accountability, evidence-informed planning, budget allocation, policy development or advocacy.

  • Nature of involvement: Some of your stakeholders may only be passive recipients of findings and recommendations, but others may be closely involved in development of the evaluation plan or participant recruitment.


Data Sources/ Evaluation Question Matrix

Some projects will have just a few data sources. But others may involve many sources, each answering multiple evaluation questions. A matrix showing that alignment will help in project management. For example, if you find that a site is no longer able to provide interviewers with access to participants, you will know which evaluation questions will be impacted. This matrix also helps to give an easy view of how each data source is maximized and how each evaluation question is supported by multiple sources of evidence.


Ethical Considerations

Regardless of whether your project needs to be reviewed by a research ethics board (many or most don’t!), you should still be planning your evaluation with consideration for project ethics. Describing ethical risks and how you will mitigate them prompts you to both think about those risks and develop your entire evaluation plan to maximize benefits and minimize harm.


Reporting Products

Sure, you’ll likely have a final report. But in what other ways are you reporting along the way? Will you be providing a presentation with slides as a deliverable? One-page summaries? Technical reports? Interim reports? Clarifying reporting expectations helps both evaluation sponsors and evaluators. Itemizing the number of report drafts you will provide before arriving at a final version will prevent your report being titled “Updated Final v2” and save frustration. Hint—defining reporting products is important for your contract, too!


Communication Plan

Reports aren’t your only mode of communication. Particularly for longer or more complex projects, defining an evaluation communication plan clarifies expectations, helps with project management and puts you on track to share the right information with the right people at the right time. You can include things like dates you’ll provide text for the CEO to send out survey invitations, or frequency of status updates, or the expected date for a press release.


Timelines

Your data collection plan likely includes some dates, but a visual or table depicting the timelines for the project as a whole gives an easy overview. Here, you can outline when you will recruit focus group participants, when data collection ends, when data is due from sites, when draft and final reports are due.


Like these components, but don’t want to build it yourself? Download our fully editable template.


See this form in the original post