Articles
Our latest articles are below, or you can filter by category using this dropdown:
What’s the difference between goal and objective? The most confusing evaluation jargon
The evaluation world is full of jargon! In this article, we list our back-pocket definitions for some of the most confusing evaluation language.
3 Easy Ways to Quantify Your Qualitative Data
You’ve completed your qualitative data collection and you’re writing up your report. You step back and look at All. The. Text. If only you had some quantitative data to include in a chart, or some numbers to report! In this article, we talk about 3 ways you can quantify your qualitative data.
Criteria Based Ranking in Developmental Evaluation
Criteria Based Ranking is one tool evaluators can use to facilitate critical thinking and some level of precision in decision making. In this article, we explain what criteria-based ranking is and how we used it in developmental evaluation.
The “mixing” in mixed methods
Data integration is a way of merging these data from different sources through mixed methods. In this article, we discuss how qualitative and quantitative data can be integrated at the study design level, methods, or analysis level.
Finding the Right Sample Size (the Hard Way)
For those interested in calculating sample sizes by hand, or getting a better understanding of the math behind many sample size calculators, we outline the formulae used to calculate sample sizes.
Finding the Right Sample Size (the Easy Way)
In this article, we briefly define sample sizes, their importance, and how to calculate them (or how to use a tool to calculate them).
How to use Calendly to schedule interviews like a pro
In the evaluation world, scheduling can be a nightmare. This article describes how to use Calendly to schedule interviews in three simple steps.
The importance of articulating assumptions
In this article, we describe what assumptions are in evaluation, explain why you should document assumptions, and describe how to reflect on your assumptions when collecting and analyzing evaluation results. We also provide some practical examples of how to include assumptions in your own evaluations!
Evaluation Facilitation Series: Facilitation Activity #1 (Making Metaphors)
This evaluation facilitation series will highlight some of these facilitation activities and how I have applied them over the years. This article focuses on their “Making Metaphors” activity, along with some step-by-step instructions you can use to try it in your own evaluations.
Evaluation Question Examples by Type of Evaluation
A look at how using different evaluation strategies or frameworks can help you to craft perfect evaluation questions.
Book Review: Developmental Evaluation by Michael Quinn Patton
A book review of Michael Quinn Patton’s Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use (2011).
Grab the cake, it's time for a data party! Benefits of and how to run your own
So you've successfully gathered the data you need to evaluate your program. But how do you engage stakeholders and partners to ensure a thorough understanding of the results? A data party could be part of the answer!
A Beginner’s Guide to PivotTables
If you work with data in Excel, whether frequently or infrequently, learning the basics of PivotTables will improve your ability to quickly explore and analyze raw data.
The 10 Metrics Your Evaluation Consultancy Should Track
Part of the job of an evaluator is to identify and define metrics for our clients. But what about you? Are you as disciplined when it comes to defining and tracking metrics for your own business?
How can we incorporate diversity, equity and inclusion in evaluation
Recognizing that equitable evaluation is an emerging area of work, this article aims to add to the growing discussion. While it does not include an exhaustive list of issues and strategies, it will help you introduce some changes to your evaluation practice.
The Data Cleaning Toolbox
The end goal of collecting data is to eventually draw meaningful insights from said data. However, the transition from raw data to meaningful insights is not always a linear path. Data are prone to human-error and this guide will help you correct those errors, as well as provide tips on how to minimize these errors in the future.
From data to actionable insights
As evaluators, we are rarely organizational decision-makers; it is our job to provide those decision-makers with actionable insights. In this article we highlight how you can translate data into meaningful findings, or insights, so you can support decision-makers to drive action within their organizations.
How (and whether) to write recommendations
What is the scope of an evaluator’s role: should an evaluator make recommendations? Do we tell programs how to improve? Or do we simply share the data with them and let them draw their own conclusions?
Evaluation Report Inspiration: Excerpts From A Breast Cancer Clinic Evaluation
A few years ago, we completed an evaluation for a breast cancer clinic. In honour of Breast Cancer Awareness month, we thought we would highlight some excerpts from that report to help inspire your next evaluation report!
Scope Creep: When to Indulge it, and When to Avoid it
Ideally, our evaluation projects would proceed as planned. But as all project managers know, sometimes things change. Actually, most of the time, things change! In some situations, our evaluation approach can be modified to adapt to the changing context, but in others, we have to say no to scope creep.