Articles
Our latest articles are below, or you can filter by category using this dropdown:
How we built capacity for evaluation processes
Much of the work as an evaluator involves conducting evaluations for clients; this article describes our experience of working with an organization to help them become their own evaluation experts. Within this article, we cover the key evaluation concepts we shared with our client and highlight our own learnings from this process.
Podcast Review: Indigenous Insights: Episodes 1 and 2
The article reviews the podcast "Indigenous Insights: An Evaluation Podcast" hosted by Gladys Rowe, which explores Indigenous evaluation perspectives. Rowe, a member of Fox Lake Cree Nation, shares experiences through storytelling, promoting reflection and learning. The podcast highlights Indigenous community-centered evaluation, featuring discussions with experts about their experiences.
Tools to Write in Plain Language
This article explains the concept of plain language, which aims to make writing easily understandable. It emphasizes the importance of clear communication and its relevance in knowledge translation. It also highlights reasons to write in plain language, such as addressing literacy challenges and making content accessible to a diverse audience. The article introduces tools, including AI writing tools like Chat GPT and readability tools, that assist in creating plain language content.
New Infographic: Research and Evaluation
This infographic is for those who want to understand the differences between research and evaluation, especially when they are used simultaneously in programs. The infographic shows the main steps of research and evaluation, points out the main ways they are different, and gives helpful advice for managing programs that use both approaches.
A picture is worth a thousand words: Photovoice
While traditional data collection methods like surveys, interviews, focus groups, and document reviews are well-known in evaluation, there are also innovative approaches like participatory and arts-based methods. One such method is Photovoice. While it might be daunting to embrace new techniques, it's important to recognize the value of methods like Photovoice in expanding our toolkit and enhancing evaluation practices.
New infographic: Data viz decision tree
Choosing the right type of chart to display your data can help to improve clarity among your readers and lessens the likelihood that your findings may be misinterpreted. With so many different types of charts and graphs, it can be tough to know where to start. This infographic is for anyone who wants to clearly display their quantitative data in a meaningful way but isn’t sure how to pick the best chart for the job.
Creating a Qualitative Codebook
A codebook for qualitative research is a stand-alone document that contains a list of themes, codes, and definitions that you are using in your qualitative analysis. This article outlines the structure of a qualitative codebook as well as steps to follow to create your own.
New infographic: 10 tips for running a focus group
By leveraging group dynamics and interactions, focus groups provide a platform for participants to build on ideas, challenge assumptions, and generate new insights. In this article, we present our top tips for running successful focus groups that are efficient and gather quality data.
New Template: Style Guide Template!
Whether you’re new to evaluation or if evaluation is your main role, this Style Guide Template is for anyone looking to create consistency among project documents.
New Template: Logic Model template using Canva Whiteboard
This Logic Model template is for anyone who wants to make Logic Models that look professional and visually interesting. This Canva design template can be customized to create your own Logic Model, suited to your evaluation needs. The professional design of this template makes it easy to create a visually interesting Logic Model to present to clients, share on a website, or submit in a report.
How to combine data from multiple sources for cleaning and analysis.
This article walks you step-by-step through the process we use when merging datasets from multiple sources.
How to spot common budgeting pitfalls.
Evaluations of any size tend to need to adhere to budgets, whether for financial or human resourcing constraints. There are certain pitfalls that can quickly derail your budget. This article will guide you through some of the most common budget pitfalls to help you plan and support you to stay on budget throughout your evaluation.
5 tips for ensuring interviewer safety.
In this article, we highlight the importance of ensuring interviewer safety to make the interview experience effective for collecting data and a positive experience for everyone involved!
New Infographic: 10 tips for designing quality reports!
Many of us have come across reports that feel cramped, unengaging, and let’s face it, a little bit boring! Dedicating some time to designing your report before writing can significantly affect its overall quality. By taking some time to focus on design, you can create a report that is not only refined and professional, but also visually appealing. Here are my 10 tips for designing quality reports!
Questions to get you thinking about your data
Data are only useful when used! They do no good buried in reports, sitting on shelves (or shared drives) hidden away. Data, particularly data from an evaluation, are begging to be discussed, contemplated, and put into action! This article discusses some ways to make sure your data are used.
Putting an ethics lens on your evaluation planning
Ethical practice is a cornerstone of our evaluation work at Three Hive Consulting. Not only does ethical practice ensure we are doing right by everyone involved, but conducting evaluations ethically adds to the professionalization of our work. This article defines ethical practice and provides some recommendations for how you can apply an ethical lens to your evaluation.
Your information will be kept confidential: Confidentiality and Anonymity in Evaluation
“Confidential” and “anonymous” are words we use quite a bit in the evaluation world. But do you know what they actually mean? This article explores some of these concepts and provides some tips on how to maintain confidentiality.
New Checklist: Information request checklist
Eval Academy just released a new checklist for anyone who’s about to start a new evaluation project. Use this checklist to make sure you’re gathering the information necessary to support your evaluation endeavour. This checklist can act as a support tool for you to make sure you have the context you need when starting an evaluation.
So you want to be a CE: How to become a Credentialed Evaluator
In the first article of this installment, we covered what the CE designation is and is not, and talked a little bit about why you might get it. In this article, we will explore what you need to do to gain your CE designation.
What’s the Difference: Bias versus Confounding?
In every research and evaluation project, it is important to identify and address sources of error that may impact the accuracy of your findings and the relevance of your recommendations. Here, we will look at what bias and confounding are (and are not), the differences between them, and important considerations to take to prepare for and address both in your next evaluation project.