How We Used The Delphi Process In Our Evaluation
February 2025
This article is rated as:
We tried using the Delphi Process in one of our recent evaluations and thought others would be interested in what we did! This article is NOT a step-by-step guide. It is more of a case study describing how we used this tool, how it was received by the client, and a strong endorsement for other evaluators to use it in their evaluations.
What is the Delphi Process?
“The Delphi method is a group-based process for eliciting and aggregating opinion on a topic with a goal of exploring the existence of consensus among a diverse group of handpicked experts.”
Khodyakov (2023)
The Delphi Process is also known as the “Delphi Method” or “Delphi Technique.” The origins of this research tool go back to the 1950s to obtain reliable expert consensus (Barett & Heale, 2020; Khodyakov, 2023). It can be used in all disciplines but is often found in health research settings (Barett & Heale, 2020; Chuenjitwongsa, 2017; Khodyakov, 2023).
But what is it, exactly? The simplest way to describe the Delphi Process is that it is a way to gather opinions from an expert panel for the purpose of reaching a consensus on an issue. For example, let’s imagine that you wanted to know the best way to train for the NBA. You could gather a group of NBA players and ask them their expert opinion on this topic. However, each player probably has his own answer to your question. Using the Delphi Process in this situation can help this group of experts come to a consensus and present you with (hopefully) one answer to the best way to train for the NBA.
The steps for the Delphi Process are usually a variation of the following:
Recruit a panel of experts on a specific topic.
Develop a questionnaire on the issue you are investigating.
Distribute your questionnaire to the panel of experts.
Review the interim results.
If consensus has NOT been reached, adjust the questionnaire based on feedback.
Repeat steps 3-5 until consensus/near-consensus is reached OR a maximum number of rounds is completed.
The illustration from Chuenjitwongsa (2017) below shows the iterative nature of the Delphi Process.
The Delphi Process can be used for a variety purposes. Some people use it to answer a research question with expert opinion, others have used it to explore the impacts of policy options, develop guidelines and curriculum. The flexibility of this method makes it a useful tool to have in your evaluation toolkit.
How did using the Delphi Process enhance our evaluation?
In my view, the Delphi Process enhanced our evaluation in two main ways:
It allowed us to involve interested parties in the evaluation in a meaningful way.
The Delphi Process is a participatory tool because it relies on the expertise of others and incorporates their feedback in a meaningful way. Sometimes people are asked to participate in things where they may feel like their input didn’t really matter in the end. Because each round of the Delphi Process builds on the feedback from the previous one, it shows your experts the immediate results of their participation. This can be a very engaging and rewarding experience as a result.
We experienced this feedback firsthand! Throughout the Delphi Process, we received comments from our experts about how much they appreciated and enjoyed participating.
It helped us to utilize expert advice to produce quality products.
As evaluation consultants, we go into our evaluation projects knowing we do not have 100% knowledge of a program’s context. That is why we rely on those who are familiar with a program and can provide us with that information.
Using the Delphi Process was a structured way for us to utilize expert advice for this evaluation project and develop a set of shared definitions that were meaningful and relevant to the people who will be using them. How do we know that the definitions will be meaningful and relevant? We know because they were co-developed with some of the people who will be using them!
How did we use it?
We used the Delphi Process as a participatory tool for developing a shared set of definitions for an evaluation project.
About the Project
We were hired to update a shared measurement and reporting system that would be used by many different programs located across the province of Alberta. Part of this project involved developing and updating a set of shared definitions that all the programs would use to guide their annual reporting.
The Problem
The definitions needed to be relevant to ALL the programs and promote a SHARED UNDERSTANDING of the reporting concepts they were defining. For example, a practicum student who is not getting paid to work may be defined as a “volunteer” in one program and as a “staff” in another. Without a shared understanding of the definition for “volunteer,” these two programs would end up measuring and reporting their activities differently. The result would be a lot of confusion and inconsistencies in the annual evaluation data.
The Solution
We used the Delphi Process as a participatory method to develop definitions that would make sense to all programs. Using this method, we recruited the input of a panel of experts representing the different program perspectives across the province and asked them to give their opinions on a set of definitions.
Here's our version of the steps we used for the Delphi Process:
Recruit a panel of experts on a specific topic.
We recruited an existing committee of representatives for the programs involved. These were people who were respected by their peers and trusted to represent all the programs. Their input helped to make the definitions more relevant, useful, and reliable for their peers.
Develop a questionnaire on the issue you are investigating.
An anonymous questionnaire with a qualitative and quantitative design was created for distribution to our expert panel. The beginning of the questionnaire included an explanation of the Delphi Process, the expected involvement of our experts, and who they could contact for more information.
Participants were asked to rate their level of agreeance with each definition and provide any additional comments.
3. Distribute your questionnaire to the panel of experts.
An initial email was sent to the panel to inform them of the questionnaire, its purpose, and to emphasize the importance of their participation. This initial email was sent by a member of the panel who was also part of the evaluation project’s Working Group as a way to foster buy-in for participating in the Delphi Process.
After the initial email was sent, the first round of the questionnaire was distributed by our team at Three Hive Consulting. Our panel of experts were given about one week’s time to participate in the Delphi Process and provide their input on the definitions.
4. Review the interim results.
Each definition and its feedback for each round was recorded in an excel spreadsheet. We kept track of all the quantitative and qualitative feedback, as well as any changes that were made to each definition as a result.
5. If consensus has NOT been reached, adjust the questionnaire based on feedback.
Prior to distributing the questionnaire, we had agreed on a threshold for “consensus.” Initially, we had decided to set “consensus” at 80% agreeance (slightly agree + agree + strongly agree).
However, our thresholds for consensus narrowed at each round to refine the definitions further. Our “consensus” thresholds for each round were:
Round 1 threshold: 80% agreeance (slightly agree + agree + strongly agree)
Round 2 threshold: 80% agreeance (agree + strongly agree)
Round 3 threshold: 90% agreeance (agree + strongly agree)
Each definition that reached consensus was taken out of the questionnaire for the next round. Any changes to definitions were shown in red text. This was done to provide feedback to the expert panel that we were incorporating their suggestions into the definitions.
6. Repeat steps 3-5 until consensus/near-consensus is reached OR a maximum number of rounds is completed.
We had initially informed our panel of experts that there would be 3 rounds of the Delphi Process with an optional final round if consensus could not be reached. We chose 3 rounds partly because we wanted to manage the burden of participation on our panel of experts. They were being asked to participate in other parts of the evaluation, so we wanted to have enough rounds to refine the definitions and keep them engaged in the rest of the project.
We ended up reaching our threshold for consensus in round 3 and did not need the final optional round.
What were the challenges we encountered?
Defining “consensus” was not easy.
We had conducted a brief scan of the literature on the Delphi Process prior to designing and launching our questionnaires. What we found was that ironically, there was no consensus on how to define “consensus” for the Delphi Process. A systematic review of Delphi studies by Diamond et al., (2014) found the percentage threshold to be the most common way of defining consensus. However, there was no agreement on the actual percentage that was a good measure for consensus.
As a result, we ended up deciding on an arbitrary percentage threshold (80% agreeance) for the first round of our Delphi Process. Because the initial threshold had been chosen arbitrarily, it ended up being adjusted in subsequent rounds based on the results we were receiving from the questionnaires. This may not have been the most rigorous application of the Delphi Process, but it worked well for our purposes.
Keeping the expert panel engaged was tougher than expected.
Overall, we had good response rates from our expert panel on all 3 rounds of the Delphi Process:
Round 1 response rate: 93%
Round 2 response rate: 80%
Round 3 response rate: 73%
However, we did experience a decline in participation over time. Some of this was due to some of the experts going on vacation since we had distributed the questionnaires during the summer months.
We did send a reminder email for each round of the Delphi Process. But we didn’t send too many additional reminders because our experts were also being asked to participate in other parts of the evaluation, and we didn’t want to overwhelm them.
What did we learn?
There were many things we learned from the Delphi Process but I’m just going to highlight a couple of them here:
Having feedback loops for our panel of experts was important for engagement.
Our panel of experts appreciated seeing their input being incorporated into the definition and giving them that immediate feedback that their contributions were being taken seriously. A few of them explicitly commented on how participating in the Delphi Process had been meaningful for them. I believe that our consistently high response rates were partly due to our experts being engaged meaningfully in the Delphi Process.
There were a couple of ways that we created feedback loops for our expert panel:
Changes to definitions from each round were shown in red text in the subsequent questionnaire to show the changes that were being made directly.
We also used the landing page of each questionnaire as a place to address any comments from the previous round that were beyond the scope of the Delphi Process.
These were usually comments related to the overall evaluation project. Although we couldn’t address them in the development of the definitions, we wanted to show our experts that we had received their notes and that they were being taken seriously.
You can’t make everyone happy.
Although we were able to reach the consensus thresholds we had defined for each round, we did not reach true consensus for all our definitions. Some experts had conflicting opinions on how a concept should be defined and some of those issues were never resolved in the Delphi Process. Ultimately, we had to make a choice to edit the definitions and present them in a way that sided with the majority opinion.
The Delphi Process can be a great tool for participatory engagement.
Using the Delphi Process was a positive experience for us on this project. It was well received by our client and our expert panel. It made our work of developing shared definitions for this evaluation project a little bit easier. We would definitely use it again and we recommend it to other evaluators to try as well!
References
Barett, D., & Heale, R. (2020). What are Delphi studies? Evidence Based Nurs, 23(3). https://ebn.bmj.com/content/ebnurs/23/3/68.full.pdf
Chuenjitwongsa, S. (2017). how to: Conduct a Delphi Study. Cardiff University. https://www.cardiff.ac.uk/__data/assets/pdf_file/0010/1164961/how_to_conduct_a_delphistudy.pdf
Diamond, I. R., Grant, R. C., Feldman, B. M., Pencharz, P. B., Ling, S. C., Moore, A. M., & Wales, P. W. (2014). Defining consensus: A systematic review recommends methodologic criteria for reporting of Delphi studies. Journal of Clinical Epidemiology, 67(2014), 401-409. https://bpb-us-w2.wpmucdn.com/sites.umassd.edu/dist/f/1316/files/2023/01/Defining-consensus-A-systematic-review-recommends-methodologic-criteria-for-reporting-of-Delphi-studies.pdf
Khodyakov, D. (2023, October 17). Generating Evidence Using the Delphi Method. RAND. https://www.rand.org/pubs/commentary/2023/10/generating-evidence-using-the-delphi-method.html