Date: June 7, 2021

The times they have a’changed and, as a result, so too have evaluation practices. Remote evaluations are here to stay. Thrust into the limelight last year, they are now firmly a part of the evaluator’s toolkit. But, how do you conduct a completely remote evaluation? In this blog, Khulisa shares insights from the front-line of remote evaluations. We pick the brain that is our Deputy Director of the New Initiatives division and expert evaluator Margie Roper to hear her experiences with conducting remote evaluations.


Margaret Roper

In the last year, we conducted a number of remote evaluations. Three different evaluations are reflected upon in this blog:

  • A completely remote evaluation: A faith-based post school education and training (PSET) program with primarily youth participants. It was evaluated entirely remotely and was designed to be online, with online discussions, presentations and responses.
  • A hybrid between remote and in person evaluation: A training programme for education district officials and teachers. It was adapted to a hybrid model of implementation, and the formative evaluation had to follow suit. The evaluation involved gathering data remotely as well as direct in-person engagement with respondents. Technology was used when in-person methods were not possible.
  • A rapid pivot from in person to remote evaluation: An international conference that had to rapidly shift to online implementation and remote evaluation, all within a 3 weeks period. Because of this rapid shift to remote, we used a developmental and formative approach.

How did you collect the data for your remote evaluations?

“We used a number of different technologies as we had to rely on many different ways to engage with participants. This ensures that participants are provided with sufficient opportunities to respond in a way that they are comfortable with. Despite doing it remotely, you’ve still got to build a relationship and develop a connection with your respondents. Because, without that, you can’t get the depth and honesty of information that you need for the evaluation.”

  1. Use Survey Monkey for remote data collection

“We used it for all three of the evaluations. Survey Monkey worked very well for international and local evaluations, because we were reaching beneficiaries who had access to computers and already used email. For the education officials, they were given a directive that they had to complete the online training, as well as to complete the post-training survey. This was helpful to ensure a good survey response rate, but may have influenced the responses.

However, the PSET evaluation went to youth. Initially, we had a good response from those youth who were more familiar with email, particularly those that were used to accessing it on their phones. After that initial good response, we hit a bit of a lull. As lockdown restrictions eased, some of the centres opened up so that youth who didn’t have access to the survey could come and complete the survey on a computer at the centre.
That worked to increase the response rate, although the quality of the data decreased. We found several similar responses and we’re not sure if it was individuals filling it in themselves or if someone was typing it in for the youth and giving us a ‘summary’ of their responses.”


TIP: When using an online survey, make sure to phrase the questions so it is easy for people to respond. Consider your language, your tone, the logic, skip patterns, and limit open-ended questions so that it’s easy for people to tick and select. Make sure it is not too long or time consuming to complete.

      2. Use Mentimeter at both in-person and virtual events

“Mentimeter is a way that you can pose a question to people at a venue and they can respond immediately and anonymously using their phone, tablet or laptop. It is also a nice way of engaging immediate feedback, there and then.

Using Mentimeter meant that I could [remotely receive] real real-time responses and feedback. I set it up and the facilitator posed the questions. The results were shown to everyone there, and they also immediately came through to me. Actual real-time responses and feedback. That definitely works. People also like it because they can see their answers come up immediately.”


TIP: Once again, your questions need to be clear. You also need to have a working WIFI network for people to access Mentimeter and to receive and show the immediate responses. It is important to clarify that the responses are anonymous before one starts, as this allows people to give a negative, or honest, answer.

      3. Use a number of different ways when conducting key informant interviews

“Some interviews were conducted using WhatsApp calls, some through Zoom. Some on FaceTime and some done the ‘traditional’ way with a land-call. This was so that we could get the one-on-one interactions. Young people were a lot more open to doing the interview on the phone, and were comfortable not having to see the face of the interviewer.

Some people struggled to share their insights and experiences, and we had to prompt and engage them. We had to encourage them to not just give a ‘yes/no’ but to get to the depth that we were looking for. keep this in mind When framing your questions for a remote evaluation. And be aware of the context and the culture of how you frame the questions, too. That’s where qualitative interviews and semi-structured guides are very important.”

TIP: Build the relationship before jumping into the interview, especially when it is remote. We have to be careful, where we take it all online, that we don’t lose the qualitative understanding. That we still find out the story behind the explanation or result.

What did the respondents think of participating in a remote evaluation?

“I think it was new for everyone. We were doing the evaluations during the hard lockdown – in the midst of the crisis. There was so much uncertainty and, at that stage, there was still the expectancy that in a couple months we’d be back to normal.”

  • For young people, responses were mixed but mostly positive.

“The PSET evaluation was primarily with younger people. They were in a learning environment and valued technology. Some of them were quite frustrated because they couldn’t respond to the survey easily. This was due to not owning a device or because of insufficient data and a lack of access to the internet.”

TIP: Trust people’s understanding of the importance of technology. There is recognition among participants that technology is a part of the work environment and we all need to embrace it. Completing surveys online is one way to get familiar with a technological platform.


  • Our district officials and teachers in the evaluations were much more hesitant

“Officials and teachers were less exposed to technology and so were more hesitant to use it. We need to help teachers and district officials make the leap into being more familiar with technology, whether WhatsApp, Mentimeter or online surveys.”

TIP: Find ways to leapfrog people into using these technologies and these approaches so that they are comfortable to respond. Because, if we’re not doing that, then we’ll get the response “I’m not going to engage, it’s too difficult.”



How did your remote evaluations changed as they progressed?

“There is a constant need to adapt the administration of the methods and tools. This is to meet the dynamic changes of both using online platforms as well as the adaptive management approach to implementing interventions caused by COVID-19.”

  • We evaluated an International Conference that, 3 weeks before, was due to happen in-person in Europe, and it had to move entirely online

“We planned for a partly in-person, partly integrated (leading into having everyone in the same room) evaluation. It changed into only a post-evaluation. It was a complete shift in the design of the evaluation and the rapid turnaround. We had to incorporate evaluation questions. The scope also changed so that we could get feedback on the online interactive conference methodology that the client was piloting.”

TIP: Learn from your remote evaluations. It is new for everyone. We learned a lot from this evaluation, and this one informed our other remote evaluations.


  • The PSET evaluation was designed as an entirely online evaluation, but other factors came into play

“By the time we were getting to the recommendations phase of the evaluation, many lockdown restrictions were eased. The client wanted us to go and see the training in the sites, because they had resumed. But that wasn’t budgeted for, and meant that the evaluation scope expanded which would have required additional funding. There would have been value in it, but we had to balance the quality, the time, and the cost of the evaluation. We just couldn’t fit it into the budget.”

TIP: Be flexible, but don’t go over budget or out of scope trying to adapt your remote evaluation to continuously changing times.


  • The evaluation with the districts used a developmental AND a formative approach

“This evaluation required a rapid turnaround from being asked to do it, getting the results and providing the feedback on it. Not only did the evaluation have to adapt, but the actual implementation plan had to adapt, and was constantly changing.
It was quite difficult, because the implementers were constantly having to respond to what was happening in the field, districts and schools, in terms of COVID-19 restrictions. Then the evaluation implementation had to change.”

TIP: When conducting a remote evaluation in times of crisis and change, be flexible and encourage implementers to take a formative evaluation approach.



What questions to consider while collecting the data for a remote evaluation?

  1. What data do you need and how can you collect it remotely
  2. Are your proposed remote instruments fit for the purpose of the task AND appropriate for the respondents?
  3. How can potentially marginalized respondents participate, and what can you do to make sure they can participate (such as access to internet, data, devices etc.)?
  4. How are your evaluation results potentially going to be accepted by the users of the evaluation (will they believe that the data is representative)?

What questions to consider during a remote evaluation?

  1.  Are you getting the depth and the honesty expected using the remote platform?
  2. Are you creating opportunities where respondents can share the negatives and challenges without feeling pressurized to respond in a specific way?
  3. How biased is your sample by doing this online?
  4. How can you make sure that, while the remote evaluation is under-way, you are still creating opportunities for everyone to respond?


What questions to consider after a remote evaluation

  1. What limitations, quality issues and gaps were there – and to what extent did the remote evaluation methods contribute to these limitations?
  2. What worked, what didn’t work, and why?
  3. How could you have improved the remote evaluation and what would you do differently next time?

Do you have any tips for someone conducting a remote evaluation?

  • Be flexible, be responsive and adaptive but keep in mind the integrity of the evaluation
  • Limit the number of questions that you ask, and have very clear questions
  • Limit the number of open-ended questions
  • Have very clear rating scales that are easy to analyse
  • Think about how you get people to respond to keep your response rate high and to keep the quality of the data collection
  • Consider how to keep participant’s data confidential

Leave a Reply