Fiona Remnant February 9, 2017

Share this:


Our field teams don’t let ‘access bias’ dictate where they go to interview respondents!


Participants have a pretty good idea of what is working and what isn’t in the projects that affect them. There are few better ways to answer the ‘attribution problem’ (what changes were result of our project intervention?) than collecting good qualitative data from participants themselves. So why don’t all organisations use qualitative impact assessments to evaluate programmes? It’s hard. People find it difficult to trust self-reported attribution and the results are often too long and difficult to use. QuIP (Qualitative Impact Protocol) addresses all of these issues to bring rigour and standardisation to a market plagued with an ‘adverse selection’ problem.

Bath Social & Development Research is developing and promoting the QuIP as a way to provide intended beneficiaries with the opportunity to describe their experience. By placing a high value on beneficiaries’ personal perceptions and priorities we can reinvigorate trust in qualitative approaches. This, we believe, will bring intended beneficiaries’ voices to the fore of impact assessments and bring us closer to closing that all important feedback loop.

  1. Blindfold data collection: OK, so we don’t literally blindfold our field teams – but they know nothing about the commissioner or the project being assessed. This essentially blinds them and the respondents from any information which could potentially bias their responses. Instead, we train field teams to use open-ended, exploratory interview schedules. These seek to understand and capture significant change in the specific areas of people’s lives which the project aimed to impact. The blinding and open-ended approach ensures honest feedback which covers all the drivers of change which affect that community – not just project-focused feedback.
  2. Engage local research expertise: Seeing a regular flow of talented alumni coming through our university’s doors and returning to their own countries, we know that there is a well of untapped ‘local’ talent which is too often ignored in favour of flying trusted consultants out to undertake fieldwork. We advocate training highly qualified field teams all over the world to conduct in-depth qualitative interviews in local dialects. This means fewer flights, fewer parachuted-in management teams going into the field and better quality data from relaxed, informal interviews.
  3. Standardize data analysis: So, what do you do with that huge pile of really interesting interview transcripts? We’ve come up with a neat, accessible coding system. It takes some training and practice, but we think there’s a qualitative data analyst waiting to be discovered in many people! Separating the roles of data collection and analysis means an analyst already knows what to look for. The data can then be coded and tagged: Does it document positive or negative change? What was the driver of change? What was the outcome? How far was that change attributable to project-related activities? Applying the same system to all data takes the ‘anecdotal’ out of reports and helps to identify where the strongest relationships exist between drivers and outcomes – whether that’s specific to the project or not!

With these three things, the QuIP offers beneficiaries the opportunity to share their experiences, and offers commissioners honest and detailed feedback. Commissioners of QuIP studies tend to be open-minded organisations keen to learn from, and share, findings. And in an effort to really close the feedback loop, we facilitate ‘unblindfolding’ workshops at the end of studies that bring researchers, project staff, and respondents when possible, together to discuss findings and determine what they mean for future projects. Full QuIP guidelines and more resources available at our website.


Fiona Remnant

Fiona Remnant is co-developer of the QuIP methodological approach, alongside Prof James Copestake from the Centre for Development Studies at the University of Bath, UK. Fiona is now a director of Bath SDR, a research organisation set up to continue developing and promoting better standards of qualitative and mixed methods impact evaluation of public and private investments with explicit social and development goals.

Leave a Reply