Blog entry originally posted in September 2013 on the Humanitarian Innovation Fund website.
Over the past month, the team in Rome has been busy aligning the project to WFP’s IT requirements and further testing the interactive voice response system of Verboice. In preparation for the IVR test calls, the Somalia VAM team recorded the audio files of the questions in Somali. The next step is to set up the audio files in Verboice. The DRC and Somalia VAM teams have also been looking into the different options for the live calls; whether they would be done in-house or outsourced to a call centre. Final decision still to be made!
Together we have also discussed the methodology for the testing phase. Below are some further thoughts on this:
Objectives of mVAM and how the testing phase fits in
The primary objective of mVAM is to compare the efficiency, cost and timelines of voice technologies against our ‘light’ face-to-face surveys in the DRC and Somalia. We will need to focus on figuring out ‘what works’ rather than precisely documenting comparisons between voice and IVR calls throughout the life of the project. Comparing the two modalities is not an explicit goal of the mVAM pilot; we will place test calls in both countries in order to inform our choices to voice or IVR calls. The test calls will help correct difficult or awkward wording and allow us to choose the right combination of questions. Test calls will also identify interview tactics that would help us obtain a high level of responses and completed surveys. The most important thing will be to come up with a short questionnaire that is respondent-friendly.
What questions are we going to ask? We will be asking the respondents about their household’s food consumption over the past week and what coping strategies households would have used if they have not had enough food or money to buy food. Questions will be modified to the country contexts.
How will we assess the performance of the pre-test?
We will use both quantitative and qualitative information to assess the test calls. Response rates will be measured in order to provide a benchmark for the subsequent rounds of data collection. Completion rate and average interview times will be the key indicators, revealing the extent to which we are able to collect data over the phone.
We also need to understand the experience from the user’s perspective. Respondents could be asked about their experience using the IVR; did they find it easy to interact with? Here we thought we could organise focus group sessions for the IDPs who will receive test IVR calls in the DRC. In the case of Somalia, we could call the respondents who receive an IVR call and ask for their feedback on interacting with the system.
The manager of the Listening to Dar voice project suggested that we be flexible in introducing IVR in mVAM. Rather than expecting respondents to be proficient in using IVR from the start, respondents could start with live calls in the first months and then move to IVR at a later stage of the project, once they have become comfortable interacting with phone surveys. So we could consider doing IVR testing before project implementation but also during the project.
Preparations for the face-to-face assessment in Somalia
Over the past few weeks, the VAM team in Somalia office has also been preparing for the face-to-face assessment that will take place in September. Data from this assessment will serve as a baseline to understand the household characteristics and food security status of the households prior to the start of the phone surveys. One of the questions that will be asked during the face-to-face survey is if the respondent owns a mobile phone and if he/she would like to take part in the monthly phone survey – we are hoping to get a big enough sample, fingers crossed!