New places, new tools: what’s up next for mVAM?

KOICA pic 2

We’ve just got back from Rwanda where we were holding a workshop on using mVAM to expand real-time food security and nutrition monitoring with Internally Displaced Persons (IDPs) and refugee populations. The project, which is made possible by the support of the Korean International Cooperation Agency (KOICA), will be implemented in ten countries in sub-Saharan Africa where WFP works.

What’s the project?

The KOICA project has two aims. First, it aims to empower information exchange with marginalized populations, specifically IDPs and Refugees. Secondly, it supports the collection of food security and nutrition data using the latest mobile and satellite technologies. This will happen in ten countries in Sub-Saharan Africa: the Central African Republic (CAR),The Democratic Republic of Congo (DRC), Kenya, Malawi, Niger, Nigeria, Rwanda, Somalia, South Sudan and Uganda.

How are we going to do this?

As you know, two-way communication systems are an important part of our work. As well as getting information that we can use to inform WFP programmes, we want to ensure that the line is open so that people in the communities we serve can contact us and access information that is useful to them. We’ve already been using Interactive Voice Response and live calls to share information with affected populations, and are now expanding our toolbox to include new technologies: Free Basics and a chatbot.

Remote data collection isn’t just done by mobile phones – VAM already uses other sources, such as  satellite imagery analysis – to understand the food security situation on the ground.  Under this project, we’ll also help countries incorporate similar analysis which will complement two-way communication systems to provide a fuller picture of the food security situation.

Finally, we’re going to harness our knowledge of Call Detail Records analysis: de-identified metadata collected via cell phone towers about the number of calls or messages people are sending and which towers they are using. We have already used this technique in Haiti to track displacement after Hurricane Matthew, and we’re really excited to transfer these ideas to another context to ensure we get up-to-date information on where affected communities are so we can better target food assistance in the right locations.

What happened at the workshop?

Representatives from all 10 country offices, three regional bureaus and staff from HQ came together to discuss the three main project components. During the workshop, the different country offices had the chance to learn more from members of the mVAM team about the specific tools they can harness and ensure their collected data is high quality, standardised and communicated effectively. However, the best part about bringing everyone together was that country teams could share their experiences about how they are already using mVAM tools. We heard from the Malawi country office about their Free Basics pilot, and Niger and Nigeria explained how they’re implementing IVR so affected communities can easily contact WFP, even after work hours. Sharing their different experiences and learning about how different tools have worked in each context not only gave everyone an overview of what mVAM is doing so far, it also helped everyone understand the implementation challenges and how to overcome them.

What’s next for the KOICA project?

We’re really excited for the next stage of the project. Each country office has now planned what tools they’re going to use to increase their communications with affected communities and how they will improve their existing data collection systems. It’s going to be great to see the impact these tools will have not only on WFP’s response, but also how they will empower the communities we’re serving. 

Hearing from those who are #FacingFamine

Photo: WFP/Amadou Baraze

Photo: WFP/Amadou Baraze

 

In early March, Stephen O’Brien, the United Nations’ Emergency Relief Coordinator, reported that 20 million people across four countries face starvation and famine.  The famines looming in Yemen, South Sudan, Somalia and Nigeria represent the largest humanitarian crisis since the UN’s creation. “Without collective and coordinated global efforts,” O’Brien said, “People will simply starve to death, and many more will suffer and die from disease.”

One of the components that complicates these particular emergencies is access to the areas in crisis. Without safe and unimpeded access for humanitarian aid workers, it’s difficult to get a picture of what’s going on in the affected areas, which adds another dimension to an already challenging response. In Northeast Nigeria, the threat of violence made it difficult for WFP’s food security analysts to visit vendors in local markets or speak with people in their homes – all part of their usual food security monitoring routine.

In order to continue gathering information needed to understand the situation in the affected areas, WFP used remote mobile data collection to get a picture of what was happening in the communities they could no longer speak to in person. With an overwhelming amount of responses, we turned to Tableau , who had already helped us create data visualizations for other countries which use mVAM, to help us visualize the results in a way that could be easily understood by everyone.

mVAM hears directly from people in affected communities in the northeast of Nigeria

mVAM hears directly from people in affected communities in the northeast of Nigeria

 

Our latest interactive data visualization of the food security situation in Northeast Nigeria is now online, and the story of how it came to be can be found on Tableau’s blog. Make sure to check out the free response section, where you can hear from 5,500 households on what should be done to improve the food security in their community.

 

Mind the Mode

Settling the (Food Consumption) Score in South Sudan

POC3_Nektarios_Markogiannis

POC 3
Photo: UNMISS/Nektarios Markogiannis

For the second installment of our ‘Mind the Mode’ series, we’re taking you to Juba, South Sudan, where we previously conducted a mode experiment. What we wanted to see was how food security indicators compare when data is collected face-to-face and through operators over the phone.

South Sudan is a complex setting for mobile surveys to begin with. The country has low cell phone penetration- it’s estimated to be only 20%. Network quality is a problem, often calls don’t go through or audio is poor.  Last, but not least, the country has been extremely unstable. While we have been using key informant phone interviews to date, we are investigating the feasibility of conducting phone surveys to collect household food security indicators. Given the complexities, starting with a test to evaluate biases related to survey mode seemed prudent.

Methodology

The mode experiment took place in “POC 3”, a Protection of Civilians (POC) camp in Juba near the main UN compound. POC 3 is the largest of three camps at the UN House site in Juba, with an estimated population of 20,000 people, according to the International Organization for Migration. People in the POC are there in search of protection against the violence and conflict that South Sudan has been experiencing. We’re hoping to use mobile phones to monitor food security indicators in POC communities. POC 3 happens to have good cell phone coverage – a 2014 survey estimated that some 70% of households in the camp had access to a phone.  

 

Photo: WFP/Silvia Passeri

Photo: WFP/Silvia Passeri

We evaluated how mode effects the Food Consumption Score (FCS), which measures the frequency of consumption of different food groups consumed by a household during the 7 days before the survey. A higher score means a better level of the respondent’s household food security. The FCS is a commonly used proxy for household food security.

We carried out two rounds of data collection, round 1 in March and round 2 in May 2016. In round 1, half of the respondents received a voice call survey and the other half participated in an identical interview face-to-face. The ‘treatment’ (voice call) was random. In round 2, some of the respondents that received a voice call took the exact same survey face-to-face, and vice versa.

There were challenges relating to security in the POC and some of the respondents from March were not found in the camp when we conducted the second round in May. As a result, we had 132 voice and 333 face-to-face interviews in round one, but 138 voice and only 117 face-to-face surveys in round 2. This sample size is smaller than we would have liked, but we think it’s indicative enough to tell us how responding to a phone survey differs from one that took place face-to-face.

Calls were placed by operators that were ‘converted’ enumerators – field monitors who usually carry out WFP’s post-distribution monitoring but were new to phone-based surveys. This meant that they were already familiar with the food security indicators and the camp community, but needed training on the protocol for phone-based surveys.

Results

We observed substantial mode effects in round 1. We obtained a mean FCS of 34 via face-to-face surveys, but a much higher score of 45  through voice calls. Our regression analysis shows that mode alone accounted for 7 points in the difference in a household’s response (p<0.01), with other factors accounting for the remainder of the difference. This means that a voice survey would inflate the FCS by 20%, leading to a gross underestimation of the severity of food insecurity in the population of interest. During round 1, the voice FCS question behaved as an almost binary variable – we would get 1s and 7s, but very few 2,3,4,5 answers. That means a lot of people said they ate a given food item one day or every day, but that very few other answers were being recorded.

FCS results, round 1

FCS results, round 1

In round 2, the difference between voice calls and face to face surveys diminished substantially. Also, the difference was not statistically significant. In fact, the slight remaining difference between the two groups was due to respondent households’ socio economic profile, not because of the mode we used to collect data.

 

R2

FCS results, round 2

Lessons learned

For the food consumption score, the differences between voice and face-to-face due to the mode effect were large in round 1, but vanished in round 2. This is a positive finding for us as we are seeking to rigorously test and validate the data collected through mobile and reporting on the results with some degree of confidence. We want to highlight a few lessons here that could help guide others into the right direction.

Lesson 1: Practice makes perfect.  We suspect that the poor quality of the data collected in round 1 is due to our call center being brand new, and experiencing ‘teething’ problems. When an in-house call center is first set up, it tends to be small scale comprising of one or two operators. With resources permitting (and provided there is increased information needs) the call center may be expanded with additional operators who will receive regular training and coaching. Our analysts have been saying anecdotally that data quality improves as time goes by and the system becomes more established. We have a good illustration of the phenomenon here in South Sudan.

Lesson 2: Close supervision is required! Although our operators were familiar with data collection, it took time to train them to implement surveys by phone with quality.  This again shows that operator selection, training, and supervision are key to obtaining good quality data.

Lesson 3: Work with professional call centers. Overall, this encourages us to continue working with professional call centers when possible, and avoid the temptation to do things in-house in a hurry – something that can be all too tempting in an emergency setting.

We also think the method used in South Sudan could be applied elsewhere to help evaluate mode effects. We will post the survey design on the mVAM Resource Center for others to use.

Prince Charming: A Triplex Tale

img_4427_resize

Welcome to “Sorland”! (Photo: WFP/Jennifer Browning)

The mVAM team sent a team member, Jen, to Triplex, the largest humanitarian emergency simulation in the world. mVAM was thrilled to join over 400 military, UN, government and NGO participants who travelled to Lista, Norway, for training in how to respond to a humanitarian emergency. In the pre-exercise stage, we presented our work on mVAM, and we hope that our participation will help to increase our engagement with such a diverse group of partners. There were also interesting presentations on shelter, supply chain, data analysis, and new tools. 

Our favorite session was on smart assessments. Lars Peter Nissen, Director of ACAPS, offered important wisdom that we should always strive to follow with mVAM. He warned against getting trapped in your own small study and losing what he termed “situational awareness,” or the bigger picture.

His three rules for humanitarian analysts to live by:

  1. “Know what you need to know.”
  2. “Make sense, not data.”
  3. “Don’t be precisely wrong, be approximately right.”

In thinking about how we can apply these three gems to our work on remote data collection, we need to make a constant effort to collect data that will really help improve humanitarian responses. Like all data nerds, we can sometimes get bogged down in calculating exact bias estimates or making sample size calculations, risking losing sight of the bigger picture from down in the weeds of our small mVAM survey in one country. But we need to remember to look at the wider situation to ensure we are collecting useful information.

img_4406

Presenting mVAM (Photo: WFP/Lucy Styles)

Then we need to make sense of our data by triangulating with what others are doing and what we already know. In our mVAM bulletins, we need to communicate clearly in a way that makes data quickly understandable to decision-makers. We need to pay attention to what the trends from our mVAM data are telling us, while not forgetting the limitations of the remote mobile data collection methodology.

After a couple days of introspection, or as we would find out later, the calm before the storm, the two-day pre-exercise ended and we embarked on the natural disaster simulation phase. We boarded buses or “flights” and travelled to Base Camp in “Sorland”, a fictional developing country that had just been hit by a hurricane and where the simulation would take place.  For the next 72 hours we would do our best to respond, learning along the way.  

The organizers made a herculean effort to have the 72 hours be as realistic as possible. We were sleeping in (admittedly high tech) tents and crossing a road jammed with huge supply trucks and lines of land rovers. The scale was impressive. Prince Harry even flew a helicopter in to observe the exercise and play the role of a Minister from the Sorland government. The organizers couldn’t have planned it, but at one point, the winds became dangerously high, almost making it necessary to really evacuate us.

img_4433_resized

The Minister of “Sorland” played by Prince Harry (Photo: WFP/Jennifer Browning)

In these conditions as in any real life emergency, it was inevitable that we would run into problems. We had planned to deploy mVAM quickly. The organizers had provided us with a list of phone numbers of IDPs in “Sorland,” actually students from the United Nations University in Bonn who did a great job role playing throughout the simulation. We wanted to contact them via SMS, using Pollit, the in-house SMS survey tool developed by InStedd. We have used Pollit successfully in Goma to collect food prices, but for Pollit to work, you need a WiFi connection. (For more on Pollit, see our blog entries Pollit Customized and Ready to Go and Working with DRC Youth to Text Back Market Prices).  At Triplex,  WiFi was supposed to be up and running the first evening, but conditions on the ground made it difficult to establish a connection. We didn’t get WiFi until the last night of the exercise, which was too late for us to use Pollit.

So instead, we participated in OCHA-led face-to-face surveys and in focus group discussions. Sometimes we get so caught up in remote data collection that these other data collection exercises can fall off our radar screen, but there is so much we learn from talking to local communities face-to-face and from coordinating with other partner agencies as they plan their own data collection. So perhaps because WiFi was such a problem, Triplex turned into a great experience to keep our coordination and face-to-face data collection skills sharp.

triplex-4

The Logistics Cluster explains access constraints (Photo: WFP/Ricardo Gonzalez)

In addition to collaborating with different organizations, working within a diverse team of WFP colleagues from different units pushed us to consult closely and understand what information they needed most. At WFP headquarters, we don’t generally have the same opportunity to work this closely on a daily basis with colleagues from other branches like logistics, procurement, and cash-based transfers. As WFP considered a potential cash-based transfer response for the fictional Sorland, it became clear that operationally, information on market functioning and food availability was very important. This meant that  while we were not able to use existing mVAM tools per se, we recognized clear demand within WFP to address this critical information gap. For next time, we will keep these information needs, i.e. “knowing what we need to know,” clearly in mind. And we’ll also make sure to prepare for all types of scenarios, think about the limitations of our technology, and do our best to have a Plan B.

Even without WiFi and Pollit, the Triplex simulation ended up being very relevant and provided a great brainstorming session for what came later. During the 72 hour simulation, colleagues from Haiti and Cuba were receiving increasingly grim alerts about the approach of Hurricane Matthew. Through Triplex, we’d already identified some of the information that could be most relevant in responding to a hurricane. So our practice in Sorland turned out to be very useful in quickly deciding what questions to ask in Haiti where we are rolling out a remote market assessment. Stay tuned for more!

 

Mobile Tech for Mobile IDPs in DRC

WFP food distribution in Mugunga camp

IDPs in Mugunga. Photo: WHO/Christopher Black

We’ve been writing a lot about how mobile technologies give us new opportunities to track food security. As WFP, we provide food assistance to many refugee and IDP camps. But right now, our knowledge often stops at the camp border.  What happens to refugees or IDPs when they leave the camp? And importantly for WFP, what happens to their food security situation? Mobile surveys could provide a key to this mystery.

Mobile Surveys and IDP Flows

jean baptiste_filtered

Jean-Baptiste Pasquier

Jean-Baptiste, our brilliant young colleague, did his Master’s Thesis on precisely these questions. He looked at almost two years of data, collected by mVAM since December 2013 in Mugunga III camp- 10km from Goma, DRC. Approximately 4,664 people live in Mugunga III, and since many people didn’t have phones, we distributed phones to 340 randomly selected households so they could participate in a phone survey.

Every month, our WFP operators, Mireille and Jean-Marie, have been diligently calling these same households. They’ve been asking households about their food consumption and any coping strategies that they’ve had to resort to if they were short of food. These questions let us calculate two key food security indicators- a household’s food consumption score (FCS) and reduced coping strategies index (rCSI). It’s also gotten Jean-Baptiste some pretty good data to play with. (For more on our work in DRC, see our blogs on our DRC launch, our market monitoring, and our 2-way communication system with camp residents).

Mireille and Jean Marie_cropped

Mireille and Jean-Marie review a call script. Photo: WFP/Marie Enlund

In Mugunga III, like most IDP camps, the population is always in flux and hard to track. People come and go without officially notifying the camp administration. In IDP speak, a “returnee” is someone who has left the camp (and in theory “returned” home though in practice the person might just have gone to live somewhere else). In March 2015, we started asking people about whether they were “returnees” and if so, where they had gone.

By using our mVAM data, Jean-Baptiste was able to pick up on changes in camp population not picked up by official figures and track where people went.  Most returnees reported staying in areas nearby to camp. Few were returning home to Masisi where over half of the IDPs in Mugunga III were from but where there still was conflict.

 IDP Flows and Food Security

We don’t just want to know where returnees go; we want to know how they are doing. However, usually, once IDPs leave the camp, they fall off our radar screen and we have no more information. But with mVAM surveys, returnees continued to respond to our calls asking about their household food security situation. Jean-Baptiste decided to see whether there were any difference in the food security situation between returnees and IDPs who remained in the camp

Sure enough, there was a difference. Returnees had better food consumption on average than IDPs who were in the camp.

But you might be wondering whether returnees were doing better even before they left the camp. Jean-Baptiste found that yes- on average, returnee food consumption climbed in the months before departing. It also improved more over time than IDPs who stayed in the camp; maybe their situation was improving so much that it allowed them to leave the camp.

graph for blog

Then, Jean-Baptiste went a step further. Maybe returnee households were just plain old different than IDPs who stayed in the camp. But he found that even by controlling for differences (for stat geeks- using a fixed effects model), leaving the camp had an estimated food consumption score increase of 7.64, which would be the equivalent of a 27% increase in the average food consumption score of an IDP currently in the camp.

Needless to say, all these findings could have a lot of implications for our programmes. Our office in DRC is looking into it.

Also, if we’ve peaked your interest, read Jean-Baptiste’s excellent thesis here.

Our 5 hacks for mobile surveys for 2015

WomanPhoneGoma

An mVAM respondent in Mugunga III camp, DRC.

  1. Gender matters. Design and run your survey in a way that promotes women’s participation. With mobile surveys, it’s hard to get as many women to respond as men. Make sure you’re calling at the right time and that you provide incentives. We also recommend having women operators. For more of our thinking on gender in mobile surveys, check out our blog entry on gender issues in West Africa.
  1. Validate mobile data against face-to-face data. Your mobile survey results may differ significantly. In many contexts, cell phone penetration has not reached the most vulnerable groups. In DRC, we had to provide phones to Internally Displaced Persons (IDPs) and access to electricity- to learn more check out our video and our blog entry. But it’s not always possible to distribute phones so it’s important to check your results against other data sources. Also, people get tired of answering their phones all the time so attrition and low response rates will affect your results.
  1. Mind the mode!  Your results will differ according to whether the survey is done through SMS, IVR, or live calls by an operator. Live calls have the highest response rates, but you have to be ready to pay. For simpler data, we have found that SMS is effective and cheap. Just remember- the context matters. SMS is working well with nationwide surveys, even in countries where literacy rates are not that high- check out our recent results in Malawi. However, SMS can be a problem in communities where literacy rates are very low or familiarity with technology is low as we found in DRC IDP camps. For Interactive Voice Response (IVR) that use voice-recorded questions, the jury is still out on its usefulness as a survey tool.  IVR didn’t work as well as SMS in Liberia, Sierra Leone, and Guinea during the Ebola crisis (HPN June 2015). But IVR has potential as a communication tool to push out information to people. Check out our entry on our two-way communication system where we use IVR to send distribution and market price information to IDPs in DRC.
  1. Keep the survey user friendly and brief. Always keep your survey short and simple. Stay below 10 minutes for voice calls, or people will hang up. If you are texting people, we don’t recommend much longer than 10 questions. Go back to the drawing board if respondents have trouble with some of your questions. With mobile surveys, you don’t have the luxury of explaining everything as with in person interviews. It might take a few rounds to get it right. When we want food prices, we’ve found we need to tweak food items and units of measurement in Kenya and DRC to best capture what people buy in local markets. Again, short and sweet should be the mobile survey mantra.
  1. Upgrade your information management systems. There is nothing as frustrating as collecting a lot of great data – without being able to manage it all! Standardize, standardize, standardize! Standardize questions, answer choices, variable names, and encoding throughout questionnaires. Automate data processing wherever possible. Also, you’ll be collecting phone numbers. This is sensitive information so make sure you have the correct confidentiality measures in place. Check out our Do’s and Don’ts of Phone Number Collection and Storage and our script for anonymizing phone numbers. Finally, share your data so others can use it! We’re posting our data in an online databank.

 

 

Will IVR work for food security surveys in a Somalia IDP camp?

As the mVAM pilot project enters its final quarter, the team is focusing on finalizing all planned activities, while documenting learning that will allow us to scale up with a strong evidence base. This month’s highlights include some hands-on work with the team in Somalia, and the launch of a comprehensive review of our activities.

The Somalia IVR coming along
A key question we have is whether interactive voice response (IVR) surveys are user friendly enough to be used in Somalia with the vulnerable groups that WFP works with. The major issue to resolve was ensuring the IVR system Verboice in our Galkayo field office was fully operational. Although we had been able to place some IVR calls, the system required dedicated attention to be fully operational In mid-January, Marie and Lucia headed to Galkayo to meet with the team for a troubleshooting mission.
Thanks to late night remote support from Gustavo at INSTEDD, bugs were ironed out, and we were soon able to get our first complete IVR surveys using a Somali language questionnaire. The team in Galkayo was trained on how to place the calls and will be following a plan to scale up IVR calls in February. Meanwhile, we will continue collecting food security data through calls placed by our operators, a modality that has worked well to date.

Featured image

Making the IVR operational: training underway

During the visit, a key discussion took place regarding appropriate incentive rates. In both DR Congo and Somalia, respondents receive a token of appreciation from WFP in order to promote participation in surveys. The amount we provide –USD 0.50 per call – is equivalent to 5 minutes of airtime. While our respondents in DR Congo seem thrilled to receive this amount of airtime, the question of increasing the incentive has come up in Somalia, where it is perceived as too small.

There seem to be three schools of thought in the team. Some believe the incentive should increase in Somalia. Others think that increasing call attempts and better sensitizing respondents should be sufficient to ensure good response to our surveys. Others still question the principle of providing an incentive to people who might already receive food assistance from WFP.

In coming months, we will be making sure respondents are called more often and that the messages they receive tell them about the importance of their participation. We would then consider working with a larger incentive in the future should response rates not improve.

Launching the mVAM review

A critical milestone of the project is capturing and sharing learning. In order to proceed with scale-up strategically and responsibly, the review of the mVAM pilot in Somalia and DRC is now ongoing. Professor Nathan Morrow, who teaches at Tulane University’s Payson Center, is leading the review. Nathan has written extensively about technology in the humanitarian world, including a review of Ushaidi’s contribution to the 2010 earthquake response in Haiti.

In January, Nathan traveled to Goma, DRC, to meet with WFP staff, key stakeholders, and beneficiaries residing in the Mugunga 3 IDP camp to hear from them how the pilot was going, document their questions and concerns. He will also be chatting with staff in Somalia and the three-EVD affected countries to learn how they view the project.

The review will include documenting the demonstrated potential of mVAM at a larger level; noting areas of improvement that can ameliorate our technology; and explore how mVAM’s technology fits within the larger humanitarian sector’s work. Results will be available in the spring.

Can we use SMS for food security surveys in a Congolese IDP camp?

Blog entry originally posted in December 2014 on the Humanitarian Innovation Fund website.


Featured image

Response rates to voice calls, DR Congo Source: WFP

Almost one year into data collection, we are now fairly confident that live voice calls, placed by operators, are a good way to stay in touch with people in the extremely vulnerable communities we work with.  Since January 2014, we have been able to conduct monthly rounds of phone surveys typically reaching between half and two-thirds of selected respondents, while collecting data of good quality. However, it’s not yet clear if either IVR or SMS offer the same advantages in our pilot contexts.

SMS: cool tool, wrong setting?

This month, we attempted to understand whether SMS surveys would work in an IDP camp. Using SMS is attractive, because it is low-cost and easy to automate using free software.  While we have had good results with SMS(link is external)when running simple national or province-level food security surveys, we have yet to evaluate the tool’s suitability in a high-vulnerability refugee camp setting.  In November, two Rome-based mVAM team members, Marie and Lucia, travelled to Goma to attempt to do just that. They helped the team in Goma organize a simple food security survey involving face-to-face interviews, live voice calls and SMS. The data collected from this exercise will allow us to understand the strengths and weaknesses of these different survey tools.

Featured image

Residents in a Congolese refugee camp responding to text messages

In order to run the SMS survey, we used Pollit(link is external) a free, open-source tool. It is easily accessible with an internet connection and requires only minimal hardware to function—a computer, a mobile phone and an internet connection. The tool was developed by In(link is external)STEDD, the same company that developed Verboice(link is external), the programme we are using for IVR calls. In the future, Pollit may allow us to periodically run short SMS surveys in-house, bringing a lot of flexibility to our field teams.  During the Goma test, Pollit proved to be a simple and flexible tool. It was easy to set up and worked smoothly during the six days of data collection.

However, response rates to SMS surveys turned out to be low, particularly compared to voice calls and face-to-face surveys. Our enumerators reported that people in the camp are not used to using the SMS function on their phones. They typically communicate using voice calls, due to low literacy and habit. In some cases, the phones people owned were broken or had dirty screens, making it difficult to read and reply to the messages we were sending.  These issues, however, do not prevent us from using voice calls, which seem to be the preferred modality amongst respondents in DR Congo. This seems to suggest that we should stick to live calls for Mugunga 3 camp, and use SMS questionnaires in other settings. We are now analyzing the data we collected in Goma in order to answer other questions we have, which includes comparing data quality for the different survey modes. We’ll be sure to share those insights later.

Listening in to Central Somalia: tracking food security and livelihood indicators for IDPs

Blog entry originally posted in August 2014 on the Humanitarian Innovation Fund website.


After three survey rounds in Somalia, the time has come to take a look at results.  As in the Democratic Republic of Congo, our operators in Somalia have been conducting live interviews by phone from a call center established in a WFP field office.  In Somalia, we ask displaced people living in camps in Central Somalia about their food consumption and the coping strategies they use.

Food consumption score: degradation as the lean season progresses

The data collected suggests that the food consumption has deteriorated slightly during the period.  In May,  12.6% of respondents had a poor or borderline food consumption score (3.6% and 9% respectively), increasing to 17.4% in June (6.3% poor and 11.1%  borderline) and  standing at 17.6%  (6.8% poor in June and 10.8% moderate) . The change is statistically significant (p=0.00), and matches the expected seasonal pattern of declining food consumption as the lean season advances.

The period covered by the phone surveys cover the end of the Gu rain season (April to June), and the beginning of the Hagaa dry season (July-September). Levels of food insecurity are usually higher for the community during the lean season, when income opportunities are low and food prices are high. This is reflected by the results of our survey, which register a decrease in the food consumption score over the months of June and July.

Nonetheless, our phone rounds do show that fewer households were consuming a ‘poor’ or ‘borderline’ diet in May-June-July than at baseline in September 2013. This could be due to an increase in assistance, coupled with improved economic access to food. Indeed, the prices of imported rice and wheat flour, the preferred cereals consumed in Central Somalia, declined from 24,000 and 18,000 shillings per kilo respectively in September 2013 to 16,000 shillings per kilo in July 2014.

Featured image

Figure 1: Food Consumption Score: increase of ‘poor’ and ‘borderline’ scores in May, June and July Source: WFP phone surveys Somalia, 2014

Coping strategies: decline of emergency strategies in July

In order to understand how people’s livelihood strategies are changing, we classified the data we collected into three categories, stress, crisis and emergency. The ‘stress’ includes purchasing or borrowing food on credit, spending savings, engaging in casual labor and withdrawing children from school. ‘Crisis’ refers to selling non-productive assets such as radios or furniture. Finally, ‘emergency’ strategies captures begging and the sale of productive assets.

As shown by Figure 2, in June, compared to May, the percentage of people implementing ‘Stress’ and ‘Emergency’ strategies increased from 13% to 17% and from 53% to 58% respectively. These results reveal that people were engaging in more strategies to face the hunger period, a phenomenon that corroborates the findings of the food consumption score during the same period.

Nevertheless, the recourse to ‘Emergency’ strategies in July was high (41%) but lower than in both May and June. This could be due to the increase in the social support to the deprived people (including IDPs) in the run-up to, during and at the end of the month of Ramadhan.

Featured image

Figure 2: Percentage of people who engaged in stress, crisis and emergency strategies. Source: WFP phone surveys Somalia, 2014

To date, response rates to our calls in Somalia have stood between 63% and 57%. This level of response is okay considering we never distributed phones in Somalia – a somewhat labor-intensive exercise we have to go through for in the eastern Democratic Republic of Congo.  We see an erosion in response (which is a trend we have seen in other countries).  We are discussing ways to maintain it, such as recruiting new respondents or being more flexible in the way we contact our respondents and keep them engaged.

Featured image

Figure 3: Response rates, Somalia Source: WFP phone surveys Somalia, 2014

For now, our operators are successfully completing at least 250 calls a month. Interestingly, women headed households represented 90% of our sample in Somalia during the first three rounds. We hope to share more insights on gender and food security in a later post.