We have success using crowdsourcing to collect food prices by SMS in the refugee camps of Kenya. The experiences made us curious about trying out other methods that could help us deliver data quickly and efficiently from the remote and hard to reach geographies where WFP works.
We found out about a startup that specializes in crowdsourced data collection. Anonymous ‘contributors’ would carry out simple data collection tasks through a dedicated smartphone app, the sales pitch went. Intrigued, we decided to pilot this system to monitor food prices in a drought-affected area of Southern Africa. We were hoping to use the data to complement the information traditional information systems produce. What did we learn?
The anonymous ‘citizen reporter’ is a myth. The company we worked with had to go through local organizations, such as NGOs, to find people able to collect the data for us. This a far cry from the vision of sourcing data from an anonymous crowd. There is more to finding contributors than putting out some ads on social media and magically reaching masses of people. Our contributors were not really anonymous and were easily identified by traders. In the end, the activity looked a lot like traditional tablet-based data collection. The World Bank also found the same thing when they contracted a private company for crowdsourcing. You can find more on their experience in here.
Getting started is labor intensive. It’s going to be a learning process for both your organization and the company, and this will mean investing significant staff time. On our side, since we were unfamiliar with the methodology, there were a lot of iterations as we attempted to specify commodity types and data types. This is perhaps surprising because we at WFP have been collecting food prices for a long time. It turns out we needed to revisit the commodity lists, specify unit measures — a process that required patience. On the company’s side, they had limited experience in the geographies of the pilot which could lead to an overestimation of what was possible and how quickly.
Expect long ramp up times. The ramp up to the data volumes we wanted took months, because that time was needed to set up the system and recruit the local contributors. Our roll out was planned this way. Do not expect an army of anonymous contributors to materialize out of thin air.
It’s still hard to reach remote places. The crowdsourcing model is no silver bullet when it comes to reaching the remote places we were interested in monitoring. It proved hard or even impossible to source enough data from the more remote markets when using a crowdsourcing service. This is perhaps because of low smartphone penetration in remote locations, the high cost of sending a contributor to such places, or to poor connectivity. In contrast, collecting data from larger urban areas was much easier.
High costs are a barrier to handover in resource-poor environments. It became clear that the cost of the activity was higher than lower-tech alternatives. WFP works to enable handover of information systems to national authorities or other local partners. For the moment, the cost of app-based crowdsourcing is perhaps out of the financial reach of our local partners.
After some trial and error, we were able to obtain good quality data through crowdsourcing that was helpful to our field offices. However, ultimately we returned to our mVAM strategy – using phone calls to traders to collect food prices each week. Although our approach can’t cover as many commodities as the company’s crowdsourcing activity provided, it has its own advantages. It’s lower tech – there is no fancy app to download. There is no far away company to deal with. Above all, it’s an approach we can hand over to our local partners.