MCC’s First Open Data Challenge - Learning beyond the Independent Evaluations
September 16, 2014
MCC has just announced its first Open Data Challenge – the call-to-action to any masters and PhD students working in economics, public policy, international development, or other related fields who are interested in exploring how to use publicly available MCC-financed primary data for policy-relevant analysis.
The release of this data is intended to facilitate broader use of the data, above and beyond the scope of the independent evaluations that produced this data. Since the challenge was announced at the end of August, one question to MCC has been – what type of additional learning is the agency interested in?
During the release of MCC’s first five impact evaluations in farmer training, there was a lot of learning and soul searching going on within the agency. Sure, some of the evaluations pointed to positive, expected results, like increases in farm incomes in the El Salvador dairy, Ghana northern farmers’, and Nicaragua farmer training programs, but there were a lot of unexpected results as well. Why didn’t we see increases in farm income in Armenia? What were unique characteristics of farmers selected for training in Honduras? What led to the differential impacts in Ghana? And, the big question, why weren’t the increases in farm income leading to observable increases in household income?
With this in mind, the MCC agriculture team took some time to ask themselves what additional learning they would have liked to see beyond what was analyzed in the independent evaluations. There were three broad categories of additional potential learning:
- Understand better what led to observable, realistic impacts. For example, in Ghana the team was left asking:
- Why were impacts positive in the North, while negative overall – was this related to the differing agro-climatic context? Did impacts differ by crop type? Was this a measurement or a timing of measurement problem?
- Understand better what led to observable, counter-intuitive impacts. In some of the evaluations, the evaluators found counter-intuitive impacts. For example, in Ghana:
- In the North, crop incomes were up by 78 percent, land under cultivation was up by 32percent, yet there was no significant increase in yields. How is that possible? Did treatment farmers plant a different mix of (higher value) crops than control?
- Understand better the impacts on project implementation, secondary outcomes, and positive/negative externalities. In many cases, analysis was limited to the primary evaluation questions and outcomes agreed to for the purpose of the evaluation. However, the initial analysis produced from the evaluations resulted in many questions that could possibly be answered by further analysis of the same data. For example, in Honduras:
- The evaluator makes a strong case that the farmers selected for the program were fundamentally different from the ‘average’ farmer that would have been selected following a random selection process. Additional analysis on who these treatment farmers were and how they differ from the average farmer, and certainly farmers living in the comparison areas where farmer training was not made available, would be useful for understanding and interpreting results of the evaluation.
In all of these evaluations, MCC also recognized the need to explore:
- Gender disaggregated impacts. Many of these evaluations were designed prior to MCC’s policy to require gender and other relevant disaggregation data and impacts. Can the available data be used to produce gender disaggregated impacts in Armenia, El Salvador, Ghana, or Nicaragua?
- Assets and Investments. While overall household incomes did not increase, is the available data able to demonstrate whether or not households increased investments in assets or other investments during the evaluation period?
While the learning from these evaluations cannot be undervalued, MCC is eager to fully explore the potential for more learning from the existing data produced by these evaluations to answer outstanding questions on how to design more effective agricultural investments and improve evaluation of these investments. We hope the Open Data Challenge is one way to motivate external researchers to use available resources to start answering these questions.