Summary of Findings
Measuring Results of the El Salvador Productive Development Project
June 2, 2014
In Context
The MCC compact with El Salvador was a five-year investment (2007-2012) of $460.9 million in three projects: connectivity, human development and productive development. The Compact’s goal was to advance economic growth and poverty reduction in the Northern Zone of El Salvador. The Productive Development Project of $68 million included three project activities implemented concurrently in the Northern Zone: (i) production and business services, (ii) investment support and (iii) financial services. The subject of the evaluations summarized here are the $56 million Production and Businesses Services Activity and the $8 million Investment Support Activity. These two activities are equivalent to 94% of the overall Productive Development Project investment and 14% of the total compact. The Financial Services Activity final evaluation is forthcoming.
These figures are based on MCC obligations as of September 2012.
Program Logic

The Productive Development Project was designed to transition producers to higher-profit activities, generate new investment, expand markets and sales, and create new jobs in ways that stimulate sustainable economic growth and poverty reduction. The Production and Business Services Activity included on-going technical assistance and training, in-kind donations (starter kits), demonstration plots, and technical and financial support for enterprises created and supported by the project in targeted value chains. 1 The Investment Support Activity provided investment capital to competitively selected applicants, who, due to insufficient collateral and lack of liquid assets, were not able to finance their investments for business activities located in and benefiting poor inhabitants of the Northern Zone. The Financial Services Activity provided guarantees to support increased lending activity by banks and non-bank financial institutions in the Northern Zone.
It was envisioned that the three activities would work together – a portion of Production and Business Services participants would have access to business planning services, investment capital or guaranteed loans through the Investment Support Activity or the Financial Services Activity. The capital and loans would help producers transition to high-value crops and finance new production technologies such as greenhouses and irrigation systems.
There were several key assumptions underlying the program logic:
- Content and duration of training are sufficient to trigger behavior change.
- Starter kits/in-kind donations are sufficient to trigger sustained behavior change.
- Producers have necessary access to credit through existing structures supported by the Investment Support Activity or Financial Services Activity.
- Primary barrier(s) to adoption of improved techniques is lack of knowledge and/or funds for investment.
- Adoption of improved techniques leads to an increase in productivity.
- Increases in productivity lead to increases in productive income which, in turn, lead to an increase in overall household income.
It’s important to note that over the course of the compact, the design of PBS was modified. Implemented in late 2009 and 2010, the first phase (Phase I) of assistance focused on technical assistance with productive activities—particularly milk production in the dairy chain, vegetable production in the horticulture chain, and wood- and clay-based handicraft production in the handicraft chain. In late 2010, PBS was modified in response to lessons learned during Phase I—namely, that increased and more diversified production was not sufficient to guarantee higher sales and income among participating producers. As such, the second phase (Phase II) of assistance featured more explicit marketing and business development components, including the establishment of two new producer-owned enterprises in the horticulture and dairy chains, and the strengthening of three pre-existing producer-owned enterprises in the handicrafts and dairy chains. The impact evaluation design for PBS—developed and initiated by stakeholders in 2009— did not anticipate these modifications to PBS assistance in late 2010. Partly for this reason, Mathematica conducted a final performance evaluation that documented and assessed Phase II assistance to farmers and producer-owned enterprises (see Final Performance Evaluation section below).
Measuring Results
MCC uses multiple sources to measure results, including monitoring data during Compact implementation, and independent evaluations, which in many cases are continued Post Compact. Monitoring data is typically generated by the implementers, and specifically covers the ‘treatment’ group of farmers who received training under the Compact.
The table below includes the monitoring indicators that were tracked during implementation of the two activities. The Financial Services Activity indicators will be added when that evaluation is presented.
Indicators | Level | Actual Achieved | Target | Percent Complete |
---|---|---|---|---|
Production and Business Services Activity | ||||
Farmers who have applied improved techniques | Outcome | 11,520 | 7,000 | 165% |
Enterprises that have applied improved techniques | Outcome | 164 | 114 | 144% |
Enterprises assisted | Output | 272 | 292 | 93% |
Farmers trained | Output | 15,363 | 10,465 | 147% |
Participants of technical assistance and training – non-agriculture | Output | 2,104 | 3,035 | 69% |
Hectares under production with support from the Productive Development Project | Output | 25,399 | 15,000 | 169% |
Investment Support Activity | ||||
Loan Borrowers | Output | 29 | N/A | N/A |
Loan Borrowers (female) | Output | 5 | N/A | N/A |
Amount of Investment Support fund approved | Output | 7,505,299 | 8,500,000 | 88.3% |
Number of loans executed by the Investment Support Fund | Output | 30 | N/A | N/A |
Number of loans approved by the Investment Support Fund | Output | 44 | 35 | 125.7% |
The average completion rate of output and outcome targets is 125% percent; and for 5 of the 8 indicators with targets, those targets were met or exceeded. It should be noted that these numbers are not always the same as the evaluation results because in addition to not taking the “without project scenario” into account as described below, the monitoring data comes from different data sources, data collection instruments, and samples of respondents.
Monitoring data is limited in that it cannot tell us what these farmers would have done in the absence of the MCC-funded training, credit, or technical assistance. For example, when implementers report that farmers have exceeded targets around adoption of new techniques, we do not know if these farmers adopted because of the training or would have adopted without the training. This is a key motivation for why MCC invests in independent impact evaluations, which estimate a counterfactual – what would have happened in the absence of the investment. For some activities, impact evaluations are not feasible or cost-effective and in those cases, MCC invests in independent performance evaluations. The evaluations for the Productive Development Project combine the use of impact evaluations and performance evaluations.
Component | Evaluation Type | Methodology |
---|---|---|
Production and Business Services Activity | Impact (Interim) Performance (Final) | Interim Impact: Randomized Roll-out Final Performance: Pre-Post |
Production and Business Services Activity – Handicrafts | Impact (Final) – Forthcoming | Randomized control trial |
Investment Support Activity | Performance (Interim) Performance (Final) – Forthcoming |
Ex-Post |
Financial Services Activity | Performance (Final) – Forthcoming | Ex-Post |
Evaluation Questions
The evaluations of the Productive Development Project were customized for each activity and were designed to answer the following questions:
Component | Evaluation Questions |
---|---|
Production and Business Services Activity |
Interim Impact:
Final Performance:
|
Investment Support Activity | Interim Performance:
|
Evaluation Results
Productive Development Project Overall
The Productive Development Project evaluations were not designed to quantitatively examine the overall effects of the combined project. However, the performance evaluations provide some insights into how the two activities highlighted here interacted. By July 2011, at least 15 PBS participants were approved for loans out of a total of 44 approved loans (34 percent). This was not the level of interaction originally envisioned between the two activities. Stakeholders generally cited the minimum loan amount of $50,000 under the Investment Support Activity as a primary reason for the lack of integration between Production and Business Services assistance (which generally served small, poor producers) and the Investment Support Activity (which generally served small- and medium-scale business owners).
Production and Business Services Activity—Interim Impact Evaluation
Although most output and outcome targets for the Production and Business Services Activity were met or exceeded, the independent evaluation found varied results for the three value chains. In dairy, the evaluation estimates there were impacts on adoption and increases in farm income. In horticulture, the evaluation estimates impacts on adoption, but no impacts on farm income. In handicrafts, the evaluation estimates impacts on employment for program participants, but no impacts were detected on productive income. In the horticulture evaluation, it should be noted that the sample was underpowered since only about 30 percent of the treatment group enrolled in the training program. This limits the ability to draw conclusions about ultimate impact, though the evaluation still provides ample opportunities for learning. In handicrafts, additional follow-up data will provide more information on whether or not the increase in employment led to an increase in productive or household income. The interim evaluation results below capture the phase of training that occurred from 2010 to 2011.
Evaluator | Mathematica Policy Research |
---|---|
Methodology | Randomized roll-out |
Evaluation Period | 12 months |
Adoption and employment | For the 2010-2011 phase of dairy implementation:
For the 2010-2011 phase of horticulture implementation:
For the 2010-2011 phase of handicrafts implementation:
|
Farm Income | For the 2010-2011 phase of dairy implementation:
For the 2010-2011 phase of horticulture/handicrafts implementation:
|
Household Income | For the 2010-2011 phase of dairy implementation:
For the 2010-2011 phase of horticulture/handicrafts implementation:
|
Production and Business Services Activity – Final Performance Evaluation
Due to the changes in project design in the middle of implementation and low participation of the treatment group in the horticulture evaluation, MCC cancelled the final data collection rounds for the PBS impact evaluation (for horticulture and dairy; handicrafts analysis is on-going) and decided to conduct a final performance evaluation. The final performance evaluation is unable to provide quantitative estimates of outcomes achieved by the activity; however it provides insights into implementation facilitators and barriers, as well as the potential sustainability of the enterprises supported under the project.
Evaluator | Mathematica Policy Research |
---|---|
Methodology | Pre-Post |
Evaluation Period | 2007 to 2012 |
Implementation Facilitators and Barriers |
|
Production, Employment, Sales |
|
Sustainability of Enterprises |
|
Investment Support Activity—Interim Performance Evaluation
The Investment Support Activity fell short of its original lending targets; however, interviewed credit recipients appear to have experienced improved outcomes compared to non-credit recipients. These results however are anecdotal because the evaluation does not have a valid comparison group for loan recipients or a sufficiently large sample size to attribute differences in outcomes to the credit. In addition, as most borrowers are still completing investments from their business plans, more detailed information regarding sales, income, and employment will be collected in a future survey round.
Evaluator | Mathematica Policy Research |
---|---|
Methodology | Ex-Post |
Evaluation Period | 2007 to 2011 |
Implementation |
|
Results |
|
Lessons Learned
MCC released impact evaluations from farmer training activities in five countries in October 2012. Looking across these five, and informed by lessons about impact evaluations in agriculture more broadly, MCC has identified a set of common lessons. Four of these lessons as illustrated by the El Salvador case are described below. Additional lessons from the performance evaluations have also been identified.
- Always return to the program logic. If the program logic and implementation plan include a variety of value chains, the evaluation must ensure sufficient power to track early and realistic impacts on income in each value chain. In El Salvador, the evaluation was not originally designed to be done by value chain but by all three sectors together. When unbundled, the design was “underpowered” to report on individual value chains.
- Linking to household income is difficult. In El Salvador dairy, the evaluators find that dairy farmers’ farm incomes roughly double that of the control group; however, they do not find an impact on household income or consumption. This is likely because the number of groups of dairy farmers that were randomized was small, and the evaluation was underpowered to report changes in household income by value chain. This needs to be taken into consideration for future evaluation design.
- Test traditional assumptions. In El Salvador, some of the evaluation findings suggest that tailored trainings and donations may produce better results in the short-term. However, the project and evaluation were not designed to test effects of variation in training content or duration in order to confirm this. MCC and MCAs will look for future opportunities to use impact evaluations to test assumptions around the appropriate content and duration of training to maximize impact.
- The randomized roll-out evaluation approach has risks. In a randomized roll-out approach, a first round of treatment farmers is compared to a control group of farmers that receive training at a later date. The key to this approach is that there be enough time between the two phases to see behavior change and accrual of benefits for the first farmers before the second round of farmers is trained. In the case of the handicrafts project, more will be learned with the follow-up data and impact analysis on intermediate and final outcomes. For the other value chains, however, the control groups have been trained as per the agreed roll-out methodology and additional learning using these evaluations is limited.
- When important for unbundling program results, require the reporting of detailed cost information. Over $10 million was available for donations to beneficiaries under the Production and Business Services Activity. However, MCC did not require MCA and its implementer to report in detail the amount of donations that were provided to individual farmers or enterprises. Detailed records were kept by the implementer; however only high-level aggregated numbers were reported back to MCC. This has resulted in the inability of the evaluation to analyze who benefited the most from donations and whether or not receiving large amount of donations was correlated with improved outcomes. To the extent that MCC wants to analyze this type of information in future projects, detailed reporting on costs from implementers should be required by their contracts and potentially required from accountable entities as well.
- The activity’s objective, target population, type of intervention, definitions, selection methodology, and expected results should be defined prior to investment. While these were stated in some form in the Compact for the Investment Support Activity, the definitions were not clear up-front or shared by all of the stakeholders. As a result, the interpretation of this language was debated throughout implementation, affecting the size range of the investments, the interest rate and the collateral requirements. In future circumstances, MCC should be very clear when drafting investment related language in Compacts in order to set out the purpose, activities and expected results of the intervention, which should be accompanied by a detailed term sheet to guide implementation preparation and investment.
- Implementer capacity matters. This is clear yet, MCC and FOMILENIO could have addressed this more proactively and earlier in the Compact. The relationship between BMI and FOMILENIO was governed through the trust agreement and an Implementing Entity Agreement (IEA) but compliance and enforcement of the agreements was a struggle throughout implementation. In hindsight, there should have been even better management of the IEA, more performance based incentives, and potentially some technical assistance to ensure that MCC funds were used in the most optimal manner. At the close of the program, the Investment Support project ERR was lower than anticipated. This was partially because it cost administratively the same amount of money to approve $7.5 million in loans as it could have cost to execute the program with the larger originally planned amount. This does not diminish the value of the overall program, but rather highlights a missed opportunity for BMI, FOMILENIO, and MCC to invest in more SMEs in the Northern Zone.
- Linkages between activities will not happen on their own. The design of the Productive Development Project included three Activities. In particular, it was envisioned that the Production and Business Services Activity (PBS) and the Investment Support Activity would work together. Producers receiving technical assistance and training under PBS were to receive help developing business plans to access credit under the Investment Support Activity. By July 2011, at least 15 PBS participants were approved for loans out of a total of 44 approved loans (34 percent). This was not the level of interaction originally envisioned between the two activities. Incentives or requirements could have been included in implementer contracts to ensure that the two activities worked together. In addition, the targeted beneficiaries of each activity could have been aligned so that there was more overlap. The minimum loan amount of $50,000 under the Investment Support Activity may have been the primary reason for the lack of integration between PBS (which generally served small, poor producers) and the Investment Support Activity (which generally served small- and medium-scale business owners).
As a result of these lessons learned in El Salvador in combination with lessons learned in four other farmer training evaluations, MCC project operational practices have changed in the following way:
- Develop program logics early and revise as necessary. MCC now requires the formulation and revision of program logics from the concept note stage and throughout implementation. The program logic approach has been applied in the most recent cohort of compacts in development (Benin, Niger and Sierra Leone). In addition the agenda of MCC’s Ag College in Sep 2012 included a day devoted to review of program logic for all active agriculture projects in the portfolio by MCC and MCA counterparts together. This was followed up with a series of peer review discussions for each of the program logics to confirm links to on-going evaluations.
- Assess training and technical assistance programs critically. Mixed results on adoption have led the MCC’s Agriculture Practice group to re-examine the focus on farmer training as a main part of the solution to low productivity of the agriculture sector and has resulted on more concerted efforts to identify interventions across the value chain. If farmer training is considered, the duration, intensity and content of the training are more carefully examined and the benefits and challenges of reaching large number of beneficiaries are fully assessed. Equally important the use of grants and starter kits has led to a review of practices across all Compacts and to the development of new guidance.
- More carefully align interventions and beneficiaries. The importance of better aligning the beneficiaries of several activities in a project and the importance of discussing early in the process the targeted beneficiaries and the potential selection criteria are being applied to the new cohort of Compacts.
In addition, as a result of these lessons learned, MCC evaluation practices have changed in the following way:
- Formal review process for evaluations. The Monitoring and Evaluation unit is pilot testing a formal review process that defines critical milestones in the evaluation cycle that require substantive review and clearance by key internal stakeholders. This review process also requires local stakeholder review of key evaluation documents in consultation with the evaluator prior to submission to MCC in order to provide feedback on feasibility of proposed evaluation, as well as technical, and factual accuracy of evaluation documents. The formal review process is intended to ensure that evaluations are designed with stakeholder buy-in, are designed using the program logic, use appropriate methodologies for the timeframe of the expected results, and are flexible enough to adjust to changes in implementation.
- Evaluation risk assessment. An Evaluation Risk Assessment Checklist has been developed and institutionalized by the Monitoring and Evaluation unit. The risk assessment checklist is reviewed by the M&E lead with M&E management. The risk assessment is intended to inform decision making and identify necessary course correction for more timely response to risk identification.
- Development and use of standardized evaluation templates. The Monitoring and Evaluation unit has developed standardized templates in order to provide guidance internally and to independent evaluators on expectations related to evaluation activities and products. These templates are intended to clarify and raise standards for evaluations by influencing the daily work of M&E staff and evaluators.
Next Steps
MCC has additional evaluations and analysis underway that will provide more results and learning about progress in El Salvador:
- Final impact evaluation for the handicrafts value chain (2013)
- Final performance evaluation for Investment Support Activity and Financial Services Activity (2014)