Poverty Reduction Blog Tag: Transparency
Posted on October 14, 2014 by Catherine Marschner, MCC Data Program Manager
What is data?
Data is raw information. When you collect all kinds of data on all kinds of different things, you can put it together to provide reliable information. This structured data can help partner country governments plan the best use of their resources, and it can help the people hold their governments accountable.
Collecting data in a standardized way makes the data even more reliable. In the case of foreign assistance, it leads to transparency, which is a priority for MCC.
So what have we done about it?
Our data team at MCC has worked hard over the last year to improve the quality of the data we share with the International Aid Transparency Initiative (IATI), an international registry that tracks the level of transparency in stakeholder-produced data on foreign aid. We have also made our efforts to produce data more efficient and sustainable. So while we are certainly proud to have been ranked among the top three donors in the 2014 Aid Transparency Index, we are even more proud—and committed to—the substantive improvements we have made and continue to make.
The quality of our data is better because we have added new data and functionality to our programmatic management information system and we’ve added detail for a number of IATI data fields: In our XML file users will find useful information now on
- planned disbursements by year for our compact programs,
- descriptions of our programs and their associated activities,
- and on the results of our work.
In fact, MCC’s data on performance was higher than any other donor ranked in the ATI – in part because we provide results descriptions, performance indicators and links to materials from our independent evaluations.
MCC has also built a more streamlined process for producing our data. We have an integrated team with expertise on the policy, data analysis, finance and technical sides. We also have an ability to pull data from different systems in order to build out an integrated data set in XML that meets the reporting requirements of both the Foreign Assistance Dashboard and IATI.
As we continue to build out our internal data systems, we are paying careful attention to how we link different pieces together. For example, our IATI file this year includes links back to our Evaluation Catalog, where MCC makes all the metadata and microdata from our independent evaluations freely available to the public.
All these strides forward have netted a dataset with a lot of richness – and some very interesting and high quality data! Yet at MCC, we also realized that it was not enough to just put this out there in XML: a format that is far from “human-readable.” Our team knew that for our efforts to become sustainable, we also needed to create an internal demand for this kind of data from our own staff. So we built a tool that visualizes the data so that MCC staff can use it to help them in their everyday work. The response has been enthusiastic so far, and we look forward to building additional analytical components, learning about staff demand, and reporting back on what we are learning!
We would love for all of these efforts to become more demand driven – so we welcome your thoughts and feedback on what we ought to prioritize as we strive to continually improve the quality and quantity of information MCC makes available to the public.
Posted on October 10, 2014 by Tom Kelly, Acting Vice President for Policy and Evaluation
At this time each year, with the announcement of the results of the Aid Transparency Index (ATI), major aid donors all over the world await their relative rankings from Publish What You Fund’s careful exercise to evaluate the quality of information provided to the International Aid Transparency Index (IATI). In this regard MCC is no different. Having ranked #1 in the world in the 2013 Index, we were in the enviable position this year of having no place to go but down!
Because MCC values transparency, we spent the last year making careful improvements to our IATI data. We also worked closely with the State Department’s Foreign Assistance Dashboard to develop a USG XML format based on the IATI standard allowing MCC’s higher quality data to be published to the IATI Registry. We worked alongside other donors to try to figure out how IATI data can be linked with country budget information. We published our own IATI implementation schedule to inform our data users about our data definitions and future publication plans. And we put lots of time and thought into how to best represent our results within the IATI data standard. Because of all these efforts – and in spite of fierce competition – MCC was able to score above 85% and remain among the top three donors worldwide. It hasn’t been easy, and we are proud of this.
Yet as the ATI comes out in its 4th year in 2014, we are surveying the broader landscape and are concerned about the performance of the donor community as a whole. We are concerned that so many donors will fall far short of their Busan commitments, and concerned that data quality will therefore not improve to a point where country partners will find it useful. Among 68 donors ranked, only 15 score in the “good” and “very good” category. The average score in the Index this year is only 39%. Clearly, major progress will have to be made by the end of 2015 to deliver on the promises of Busan.
In this context, people ask us all the time what’s the way forward? Setting aside some peculiarities that make it easier for MCC to do this (as a young, small agency with transparency in our DNA), here are a few of the things we have found most useful:
- Demonstrate political will and leadership on transparency at all levels – so staff are incentivized to work hard at solving the multitude of problems that will inevitably come up;
- Give a strong mandate to a small team that includes policy, data analyst, technical and finance staff – so that together they can resolve most of the issues and tee up the important points effectively for senior staff decisions;
- Don’t try to build a single system to meet IATI reporting requirements – instead develop a strategy for continual progress. Think through how you can pull data for each of the required fields from existing systems, and use your tech people to link them up into your XML output. Start with the fields where you already collect information and work steadily on improving quality of this data. Then make plans to do what’s required to collect and report on additional information over time;
- Keep talking to stakeholders and data users to better understand – and to stimulate – demand; and finally…
- Open up your data to your own staff – leverage IATI efforts to build more robust internal systems to share and use data. As staff see the benefits to their own work, support for the work of data teams will grow, and internal demand will make the system sustainable.
MCC will soon be launching a Principles into Practice paper detailing these and other lessons from our work on transparency and accountability. We hope many of you will join us in upcoming conversations so that we can learn together how to move this field of practice forward. We believe that it is possible for donors worldwide to jump forward in 2015, and we look forward to doing our part to help to drive the broader agenda for transparency.
Posted on July 10, 2014 by Tom Haslett, Program Officer
Earlier this year, the Principal Secretary of Malawi’s Ministry of Energy convened the first semi-annual review of the country’s five-year, $350.7 million MCC compact. These reviews will be held every six months and provide key stakeholders with an opportunity to assess progress against the agenda for reform in Malawi’s power sector. The compact establishes an ambitious program to revitalize the country’s power sector through investments in critical infrastructure, hydropower plant efficiency and sector institutions.
The centerpiece of the review is a set of indicators focused on the performance of ESCOM, the country’s electricity utility, in areas like asset maintenance, bill collection and efficient provision of electricity. ESCOM’s financial plan establishes targets for these indicators, which are then compared against actual performance at the semi-annual review. This gives stakeholders in attendance and the Malawian public in general a window into what’s going on at ESCOM and with broader power sector reforms aimed at attracting new investment in electricity generation.
Why is this important? ESCOM is the main electricity provider for Malawian households and companies, and its operations are a matter of intense public interest. However, ESCOM’s recent performance has not been strong, and many Malawians lack confidence in the utility’s ability to improve. This was displayed recently as stakeholders spoke out against an increase in tariffs proposed by ESCOM; people asked why they should pay more for unreliable service and encouraged the company to increase efficiency, not raise rates. And in April 2014, the energy regulator approved a tariff increase below the level ESCOM identified as necessary to cover costs while highlighting the need for performance improvements.
ESCOM’s efforts to improve its service have given rise to a chicken-and-egg problem: The company believes a higher, cost-reflective tariff is necessary to improve service. But the public will have trouble accepting significantly higher tariffs until ESCOM’s operations improve.
This is where the semi-annual review can establish a path forward by providing a forum where objective data helps paint a picture of ESCOM’s performance and define corrective actions.
At the same time, emerging issues in the power sector can be jointly reviewed by power sector institutions and representatives of the private sector and civil society.
The review is also a perfect tool for the MCC model. The sustainability of our work in Malawi is based on strengthening ESCOM’s ability to recover costs, invest in service provision and be a viable partner for investors. We’re supporting these goals by introducing a modern management information system and helping build capacity in areas like financial management, procurement and billing efficiency.
The compact also targets policy reforms that can incentivize private investment in new power generation. The semi-annual reviews will allow us to understand if the compact is meeting its goals and provide learning opportunities. In addition, by bringing together stakeholders from across Malawian society, these forums will ensure the public consultations that helped develop the compact continue to inform its implementation.
This first semi-annual review gave concerned stakeholders a chance to better understand current power sector reform goals, progress to date against those goals and how the compact is supporting their achievement. A report that includes data on all key performance indicators discussed in the review will be publicly available soon. And three sub-committees with members drawn from the semi-annual review participants will meet on July 17 to review priority corrective actions to progress against the reform agenda and approve implementation procedures and timelines to address these issues. As our work continues over the coming months months, we’ll look forward to the next review—another opportunity to shed light on how MCC is helping to reform Malawi’s power sector.
Posted on April 28, 2014 by Nathaniel Heller, Executive Director, Global Integrity, and Alicia Phillips Mandaville, Managing Director, Development Policy
Last week, roughly 40 colleagues gathered at the OpenGov Hub in Washington, D.C., to brainstorm and debate around possibilities for a Governance Data Alliance, an idea focused on improving coordination in the production of governance data while simultaneously establishing and strengthening feedback loops between producers and actual users of those data.
The gathering was co-organized by Global Integrity and the Millennium Challenge Corporation and facilitated by the terrific Allen Gunn of Aspiration. The results of a pre-event scoping survey was visualized and mapped by the craftsmen over at Vizzuality. You can check out all of that data over at dataalliance.globalintegrity.org, see pictures of the meeting over on Flickr or read comments from participants on twitter using #governancedata.
As we talked about publicly before the meeting (on multiple blogs, including here and here), our intent with this quick get-together was to explore whether there was sufficient interest in this diverse and ad hoc group to take an exploratory process forward—and, if so, to identify the big questions we’d need to collectively answer to determine the initial contours of a potential alliance. We did not intend to answer any of these big questions during the two days of the meeting or design any solutions or outcomes. Instead, we were focused entirely on sussing out the major, “Gee, we really need to figure that out” issues.
The good news is that there was a strong consensus to take an exploratory process forward. We also managed to identify a number of core, meaty items that need further unpacking in the coming months if a governance data alliance is to add value—a process we’ll be taking forward through a number of ad hoc working groups. Those working groups are open to anyone interested in being part of the conversation, regardless of whether you attended last week’s meeting. Here’s what we’ll be focused on:
Making our assumptions explicit about how better governance data can lead to improved outcomes (or as Toby Mendel from the Centre for Law and Democracy pointed out, we need a clear and compelling theory of change). We all think that better data on governance can—when the data is used—help improve governance and service delivery outcomes. But we have a variety of views about the ways in which better governance data can lead to improved outcomes. Maybe it’s about policy makers being able to make better-informed decisions; maybe it’s about citizens’ groups being able to hold decision makers accountable; maybe it’s about donors being able to incentivize governance reforms. An essential starting point in working out how a Governance Data Alliance can help is to make explicit the ways in which we think better data can lead to better outcomes. This should enable us to focus more clearly on addressing the challenges and obstacles that sometimes prevent better data leading to better outcomes.
Refining and settling on an initial problem statement(s). During the course of our meeting, we identified a range of problems that a potential Governance Data Alliance could help address. Poor communication between governance data producers (which manifests itself in redundant country coverage and coverage gaps) is one; similarly poor communication between producers and users leading to wasted effort in the production of information no one actually uses is another. While users struggle to gather standardized, machine-readable data, unused zombie governance data and methodology repositories continue to propagate (examples are here, here, here, and here). But which of these (and many other challenges) should we collectively seek to address first (or second)?
Membership. Who might participate in a future alliance from all three cohorts (users, producers and enablers)? How would participation be extended, candidate organizations vetted and cats herded so as to keep the collective a manageable yet very large tent? As Rita Ramalho of the IFC reminded us, each of these cohorts are robust communities by themselves! We owe a thanks to John Samuel of Development Studies for asking out loud: how we can avoid letting this process and an eventual alliance be dominated by NGOs and actors from Northern countries?
Governance (how meta!). How would an eventual alliance be governed? Who sets the rules of the game, and where does power and decision making reside? If a staff is needed to take the process forward, who should they be and where should they sit? Vincent Lazatin of the Transparency and Accountability Network in the Philippines put in some heroic work in teeing up these issues.
Just what is “governance data?” While we intentionally parked any debate around the definition of “[good] governance” for a later date, we know we need to resolve this to some degree of satisfaction moving forward. Is “governance data” third-party NGO ratings of government performance? Internal public sector administrative data like accurate counts of birth certificates? Household surveys asking about satisfaction with government service delivery? All of those, or something completely different? Ernst and Young’s Kelly Terrill led a break out session that made clear there is unmet demand for all different aspects of governance data from non-traditional corners as well.
Producer coordination. There are many ways in which governance data producers can better coordinate and improve efficiencies. But should that start with simple communication and awareness-raising around anticipated coverage patterns or extend more aggressively to shared in-country research teams or streamlined methodologies and question sets? Global Integrity’s Hazel Feigenblatt is already helping to coordinate an initial team across several data producers to begin tackling this.
Tackling the feedback loop problem. We all agreed there was a huge need to establish and nurture better feedback loops between governance data producers and users. Vanessa Tucker of Freedom House spoke about the value of face-to-face meetings with governments interested in “unpacking” the data. But in the vast majority of cases, producers have very little understanding of who actually uses their data and whether their data has any impact in terms of behavioral change. Users typically have little access to producers to share concerns or thoughts for improving methodologies and data samples. Shyaka Anastase from the Rwanda Governance Board highlighted how this disconnect can lead to mistrust. Tackling the feedback loop problem could take a number of forms, from a simple “switchboard” service that connects users with producers (and vice-versa) to a more ambitious model where producers and users are permanently and regularly talking to one another. Where’s the right place to start?
Opportunities to leverage improved governance data. Can we identify key development and political issues and agendas where improved governance data (and its uptake and usage) can impact development and policy outcomes? Jamie Roberto Diaz Palacios from the Guatemalan National Program for Competitiveness pointed out the links between governance data and investor interest as a country specific opportunity. But at a global level, how deeply should a potential alliance dive into discussions around the post-2015 development agenda or the “data revolution?”
Funding. Most of the anticipated activities under an eventual alliance would not be cost-free, even the lowest-hanging fruit. How would we source financial support to operationalize the vision? Is there a healthy role for high-intensity users of governance data to recognize more publically that governance data doesn’t grow on trees, but rather requires continued and non-trivial investment? The philanthropic funders in the room—Elizabeth Eagen, Mark de la Iglesia, and Subarna Mathes from Open Society Foundations; Libby Haight from Hewlett Foundation; and Laura Bacon from Omidyar Network—were incredibly gracious in engaging in these discussions without awkwardness.
While the above issues will be tackled in the working groups moving forward (coordinated by a coordination “super” group that keeps all of those trains running on time), many others will also be addressed and wrestled with in the coming months. And we need your help and interest.
Very soon, we’ll be putting into a place a public mechanism for inviting additional friends and colleagues into this process on completely equal terms. While you’ll have as much influence over the outcomes as anyone else, there’s a catch: You’ll need to put in some real effort and sweat equity, possibly several hours each week. Keep an eye on this blog for updates on that front.
In the interim, if you have interest in plugging into things sooner, just give us a shout at nathaniel [dot] heller [at] globalintegrity [dot] org and mandavilleap [at] mcc [dot] gov. Stella Dawson from Thomson Reuters Foundation, who added a wonderful media practitioner’s perspective to the meeting, has also published a summary piece on the event here. We’ll also be publishing more extensive notes and transcripts from the meeting, so keep an eye out for those as well.
Posted on January 31, 2014 by Leonard Rolfes Jr., senior property rights advisor, MCC, and Alfousseyni Niono, land issues and financial services coordinator, MCA-Mali
(This post is part of an ongoing series on food security and is adapted from the Winter/Spring 2012-13 issue of Knowledge and Innovation Network Journal, a technical publication featuring lessons, innovations, ideas, and thinking behind MCC’s poverty reduction investments around the world.)
How can newly irrigated land be allocated to farmers in a way that is fair and transparent and leads to efficient agricultural production while also providing an opportunity for the poor and vulnerable to climb out of poverty? This was one of the big questions that the Alatona Irrigation Project in central Mali set out to answer.
The project—part of MCC’s five-year, $435 million compact with Mali—converted more than 12,000 acres of dry scrub land into rich, productive irrigated land suitable for growing rice and vegetables. Once the irrigation infrastructure was built, the land needed to be allocated to people who would farm it.
MCA-Mali, the local organization implementing the compact, first allocated 12-acreunits if the land to the families who were displaced by the project and who could no longer use the land for grazing and other livelihood activities. For the remaining units, it was necessary that the people who received land had the knowledge and resources to make productive use of it—while trying to correct the deep-rooted inequalities in the region by encouraging the participation of women, the landless and other disadvantaged groups. Every proposed solution risked antagonizing some part of the population who believed they deserved more of the land than they were being allocated.
In the end, a two-step process was used to allocate the remaining land. First, each applicant was evaluated based on their current access to land (the landless received extra points), farming and irrigation experience (more experience equaled more points), proof of having paid water fees in the past (the land had to be purchased and water fees paid), membership in an association or cooperative, access to farming tools and adequate resources, and gender and age (women and youth received extra points). Each applicant was given a point score, and those who passed a minimum point threshold entered the second stage: a lottery.
The lottery was conducted publicly and transparently to ensure that the outcome was fair and accepted by all parties. To maximize women’s access to land, joint-titling was encouraged, allowing land owners to name their spouse as a co-owner of the land, which will prevent women from losing land access in the event of a husband’s death.
The effort required substantial community outreach to make sure residents fully understood the process and criteria for applying for irrigated land. The hope is that this successful model for land allocation and joint titling will be replicated throughout Mali and other countries in West Africa whenever land needs to be allocated.
Tell us what you think! Have you had experiences with land allocation or determining who gets access to land in other development projects? How were criteria determined, and how accepting was the community?
Click here to read the full article.
Posted on January 15, 2014 by Alicia Phillips Mandaville, Managing Director, Development Policy
The start of a new year seems to prompt an awful lot of writing about how the data revolution will change everything—especially in the developing world. It will be bigger than the industrial revolution. It is already disruptive. And the applications and devices that humans can design to use this data are projected to reduce poverty, liberate people, halt the spread of disease, and alter the state-centric nature of the international system. The more disruptive the better! Vive la Révolution!
It’s easy to get caught up in this, as (full disclosure) I am. The availability of machine-readable, comparable information is already changing people’s lives in very practical ways. Data has even become less nerdy and more exciting to talk about: We can refer to “a disruptive future,” and plenty of people think that future kind of looks like an iPhone. Using technical terms in everyday professional conversations is becoming the norm. But underneath the comfortable arm waving about this bright new future, there are some quiet places that have not seen this change.
At a time where people are waxing eloquent about the power of big data to make consumer goods and services ever more tailored and ever more rapid, the world still lacks reliable, comparable country statistics on basic economic, governance and human development outcomes across much of the developing world. UNICEF estimates that one in three children have not been registered and therefore simply do not exist in statistical terms. Education outcomes are often estimated by models based on five-to-10-year-old data. As a proxy for accountable governance, budget transparency data covers only about half of the more than 190 countries in the world.
And the closer you look, the more you find that even the data we have considered reliable has internal flaws that can make it hard to trust (see Mortan Jensen's controversial book Poor Numbers). Unlike “big data”—where the law of large numbers more or less evens out the errors of any individual data point—cross-country data comparisons are typically small enough that even a handful of inaccurate data points can alter the outcome.
The first challenge here is obvious. If we want to realize the potential of the data revolution in the world’s poorest countries, we need more and better data. Period. And people are already both demanding it and trying to create it.
But there is a second, less-visible challenge: ensuring that data is used responsibly. Foreign aid and foreign assistance are fields where much of the data we want to use is just beginning to be collected or fraught with challenges. But while development professionals grapple with how to work appropriately with some serious data gaps, we are surrounded by popular examples from other fields of how reliable big data can be: Nate Silver's 2012 election predictions, Target's marketing algorithms that can tell you are pregnant before you tell your friends and even a Brad Pitt movie about data—seriously! It can be tempting to think our world is the same—but it isn’t yet.
So if we are using development data, how do we know we are using it responsibly for policy making and aid allocation? That's not an often-asked question, but I think it should be. Are there cross checking metrics? What would that even look like?! Is transparency the answer? When someone corrects a data error, how should decision makers react (à la the Reinhart and Rogoff data controversy)?
Over this year, focusing on the responsible use of data is a theme I'll come back to again and again: things worth watching and learning from, characteristics of the responsible (and irresponsible!) use of development data and efforts to fill data gaps to enhance aid effectiveness. I hope others will too.
Posted on October 25, 2013 by Sheila Herrling, Vice President for Policy and Evaluation
Yesterday I was part of a panel discussion to launch the 2013 Aid Transparency Index. The Index, published each year by Publish What You Fund, is the only independent assessment that rates aid organizations on how transparently they do business. And this year, the rankings show great progress across the U.S. Government in terms of aid transparency, with five of the six U.S. organizations evaluated improving their rankings.
The quantity and the quality of information being made available by U.S. foreign aid agencies increase every single quarter of reporting. This year’s Index shows the United States making considerable progress in balancing the need for coherence across government agencies, as well as progress with the timeliness and accuracy of data.
This year, MCC is being recognized as the top-ranked organization among the 67 assessed. We are all very honored by the ranking and continue in our commitment to making transparency a core business practice. And, truth be told, we are also humbled as we see agencies and organizations that have and will continue to inspire in this space now lower in the rankings despite their truly transformational efforts.
There is so much to learn from one another as we all seek to advance transparency and open data in order to find greater efficiencies in our business models, enhance citizen accountability over aid investments and maximize development impact. Just a few examples are here and here.
I thought it might be useful to share some of my reflections on the journey that got us to the top this year:
- Commit unequivocally and be persistent. Forging internal consensus is a critical first step. On the path to securing that consensus, be prepared to work through a “psychology of fear” that is perfectly understandable but must be overcome. It means believing firmly that the risks of more information in the public domain are worth taking in the pursuit of greater business efficiency and greater impact on the ground. And it means taking a leap of faith that your stakeholders will appreciate the risk and join you in a spirit of partnership.
- If you thought step one was hard, wait ’til you see what comes next. It is extremely important to make a strong business case for opening data to clearly show how the investment is going to bring a return to your organization, as well as to have the patience required to reach proof of concept on that business case. Tremendous hard work is required to deliver quality data. Be prepared to invest a lot of time and energy—largely manually—to organize disparate data and get it to a place where you can have a single authoritative source with multiple end-uses. The process requires a heavy lift on the front end—but as the data production becomes increasingly automated over time, costs will decrease dramatically while the benefits steadily rise.
- Put together a crack team that partners policy and technology. Part of doing it well requires a task-oriented team with a mix of policy-minded and technology-minded people. The technology-minded types need to learn not to roll their eyes at perceived bureaucratic hurdles and process/structure issues thrown up by the policy types, and the policy types need to acknowledge that there is room to loosen some controls and crowd-source the effort.
- Stay ambitious. Complacency in this space should not be tolerated. Continue to examine the demand side of the equation to make sure you are producing the right data in the right format for your various stakeholders. Continue to stay in touch with other organizations that are also driving forward in the field to learn and share and leapfrog each other’s efforts.
And to give folks a preview of what’s on the horizon at MCC as we seek to maintain that top spot:
- Revamp of data.mcc.gov: A revamp of our open data hosted at data.mcc.gov will include building a high-quality API file to allow a whole new world of stakeholders to access our data. We will continue to publish data in a range of formats, and the new interface of data.mcc.gov aims to make our data more easily discoverable and accessible.
- Release of 10-20 evaluation survey data sets: By June 2014, MCC has committed to publishing 10-20 of the survey data sets that have been collected as a result of our independent evaluations. We are in the process of preparing the data for release and presenting it for clearance to our internal Disclosure Review Board, which has been formed to ensure that MCC upholds high legal and ethical standards throughout the release process. In the future, we expect a steady stream of data sets to be made available because we are also reengineering our evaluation process with the end goal of data release in mind. This should speed up the process considerably.
- A new disclosure policy: We are putting the finishing touches on our new disclosure policy, which will guide staff in implementing transparent practices around the release of information collected in the course of MCC business. The policy aims to empower staff to release more information, consistent with the presumption of disclosure.
- Elevate our Open Government Plan: While the disclosure policy will serve as internal guidance to our staff, MCC is also planning to revise our Open Government Plan by June of next year. This plan will serve as the public-facing MCC document on access to information. In the process of revising this plan, MCC will seek active participation of stakeholders throughout the policy making process.
- Enhance and evolve the Dashboard: MCC continues to work with the Foreign Assistance Dashboard to continue to improve our own data on the Dashboard and to begin submitting data in XML format. We will make our XML code open code so any agency that wants to publish to XML can use what we’ve already produced.
- Pilot IATI XML generators in some MCAs: MCC will begin to explore how we can support our Millennium Challenge Accounts—the implementing organizations in partner countries—in reporting to IATI. As we build out new business systems for MCAs to use for financial, procurement and reporting functions, we will explore how to build IATI file generators into these systems to facilitate the process of including this information in the IATI Registry.
Trust that MCC will always seek to push the boundaries on transparency and open data because we believe so firmly that it leads to better programs, better understanding of what we do and better results. We take our No. 1 spot in the Aid Transparency Index with great pride, humility and a sense of sincere responsibility to keep evolving our efforts in this space for ourselves and others.
Posted on October 5, 2012 by By Sheila Herrling, Vice President for Policy and Evaluation
On Monday, I joined U.S. Government colleagues—Gayle Smith of the White House, Don Steinberg of USAID and Rob Goldberg of State Department—at the launch of the 2012 Aid Transparency Index. The Transparency Index, published annually by Publish What You Fund, rates aid organizations on how transparently they do business. This year, MCC was named the ninth-most transparent organization out of 72 globally, and the most transparent U.S. Government agency.
Of course, we are proud of our individual ranking. But we are more proud of being part of an administration that is so firmly committed to transparency. We are proud to be part of an interagency team collectively striving to bring more sunlight to our foreign assistance. The Foreign Assistance Dashboard is a huge step forward for the U.S. Government—just a few years ago it was next to impossible to know what the United States was spending, where and on what. We are proud to be the first agency to publish obligation and expenditure data to the Dashboard. The release last week of the OMB Bulletin is another big step for the U.S. Government, as it brings other agencies into the Dashboard.
But we want to do so much more. MCC has our sights on two ways to push our transparency efforts beyond “show me the money” to “show me the evidence.” Two efforts are central to MCC’s evidence-based approach—putting more data in the public domain and bringing transparency to what we are learning.
First, more data. This week, we launched our Open Data Catalog, data.mcc.gov, which will over time become our one-stop shop for financial, performance and evaluation data. This is about exposing the data and evidence that MCC uses to make decisions and measure results and putting it in the hands of smart people to use it in new ways. Our first step was to put out MCC’s Fiscal Year 2012 selection data in XML format. Check back soon for FY13 selection data, financial data, program performance data, and a goldmine of household survey data that underlies our independent evaluation work.
Second, more transparent learning. MCC is pushing transparency beyond money and data to learning about what we are actually achieving. MCC has made a big commitment to independent evaluations to help us test assumptions about traditional approaches, and build better evidence for what works—and what doesn’t—in development. Our first impact evaluations will be final later this month, and we’ll make all the findings public. The learning distilled from these rigorous independent evaluations is enormous, for us and for others.
We look forward to you standing with MCC as we take transparency to the next level, even when the evidence points to things not going as we expected. That is often the greatest motivator for change. It is central to accountability and open government and is at the heart of MCC’s evidence-based approach.
- October 2014
- September 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- January 2010
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008
- July 2008
- June 2008
- April 2008
- March 2008
- February 2008
- January 2008