Skip to main content

The Citizen

Region / Africa

A Summer at Bridge

Measuring impact of a long term systemic change project
Isabel Opice

Isabel Opice is a second year MPA/ID student at HKS. Prior to HKS, she worked as an Office Advisor in the Secretariat in the state of Sao Paulo, Brazil.

In the everyday life in Brazil, I have seen stark social injustice: young homeless children on the streets, large scale illiteracy and the consequences of a poor health system. A wish to make sense of and help transform this situation was one of the key reasons I decided to study economic development.

I spent this summer interning at Bridge International Academies’ Policy and Partnership (P&P) team in Washington DC. Bridge is a social enterprise that runs and supports nearly 1000 primary schools and half a million children in low income communities – less than $2 a day – across Africa and Asia. Back in Brazil, I worked on innovative educational policies, the lack of which is a major roadblock to development in Brazil, and in public-private partnerships in the government. I hoped that over the course of the summer, I would have the unique opportunity of drawing on my previous work experience at an NGO in the education space and apply it to an international context, while contrasting my learning inside the classroom with actual development practice.

I was curious to understand how educational social enterprises function in Africa as opposed to my own experience in South America. Additionally, my program at HKS, the MPA-ID, has a rigorous quantitative curriculum, and I was interested in the monitoring and evaluation aspects of Bridge.

Bridge began as an affordable school network in Kenya, but since 2015, it has worked directly with governments to enhance public educational systems. The P&P team pursues opportunities for partnering with governments and helps structures the partnership projects once they are established by focusing on learning outcomes. During my internship, I worked for a project in Nigeria and this enabled me to interact with a range of different functions, from the academic team to the country Director. I got a systemic view of how Bridge functions, while being based in DC.

One of Bridge’s main innovations is the Teacher Guide. On my first day at work, I was shown videos of Bridge’s classrooms in Nigeria. One could tell the difference between a new teacher, who had just joined Bridge and was slowly adapting to the Teacher guides, and a teacher who had worked for Bridge for a longer period. While the first would just be learning how to use a tablet and interact with the pupils at the same time, the later would teach in an incredibly natural way, all enabled by the Guide. The guide gave support and structure; empowering the teacher to engage with their students. As with all pedagogical approaches, practice pays off!

How do we measure long-term impact?

Those videos made me think about how Bridge’s impact, often measured by snapshots of learning achievements in a particular year, fails to tell the full and perhaps the more important story. Bridge is proposing a structural change in the very basic system of education. It is enabling and supporting teachers through technology by providing carefully designed lessons in places where teaching materials, support and training is extremely scare. Given the complexity of the challenge, the change is a long and iterative process, where everyone is constantly learning. Teacher guides are improved daily through classrooms feedback sessions and of course, teachers boost their pedagogical skill over time.

This reminded me of one of my classes when we discussed the role of ‘Big Ideas’ in development. Big problems, such as the learning crisis, are not resolved with one-off silver bullets, but rather with systematic change and reform.  

Having studied impact evaluation and cost-benefit analysis intensively during my first year of graduate school, I read many studies assessing Bridge’s impact, both from internal and external researches. One particular study of the Partnership Schools for Liberia (PSL), now the Liberian Educational Advancement Programme (LEAP), caught my attention. One of the conclusions in the study was that the cost benefit of the program was not as effective as the results found in a similar study in India, where cameras were installed in the classroom to monitor teachers. The Indian study found that the cameras greatly reduced teacher absenteeism and, consequently, improved learning achievements. I thought that this was an odd parallel to draw and failed to appreciate the difference between short term behavioural shifts and long term systems change.

Short term interventions and long term system change approach, are different, after all!

Cost-benefit analyses and impact evaluations are extremely important tools to determine whether programs are effective or not. But they can be misleading for a short-term horizon. The benefits of an intervention that generates structural change would, however, only be apparent in the long run. Can a simplistic impact evaluation study capture that at all?

I returned to campus mulling over this question: how do we measure systems change? The summer has taught me so much and reinforced the reason why I chose to study and work within development: it teaches me how it is possible to make a change.