23 September 2025

Share this article:
FacebookWhatsAppEmailLinkedIn

Monitoring, evaluation and learning (MEL) are often presented as indispensable tools for accountability, improvement, and long-term impact. Funders want assurance that their resources are making a difference. Delivery organisations want to understand their effect on the communities they serve. Policymakers want to see evidence that investments deliver outcomes. On paper, MEL promises all of this: neat data, measurable outcomes, and clear evidence of change. 

Yet those working on the frontline, especially with young people, know that reality looks very different. At the 2025 London Funders’ Festival of Learning, together with one of our members, Mary’s Youth Club, we hosted a session on The Hidden Cost of Monitoring & Evaluation. The conversation revealed not only the financial costs that are often overlooked, but also the relational, organisational, and opportunity costs that come with current approaches to MEL. 

This blog builds on that discussion, exploring the gap between expectation and reality, and what it would take to make MEL genuinely useful for everyone involved. 

Expectation vs. reality: collecting data with young people 

At the design stage, MEL often assumes a straightforward process: young people will complete surveys or feedback tools honestly and independently, producing tidy data that reflects the impact of interventions. In practice, data collection requires far more time, relationship-building, and trust. 

Youth workers frequently need to guide young people through forms, ensuring questions are understood without inadvertently influencing their answers. Even then, the data collected often tells us what happened but struggles to reveal why it happened. 

Validated tools, often favoured by funders and policy makers, promise robust and comparable data. Yet many are not suited to the realities of youth work. They may be: 

  • Too long or complex for young people to engage with. 
  • Written in jargon that doesn’t resonate with their lived experiences. 
  • Administered in ways that feel extractive or insensitive, or even lead to safeguarding issues, particularly if they are not trauma-informed. 

The risk is that tools designed to generate credibility aren’t always fit for gaining information from young people in an everchanging context, alienating the very voices evaluation is meant to capture. In addition, for young people who use open-access services, there may not even be a clear “before” and “after” point against which to measure progress. 

The hidden cost here is not just the staff time spent administering surveys, but the risk that some processes may limit the reliability and depth of young people’s feedback, and therefore our understanding of impact. 

Engagement costs: building capacity for meaningful data 

If data is to be both robust and meaningful, allowing us to understand both what the impact was and how it was achieved, organisations need more than survey responses. Qualitative participatory and creative methods are crucial: reflective activities, peer evaluation, or visual storytelling are a few examples that can illuminate the why as well as the what. They are more engaging, allowing different ways of expression that paint the picture of the provision and its impact. Importantly, these approaches make evaluation less extractive, offering something back to young people and practitioners.  

Participatory and creative tools are particularly powerful because they work well across diverse audiences. These methods create multiple entry points for young people with different communication styles, cultural backgrounds, and levels of confidence. Instead of relying solely on written surveys, which can exclude or flatten voices, these approaches make space for a wider range of perspectives to be heard. They also help ensure that evaluation is not only more inclusive, but also more engaging and accurate, producing richer insights that reflect the complexity of young people’s experiences. 

But they require investment in staff training and organisational capacity. Youth workers need to understand why MEL matters, not just feel confident in carrying it out. Organisations need systems for storing, analysing, and acting on the data. 

This is rarely fully funded. Too often, grants cover the delivery of activities but not the time needed to gather and interpret learning. 

Frameworks: when theory doesn’t match reality 

Evaluation frameworks, such as theories of change or logical models, are meant to bring clarity. They show how activities lead to outputs, outcomes, and long-term impact. In practice, youth work is rarely so linear. 

Young people’s journeys are complex, shaped by personal circumstances, relationships, and community contexts. Outcomes are not always predictable, nor do they follow neat sequences. Despite this, many frameworks are rigid, tied to pre-agreed indicators that reflect funder priorities rather than young people’s realities. 

This creates several hidden costs: 

  • Multiple agendas: youth workers juggle the needs of different funders, often with conflicting requirements. 
  • Reduced flexibility: organisations may feel pressure to stick to measurable activities rather than respond creatively to emerging needs. 
  • Missed insights: unintended outcomes, which are often the most meaningful, go unrecorded. 

The session reminded us that frameworks should not be cages. Flexibility and trust are essential if MEL is to capture the richness of youth work. 

The real financial cost of MEL 

Perhaps the most striking reality check from the session was about funding. A common rule of thumb is that around 10% of a project’s budget should go to MEL. But research, and our own experience, suggest the real figure is closer to 20%. 

That 20% covers not just surveys and databases, but also: 

  • The time it takes staff to collect and input data. 
  • The analysis required to turn data into insight. 
  • The reflection and learning needed to adapt programmes. 

When MEL is underfunded, organisations face survey fatigue, limited capacity for analysis, and missed opportunities for learning. The burden falls most heavily on frontline staff, who must choose between meeting reporting requirements and spending time with young people. 

Underfunding MEL also undermines its value. If data cannot be properly analysed or reflected on, then its potential to shape better services is lost. This represents a double cost: wasted effort for staff and wasted opportunity for funders. 

Learning gaps: closing the feedback loop 

Another hidden cost lies in what happens after data is collected. Too often, organisations never see the full analysis or recommendations. Reports are sent upwards to funders, but not back to the practitioners who contributed to them. 

This breaks the feedback loop that is essential to genuine learning. Delivery staff miss out on insights that could inform their practice. Young people rarely hear how their feedback shaped decisions. Programmes may end before findings are embedded, while staff turnover and shifting contexts erode institutional memory. 

The result is evaluation as a one-way process — accountability without learning. To change this, funders need to share back findings, reflect with grantees, and co-own decisions about how learning is applied. Without that, the promise of MEL remains unfulfilled. 

The opportunity costs of MEL 

Beyond financial and relational pressures, MEL carries another set of hidden costs: the opportunities foregone when evaluation is not optimal. 

  • Lost innovation: rigid frameworks discourage experimentation, as organisations stick to what is measurable rather than what may be transformative. 
  • Lost voice: young people’s perspectives can be sidelined if tools are inaccessible or irrelevant. 
  • Lost trust: when practitioners feel MEL is imposed rather than co-owned, their engagement becomes compliance-driven rather than curiosity-driven. 

These opportunity costs are harder to quantify but deeply felt across the sector. They represent the gap between MEL as a tool for accountability and MEL as a process for collective learning and improvement. 

Towards better practice: key takeaways 

So, what would it take to reduce these hidden costs and unlock the real value of MEL? Our session suggested several principles for both funders and practitioners: 

  1. Respect practitioners as experts
    Youth workers hold unique knowledge of what works with young people. Involve them, and young people themselves, in shaping MEL tools and frameworks. 
  2. Be flexible
    Adapt frameworks to reflect real contexts. Allow grantees to use their own tools where they already exist, mapping these to funder outcomes where possible. 
  3. Invest in capacity
    Fund MEL properly, not just for data collection but for analysis, reflection, and learning. Treat the often-quoted 10% as a minimum, not a ceiling. 
  4. Close the loop
    Ensure findings are shared back with delivery organisations and young people. Create opportunities for joint reflection and co-ownership of next steps. 
  5. Think ‘donor+’
    Support MEL not just as an accountability mechanism, but as an opportunity for mutual learning. Funders can add value by building grantees’ capacity to evaluate in ways that are meaningful to them, to adapt and improve impact for all. 

Conclusion: from cost to value 

Ultimately, the hidden cost of MEL is more than financial. It lies in strained relationships, diverted time, and lost opportunities to learn and adapt. But MEL also holds immense potential. Done well, it can deepen trust, strengthen practice, and generate the insights needed to transform services for young people. 

For that to happen, MEL must shift from being a transactional requirement to a shared learning journey. Funders, practitioners, and young people all have a stake in making evaluation meaningful. Recognising and resourcing the real costs is the first step towards making sure the benefits outweigh them. 

By lifting the lid on the hidden costs, we can start to reimagine MEL not as a burden but as a bridge: connecting accountability with learning, and evidence with action. 

 

This blog was co-written by Maya Reggev, London Youth’s Learning & Impact Lead and Sally Baxter, Mary’s Youth Club’s CEO and Youth Development Manager, as a contribution to the London Funders Festival of Learning 2025.