At Learning and Change we work with different methodologies to assess Value for Money (VfM) and we know how to select the best method for your organisation or programme. The choice of the methodology ultimately depends on the nature of your work, what value means for your programme and whose value counts, i.e. what perspective you will be taking.

Do you know that there are different methodologies to assess Value for Money (VfM)? These may include Social Return of Investment (SROI), cost-benefit (CBA), cost effectiveness (CEA), Basic Efficiency Resource (BER) analysis. At Learning and Change we have tested all of these and concluded that no single methodology enables us to answer the key questions that VfM analysis aims to answer:

 

  • What are the investments that are allowing us to achieve our outcomes?
  • Are we investing in areas that are generating change?
  • Are we investing resources in some areas which are not generating value, i.e. not making a difference?
  • Does it make sense to continue investing in these areas?
  • What can we learn?
  • How can we do things differently in the future?
  • What are the key aspects that can illustrate to our stakeholders the extent to which the programme delivers VfM?

 

The challenge is that we tend to work on programmes which operate in complex environments where not everything that counts can be counted and, therefore, these methodologies imply using significant assumptions or omitting certain outcomes and changes achieved. For this reason, we tend to mix a variety of different approaches usually tailored to the characteristics of each programme we are assessing and based on the data that we have access to.

 

For instance, when we worked with the Aus4Equality Gender Responsive Equitable Agriculture and Tourism (GREAT) programme we delivered a VfM assessment with two purposes: (i)to analyse the VfM that the programme had managed to deliver in its Phase 1; and, (ii)to test the VfM Framework and Toolkit that we had previously developed.

 

The assessment was undertaken remotely, mostly using existing data sets and reports. The GREAT team was responsible for the data collection using GREAT’s VfM Toolkit and Learning and Change was responsible for the analysis and identifying the key findings. These were then reviewed and discussed in a VfM Task Force, a working group of GREAT staff members, meeting to identify the key recommendations for Phase 2.

 

The assessment concentrated on responding to the following questions:

  1. What is the estimated VfM of GREAT Phase 1?
  2. What mechanisms, processes and practices have been applied that enhance or reduce VfM?

 

To do so, we used the VfM Framework that we had previously developed and that you can see here. There we identified components that the programme aims to address in order to deliver VfM, criteria and standards and a scoring approach to assess the VfM performance.

 

The following process was used for the assessment:

 

  • Desk-based review

Firstly, Learning and Change undertook a thorough document review capturing key VfM-related elements and entering them in a VfM Data Collection database.

 

  • Data collection

The GREAT team was responsible for collecting the information on VfM using the VfM Toolkit. The information was inputted directly in the VfM Database and used for the analysis.

 

  • Analysis and identification of preliminary findings

The information captured in the VfM Database as a result of the Desk-Based review and the data collection was then used by Learning and Change to consolidate the information and identify the key findings. Qualitative coding techniques were used to consolidate the data and included in the VfM Data Consolidation Template (one of the tools of the VfM Toolkit) and organised by VfM Component of the VfM Framework.

  • Interviews with GREAT team

To fill some of the data gaps and validate some of the preliminary findings, Learning and Change undertook a series of interviews with GREAT staff.

 

  • Scoring of the VfM Components

Using the consolidated data, the preliminary findings and the interviews, Learning and Change scored each VfM component using the guidance provided in the VfM Framework and Toolkit. Each component was scored from 1 to 5, as described below:

  1. Beginning – the Program did not address the component
  2. Developing – the Program has started to address the component
  3. Strengthening – the Program has addressed the component but some elements require further strengthening
  4. Satisfactory – the Program has fully addressed the component
  5. Significant – the Program has addressed the component beyond expectations

 

  • VfM Task Force meeting

The preliminary findings were shared with the members of the VfM Task Force, a team set up as part of the VfM Framework with the aim of reviewing the findings to identify key actionable recommendations and taken forward in Phase 2.

 

  • Draft of the assessment report

Learning and Change completed a first draft of the assessment report, which was then reviewed by the GREAT team before completing a final draft that incorporated key feedback and comments.

 

Learn more about our work on VfM here.

 

Get in touch to improve your impact and make a difference in the lives of the people you work with.