Overcoming data inconsistencies through evaluation

The Client: Commonwealth Government Department

Program evaluation design enables a Commonwealth funding initiative overcome data inconsistencies to evaluate its achievements, outcomes and impacts

The situation

In 2013, a Commonwealth funding initiative to improve the delivery of services across Australia through the use of digital technologies concluded. The initiative had provided funding to a range of projects across the education, health and emergency management sectors, particularly targeting regional, rural and remote communities.

Grosvenor was engaged by the Department to conduct a final summative evaluation of the initiative to assess its achievements, outcomes and impacts as well as lessons learned for future policy development.

The challenge

As the initiative had funded a range of projects in multiple sectors and regions, the evaluation was reliant upon the analysis of data from multiple sources. A common approach to data collection and reporting had not been implemented for each project, resulting in gaps and inconsistencies in the available data. This prevented the direct comparison of projects and analysis of outcomes to evaluate the initiatives overall achievements, outcomes and impacts.

The project

In order to conduct appropriate analysis and comparisons we had to design an evaluation approach that promoted consistency and completeness of information and data.

We first collected and reviewed all existing program documentation to inform the development of the evaluation criteria which included key evaluation questions and data requirements. All available data and information was mapped against the evaluation questions to reveal the following:

  • inconsistencies in the period of data available for each project
  • incomplete quantitative data relating to the outcomes of each project
  • inconsistencies in the definitions and use of key terms by the Department and the project participants.

The available data also indicated that some projects had not met their original targets at the conclusion of the initiative, suggesting that the desired outcomes may not have been achieved.

A range of data collection activities were developed and conducted to address the gaps and inconsistencies in the data. Data collection activities specifically sought to:

  • clearly define key terms to ensure they were consistently used and applied by all stakeholders
  • collect quantitative data about each projects outcomes and achievements over clearly defined and documented time periods
  • request additional data from funded organisations where existing data did not cover the entire project timeframes.

As a result of the data collection activities we were able to collect complete and consistent data which could be used to compare the different projects and evaluate the overall initiative. Analysis of the newly collected data confirmed that all projects had met, or exceeded, their original targets by the time of the evaluation. Further, by collecting data for the period between the conclusion of the initiative and the evaluation we were able to identify the sustainable and ongoing use and impact of some projects.

The results

By ensuring consistent and complete data was collected, we were able to accurately assess the uptake and overall impact of each funded project, showing that all targets had been met and some ongoing outcomes achieved.

At the conclusion of the project we provided the client with a detailed report which addressed the initiatives achievements, outcomes and impacts. We were also able to identify lessons learned for future policy development.

The client was highly satisfied with the report which we produced and subsequently engaged us to create an evaluation framework for a separate funding initiative that was about to commence. This framework will be used to inform the collection of performance data from each funding recipient from the outset of the initiative and enable monitoring and evaluation over the next four years.