Today we published a news story on the site about what we learned from a review of 200 Heritage Grants evaluation reports.
Over the coming weeks and months I'm going to pull out some key points from the review which I think OC users will find most useful, starting with this list of dos and don'ts.
Observations from the strengths of the excellent reports:
- All of these reports embedded robust data collection methods at the outset of the project.
- Many of them used evaluation throughout the project lifespan to continually test and refine the success of activities and to monitor their impact. Some were able to demonstrate how selfevaluation findings have already had an impact on the ongoing success of the project.
- All demonstrated expert understanding of research methodologies and of impact evaluation.
- Many provided commentary on the robustness of the data and guidance for how to interpret the findings, taking into account sample sizes and confidence intervals.
- There was often a good balance between the detail of qualitative case studies and what projects meant to people individually, combined with the broader picture provided by quantitative surveys.
- Some reports referenced evaluation metrics and project successes to external references thereby setting their findings within a wider context e.g. other surveys produced by visitor / volunteer organisations, Generic Learning Outcomes.
- In some cases the more succinct excellent reports were supported by appendices, which included detailed supporting evidence such as a list of who was consulted, evaluation plans, activity plans and separate research reports.
- Many were explicit about how the learning from this project and the evaluation report would be used to inform best practice in future. Some explained exactly whom the report would be shared with, how they would use it and what impact this was expected to then have.
- All the reports clearly and explicitly focused on outcomes and legacy.
Observations from the weaknesses of the poor reports:
- It was clear that evaluation had not been a priority for these projects.
- Many reports were short summaries and lacked sufficient detail about their project. Some were incomplete reports despite their title being ‘evaluation report’: they partially reported on an aspect of the project or focused on the achievement of milestones in terms of project management and processes rather than outcomes. Some reports only contained activity summaries, some even included conservation plans and marketing plans.
- Some reports lacked a clear structure or had no introduction / aims or objectives section in which to understand the context of the project.
- In many cases there was no evidence of any evaluation data. Therefore, reports often relied on the perspective of the author/s, or used selective anecdotal data to support the findings. It was often unclear how the report’s judgments had been made.
- Where some data was provided, the reports did not demonstrate that they could analyse this robustly. Some reports included information in appendices but did not refer to this in the report nor attempt to analyse it.
- Many focused on universally positive/anecdotal verbatim and commentary, asserting that objectives had all been met despite a lack of robust evidence to support this. Where some objectives or targets had clearly not been met there was often a lack of explanation for this.
- Many lacked reflection and insight into the strengths and weaknesses of the project.
- Many did not attempt to consider objectively whether any lessons had been learned.
- Many did not consider outcomes or refer to the project’s wider impact outside of the fact that it had taken place and been delivered on time and to budget.
- A few reports had clearly not been proofread and had missing sections.
I'd be keen to hear your thoughts on what does and doesn't make a good evaluation report, or ways in which you've approached your own evaluation which others might find useful.
hi Amy, we are just about to start pulling our evaluation report together so this is very helpful, thank you!
You're most welcome!
Do you have any examples of good evaluation reports which you could share? It would be interesting to see how report writers have delivered these strengths.
Let me see what I can find.
As requested, here are examples of two excellent evaluation reports.
The first one comes from the Hands On Our Heritage project, which focused on the as-yet-untold industrial, agricultural and social history of Sheffield Manor Lodge, and its decline from 'great house' status.
The second one comes from Bletchly Park's Phase 1 restoration project, which helped preserve the heritage of the site, and improve layout and accessability to the museum and education centre.
I'm just starting the Evaluation Report for our £50k cemetery restoration and interpretation project. All the sample/existing Evaluation Reports I have found appear to be for significantly larger projects - and are therefore very lengthy and detailed. Do you have some guidance for what might be appropriate for a project of this size?
Hi Jemma - I can't immediately find an example of a report in the region of £50K, however, what might be useful is this basic model of suggested evaluation budget and method for projects under £100K:
Spend on evaluation: Completed internally and up to 3% of budget
Type of evaluation: Summative
Acceptable Methodologies: Visitor books, Trip Advisor Feedback etc.
Outcomes: Focus on participation and engagement
If I do come across a suitable report example, I will be sure to post it here for you.
Amy, hi, a huge help, thank you
We have just updated our guidance which provides background information on carrying out an evaluation, along with advice and ideas on producing your report. It also provides useful insight on how to measure and report on HLF outcomes.
The guidance will be relevant to anyone who is preparing an application, or carrying out a project, under any of our grant programmes.