Skip to content

Creating an Annual Summary Report Contents:

1.   Introduction


Annual Summary Reports are a document that can be auto-generated from evaluations in the Culture Counts platform. They were created to help band 2 and 3 National Portfolio Organisations (NPOs) meet their funding requirements during the 2019-23 Arts Council England (ACE) funding period. Because of this, they only compare dimension and demographic questions which were mandatory for band 2 and 3 NPOs at that time.

Although Annual Summary Reports are still available to Impact & Insight Toolkit (Toolkit) users, we don’t recommend using them during the 2023-26 Investment Programme, as the questions they compare are limited. Annual Summary Reports have now been replaced by new Ambition Progress Reports, created for you by Counting What Counts (CWC). Read more about our new report types in the insights and learning section of our Evaluation Guide.


2.   How to create an Annual Summary Report


  1. Click on the ‘Reports’ button.

Screenshot of 'Report' button


  1. Click the ‘Summary Reports’ button.

Screenshot of 'Summary report' button


  1. Click on the ‘Prepare Summary Report’ button for the relevant reporting year.

Screenshot of 'Prepare summary report' button


  1. Select evaluations.

You will now be able to select which evaluations you want to include in the report. The ‘Evaluations | Submitted’ section will list evaluations that have had their data submitted to ACE. Use the tick boxes to select or deselect an evaluation; you can choose up to four evaluations.

Screenshot of 'Select evaluations' section


Unsubmitted evaluations are listed in the ‘Evaluations | Unsubmitted’ section. To select an evaluation for the report you must submit the data to ACE. To submit the data, click the ‘Submit data’ button on your evaluation.

Screenshot of 'Evaluations | Unsubmitted' section


Once you have selected your evaluations click the ‘Generate Summary Report’ button.

Screenshot of the 'Generate Summary Report' button


  1. Review and annotate.

Screenshot of 'Review and annotate' section


A report of your selected evaluations will now be automatically generated. In the ‘Set-up report’ section, input your cover title, cover image and organisation name.


  1. Preview report.

The report data is pulled automatically from your evaluations. The report will compare responses to dimension and demographic questions which were mandatory for band 2 and 3 NPOs during the 2019-23 funding period:

  • Captivation: It was absorbing and held my attention
  • Challenge: It was thought provoking
  • Concept: It was an interesting [idea/programme]
  • Distinctiveness: It was different from things I’ve experienced before
  • Relevance: It has something to say about today’s world
  • Rigour: It was well thought through and put together
  • Excellence: It is one of the best examples of its type that I have seen
  • Originality: It was ground-breaking
  • Risk: The [artists/curators/organisers] were not afraid to try new things
  • Age: What is your age?
  • Gender: How would you describe your gender?
  • Postcode: What is your postcode?

The report will not include custom questions or other dimension questions data. If no data was collected for a dimension listed above, or it was not used within an evaluation, charts will not be generated alongside the name of the evaluation in the relevant dimension section.


  1. Explore and interpret your charts.

An important stage in compiling the report is to understand what the data can tell you about how people experienced different events. The charts in this report, combined with any commentary you collected from peer and public respondents, can help you to understand:

  • What kind of experience it was for people – which dimensions scored most highly?
  • Whether peers and public experienced the events in the same way.
  • How well aligned peer and public responses were with your original intentions for the events.


Each chart has a text box to add optional context. The word limit is 40 words.

Screenshot of 'Add optional context' text box


  1. Add an introduction.

This is an opportunity to provide information about why you chose these evaluations. The word limit is 400 words; you can copy and paste into the text box if you prefer to begin writing elsewhere. Click the ‘Show’ button in the ‘Suggestions’ section for prompts about what to include.

Screenshot of 'Introduction' section


  1. Add a conclusion.

This text box allows you to share the insights you gained through the evaluation. The word limit is 400 words. Click the ‘Show’ button in the ‘Suggestions’ section for prompts about what to include.

Screenshot of 'Conclusion' section


  1. Click the ‘Save to PDF’ button.

Screenshot of 'Save to pdf' button


Your report will be saved as a PDF to your device. This report is yours, and you may share it with whomever you like, whether that be Board members, staff or external stakeholders.

If revisions are made to your report, you can update it by again clicking ‘Save to PDF’.


3.   Appendix: Interpreting charts


Average difference by dimension

These charts show the average differences in how the public responded to each ‘core’ dimension. Shorter bars indicate less difference with self-expectation.

Example of difference by dimension charts


Average difference overall

These charts show the overall average difference for each evaluation taking into consideration all ‘core’ dimension outcomes. Smaller numbers/circles indicate less difference with self-expectation.

Example of difference overall charts

Note: These charts do not show whether public scores were above or below expectations but focus on highlighting difference and alignment.


Dimension outcomes

These charts show how participants (self, peer and public) responded to the ‘core’ dimensions and enable comparison of expectations with outcomes. Evaluations are ordered by the self-assessors’ average expectation and range from lowest to highest.

Example of outcomes charts


4.   Troubleshooting


If your report doesn’t look correct or appears to be missing data, please ensure your evaluation is configured properly and that you have distributed your surveys to the correct respondents.

Possible causes:

  • Evaluation properties for artform, location or attendance have not been added
  • No peer assessor responses
  • No self assessor responses
  • Standard demographic questions not used
  • ‘Core’ dimension questions not used

The team at Counting What Counts is ready to support you. Please do not hesitate to get in touch with any questions.