Creating an Annual Summary Report Long read Last modified: 11 Sep 2023 Creating an Annual Summary Report Contents: 1. Introduction Annual Summary Reports are a document that can be auto-generated from evaluations in the Culture Counts platform. They were created to help band 2 and 3 National Portfolio Organisations (NPOs) meet their funding requirements during the 2019-23 Arts Council England (ACE) funding period. Because of this, they only compare dimension and demographic questions which were mandatory for band 2 and 3 NPOs at that time. Although Annual Summary Reports are still available to Impact & Insight Toolkit (Toolkit) users, we don’t recommend using them during the 2023-26 Investment Programme, as the questions they compare are limited. Annual Summary Reports have now been replaced by new Ambition Progress Reports, created for you by Counting What Counts (CWC). Read more about our new report types in the insights and learning section of our Evaluation Guide. 2. How to create an Annual Summary Report Click on the ‘Reports’ button. Click the ‘Summary Reports’ button. Click on the ‘Prepare Summary Report’ button for the relevant reporting year. Select evaluations. You will now be able to select which evaluations you want to include in the report. The ‘Evaluations | Submitted’ section will list evaluations that have had their data submitted to ACE. Use the tick boxes to select or deselect an evaluation; you can choose up to four evaluations. Unsubmitted evaluations are listed in the ‘Evaluations | Unsubmitted’ section. To select an evaluation for the report you must submit the data to ACE. To submit the data, click the ‘Submit data’ button on your evaluation. Once you have selected your evaluations click the ‘Generate Summary Report’ button. Review and annotate. A report of your selected evaluations will now be automatically generated. In the ‘Set-up report’ section, input your cover title, cover image and organisation name. Preview report. The report data is pulled automatically from your evaluations. The report will compare responses to dimension and demographic questions which were mandatory for band 2 and 3 NPOs during the 2019-23 funding period: Captivation: It was absorbing and held my attention Challenge: It was thought provoking Concept: It was an interesting [idea/programme] Distinctiveness: It was different from things I’ve experienced before Relevance: It has something to say about today’s world Rigour: It was well thought through and put together Excellence: It is one of the best examples of its type that I have seen Originality: It was ground-breaking Risk: The [artists/curators/organisers] were not afraid to try new things Age: What is your age? Gender: How would you describe your gender? Postcode: What is your postcode? The report will not include custom questions or other dimension questions data. If no data was collected for a dimension listed above, or it was not used within an evaluation, charts will not be generated alongside the name of the evaluation in the relevant dimension section. Explore and interpret your charts. An important stage in compiling the report is to understand what the data can tell you about how people experienced different events. The charts in this report, combined with any commentary you collected from peer and public respondents, can help you to understand: What kind of experience it was for people – which dimensions scored most highly? Whether peers and public experienced the events in the same way. How well aligned peer and public responses were with your original intentions for the events. Each chart has a text box to add optional context. The word limit is 40 words. Add an introduction. This is an opportunity to provide information about why you chose these evaluations. The word limit is 400 words; you can copy and paste into the text box if you prefer to begin writing elsewhere. Click the ‘Show’ button in the ‘Suggestions’ section for prompts about what to include. Add a conclusion. This text box allows you to share the insights you gained through the evaluation. The word limit is 400 words. Click the ‘Show’ button in the ‘Suggestions’ section for prompts about what to include. Click the ‘Save to PDF’ button. Your report will be saved as a PDF to your device. This report is yours, and you may share it with whomever you like, whether that be Board members, staff or external stakeholders. If revisions are made to your report, you can update it by again clicking ‘Save to PDF’. 3. Appendix: Interpreting charts Average difference by dimension These charts show the average differences in how the public responded to each ‘core’ dimension. Shorter bars indicate less difference with self-expectation. Average difference overall These charts show the overall average difference for each evaluation taking into consideration all ‘core’ dimension outcomes. Smaller numbers/circles indicate less difference with self-expectation. Note: These charts do not show whether public scores were above or below expectations but focus on highlighting difference and alignment. Dimension outcomes These charts show how participants (self, peer and public) responded to the ‘core’ dimensions and enable comparison of expectations with outcomes. Evaluations are ordered by the self-assessors’ average expectation and range from lowest to highest. 4. Troubleshooting If your report doesn’t look correct or appears to be missing data, please ensure your evaluation is configured properly and that you have distributed your surveys to the correct respondents. Possible causes: Evaluation properties for artform, location or attendance have not been added No peer assessor responses No self assessor responses Standard demographic questions not used ‘Core’ dimension questions not used The team at Counting What Counts is ready to support you. Please do not hesitate to get in touch with any questions. Discover more resources