An Annual Summary Report (hereafter referred to as Report) is a document that can be auto-generated from evaluations in the Culture Counts platform. Once you have finished collecting data for an evaluation year, April to March, you are ready to create your Report for submission to Arts Council England (ACE). Your funding agreement requires you to create your Report and share it with ACE as part of your July payment condition. The Report will contain graphs and insights from the four mandatory evaluations conducted over the previous evaluation year. It is an opportunity for you to reflect on your evaluated works; it should be used to inform conversations about the impact your deliveries as a Band 2 or 3 NPO have on those that experience your work.
STEP 1: To create a Report, click on the Reports button at the top of the Culture Counts dashboard.
STEP 2: Click the Summary Reports button in the sidebar on the left-hand side.
STEP 3: Click on the orange Prepare Summary Report button next to the relevant reporting year.
STEP 4: Select evaluations.
You will now be able to select which evaluations you want to include in the Report. Under the ‘Evaluations | Submitted’ section you will see a list of evaluations that have had their data submitted to ACE. Use the tick box next to the name of the evaluation to select or deselect it; you can choose up to four evaluations. In the ‘Type’ section you can see whether your evaluation meets the requirements for an ACE ‘Core’ or ‘Flexible’ evaluation. If they do not meet the requirements, you will see the dash symbol. Please see our funding requirements page for more information.
Unsubmitted evaluations are listed in the ‘Evaluations | Unsubmitted’ section. To select an evaluation for the Report you must submit the data to ACE. Please see our ‘Creating an Insights Report’ guidance for more information.
Once you have selected your evaluations click the black Generate Summary Report button.
STEP 5: Set-up report.
You will now be able to see a summary page for the Report. Scroll down past the ‘Select Evaluations’ section to the ‘Review and annotate’ section.
Input your cover title, cover image and organisation name.
STEP 6: Preview the Report.
Scroll down to preview the Report. You will now be able to see an auto-generated report of the selected evaluations. The data is pulled automatically from your evaluations. Data will be shown for responses to dimensions and demographic questions. If no data was collected for a particular dimension, or it was not used within an evaluation, charts will not be generated alongside the name of the evaluation in the relevant dimension section.
STEP 7: Explore and interpret your charts.
An important stage in compiling the Report is to understand what the data can tell you about how people experienced different events. The charts in this report, combined with any commentary you collected from peer and public respondents, can help you to understand:
Below each chart is a text box to add optional context. The word limit is 40 words.
STEP 8: Add an introduction.
This is an opportunity to provide information about why you chose these evaluations. The word limit is 400 words; you can copy and paste into the text box if you prefer to begin writing elsewhere. Click the Show button in the ‘Suggestions’ section below the text box to see prompts about what to include.
STEP 9: Add a conclusion.
This text box allows you to share the insights you gained through the evaluation. The word limit is 400 words. Click the Show button in the ‘Suggestions’ section below the text box to see prompts about what to include.
STEP 10: Click the Save to PDF button on the bottom left of the page.
Your report will be saved as a PDF to your device. To fulfil your funding requirements, send a PDF of the Report to your Relationship Manager and upload a copy to Grantium. Note that this report is yours, and you may share it with whomever you like, whether that be Board members, staff or external stakeholders.
If revisions are made to your Report, you can update it by again clicking Save to PDF.
Average difference by dimension
These charts show the average differences in how the public responded to each cultural experience dimension. Shorter bars indicate less difference with self expectation.
Average difference overall
These charts show the overall average difference for each evaluation taking into consideration all cultural experience dimension outcomes. Smaller numbers/circles indicate less difference with self expectation.
Note: These charts do not show whether public scores were above or below expectations but focus on highlighting difference and alignment.
These charts show how participants (self, peer and public) responded to the core cultural experience dimensions and enable comparison of expectations with outcomes. Evaluations are ordered by the self assessors’ average expectation and range from lowest to highest.
If your report doesn’t look correct or appears to be missing data, please ensure your evaluation is configured properly and that you have distributed your surveys to the correct respondents.
If you’re still having issues with your report, please contact firstname.lastname@example.org
Powered by BetterDocs