Creating An Insights Report

1. What is an Insights Report?

An Insights Report is a document that can be auto-generated from an evaluation in the Culture Counts platform. Once you have finished collecting data for an evaluation, you can create an Insights Report. Depending on your funding requirements, you may need to submit a PDF of the report to your Arts Council England Relationship Manager; please see our funding requirements section for more details.
 

2. How to create an Insights Report

 
2.1. To create an Insights Report, select the evaluation you want to generate it from in the sidebar on the left-hand side of the dashboard.

How To Create An Insights Report - Click Prepare Insights Button
 

2.2. Click the Prepare Insights Report button.

 

2.3. Confirm your data.

You will now be able to see an auto-generated report of the selected evaluation.

The data is pulled automatically from your evaluation. Data will be shown for responses to dimensions, demographics and custom questions. If no data was collected for a particular question, it will not be listed in the sidebar on the left-hand side.

If you completed the Evaluation Properties when setting up your evaluation, then Duration, Event type, Event Artform(s), Location and Attendance will be included in the heading of your report. The artform(s) that appear will be the top-tier artform(s) only. If you inputted attendance size in the Evaluation Properties section, your report will also include the margin of error for each average public dimension score.

If you did not enter any information into the Evaluation Properties section, the message ‘This report has missing evaluation properties’ will appear. Click the Set-up evaluation properties button to add your evaluation properties.

 

2.4. Select what dimensions, demographics and custom questions you want to include in the report.

Click on a question in the sidebar on the left-hand side to toggle between selecting and deselecting: deselected questions will appear greyed out with a line through the text; selected questions will not be crossed out and appear in a white box.

 

2.5. Explore and interpret your charts.

An important stage in compiling the report is to understand what the data can tell you about how people experienced the event. The charts in this report, combined with any commentary you collected from peer and public respondents, can help you to understand:

  • What kind of experience it was for people – which dimensions scored most highly?
  • Whether peers and public experienced the event in the same way.
  • How well aligned peer and public responses were with your original intentions for the event.
  • How much variation there was in public scores – were people in close agreement, or was there a spread of opinion?

Click the + Annotation button under each chart to add optional annotations. This allows you to add your own commentary and context to each individual chart

 

2.6. Add your Creative Intentions.

This is an opportunity to provide information about your event and what it set out to achieve. The word limit is 250 words; you can copy and paste into the text box if you prefer to begin writing elsewhere. Click the Show button in the ‘Suggestions’ section below the text box to see prompts about what to include.

 

2.7. Add your Insights.

Similar to the Creative Intentions section, this text box allows you to share the insights you gained through the evaluation. The word limit is 250 words. Click the Show button in the ‘Suggestions’ section below the text box to see prompts about what to include.

Note: When you add text to your report, it will be saved automatically, giving you the option of coming back to the report to complete it at a later stage.

 

2.8. Click the Save to PDF Button on the top right of the page.

Your report will be saved as a PDF to your device. Note that this report is yours, and you may share it with whomever you like, whether that be Board members, staff or external stakeholders.

If revisions are made to your evaluation data, you can update your Insights Report and again click Save to PDF to save a copy of the updated version.

3. Submitting data to Arts Council England

 

3.1. Select the evaluation you want to submit data for from the sidebar on the left-hand side.

 

3.2. Click the Submit data button.

 

3.3. Confirm your data.

You will now be able to see a pop-up summary of your evaluation data. The data is pulled automatically from your evaluation. Summary data will be shown for the number and type of responses. Dimension and demographic questions used will also be listed. Please note all dimension and demographic questions used in the selected evaluation will be listed regardless of whether any data has been recorded from responses.

The pop-up will also tell you whether your evaluation meets ACE requirements for BAND 2 & 3 Core or Flexible evaluations. This information may not apply to you, please see our funding requirements section for more details.

 

3.4. Click the blue Submit button at the bottom of the pop-up.

Once you click Submit your evaluation data will be shared with ACE. Submitted data will include dimensions and demographics. It will not include custom questions or personal data such as names or contact details.

Appendix: Interpreting charts

 

Dimension averages (mean)

These charts provide a quick snapshot of dimension averages and highlight where the public, peer and self align or diverge in their level of agreement. Note: only dimensions shared across all respondent groups are shown in these charts.

01 Dimensions are plotted on the form of a radar chart with two rings (inner and outer). The shape and number of axes will reflect how many dimensions have been selected. In the example shown, six dimensions have been selected.

02 Each axis ranges from 0 to 100, where 0 is strongly disagree, 50 is neutral and 100 is strongly agree.

03 The average (mean) score for each dimension is plotted.

Intention vs outcome

These charts show self prior, peer and public mean averages. Dimensions are ordered from highest to lowest self prior. No line means that dimension was not asked of that respondent group.

For example, an interpretation of this chart could be that, on average, peers consistently scored the event more highly than self assessors and the public.

Variation in public scores

This chart is a box and whisker plot. The vertical line shows the median public score. The green bar shows the interquartile range. The middle 50% of public responses lie within this range. You can read more about box and whisker plots here.

For example, an interpretation of this chart could be that although the dimension ‘Belonging’ has, on average, the highest public score, it was also the dimensions which the public gave the greatest range in response values.

Stacked level of agreement

This is a stacked bar chart. It shows the % of public responses which fall within a given range. Each range is represented by the different coloured bars with labels below.

For example, an interpretation of this chart could be that across the four dimensions all public respondents either agreed or strongly agreed with the dimension statements.

Response distribution

These distributions offer a more in-depth look at how the public varied in their opinions. The taller the line at any given point, the larger proportion of the public respondents who gave that response. A single large peak indicates most people agreed. Multiple peaks indicate that there were varied opinions amongst the public respondents.

For example, an interpretation of this chart could be that the dimension ‘Authenticity’, despite the previous chart showing 71% of responses fell within ‘agree’, had the most people score it in the lower end of that section.

Glossary

Margin of Error

The margin of error represents how confident we are in our reporting of each dimension, based on the number of responses received. It shows how close we think the average public scores is to the ‘true’ average score of all audience members or visitors who experienced the work. The lower the margin of error, the more likely it is that the result is reflective of the total audience. A margin of error of 2% means that if you were to survey your entire audience, the resulting score would likely be within 2% above or below the reported score.

Quartiles

Quartiles tell us about the spread of a data set by breaking the data set into quarters. The top 25% of responses lie in the upper quartile and the bottom 25% of responses lie in the lower quartile.

Interquartile Range

Interquartile range is a measure of spread. It is defined as the difference between the upper and lower quartiles, or, the range in which the middle 50% of responses lies.

Troubleshooting

If your report doesn’t look correct or appears to be missing data, please ensure your evaluation is configured properly and that you have distributed your surveys to the correct respondents.

Possible causes:

  • Evaluation properties for artform, location or attendance have not been added
  • No peer assessor responses
  • No self assessor responses
  • Standard demographic questions not used
  • Core dimension question not used

Further assistance

For more information on interpreting your Insights Report please watch the Interpreting Toolkit Reports webinar.

If you’re still having issues with your report, please contact support@countingwhatcounts.co.uk

Powered by BetterDocs

X
X