Creating Your Insights Report

View/download a PDF of this page here.

Once you have finished collecting data for an evaluation, you are ready to create your final report for submission to the Arts Council. Your funding agreement requires you to finalise and submit this report to your relationship manager within one month of the event’s end date. From your dashboard, select the evaluation from the left sidebar. Next, click on Go to Report to be taken to the Reporting section of your dashboard, which you should already be familiar with (see Fig. 1). In the top right hand corner you will see two buttons: Prepare Evaluation Report and Export Evaluation. Click on Prepare Evaluation Report. Please note that this button will only be visible once some data has been collected.

Reporting dashboard in Culture Counts

Fig. 1 Where to find the Insights Report template

Upon clicking on this button, you will be taken to an auto-generated report. Here you can confirm the details in the report and add contextual information, before sending the report to your Relationship Manager. As you complete each stage of the process, the points along the left sidebar will display a tick mark confirming completion.

Fig. 2 Page 1 of an example Insights Report

What you need to do next:

  1. Confirm that you are happy with the data as it is presented
  2. Explore and interpret the charts
  3. Add your Creative Intentions (strongly recommended)
  4. Add commentary on insights gained (strongly recommended)
  5. Submit report through the platform
  6. Save it as a PDF and email it to your Relationship Manager

 

1. Confirming Your Data

The data is pulled automatically from your evaluation, so everything should appear correctly without you needing to make any adjustments. Data will only be shown for responses to the Core Dimensions used and not for custom questions. If no data was collected for a particular dimension, you will see N/A in place of data. Similarly, if your surveys did not include demographics questions, you will see a blank space under those headings.

If you completed the Evaluation Properties when setting up your evaluation, then Type, Artform(s) and Location will be included in the heading of your report. The artform(s) that appear will be the top-tier artform(s) only. If you inputted attendance size in the Evaluation Properties then your report will include the margin of error for each average public dimension score.

 

2. Exploring and Interpreting the Charts

The most important stage in compiling the report is to understand what the evaluation data can tell you about how people experienced the event. The charts in this report, combined with any commentary you collected from peer and public respondents, can help you to understand:

  • what kind of experience it was for people – which dimensions scored most highly?
  • whether peers and public experienced the event in the same way
  • how well aligned peer and public responses were with your original intentions for the event
  • how much variation there was in public scores – were people in close agreement, or was there a spread of opinion?

Please see the appendix for more information on how to interpret the different charts in the report. Once you have a sense of what the evaluation data are telling you, you are ready to add your Creative Intentions and Insights.

 

3. Adding Your Creative Intentions

This is an opportunity to provide information about your event and what it set out to achieve. This context will help your Relationship Manager to make sense of the evaluation results. You’ll be shown the following prompts to get you thinking about what to include, but it’s up to you how you choose to use this space. There is a word limit of 250 words and you may copy and paste into this box if you prefer to begin writing elsewhere.

  • What were the aims of the event?
  • Was the event aimed at a specific audience (i.e. children and young people)?
  • Which dimensions were most important for this event and why?
  • How were you hoping peers and audience members would respond?

 

4. Adding Insights

Similar to the Creative Intentions section, this box allows you to share the insights you gained through the evaluation with your Relationship Manager. The word limit is 250 words. You will be prompted with the following questions to guide you:

  • Did the findings make sense to you given your original aims for the event?
  • Which aspects of the findings are you most proud of? Are you surprised by any of the findings?
  • What did you learn from any additional public or peer commentary that you collected?
  • How can you use the findings to inform your programming or practice in the future?

Note: When you add text to your report, it will be saved automatically, giving you the option of coming back to the report to complete it at a later stage.

 

5. Submission and Saving as a PDF

Once you feel the report is complete, click on the Submit button on the left-hand side of the screen. You will be shown the following message:

Once you submit this report, the data that drives it will be shared with ACE as part of your funding requirements. This will not include personal data such as names or contact details. You should then save your report as a PDF and share it with your Relationship Manager via email.

Upon clicking OK, a window will open to allow you to save the report as a PDF to your device.

Please save your report as a PDF and share it with your Relationship Manager via email. Note that this report is yours, and you may share it with whomever you like, whether that be Board members, staff or external stakeholders.

If revisions are made to your evaluation data after submission for whatever reason, you would not need to resubmit this form. However you might want to save the new PDF and email the updated version to your Relationship Manager.

 

Appendix: Interpreting Charts

 

Dimension Averages (Mean)

These charts provide a quick snapshot of dimension averages and highlight where the public, peer and self align or diverge in their level of agreement.

radar chart example

 

 

01 Dimensions are plotted to form a hexagon with six axes and two rings (inner and outer).

 

 

radar chart showing measurements

 

 

02 Each axis ranges from 0 to 100, where 0 is strongly disagree, 50 is neutral and 100 is strongly agree.

 

 

radar chart with example data

 

 

03 The average (mean) score for each dimension is plotted.

 

 

Intention vs. Outcome

 

These charts show the mean self (prior), peer and public scores. They indicate how an organisation’s creative intentions for a project (prior to the event) compared with the actual experiences of peers and the public (post event).

Relevance

chart from insights report 

Variation in Public Scores

These charts show the mean level of agreement for each of the 6 public dimensions as well as providing an indication of how varied audience responses were around the mean. If the attendance number is known, the margin of error will also be shown for each score.

variance chart

 

variance chartsThe number in the green circle is the mean public score. The vertical green line shows the media public score. The light green horizontal bar represents the interquartile range. The middle 50% of public responses lie within this range. The wider the bar, the more varied the public agreement level was for that dimension.

If the mean score is positioned to the left or right of the media there are more extreme values outside the interquartile range pulling the average in that direction.

The mean score for relevance is 83 with a lower quartile (Q) score of 70 and an upper quartile (Q3) score of 90. The spread of scores is narrow, indicating that the public tended to have similar opinions for this dimension. The mean is to the right of the centre, meaning there were more high scores outside of the interquartile range than low scores.

 

Troubleshooting

 

If your report doesn’t look correct or appears to be missing data, please ensure your evaluation is configured properly and that you have distributed your surveys to the correct respondents.

Possible causes:

  • Evaluation properties for artform, location or attendance have not been added
  • No peer assessor responses
  • No self assessor responses
  • Standard demographic questions not used
  • Core dimension question not used

 

Further Assistance

If you’re still having issues with your report, please contact support@countingwhatcounts.co.uk

  

Glossary

 

Margin of Error

The margin of error represents how confident we are in our reporting of each dimension, based on the number of responses received. It shows how close we think the average public scores is to the ‘true’ average score of all audience members or visitors who experienced the work. The lower the margin of error, the more likely it is that the result is reflective of the total audience. A margin of error of 2% means that if you were to survey your entire audience, the resulting score would likely be within 2% above or below the reported score.

 

Quartiles

Quartiles tell us about the spread of a data set by breaking the data set into quarters. The top 25% of responses lie in the upper quartile and the bottom 25% of responses lie in the lower quartile.

 

Interquartile Range

Interquartile range is a measure of spread. It is defined as the difference between the upper and lower quartiles, or, the range in which the middle 50% of responses lies.

 

N/A

Not available (N/A) indicates missing data or missing evaluation properties.

 

*Please note, the links above will take you away from the Impact and Insight Toolkit and to YouTube. Please ensure you are happy for this to happen before clicking and have familiarised yourself with their cookie and privacy policies.

 

Go back to Support Materials

 

The information on this page was last updated on 15 October, 2019.

X
X