Developing the ‘Evaluation for Online Works Template’ to suit your Organisation and Works

As a user of the Impact & Insight Toolkit, there will be a new evaluation template in your dashboard entitled, ‘Evaluation for Online Works Template’.  This template has been designed in collaboration with arts and cultural organisations as well as funding bodies in order to establish a survey to capture insight into the ever-increasing use of digital media to experience arts and cultural works.  More information on its development can be found here.  This document explains how a Toolkit user might choose to develop and alter the existing template to suit their individual needs.

View/download a PDF of this page here

  1. Dimension Usage
  2. Survey Length
  3. Suggested Question Alterations (Removals and/or Additions)
  4. Survey Delivery

 Sergio Rota ‘Big Sphere of Journals’ via Unsplash

Dimension Usage

The template includes four of the core cultural experience dimensions.  These have been selected to provide continuity across both an NPO’s and CPP’s evaluations, regardless of the financial year and funding requirements.  This will provide you with comparable data points which will enable you to understand the difference between work that is experienced online vs in-person.  The four featured in the evaluation template can be swapped for any of the other core cultural dimensions as you see fit.  As a reminder, the core cultural experience outcomes for NPOs are below, with those included in the template italicised:

  • Concept
  • Captivation
  • Distinctiveness
  • Challenge
  • Rigour
  • Relevance

Two of the selected core dimensions, Rigour and Relevance, are also part of the core set for CPPs.

We suggest that whichever core cultural experience dimensions you choose, they speak to the aims of the work and your organisation’s mission.

 

Survey Length

 As always, we want you to obtain maximum insight without overwhelming respondents with too many questions and placing too high a demand on their time.  When designing a survey for members of the public, we usually suggest aiming for a completion time of 3 minutes, with some flexibility.  This is to make allowances for those that may take longer to complete a survey than others.  There will inevitably be a variety in the amount of time someone will dedicate to completing a survey, but a key factor is whether the survey is an outward facing customer opinion style survey, or if it is being completed for professional reasons.  It is because of this that you may notice that a peer survey is generally longer and may take a few more minutes to complete than the public survey.  There is evidence to suggest that the drop-off rate increases after 8 minutes.  Therefore, we would suggest aiming for a completion time of 6 minutes for peer reviews, with some flexibility.

The importance of keeping a survey concise increases when surveying work experienced online.  The reason for this is that the attendee’s commitment to experiencing the work is likely to be lesser than attending an in-person work.  To explain, when attending in-person, audiences typically need to set aside time in their calendar; perhaps buy their ticket in advance; and, arrange transportation.  When they are in their homes, there is less commitment involved.

This carries through to their commitment to opening and completing a survey.  The amount of effort someone might make correlates positively to their investment.

 

Suggested Question Alterations

Removals

The new template is designed to be all-encompassing – something that can be used if you have no other data or if you’re uncertain how to gather further data.  Therefore, it might feel too lengthy, especially if you are confident in gathering and using additional data (e.g. YouTube analytics).

Therefore, we have a few suggestions of questions you might choose to remove from the survey, as you tailor it and make it suitable for your organisation and work.

  • Statement: I felt that this digital experience met my need for cultural activity
  • Short-text: Please write three words to describe your experience of this work
  • Dropdown: How would you rate your experience overall?
  • Multiple choice: On what sort of device did you experience this work?
  • Dropdown: How long did you engage with this work for?
  • Short-text: Why did you not stay for the full duration of the work?
  • Yes/no: Either during or afterwards, did you share anything about this work on social media?

Having copied the survey into your own evaluation, you can remove any question by clicking on its corresponding dustbin icon in the ‘Design’ page.

 Additions

Dependent on your organisation’s current position and the specifics of the work you are evaluating, there are many other questions you may wish to consider.

If you are running work online via a means which allows for live commentary, (YouTube Premier, IGTV, Facebook Live etc) it would be appropriate to use a custom slider or dropdown (see here for guidance on how to create these) to ask for the agreement level to the following statement:

The ‘live’ comments made me feel part of an audience community experiencing this work.

If it is anticipated that the reach of the work will be reasonably small, say it’s an online workshop with a maximum of 30 participants, there would be a case to incorporate more free-text style questions, as this allows for richer insight.  This is not recommended when there is a larger reach as conducting analysis on qualitative feedback can be resource-heavy.  An example of a non-directive free-text question can be seen below:

Is there anything else you’d like to share about your experience?

If a piece of online work is targeting those that regularly engage with arts and culture, a question about the respondent’s expectations could be asked.  A survey could contain a question such as the one below, presented as a dropdown, to provide a useful point of reference to learn how the public perceives the concept of online work and how that perception influences their responses to the quality-focused dimension questions.

Did the work meet your expectations? 

It might be that you’re interested to learn how much effort has gone into the attendee’s experience of the work in order to establish what, if any, effect this has.  For instance, if you’re surveying a piece of theatre which was pre-recorded and now being shown via digital means, you might be interested in knowing the extent to which a similar atmosphere was recreated at home: did the attendee close the curtains, turn off their phone and turn up the volume?  If so, what impact does this have on their experience?  It might even be that they organised with friends or family to watch simultaneously, and then have a video-conversation afterwards to discuss it.  A multiple-choice question, such as the one below could provide much insight.

How did you recreate the atmosphere of the theatre at home?

The questions shown above are, of course, only a sample of additional questions you may wish to ask of your respondents.  There are many other topics you may wish to consider, for instance:

  • Providing the opportunity to join a mailing list
  • Gauging appetite to engage with similar works in the near future
  • Asking a Net Promoter Score question, enabling you to gain some understanding of the further reach of your work

Through a combination of the questions included within the template and carefully selected additional questions, the insight you could gain from this survey could be invaluable as you develop your future programme.

 

Survey Delivery

When setting up your survey, it’s important to consider how it will be delivered.  When a work is delivered online, the opportunity for follow up face to face interviews is not there.  Similarly, it may be that you do not have access to the email addresses of those that have experienced the work.  Dependent on how your work is delivered, there may be opportunities to:

  • Encourage those that are experiencing the work to complete the survey via live commentary (consider IGTV, YouTube live etc.)
  • Use social media to publicise the work and the subsequent survey
  • If it’s a live stream, encourage those that are delivering the work to publicise the survey
  • If the work is pre-recorded, have a survey’s QR code and URL included at various points, encouraging attendees to respond throughout the work

It is important to acknowledge that there is likely to be a lower response rate to surveys for online works, in comparison to in-person.  This is partly due to the less commitment required to experience the work, as mentioned in the Survey Length section of this guide.  However, this is not to say that you won’t achieve great insight.  All responses to your survey will contribute to your understanding of how your online works are perceived by those that experience them, and the impact your organisation is having as a whole.

 

The team at Counting What Counts is ready to support you in your use of the evaluation template, tailoring it to your needs.  We can advise on question choice, delivery methods and which respondent categories you may wish to use.

 

Please do not hesitate to get in touch with any questions at all:

support@countingwhatcounts.co.uk

 

Page updated: 25/06/2020

 

X
X