User Guide

This section will get you to start thinking carefully and strategically about how to shape your surveys in order to make the most out of the Impact & Insight Toolkit. Click on the titles below to explore the different subsections or view/download a PDF version of this page here 

1.          Reflect on your objectives

2.     Make your objectives measurable

3.     Survey design

4.     Engaging self assessors and peer reviewers

5.     Data collection

1.   Reflect on Your Objectives

Before you begin designing your surveys, think about what your event is trying to achieve.

  • What are the aims of the event you have chosen to evaluate?
  • What is the mission behind your organisation?

Take a look at the excerpts from several NPO’s mission statements below. These highlight some of the fundamental objectives that these arts and cultural organisations have set out to achieve.

Our mission is to inspire people through innovative art and culture – contributing to our region’s well-being, learning and economy.   – Firstsite

To place the best contemporary classical music at the heart of today’s culture; engaging and challenging the public through inspiring performances of the highest standard, and taking risks to develop new work and talent. – London Sinfonietta

To stimulate curiosity and wonder, promoting opportunities for people of all ages, abilities and backgrounds to participate in and enjoy exhibitions, educational programmes, activities and events. – Horniman Museum

There’s a diverse mix of goals represented here, with key-words like inspire, innovative, challenging and curiosity. Think about where your organisation’s ambitions lie. Also consider whether the event you are evaluating represents something different, risky or challenging for your organisation. Are you pushing the boundaries of your artform, or interpreting existing work in new ways? Are you trying to reach a new audience, or offer something different for your core audience?

The next section will outline how you can use the Impact & Insight Toolkit to evaluate your impact as you pursue your creative ambitions.

Man performance barefoot

2.   Make Your Objectives Measurable

Anecdotally and intuitively, most organisations have a sense of the extent to which they are achieving their goals but often these objectives aren’t measured in a structured way. The Impact & Insight Toolkit enables you to:

  • Build evaluations around key objectives to find out how audiences or participants were impacted by a given piece of work
  • Capture public data about impact to make a stronger case for the outcomes your organisation generates

To get the most out of the Impact & Insight Toolkit it is important to select dimensions that are relevant to your organisation’s objectives, taking into account those dimensions that are required by the Arts Council for ‘mandatory evaluations’.

Arts Council Mandatory Evaluations

Arts Council England Core Cultural Experience Dimensions

Arts Council has identified the following core dimensions that it would like NPOs to use for any public-facing mandatory evaluation (for self, peer and public review):

Self, Peer and Public

Concept:                     It was an interesting idea

Distinctiveness:       It was different from things I’ve experienced before

Challenge:                  It was thought-provoking

Captivation:                It was absorbing and held my attention

Relevance:                  It has something to say about the world in which we live

Rigour:                        It was well thought through and put together

There are a number of advantages to using pre-set dimensions across all your evaluations, not just the mandatory evaluations you share with the Arts Council. Pre-set dimensions can save you time when you’re designing your evaluations – you have access to tried-and-tested survey questions without having to create them from scratch. If you use the core dimensions consistently across your evaluations, you will be able to compare the impact of different events you undertake between now and 2022. You will be able to track whether your programme as a whole is heading in the right direction, and identify where you are having greatest impact and areas that may need more energy and resources.

As the Arts Council’s core dimensions will be used by many NPOs, you will be able to compare your results with similar organisations and contribute your data to a shared, aggregate, anonymous dataset. Thes will help build evidence of the impact of the sector as a whole, not just your organisation.

In addition to core dimensions, you may want to include further dimensions that reflect your creative intentions for the event, or custom questions to capture information about your audience.

 

Dimension Choice in Practice

Consider the following example:

Festival 1 surveys their attendees asking a combination of demographic questions and marketing questions (e.g. ‘How did you hear about this event?’), and seeks to gauge overall satisfaction with the festival and the likelihood that people will recommend it to others.

Festival 2 designs their surveys to assess their impact by employing a range of questions that seek to understand whether the event helped visitors feel connected to people in the community, whether it inspired them to be more creative, and whether it challenged them to think in new ways.

If a visitor to Festival 1 expresses dissatisfaction with the event, the survey does not capture the crucial information that would explain why. Even if the majority of survey takers respond positively, it may be difficult to replicate the success of the festival in the future without understanding what it was about the event that made it enjoyable or memorable. By only choosing these kinds of questions, Festival 1 has limited the usefulness of the data collected.

In comparison, Festival 2 is able to understand how particular groups of interest experienced the event in different ways. For example, say Festival 2’s results show that although they are successfully attracting diverse audiences, females under 30 were highly challenged and captivated, while males and older demographics were less engaged. It is this type of insight that makes an evaluation meaningful and allows for evidence-based actions.

Thoughtful evaluation will ensure that the evidence collected through the Impact & Insight Toolkit is useful to all parts of your organisation as well as the sector as a whole.

 

3.   Survey Design

There are three different types of survey you can create to collect feedback from your respondent groups:

  • A standard survey for members of the public to rate their experience of the event, based on your chosen set of dimensions and any custom questions.
  • self prior survey that asks self assessors to describe their objectives for the event. Having a self prior survey is essential if you want to compare your objectives for the event with how it was received by audiences and peers.
  • A post survey that asks self assessors and peer reviewers to rate their experience of the event

An advanced option is to create a prior survey for peers to gauge their expectations of the event, which can then be compared with their actual experience. Please contact our support team at support@countingwhatcounts.co.uk if you would like to experiment with this.

Below are some suggestions to help you maximise the efficacy of your surveys.

1. Keep surveys as short as possible

Audiences are often asked to complete surveys. We encourage organisations to keep surveys to a maximum of three minutes, after which there is a notable drop off in response rates. State clearly that you’re only asking an audience member for a small window of their time.

Some of your activities, such as participatory events where you work with smaller groups over a longer period, may allow you to survey a group more frequently, or use longer question schedules. Please contact our support team at support@countingwhatcounts.co.uk for further advice and guidance.

2. Don’t ask questions for the sake of it

To keep surveys short, you need to be selective about the questions you choose to ask. Sometimes questions are included simply because they’ve been asked in previous years, even though they’re no longer relevant. Continuity can be beneficial, but only if the data collected is being used to represent changes over time. Remember, choose questions that will help you understand your progress in relation to your objectives and identify where improvements might be made. Think about what you’ll learn from the data and resist the urge to include questions that don’t generate actionable insights. We recommend that surveys contain a combination of core dimensions and custom questions, approximately 12 in total, to keep them short and interesting for respondents.

3. Present surveys as an opportunity for your key audiences to give feedback

Evaluation should be seen as a two-way street. Surveys generate valuable information that help your organisation gain insight, but they also give audiences the opportunity to provide feedback that can improve future experiences. Many people appreciate the chance to have their say, and the data you generate is a powerful starting point for a richer conversation with your audiences and partners.

4. Collect anecdotes or free text answers to sit alongside the numbers

The inclusion of open text questions helps you to capture rich descriptions and interesting details about visitor’s experiences. Open text questions also give people the chance to communicate with you freely and discuss issues or outcomes that you may not have anticipated. Although an individual anecdote alone is not evidence of broad impact, they can provide additional context and colour to sit alongside numerical data.

5. Evaluate as quickly as possible

This may seem obvious, but frequently evaluations are carried out weeks or even months after an event or programme. Sometimes surveys are sent to all attendees at the end of a lengthy exhibition or performance season, which means that people who attended early may have forgotten the small but important details about their experience. Many ticketing systems and CRMs make it easier to schedule a mailout to correspond with the date of attendance, while intercept interviews or fixed tablet surveys can collect responses during or immediately after. This ensures that the experience is still fresh in people’s minds; they’re more willing to complete the survey and they can offer a more accurate picture of the impact it had on them.

 

4. Engaging Self Assessors and Peer Reviewers

A key element of the evaluation process is identifying and engaging peer reviewers and self assessors.

Peers

This is an opportunity to gather feedback from professionals in your field whose opinions you value and respect. Take this opportunity to reflect upon which individuals you feel would offer valuable insight as well as a fair, objective critique of your work. You should choose individuals who are not emotionally invested in the specific work you are evaluating, and you may want to look beyond the ‘usual suspects’ to access perspectives on your work that you might not normally here. Finally, think about the bigger picture and the peer network that is created – can you enable more diverse voices to be heard through your selection of peers? Can you create valuable professional development opportunities for practitioners at different stages of their careers?

Don’t forget that you will need to contact your chosen peers, ask them if they are willing to take part and arrange a time for them to attend your event. Read more about peer review here. 

Self-Assessors

Also consider who you would like to nominate to respond to the self-assessment survey. This can be just one person in your organisation, or better still, a range of key individuals involved in the event’s production. Feel free to think broadly about the individuals who have contributed to the making of your event. In this way, self-assessors might include the artists, curators or marketing or education staff within your organisation.

 

5. Data Collection

Once you have decided on your metrics and custom questions, designed your surveys and identified self assessors and peer reviewers, you are ready to collect data.

You can distribute a survey to your self assessors and peer reviewers directly from the Culture Counts platform (this is explained in detail in the Platform Guide).

There are three methods for distributing surveys to audiences and visitors. You may choose to use one or a combination of methods, depending on the type of event being evaluated, the size and capacity of your organisation and what you think will be most appealing or appropriate for your audience. Culture Counts will create a different survey URL for each method that you choose.

1. Online

The Online URL can be shared over email, social media and other digital platforms. If your organisation has a box office or CRM and corresponds with audience members and visitors largely via email, this is a simple option for sharing your survey. This option allows you to disseminate your survey to a large number of people quickly. Online surveys are non-resettable, so they can only be taken once by respondents.

2. Interview

Intercept interviews involve staff and/or volunteers from your organisation surveying the public at your event with tablet computers. The Interview URL is resettable so that multiple responses can be recorded on a single device.

Whether the project you are evaluating is an exhibition or a performance, this option necessitates personal contact and dialogue with respondents, allowing you to collect more nuanced feedback and develop a rapport with your audience. It also allows organisations to spot sample and target groups that may otherwise not complete a survey.

If you’d like to use this option, we can offer a training video and guide for interviewers on how to use the Culture Counts platform as well as various interviewing and sampling techniques.

3. Display

Your public surveys can also be displayed on a device at a set location, such as a fixed podium or a computer in a public area. This option allows you to target visitors onsite as they visit your event, even if you do not have staff available to conduct intercept interviews.  Similar to the Interview option, the Display URL is resettable so that multiple responses can be recorded.

 

Sample Size

How many public survey responses do you need? There is no definitive answer to this question; large samples are generally more powerful as they yield more accurate results, but data collection and analysis will be proportionately more time consuming and expensive.

Sample size is the number of completed responses your public survey receives. It’s called a sample because it represents part of the total audience for your event (it would normally be impractical to survey everyone who attended).

Your aim is to collect just enough responses to feel confident that the views of your sample are representative of the views of your audience as a whole. Counting What Counts will be producing more detailed guidance on appropriate sample sizes for different types of event, including participatory events which typically involve smaller groups of people. In the meantime, as a rule of thumb we advise aiming for 70-100 public responses per event.

More information about how to carry out reliable surveys is available from The Audience Agency: https://theaudienceagency.org/insight/guide/representative-and-reliable-surveys and https://theaudienceagency.org/insight/good-practice-guide-to-sampling

 

Please contact support@countingwhatcounts.co.uk if you would like to discuss sampling in more detail.

Image Credit: Craig Whitehead, “Tate” via Unsplash

The information on this page was last updated on 28 March, 2019.

 

X
X