Creating An Evaluation

1. What is an evaluation?

Evaluations are folders that contain all the information for a specific event, project or work for which you are keen to learn more about its impact. The work that you evaluate can vary from an online exhibition to an in-person stage performance to an outdoor workshop; it is completely up to you!

 Please note that there is no a limit on the number of evaluations you can do.

 Dependent on your funding status with Arts Council England, there will be different requirements for your evaluations. To see more about the requirements, please see the mandatory guidance.

 An evaluation will generally contain multiple surveys, all with a different purpose. There may be a survey specific for:

  • Self assessors – the team/person creating, curating, or developing the work you are evaluating.
  • Peer reviewers – someone whose professional opinion you value; for example, this can be an academic from the area of work you are evaluating or an associate working in a similar discipline. The main thing to remember is that this person needs to be distanced from the work i.e., not involved in creating, producing, or developing the work.
  • Public – The audience/public that are going to experience the work out of their own interest.

The surveys may also have different ‘types’:

  • Prior survey – a survey taken before the work has started; this is to measure your expectations before the event commences.
  • Standard survey – a survey dedicated to capturing one-off reactions, generally used when surveying the public.
  • Post survey – for self and peer respondents to record their responses which can be compared with the prior survey.

The exact configuration will depend on your funding status. However, the evaluation is likely to present a combination of the above.

2. How to create an evaluation

You can also watch the free video tutorials that cover the following steps here.

2.1. Having received your login details from the Counting What Counts team, you will be able to access the Impact & Insight Toolkit at this address:

2.2. Once here, click ‘Sign In’ to access the Culture Counts platform. Once you have signed in you will see the following:

2.3. On the left hand side, you will see the Evaluations panel which will display any previous Toolkit evaluations.

2.4. To create a new evaluation click on ‘New Evaluation’ (orange button in the Evaluation panel)

2.5. In the ‘Create an Evaluation’ pop-up, enter the name of your evaluation in the ‘Evaluation name’ field (we have called ours ‘ACE Test’ in the example below but normally it would be the name of the work you are evaluating).


3. Create from Template, Evaluation or Blank

In the ‘Create from’ field there are three options for creating your evaluation.

  • Template
  • Evaluation
  • Blank

Template: If you are new to the Toolkit or are creating an evaluation to meet your Art Council funding requirements, we strongly advise you to use the ‘Template’ option. When Template is selected, you will see a dropdown list of evaluation templates i.e., evaluations that have already been created for you. You can of course add more questions to these templates at the Design stage. Using the Template option is a quick way to set up evaluations that meet your specific funding requirements.

Evaluation: This option lets you create an evaluation from a previous evaluation you have made, allowing you to essentially create a duplicate of a pre-existing evaluation – just remember to give it a different name!

Blank: Creating from blank gives you the freedom to create your own evaluation from scratch.


3.1. Creating From Template (recommended)

3.1.1 Select The Template

When ‘Template’ is selected, you will see a dropdown list of evaluation templates i.e., evaluations that have already been created for you. You should select the template that is most relevant to your cultural offering. When you click on a template a short description of the template will appear underneath.


3.1.2 Preview the Template (optional)

Click on ‘Preview’ to view the template. This is useful if you’re unsure what the evaluation looks like before creating the evaluation. A green pop-up will appear at the top, highlighting the fact you are in preview mode i.e. ‘This evaluation is a preview only and cannot be edited’.


3.1.3. Dimension set-up: Core or Flexible?

Depending on the template you select you may be prompted to choose an evaluation type i.e., ‘Core’ or ‘Flexible’. If you are using a template but do not wish to create a Core or Flexible evaluation type please click ‘No, thank you’.

If you are a NPO you will be expected to complete a specific number of Core and/or Flexible evaluations each financial year. If you are unsure of the type and the number you need to complete, please read the mandatory requirements for your specific funding status. For more information about Core and Flexible evaluations please click here.

Please note: The dimensions that you select at this stage ensure that your evaluation meets either the Core or Flexible requirements. You can add more dimensions from other dimension categories in the ‘Design’ section e.g. Placemaking dimensions


3.1.4. Core

If you select ‘Core’, you will then be prompted to click on ‘Confirm dimensions’.

  • A pop-up will appear displaying the ACE Core dimensions i.e., 6 dimensions to be completed by self, peers and public (Captivation, Concept, Rigour, Relevance, Distinctiveness and Challenge) and an additional 3 dimensions that need to be completed by self and peers only (Excellence, Originality and Excellence)
  • Scroll to the bottom of the pop-up box and click ‘Create Evaluation’.
  • The system will now insert an evaluation into your account, populated with surveys that adhere to the structure and questions chosen.


3.1.5. Flexible

If you select ‘Flexible’, you will then be prompted to Select dimensions.

  • The ‘Select dimensions’ pop-up will display ‘Cultural Experience and ‘ACE Participatory dimensions. You can view these categories separately or together by using the fields along the top of the pop-up in ‘From categories:’.
  • You will notice that the system may have randomly highlighted 4 dimensions in green as this is the minimum requirement for a flexible evaluation. You can deselect these by clicking on them and then select the 4, or more, you wish to include in your evaluation.
  • Once you have selected the dimensions scroll to the bottom of the pop-up box and click ‘Create Evaluation’.
  • The system will now insert an evaluation into your account, populated with surveys that adhere to the structure and questions chosen.


3.1.6. Generating surveys

Once you have clicked on ‘Create Evaluation’, the system will generate and display the surveys within that evaluation. These are the surveys for the different respondents i.e. Public, Peer, Self.


3.2. Creating from Evaluation

If you opt to click on ‘Evaluation’ in the ‘Create an Evaluation’ box, you can create an evaluation from a previous evaluation you made, allowing you to essentially create a duplicate of a pre-existing evaluation, just with a different name.

  • A dropdown box will appear which will list the evaluations that you have previously created. In this example, we have ‘Art and Craft Festival (Flexible)’ and ‘Gracie’s Evaluation’ as evaluations that have been created previously on this account.
  • Click on the evaluation you wish to reuse.
  • Click on ‘Preview’ to view the template. This is useful if you’re unsure what the evaluation looks like before creating the evaluation. A green pop-up will appear at the top, highlighting the fact you are in preview mode i.e. ‘This evaluation is a preview only and cannot be edited’.
  • Click on ‘Create Evaluation’.
  • The system will now insert an evaluation into your account, populated with surveys that adhere to the same structure and questions as the evaluation you chose to duplicate.


3.3. Creating from Blank

Creating from blank gives you the freedom to create your own evaluation from scratch. After giving your evaluation a name and selecting ‘Blank’ you will be able to create your own surveys by selecting ‘Create Survey’ on the top right-hand corner:


4. Adding properties

When you set up an evaluation, we strongly recommend you also add metadata tags; these are descriptors of your event which will enable you to make more detailed comparisons between your evaluations over time, or against aggregate data sets. If you intend on submitting this evaluation to Arts Council, these details will also give your Relationship Manager important context when reviewing your evaluation report.

4.1. Click on ‘Properties’ on the front page of your evaluation. Here you can add details such as artform(s), event location and attendance numbers.

4.2 Once you select the tick box ‘Event’, you will be asked to choose the type of event you are offering. Then click ‘Save’ (you should click save on all the answers you provide).

4.3. Add a location and whether your work is touring.

4.4. The overall attendance (this can be a rough number of people that will experience this work).

4.5. Lastly, add the start and end date of your work (optional).

5. Configuring your survey

You must now configure your surveys. This is where you can set up the following options for your survey:

  • Name (as seen by your respondents)
  • Start and Close Dates (optional)
  • Survey introduction
  • Custom branding (optional)
  • Survey type (optional)
  • Delivery types for a public survey e.g. smart phone, tablet, interviewer

Click on Configure which you will find in the navigation panel along the top.


5.1. Name

Give the survey a name that you are happy for the respondents to see when they receive the survey. In this case it has been named ‘Public Survey – We want your feedback!’ but we would recommend that you customise the survey name to make it unique to the work you’re evaluating.


5.2. Start and close dates (optional)

The start and close dates will determine when the survey begins and stops taking entries. We advise that you enter the dates nearer to the time of delivery. It is fine to leave the start and close dates blank. If you leave the start date blank, the survey will be available to record responses immediately and, if you leave the close date blank, the survey will remain active indefinitely. When inputting the start and close dates, we advise that you include an additional one or two days either side of the official start and close dates to ensure that you capture all available survey responses.


5.3. Survey introduction

Here you can write a short optional introduction to your survey that will appear on the front page of your survey. This introductory front page will only appear on the Online and Display survey delivery methods, but not for the Interviewer method. This is because interviewers delivering the survey by using the Toolkit should be able to provide this information verbally to respondents. The introduction is a good place to provide a brief explanation to respondents, such as:

‘We’d love to hear what you thought. Please share your views on this event via our short survey. Your feedback will help us to understand and measure the impact of our events.’


5.4. Custom branding (optional)

Here you can upload your own logo, which will appear on the front page with the survey introduction text.


5.5. Survey type (optional)

You do not need to take any action here if you have chosen the Template or Evaluation option. However, if you selected ‘Blank’ you will need to select the right option for your type of survey.

There are three survey types: Standard, Prior and Post.

  • Standard: The standard survey is for a one-off survey, generally to be completed by public audiences or visitors. Audiences are typically asked to complete a standard survey after experiencing an event or visiting a place, to measure the perceived impact of their experience.
  • Prior: Prior surveys allow you to record expectations of an event, to later compare with how it was actually perceived. This is essentially the survey you take before the event has occurred. We generally recommend that prior surveys are completed by self assessors to record their objectives and creative intentions for the work.
  • Post: Post surveys are used to aid comparison between expectations and experience. Typically, self and peer reviewers complete the post survey. Post work data can then be compared with the public data from the standard survey.

When creating prior and post surveys for the same respondent group within an evaluation from blank, you should ‘Survey Link’ them. Please see below:

This survey is a prior survey, so by selecting the ‘Post-Event Survey for Self and Peer Respondents’ from the dropdown list, the two surveys will be synced so that details from the prior survey are copied over to the post survey. To complete your post survey, click on the ‘Design’ tab and review the question schedule.

You only need to link surveys if you have chosen to create an evaluation from blank.


5.6. Delivery types for a public survey e.g. online (smart phone/tablets), facilitated by interviewer etc.

Please note that all of the methods require a stable internet connection or mobile data in order to collect survey responses.

This section allows you to choose how the survey will be delivered to the public respondents. If the survey you are working on is for self and/or peer respondents, you do not need to select a delivery type. Culture Counts supports the following delivery types:

  • Online survey: to be taken by respondents online via email, or on their own smart phones or devices. An online survey will only accept one response per device.
  • Interviewer: to be facilitated by an interviewer with a tablet computer. Interview surveys are resettable so that multiple responses can be recorded on one device.
  • Display: to be displayed on a device at a set location, such as a fixed podium or a library computer. Display surveys are resettable so that multiple responses can be recorded on one device.

Select all the methods that you wish to use to deliver your survey. You can choose however many you wish. A unique survey link will be created for each method and will be displayed clearly on the Summary page at the end of the survey builder. For more information on the various delivery methods, please see the Getting Started With Thoughtful Evaluation guide.

You will be presented with advanced options at the bottom of the Configure page, related to the delivery type(s) you have chosen, including:

  • A finish URL, which will redirect respondents to a specific website (e.g., your organisation’s homepage) on completion of the survey.
  • A timeout, which will reset the survey if the respondent is inactive for a period of time.
  • Access to survey tools, which will enable interviewers to access tools to support the interview process.



6. Designing/adding survey content

Designing a survey is the process of adding questions and content to your survey. On the Design page you can add, modify or remove questions.

Dependent on whether you selected Template or Evaluation; Core or Flexible at the commencement of the process (see step 3 above), you might find some dimensions are already populated on your Design screen. You can add or remove dimension questions, but if you are planning on using your evaluation to contribute to meeting funding requirements, you should ensure you don’t remove anything required.

In order to add content, simply click the type of question or content and it will appear in the survey section. Your questions should be chosen based on the objectives of your evaluation (the outcomes you want to measure that reflect your creative intentions for the work). If you want to learn more about best practice for making surveys, visit the Getting Started With Thoughtful Evaluation guide for more information. The types of survey content available in Culture Counts are outlined below:


6.1. Dimension

A dimension is a standardised statement that respondents can agree or disagree with, rated using a slider. The slider represents a one-point scale on a continuum, where the far left is 0, the middle is 0.5 and the far right is 1. This means that every position on the slider has its own value.

Culture Counts’ dimensions are based on extensive collaboration with the arts and cultural sector in order to identify the key elements that make up the quality of a cultural event or project. These elements have been translated into single statements to be included in Culture Counts surveys.


6.1.1. Select ‘Dimension’ from toolbar (left hand side)

6.1.2. Click in the Dimension Type box and you will be presented with a pop-up box called ‘What do you want to measure?’ which contains a dropdown menu, ‘Dimension Category’. You can find a full list of the dimension categories and the dimensions available in the Culture Counts platform here.

6.1.3. To add a dimension question, simply click the category you wish to use, select the dimension you want to add, then click ‘Use selected dimension’. After the dimension has been added to the survey, you can make appropriate amendments.

6.1.4. After the dimension has been added to the survey, click the dropdown next to ‘Question text’ and select variants of that dimension statement that best suit your survey e.g. this is where you can change the tense of the statement as required.

For more support on using Culture Counts dimensions in your survey watch the support film here.

6.2. Slider

You can use a slider if you have a statement that you would like to measure reactions to. Responses are recorded along a slider that ranges between three values. By default, these are “Strongly disagree”, “Neutral”, and “Strongly agree”, but can be customised when the question is added. Similarly to the Dimension question format, the slider represents a one-hundred-and-one-point scale on a continuum, where the far left is 0, the centre point is 0.5 and the far right is 1. This means that every position on the slider has its own value. 

6.3 Demographic

This is where the three inbuilt demographic questions are stored: age, gender and postcode. By default, these three questions will be added to any survey you create. 

6.4. Experience

There are two inbuilt experience questions: Net Promoter Score (NPS), Overall Experience. NPS is a globally recognised metric for measuring the likelihood of customers recommending an event, organisation or product. Overall Experience is a popular question which can aid your understanding of an audience’s overarching satisfaction level. 

6.5. Date

Provides a date input where respondents can select a date from a calendar. 

6.6. Yes/No

Gives the respondent two options to choose from, which default to “Yes” and “No”. 

6.7. Number

A basic number input, where numbers can be input by either typing, or changed using up and down arrows; minimum and maximum values can be specified. 

6.8. Dropdown

Provides a dropdown list, where respondents can answer by choosing one of the options. 

6.9. Multiple Choice

Provides a selection of answers, where respondents can answer by selecting one or more of the options. 

6.10. Short Text

A short text input is designed for smaller text responses, such as one-word answers. For this question format, the Culture Counts platform will automatically graph the most commonly used words. 

6.11. Free Text

A simple text input that allows respondents to type an answer or comment with no word limit. These responses are available via the CSV file download. 

6.12. Message

This displays a message within the survey but does not require respondents to provide an answer. It is typically used to provide respondents with additional instructions or context. 

6.13. Email

This option provides a section to ask for respondents’ email contact information. It is a simple text input designed to recognise if the text entered is an email address. When the responses are downloaded, the respondent’s email address will automatically be separated from the rest of their data, so that their responses to other questions remain anonymous. We therefore recommend that you only use the email question once per survey or to register for one thing. Organisations using the option are ultimately responsible for being in compliance with all of the attendant GDPR obligations around the collection and use of personal data. 

The email addresses gathered via the survey can be downloaded from the survey’s Summary page.  The email addresses will be presented in a random order, ensuring that connections cannot be drawn between the email addresses and other survey responses. 

6.14. Editing or removing content

In order to edit content, click on the question text you wish to edit and type your desired text. Some content types, like dimensions, cannot be edited as they are standardised. To remove a question, click on the rubbish bin icon in the top right of the question box. The icon with the four squares allows you to change the ordering of questions within your survey by dragging to the correct position.

Please note that standard demographic questions – age, gender and postcode – are included automatically at the end of all Culture Counts surveys. You do not need to add these questions to the survey yourself but are able to delete them if you see fit. 

6.15. Survey logic

Survey Logic allows you to hide questions based on their relevance to particular respondents. Asking targeted questions helps keep surveys short and achieves increased response rates.

Our Introduction to Survey Logic video provides users with a step-by-step guide to adding logic to survey questions in the Culture Counts dashboard, and provides insight into when this feature may be useful: 


7. Inviting peer or self assessor to the survey

One of the core functions of Culture Counts is the ability to invite and manage peer (7.1) or self assessors (7.6). This can be done easily through our Invite page.


7.1. Inviting peer reviewers

To find a suitable peer to review an event, the first place we would encourage you to look is within your own professional networks. Your networks may contain people from local arts organisations, touring companies, independent artists, schools and academic institutions, amongst others. By engaging with your current contacts, you are acknowledging the value you place on their professional opinions of your work.

You can also look within the Culture Counts platform, where the Peer Matching Resource allows you to search for peers. You may browse and filter a list of creative professionals who have registered their interest in participating as a peer reviewer. They may be working with an Arts Council funded organisation that is registered to use the Impact & Insight Toolkit, or they may have signed up as an external individual interested in being part of the project.

When selecting your peer reviewers, we would recommend that you engage with peers from both your own network and the new resource available to you, in order for you to achieve a well-rounded range of professional perspectives on your work.

Below we have provided guidance on: 


7.2. Finding peers in the Peer Matching Resource

The Peer Matching Resource provides Toolkit users with a list of creative professionals who have registered their interest in completing peer reviews for users of the Impact & Insight Toolkit. Upon registering, all peer reviewers will create a personal profile which includes information about their job role and the company for which they work, as well as their background and expertise, and the regions to which they are willing to travel.

7.2.1. To access the Peer Matching Resource and the list of peer reviewers, you will first need to have a Culture Counts account, and second, to have set up an evaluation which includes a peer and self post-event survey. This is the survey that you will send to the peer reviewer(s) once they’ve attended your event/exhibition.


7.2.2. Click on this survey; then click on the Invite tab at the top of the page.

7.2.3. If you scroll down you will notice a database of peers, which you can browse by page, or by using the filters on the right-hand-side.

Within each entry, the peer’s job title appears in bold, followed by the name of the organisation where they work. You may click on Biography to view the individual’s full profile card, showing their name, specialisms, interests and experience. Names are omitted from the initial list view to encourage you to browse peer reviewers based on their job role and artform, rather than automatically go for people whose names you already know. That said, if you are looking for someone specific, you are able to search their name in the search bar and their profile will appear, if they are in the database.

The search functionalities within your dashboard allow you to browse all registered peer reviewers, filtering them by artform and location. It is worth noting that location refers to the regions which they have said they are willing to travel; therefore, it is helpful to select the location of your event in the search.

Tip: If you decide to choose multiple peer reviewers it may be a good idea to think about choosing a group with a variety of expertise, experience and specialisms. Having this variety of peers reviewing your event will offer you a well-rounded view of your event and could provide you with further insight. It is also worth considering peers’ locations when choosing; if they are far away, you will need to consider the cost of reimbursing them for their travel costs and time.

7.2.4. Select the peers you would like to invite by clicking on the checkbox to the left of each peer, then click on ‘Invite’ at the top left of the page.

7.2.5. Now, you can personalise your invitation by adding your own subject and message. You should provide details about the event, including dates and times when they are welcome to attend. Every box must be filled out and you are welcome to invite more than one peer to review each event.


7.2.6. Dates and times: The dates you enter must take place in the future, even if, say, the exhibition you are inviting the peer to review has already begun. You have the option to add multiple ‘events’ to the invitation; therefore, you could offer multiple date options, or you could simply explain the breadth of availability in your message. The times you enter could either be your venue’s opening times or the times of a particular performance.

If the work you’re evaluating is taking place over a period of multiple dates (exhibition or performance, for example), we suggest that you actually write the final date in the ‘start date’ box. This is because the system won’t allow a peer to ‘accept’ an invitation after the ‘start date’. You can include the actual dates of the run in the invitation text. Please know that an alteration to this design is being considered for a future version of the Peer Matching Resource. 

7.2.7. Location : This is the location where your event takes place.

7.2.8. Organisation details: Include your contact details so that the peer is able to get in touch if they need to.

7.2.9. Once you have finished the invitation, you can preview it on the right of your screen. When you are happy with it, click on the Send button in the top left corner. Peer reviewers will receive this invitation by email and will either accept or decline the request by clicking on the links in the invitation.


7.3. How can I see the status of my invitations?

At the bottom of the search column you will see ‘Status’ and a dropdown menu underneath with the options: Invited, Accepted, Declined, Completed, and Cancelled.

Whichever you select, the peers within that category will appear:

  • Invited includes those peers that were sent the invitation but have not replied.
  • Accepted includes those peers that have accepted the invitation.
  • Declined includes those peers that have declined the invitation.
  • Completed includes those peers that have completed the survey.
  • Cancelled includes those peers that had originally accepted, but have now needed to retract their acceptance

You will notice that there are three circles in the Status column – the first indicates peer invitation sent; the second, accepted; the third, completed. Once a stage has been completed, the appropriate circle will be filled in. If a peer declines your invitation, the circles will turn red and their contact details will not be visible. On the contrary, once someone accepts your invitation, their email and phone number will appear to the right in the box, indicated by the icons of a telephone and envelope. You may want to get in touch with them to coordinate and finalise the details of their visit.

As soon as they accept, you will be able to find their survey link from the Summary page of that survey. Send this to them either through the platform, or by copying and pasting their URL in an email directly to them. The survey can be sent at any time, but we suggest you do this shortly after their visit whilst the experience is fresh in their mind.

7.4. Inviting peers outside the Peer Matching Resource

If you would like to invite a peer reviewer from outside of the database, scroll down the Invite tab where there is a box for you to enter these peers’ email addresses directly. If you choose to do this, you are not registering these individuals to the Peer Matching Resource. You are simply using them as a peer for this one particular evaluation and their details will not be shared with other users. If you and/or the peer would like to register to the database, please read the information above.


7.5 Register a Peer Reviewer

The Invite page is also where you can register to join the Peer Matching Resource.  Click on ‘Register a Peer’ (highlighted in blue) and add your details. For more information on who should be a peer reviewer, please see our guidance.


7.6 Inviting self assessors

Similarly, to the above, when needing to register self assessors for a survey, simply type in the email address of the self assessor in the field ‘Invite Self Assessors’ which can be found under the ‘Want to invite someone else from your professional network?’ field; press enter.


8. Completing and distributing surveys

Once you have been through the Configure, Design and Invite stages, you have created your survey and will be taken to the survey Summary page.

Here you will be shown an overview of the survey, with the number of questions you have included; a list of the self and peer reviewers you have added; a summary of the survey results so far. You can return to this page at any time to see how your survey is progressing and to learn which of your nominated respondents have completed their surveys.


8.1. Distributing to the public

The Summary page will contain the links (URLs) for your survey.

There will be a separate URL for each of the different delivery types for the public survey (Online, Interview & Display) that you selected at configuration stage. It is important to use the correct URL for each method. To distribute a public survey, simply copy these links and use them in the relevant locations. For example, for a survey to be administered by interviews, enter the Interview link into the browser of your iPads or tablet computers. For a survey to be distributed via email, copy the Online link and paste it into the email you are sending to audience members/visitors.

Before distributing the survey, we recommend you click on ‘Preview’ to the right of the survey link (the eye icon) so that you can make sure the survey looks just as you intended it to. The preview link will open the survey in a new tab for reviewing, but will not record any data.

Please remember that using the Online, Interview or Display link will always record the response as public, regardless of which survey it is associated with.


8.2. Distributing to self and peer respondents

The easiest way to distribute a survey to self and peer respondents is to send email invitations to your nominated assessors via the Culture Counts platform. If you scroll to the bottom of the survey Summary page, you will see that the system automatically generates a unique link to the survey for each of your nominated self and peer reviewers. Each self and peer reviewer must be sent their own unique link, and it’s easy to do this via the platform.

On the Summary page, click the ‘Send invitation’ button next to your first self or peer assessor. A dialogue box will open to allow you to draft your invitation email. The survey’s URL for that particular respondent will automatically be included in the email.

If you are asking self and/or peer reviewers to complete both a prior and post survey, you need to invite each respondent to complete the prior survey first. Within each prior survey, Culture Counts automatically asks respondents when they are planning to attend the event. This date will show up in the survey Summary page, along with notification that the peer or self assessor has completed their survey. You will know to send self and peer reviewers their links to the post survey once they have completed the prior survey and experienced the work.


9. Seeing the results

Once you’ve completed the evaluation and collected the public, peer and self responses, you are able to see your results of the specific survey on your Reporting dashboard. You can access this by clicking on ‘Manage’ on your survey’s summary page. Please select ‘Go to report’ from the dropdown menu.

Here you will be able to view graphs, updated in real-time, which can be viewed online or downloaded into a pdf format or zip file:

You have the option to download the Culture Counts generated charts, either as a .zip file, or as a complete .pdf file. If saved as a .zip file, you can extract individual charts to use as you like, for example in your own presentation documents.

You can also download the raw data in a CSV format. This can be opened in Microsoft Excel or Numbers, enabling you to conduct further analysis:

To download any of these, simply click ‘Export Evaluation’ on the top right-hand side of the main Reporting page.


9.1. Reporting dashboard functions


9.1.1. Evaluation summary

On this page, you can view the total number of responses, the total number of dimensions tracked, and total number of surveys contained in the evaluation. The page also provides a summary of mean average scores for each dimension from public respondents. The summary of dimensions is depicted in the form of a bar graph and is evaluated on a scale from 0-100. An average score of 0 demonstrates that respondents strongly disagreed, while a score of 100 indicates that respondents strongly agreed.


9.1.2. Demographics

The demographics page displays the ratio of public respondents that identify as male, female and in another way in the form of a pie chart. This allows you to see whether your survey respondents were predominantly male, female or of an alternative gender. The demographics section also depicts the age breakdown of survey respondents in the form of a bar graph.


9.1.3. Respondent comparison

In this section you can observe the total number of survey responses, segmented into three categories: number of public responses, number of peer responses, and number of self-assessor responses. Average scores for each respondent type for each dimension are then presented in the form of a bar graph on a scale from 0-100.

This chart allows you to quickly compare the scores given by the public, peer and self respondent groups after experiencing the work.


9.1.4. Experiences and expectations

In this section, you can see and compare the average prior and post dimension scores recorded for self and peer reviewers. The results are shown in a bar graph and are measured on a scale from 0-100, where 0 indicates strong disagreement and 100 indicates strong agreement. Comparing before and after scores demonstrates where the work met, fell below or exceeded expectations. This chart is particularly useful for organisations to identify the areas in which they achieved their creative intentions or objectives. If prior and post surveys have not been carried out as part of the evaluation, no data will be available in this section.


9.1.5. Intelligence questions

Intelligence Questions are specific questions that have been standardised to allow for easy benchmarking and comparison. Intelligence questions often reflect common funder requirements and globally recognised measures, such as the Net Promoter Score. Utilising the pre-set Culture Counts intelligence questions can help to provide insights into marketing or outcomes specific to your organisation and will contribute to the big database of sector insights. You can insert these questions on the Design page of any survey.


9.1.6. Response origin

The sixth section of Reporting is titled ‘Response Origin’. Here you can see how the data for your survey was collected. Data can be collected from a combination of online responses, interviewer responses, display responses, peer reviewer responses and self assessor responses. The breakdown of responses is depicted in a pie chart.


9.1.7. Custom questions

When setting up a survey, you are given the option of including custom questions. This comprises any question type outside the standard dimension sets, inbuilt demographics or intelligence questions. This section of reporting presents the results from these questions in bar and pie chart format. Audience comments and other longer custom question responses can be accessed by exporting the raw CSV file. This can be done at any time from the reporting section by clicking Export Evaluation in the top right corner of the screen.


10. Additional functionalities

10.1. How to share evaluations

The Culture Counts platform enables organisations to share their evaluations with others to help the sector benchmark and learn. Many of our members choose to share their evaluations with a member of Counting What Counts staff, asking us to check it over before its public distribution. Sharing evaluations is also useful when undertaking a collaborative event with another member organisation.

To share an evaluation, simply click on the evaluation name in your dashboard evaluation list. Click the ‘Edit’ button at the top right of your screen. Click ‘Sharing Options’ from the pulldown list. In the box that says ’Share with another’, type in the email address of the person you would like to share with. They must also have a Culture Counts account so they can view your evaluation within their dashboard.

To the right of the email address is a dropdown list of sharing options. Choose ‘Viewer’ if you would like them to be able to look at your evaluation and results but not edit it in any way. If you would like them to be able to do more than just view your evaluation, choose ‘Admin’ from the pulldown list. This will give them the same access to the evaluation as you. This option is best for collaborative events, where both organisations want to share the creation and editing of their event surveys; both organisations would also have equal access to results.

If you would like to unshare with someone, simply open the Sharing options box again via the ‘Edit’ dropdown and click the ‘x’ next to their organisation name. This is where you can also view each organisation’s permission level.


10.2. How to archive an evaluation or survey

If you find yourself wanting to remove an evaluation or survey from your dashboard, you can do so by using the archive function. Archiving a survey or evaluation will remove it from your dashboard. It is similar to deleting something, except that it will not delete any associated data.

10.2.1 Archiving an evaluation

Data archiving is the process of moving a collection of data to a backup repository in order to separate current data from data that is no longer actively used. The files that are archived can be retrieved and this process is often used for long term data storage.

You can archive an evaluation by clicking into the evaluation you wish to archive and then clicking ‘Edit’ on the top right corner. From the dropdown you can then select ‘Archive’.

You will then be prompted to confirm or leave this process. Click ‘Yes, archive this evaluation’ if you wish to continue. This will archive the evaluation, including all surveys within it. If you are a Creator or Admin of an evaluation, you also have the option to archive the evaluation for everyone it is shared with. This is useful when a project is no longer relevant, or you simply want to remove it from their dashboards. This will not remove their access to the data; however, and they can restore it themselves if they want to continue using it.

We wouldn’t recommend that you archive evaluations in the opening years of the Impact & Insight Toolkit programme. We will be encouraging you to analyse and compare your data, so in the early years of the project there is a value in keeping your evaluations in your Dashboard. However, if you carry out a lot of evaluations using the platform then you may of course wish to archive your pilot or test events.


10.2.2. Archiving a survey

This is a similar process to archiving an evaluation. If you wish to archive a specific survey within an evaluation you can by clicking ‘Edit’ next to the survey and from the dropdown list select ‘Archive’. Again, you will be prompted to confirm or leave this process. Click ‘Yes, archive this survey’ if you wish to continue.


10.2.3. Recovering an archived evaluation or survey

Evaluations and surveys that are archived are not permanently deleted and can be recovered if required.

To recover an archived survey or evaluation you will need to contact our support team. Be sure to mention your organisation name and the evaluation or survey name that you would like recovered in your support request.

Powered by BetterDocs