Evaluations are folders that contain all the information for a specific event, project or work for which you are keen to learn more about its impact. The work that you evaluate can vary from an online exhibition to an in-person stage performance to an outdoor workshop; it is completely up to you!
Please note that there is not a limit on the number of evaluations you can do.
Dependent on your funding status with Arts Council, there will be different requirements for your evaluations. To see more about the requirements, please see the mandatory guidance.
An evaluation will generally contain multiple surveys, all with a different purpose. There may be a survey specific for:
The surveys may also have different ‘types’:
The exact configuration will depend on your funding status. However, the evaluation is likely to present a combination of the above.
Having received your login details from the Counting What Counts team, you will be able to access the Impact & Insight Toolkit at this address: https://impactandinsight.co.uk
Once here, click Sign In to access the Culture Counts platform.
Once you have signed in you will see the following:
On the left hand side, you will see the Evaluations panel which will display any previous Toolkit evaluations.
To create a new evaluation click on New Evaluation (orange button in the Evaluation panel)
In the ‘Create an Evaluation’ pop-up, enter the name of your evaluation in the ‘Evaluation name’ field (we have called ours ‘ACE Test’ in the example below but normally it would be the name of the work you are evaluating).
In the ‘Create from’ field there are three options for creating your evaluation.
If you are new to the Toolkit or are creating an evaluation to meet your Art Council funding requirements, we strongly advise you to use the ‘Template’ option. Using the Template option is a quick way to set up evaluations that meet your specific funding requirements.
When Template is selected, you will see a dropdown list of evaluation templates i.e., evaluations that have already been created for you. You should select the template that is most relevant to your cultural offering. When you click on a template a short description of the template will appear underneath.
Click on Preview to view the template. This is useful if you’re unsure what the evaluation looks like before creating the evaluation. A green pop-up will appear at the top, highlighting the fact you are in preview mode i.e. ‘This evaluation is a preview only and cannot be edited’.
Depending on the template you select you may be prompted to choose the evaluation type i.e., ‘Core’ or ‘Flexible’*. If you are using a template but do not wish to create a Core or Flexible evaluation type please click No, thank you.
If you select Core, you will then be prompted to click on Confirm dimensions.
If you select Flexible, you will then be prompted to ‘Select dimensions’.
*For more information about Core and Flexible evaluations please click here. If you are a NPO you will be expected to complete a specific number of core and/or flexible evaluations each financial year. If you are unsure of the type and the number you need to complete, please read the mandatory requirements for your specific funding status.
Please note: The dimensions that you select at this stage ensure that your evaluation meets either the Core or Flexible requirements. You can add more dimensions from other dimension categories in the ‘Design’ section e.g. Placemaking dimensions
Once you have clicked on Create Evaluation, the system will generate and display the surveys within that evaluation. These are the surveys for the different respondents i.e. Public, Peer, Self.
If you opt to click on Evaluation in the ‘Create an Evaluation’ box, you can create an evaluation from a previous evaluation you have made, allowing you to essentially create a duplicate of a pre-existing evaluation, just with a different name.
Creating from blank gives you the freedom to create your own evaluation from scratch. After giving your evaluation a name and selecting Blank you will be able to create your own surveys by selecting Create Survey on the top right-hand corner:
When you set up an evaluation, we strongly recommend you also add metadata tags; these are descriptors of your event which will enable you to make more detailed comparisons between your evaluations over time, or against aggregate data sets. If you intend on submitting this evaluation to Arts Council, these details will also give your Relationship Manager important context when reviewing your evaluation report.
Click on Properties on the front page of your evaluation. Here you can add details such as artform(s), event location and attendance numbers.
Once you select the tick box Event, you will be asked to choose the type of event you are offering. Then click Save (you should click save on all the answers you provide).
Add a location and whether your work is touring.
The overall attendance (this can be a rough number of people that will experience this work).
Lastly, add the start and end date of your work (optional).
You must now configure your surveys. This is where you can set up the following options for your survey:
Click on Configure which you will find in the navigation panel along the top.
Give the survey a name that you are happy for the respondents to see when they receive the survey. In this case it has been named ‘Public Survey – We want your feedback!’ but we would recommend that you customise the survey name to make it unique to the work you’re evaluating.
The start and close dates will determine when the survey begins and stops taking entries. We advise that you enter the dates nearer to the time of delivery. It is fine to leave the start and close dates blank. If you leave the start date blank, the survey will be available to record responses immediately and, if you leave the close date blank, the survey will remain active indefinitely. When inputting the start and close dates, we advise that you include an additional one or two days either side of the official start and close dates to ensure that you capture all available survey responses.
Here you can write a short optional introduction to your survey that will appear on the front page of your survey. This introductory front page will only appear on the Online and Display survey delivery methods, but not for the Interviewer method. This is because interviewers delivering the survey by using the Toolkit should be able to provide this information verbally to respondents. The introduction is a good place to provide a brief explanation to respondents, such as:
‘We’d love to hear what you thought. Please share your views on this event via our short survey. Your feedback will help us to understand and measure the impact of our events.’
Here you can upload your own logo, which will appear on the front page with the survey introduction text.
You should not need to take any action here if you have chosen the Template or Evaluation option. However, if you selected Blank you will need to select the right option for your type of survey.
There are three survey types: Standard, Prior and Post.
When creating prior and post surveys for the same respondent group within an evaluation, you should ‘Survey Link’ them, so that that Culture Counts knows which data to compare; please see below:
This survey is a prior survey, so by selecting the Post-Event Survey for Self and Peer Respondents from the dropdown list, the two surveys will be synced so that details from the prior survey are copied over to the post survey. To complete your post survey, click on the Design tab and change the tense of the dimension statements from future to past in the dropdown options provided.
This section allows you to choose how the survey will be delivered to the public respondents. If the survey you are working on is for self and/or peer respondents, you do not need to select a delivery type. Culture Counts supports the following delivery types:
Select all the methods that you wish to use to deliver your survey. You can choose however many you wish. A unique survey link will be created for each method and will be displayed clearly on the Summary page at the end of the survey builder. For more information on the various delivery methods, please see the Getting Started With Thoughtful Evaluation guide.
You will be presented with advanced options at the bottom of the Configure page, related to the delivery type(s) you have chosen, including:
Please note that all of the methods require a stable internet connection or mobile data in order to collect survey responses.
Designing a survey is the process of adding questions and content to your survey. On the Design page you can add, modify or remove questions.
Dependent on whether you selected Template or Evaluation; Core or Flexible at the commencement of the process (see step 4 above), you might find some dimensions are already populated on your Design screen. You can add or remove dimension questions, but if you are planning on using your evaluation to contribute to meeting funding requirements, you should ensure you don’t remove anything required.
In order to add content, simply click the type of question or content and it will appear in the survey section. Your questions should be chosen based on the objectives of your evaluation (the outcomes you want to measure that reflect your creative intentions for the work). If you want to learn more about best practice for making surveys in Culture Counts, visit the Getting Started With Thoughtful Evaluation Guide for more information. The types of survey content available in Culture Counts are outlined below:
A dimension is a standardised statement that respondents can agree or disagree with, rated using a slider. The slider represents a one-point scale on a continuum, where the far left is 0, the middle is 0.5 and the far right is 1. This means that every position on the slider has its own value.
Culture Counts’ dimensions are based on extensive collaboration with the arts and cultural sector in order to identify the key elements that make up the quality of a cultural event or project. These elements have been translated into single statements to be included in Culture Counts surveys.
Arts Council England has identified a set of Core Cultural Experience Dimensions that NPOs will use in their mandatory evaluations. By using these dimensions any organisations will be able to benchmark their results easily against the other Impact & Insight Toolkit users.
Select ‘Dimension’ from toolbar (left hand side)
Click in the Dimension Type box and you will be presented with a pop-up box called ‘What do you want to measure?’ which contains a dropdown menu, ‘Dimension Category’.
To add a dimension question, simply click the category you wish to use, select the dimension you want to add, then click Use selected dimension. After the dimension has been added to the survey, you can make appropriate amendments.
After the dimension has been added to the survey, click the dropdown next to ‘Question text’ and select variants of that dimension statement that best suit your survey. This is where you can change the tense of the statement as required.
For more support on using Culture Counts dimensions in your survey watch the support film here.
You can use a slider if you have a statement that you would like to measure reactions to. Responses are recorded along a slider that ranges between three values. By default, these are “Strongly disagree”, “Neutral”, and “Strongly agree”, but can be customised when the question is added. Similarly to the Dimension question format, the slider represents a one-hundred-and-one-point scale on a continuum, where the far left is 0, the centre point is 0.5 and the far right is 1. This means that every position on the slider has its own value.
This is where the three inbuilt demographic questions are stored: age, gender and postcode. By default, these three questions will be added to any survey you create.
There are two inbuilt experience questions: Net Promoter Score (NPS), Overall Experience. NPS is a globally recognised metric for measuring the likelihood of customers recommending an event, organisation or product. Overall Experience is a popular question which can aid your understanding of an audience’s overarching satisfaction level.
Provides a date input where respondents can select a date from a calendar.
Gives the respondent two options to choose from, which default to “Yes” and “No”.
A basic number input, where numbers can be input by either typing, or changed using up and down arrows; minimum and maximum values can be specified.
Provides a dropdown list, where respondents can answer by choosing one of the options.
Provides a selection of answers, where respondents can answer by selecting one or more of the options.
A short text input is designed for smaller text responses, such as one-word answers. For this question format, the Culture Counts platform will automatically graph the most commonly used words.
A simple text input that allows respondents to type an answer or comment with no word limit. These responses are available via the CSV file download.
This displays a message within the survey but does not require respondents to provide an answer. It is typically used to provide respondents with additional instructions or context.
This option provides a section to ask for respondents’ email contact information. It is a simple text input designed to recognise if the text entered is an email address. When the responses are downloaded, the respondent’s email address will automatically be separated from the rest of their data, so that their responses to other questions remain anonymous. We therefore recommend that you only use the email question once per survey or to register for one thing. Organisations using the option are ultimately responsible for being in compliance with all of the attendant GDPR obligations around the collection and use of personal data.
In order to edit content, click on the question text you wish to edit and type your desired text. Some content types, like dimensions, cannot be edited as they are standardised. To remove a question, click on the rubbish bin icon in the top right of the question box. The icon with the four squares allows you to change the ordering of questions within your survey by dragging to the correct position.
Please note that standard demographic questions – age, gender and postcode – are included automatically at the end of all Culture Counts surveys. You do not need to add these questions to the survey yourself but are able to delete them if you see fit.
Survey Logic allows you to hide questions based on their relevance to particular respondents. Asking targeted questions helps keep surveys short and achieves increased response rates.
Our Introduction to Survey Logic video provides users with a step-by-step guide to adding logic to survey questions in the Culture Counts dashboard, and provides insight into when this feature may be useful: https://www.youtube.com/watch?v=4kWkkEidnkU
One of the core functions of Culture Counts is the ability to invite and manage peer or self assessors. This can be done easily through our Invite page.
You can invite a peer reviewer that’s registered on the Peer Matching Resource. For more information on how to use this resource, please see here: https://impactandinsight.co.uk/support-materials/using-the-peer-matching-resource/
Similarly, to the above, when needing to register self assessors for a survey, simply type in the email address of the self assessor in the field ‘Invite Self Assessors’ which can be found under the ‘Want to invite someone else from your professional network?’ field; press enter.
Once you have been through the Configure, Design and Invite stages, you have created your survey and will be taken to the survey Summary page.
Here you will be shown an overview of the survey, with the number of questions you have included; a list of the self and peer reviewers you have added; a summary of the survey results so far. You can return to this page at any time to see how your survey is progressing and to learn which of your nominated respondents have completed their surveys.
The Summary page will contain the links (URLs) for your survey.
There will be a separate URL for each of the different delivery types (Online, Interview & Display) that you selected at configuration stage. It is important to use the correct URL for each method. To distribute a public survey, simply copy these links and use them in the relevant locations. For example, for a survey to be administered by interviews, enter the Interview link into the browser of your iPads or tablet computers. For a survey to be distributed via email, copy the Online link and paste it into the email you are sending to audience members/visitors.
Before distributing the survey, we recommend you click on Preview to the right of the survey link (the eye icon) so that you can make sure the survey looks just as you intended it to. The preview link will open the survey in a new tab for reviewing, but will not record any data.
Please remember that using the Online, Interview or Display link will always record the response as public, regardless of which survey it’s associated with.
The easiest way to distribute a survey to self and peer respondents is to send email invitations to your nominated assessors via the Culture Counts platform. If you scroll to the bottom of the survey Summary page, you will see that the system automatically generates a unique link to the survey for each of your nominated self and peer reviewers. Each self and peer reviewer must be sent their own unique link, and it’s easy to do this via the platform.
On the Summary page, click the Send invitation button next to your first self or peer assessor. A dialogue box will open to allow you to draft your invitation email. The survey’s URL for that particular respondent will automatically be included in the email.
If you are asking self and/or peer reviewers to complete both a prior and post survey, you need to invite each respondent to complete the prior survey first. Within each prior survey, Culture Counts automatically asks respondents when they are planning to attend the event. This date will show up in the survey Summary page, along with notification that the peer or self assessor has completed their survey. You will know to send self and peer reviewers their links to the post survey once they have completed the prior survey and experienced the work.
Once you’ve completed the evaluation and collected the public, peer and self responses, you are able to see your results of the specific survey on your Reporting dashboard. You can access this by clicking on Manage on your survey’s summary page. Please select Go to report from the dropdown menu.
Here you will be able to view graphs, updated in real-time, which can be viewed online or downloaded into a pdf format or zip file:
You have the option to download the Culture Counts generated charts, either as a .zip file, or as a complete .pdf file. If saved as a .zip file, you can extract individual charts to use as you like, for example in your own presentation documents.
You can also download the raw data in a CSV format. This can be opened in Microsoft Excel or Numbers, enabling you to conduct further analysis:
At the end of this document, you will find links to instruction videos explaining some of the basic analyses you can do using Excel.
To download any of these, simply click Export Evaluation on the top right-hand side of the main Reporting page.
On this page, you can view the total number of responses, the total number of dimensions tracked, and total number of surveys contained in the evaluation. The page also provides a summary of mean average scores for each dimension from public respondents. The summary of dimensions is depicted in the form of a bar graph and is evaluated on a scale from 0-100. An average score of 0 demonstrates that respondents strongly disagreed, while a score of 100 indicates that respondents strongly agreed.
The demographics page displays the ratio of public respondents that identify as male, female and in another way in the form of a pie chart. This allows you to see whether your survey respondents were predominantly male, female or of an alternative gender. The demographics section also depicts the age breakdown of survey respondents in the form of a bar graph.
In this section you can observe the total number of survey responses, segmented into three categories: number of public responses, number of peer responses, and number of self-assessor responses. Average scores for each respondent type for each dimension are then presented in the form of a bar graph on a scale from 0-100.
This chart allows you to quickly compare the scores given by the public, peer and self respondent groups after experiencing the work.
In this section, you can see and compare the average prior and post dimension scores recorded for self and peer reviewers. The results are shown in a bar graph and are measured on a scale from 0-100, where 0 indicates strong disagreement and 100 indicates strong agreement. Comparing before and after scores demonstrates where the work met, fell below or exceeded expectations. This chart is particularly useful for organisations to identify the areas in which they achieved their creative intentions or objectives. If prior and post surveys have not been carried out as part of the evaluation, no data will be available in this section.
Intelligence Questions are specific questions that have been standardised to allow for easy benchmarking and comparison. Intelligence questions often reflect common funder requirements and globally recognised measures, such as the Net Promoter Score. Utilising the pre-set Culture Counts intelligence questions can help to provide insights into marketing or outcomes specific to your organisation and will contribute to the big database of sector insights. You can insert these questions on the Design page of any survey.
The sixth section of Reporting is titled ‘Response Origin’. Here you can see how the data for your survey was collected. Data can be collected from a combination of online responses, interviewer responses, display responses, peer reviewer responses and self assessor responses. The breakdown of responses is depicted in a pie chart.
When setting up a survey, you are given the option of including custom questions. This comprises any question type outside the standard dimension sets, inbuilt demographics or intelligence questions. This section of reporting presents the results from these questions in bar and pie chart format. Audience comments and other longer custom question responses can be accessed by exporting the raw CSV file. This can be done at any time from the reporting section by clicking Export Evaluation in the top right corner of the screen.
The Culture Counts platform enables organisations to share their evaluations with others to help the sector benchmark and learn. Many of our members choose to share their evaluations with a member of Counting What Counts staff, asking us to check it over before its public distribution. Sharing evaluations is also useful when undertaking a collaborative event with another member organisation.
To share an evaluation, simply click on the evaluation name in your dashboard evaluation list. Click the Edit button at the top right of your screen. Click Sharing Options from the pulldown list. In the box that says ’Share with another’, type in the email address of the person you would like to share with. They must also have a Culture Counts account so they can view your evaluation within their dashboard.
To the right of the email address is a dropdown list of sharing options. Choose ‘Viewer’ if you would like them to be able to look at your evaluation and results but not edit it in any way. If you would like them to be able to do more than just view your evaluation, choose ‘Admin’ from the pulldown list. This will give them the same access to the evaluation as you. This option is best for collaborative events, where both organisations want to share the creation and editing of their event surveys; both organisations would also have equal access to results.
If you would like to unshare with someone, simply open the Sharing options box again via the ‘Edit’ dropdown and click the x next to their organisation name. This is where you can also view each organisation’s permission level.
If you find yourself wanting to remove an evaluation or survey from your dashboard, you can do so by using the archive function. Archiving a survey or evaluation will remove it from your dashboard. It is similar to deleting something, except that it will not delete any associated data.
Data archiving is the process of moving a collection of data to a backup repository in order to separate current data from data that is no longer actively used. The files that are archived can be retrieved and this process is often used for long term data storage.
You can archive an evaluation by clicking into the evaluation you wish to archive and then clicking Edit on the top right corner. From the dropdown you can then select Archive.
You will then be prompted to confirm or leave this process. Click Yes, archive this evaluation if you wish to continue. This will archive the evaluation, including all surveys within it. If you are a Creator or Admin of an evaluation, you also have the option to archive the evaluation for everyone it is shared with. This is useful when a project is no longer relevant, or you simply want to remove it from their dashboards. This will not remove their access to the data; however, and they can restore it themselves if they want to continue using it.
We wouldn’t recommend that you archive evaluations in the opening years of the Impact & Insight Toolkit programme. We will be encouraging you to analyse and compare your data, so in the early years of the project there is a value in keeping your evaluations in your Dashboard. However, if you carry out a lot of evaluations using the platform then you may of course wish to archive your pilot or test events.
This is a similar process to archiving an evaluation. If you wish to archive a specific survey within an evaluation you can by clicking Edit next to the survey and from the dropdown list select Archive. Again, you will be prompted to confirm or leave this process. Click Yes, archive this survey if you wish to continue.
Evaluations and surveys that are archived are not permanently deleted and can be recovered if required.
To recover an archived survey or evaluation you will need to contact our support team. Be sure to mention your organisation name and the evaluation or survey name that you would like recovered in your support request.
Powered by BetterDocs