1. Introduction
Impact & Insight Toolkit will assist Arts Council England and the CPP Programmes in the evaluation of the second key CPP Action Research question:
What approaches enable us to deliver on our aspiration for excellence, both in the process of community engagement and the creative and cultural experiences on offer?
The Toolkit will also assist Creative People and Places programmes and Arts Council England in their evaluation of how far CPP activity contributes to the aims of the programme, including but not limited to:
- Empowering communities to lead and shape local cultural provision.
- Enabling excellence and relevance in both the engagement process and the creative and cultural experiences on offer;
- Taking an Action Research approach to community engagement in arts, creativity, and culture; learning what works best and sharing that learning
- Developing programmes that respond to the demographic in your area and are inclusive of the whole place;
- Responding to public creative and cultural interests and providing a taste of the widest possible range of high-quality experiences (as audiences and participants) to support people to make an informed choice about the kinds of culture they may like;
- Challenging and supporting people to try new and different things with all partners being introduced to new and broader creative and cultural choices (local, national and international) and different ways of working with communities;
The Toolkit can be used to deepen your understanding of how well your intentions for your activities/events align with the experiences of your peers and your audiences. The Toolkit project uses the online evaluation platform, Culture Counts.
To learn more about the Toolkit, visit the website here – https://impactandinsight.co.uk
2. Mandatory Requirements and Fulfilling Your Payment Condition
It is a mandatory requirement for all National Portfolio Creative People and Places Programmes (funded from 1 April 2026) to undertake a minimum of four Impact & Insight Toolkit evaluations each financial year and share findings with their Relationship Manager via an agreed reporting template.
This mandatory requirement links to the Payment Condition in your Funding Agreement each May:
Data monitoring and quality evaluation monitoring reports for previous financial year in the format Arts Council England requests.
All CPPs must carry out at least four evaluations each financial year on Culture Counts. The four evaluations should be representative of your programme across the year. CPPs that do not have four events per year available to evaluate, should discuss suitable usage with their Relationship Manager.
To fulfil your payment condition, by your May 2027 payment condition, you must have:
- Carried out four evaluations on Culture Counts;
- Created an Insights Report for each of your four evaluations;
- Uploaded your Insights Reports to Grantium or emailed them to your Relationship Manager, and uploaded their confirmation of receipt to Grantium
If you do not already have a Culture Counts account, you will need to register for one. Please see here the registration link for Culture Counts – https://impactandinsight.co.uk/register/
Please see here guidance on creating an Insights Report – https://impactandinsight.co.uk/resource/creating-an-insights-report/
If CPPs choose to undertake any evaluations in addition to the mandatory four, they are welcome to do so and to design them however they like.
3. Linguistically Easy Read Surveys
We recognise that many people who engage with the work of CPPs may not have English as their first language, or may have accessibility needs. This is why we have decided to use Linguistically Easy Read questions in surveys, where available. Find out more here – https://impactandinsight.co.uk/app/uploads/2026/01/Accessible-Survey-Questions-Project-1.2.pdf
4. What Should You Evaluate?
We encourage CPPs to evaluate a range of their work. CPPs will find the Toolkit most valuable if they use it to evaluate events where they can learn something interesting about the experiences of their audiences or participants, or where they have a particular hypothesis about programming or marketing that they wish to test. CPPs can discuss their event choices with their Relationship Manager if that would be helpful.
You can use the tool to evaluate any of the following activities:
- Events: for example performances, festivals, exhibitions;
- Creative Workshops and participatory experiences: for example craft workshops, singing and or music groups;
- Decision Making Activities: for example community commissioning panels;
- Co-created work: for example a large-scale event which has been co-created by the community.
A mandatory evaluation should involve an activity/event that can be evaluated using either the Core CPP Experience Dimensions for Audiences and/or participants (see below) OR the Core CPP Experience Dimensions for co-creators and/or community decision makers (See below).
The Core CPP Experience dimensions for audiences and/or participants
The following metrics are mandatory when evaluating things like:
- Events: e.g., performances, festivals, exhibitions
- Creative Workshops and participatory experiences: e.g., craft workshops, singing and or music groups
- Distinctiveness (Linguistically Easy Read): I have not seen, watched or heard something like [this] before.
- Appreciation (Linguistically Easy Read): [It] made me think about other people and their culture in a new way.
- Connection (Linguistically Easy Read): [It] helped me to feel closer to the people in my community.
- Belonging (Linguistically Easy Read): I feel like I’m part of the community because of [it].
- Pride in Place (Linguistically Easy Read): [It] made me feel proud of the area where I live.
Important – The subject in square brackets (for example, ‘[It]’) should be edited in the Design tab of your survey. It should be edited to reflect the title of the experience/event/activity you are evaluating. For example, ‘I feel like I’m part of the community because of [it].’ could be edited to ‘I feel like I’m part of the community because of the Painting Workshop.’ This is to increase the specificity of wording and ultimately improve the accessibility of the survey.
The Core CPP Experience dimensions for co-creators and/or community decision makers
The following metrics are mandatory when evaluating things like:
- Decision Making Activities: e.g., community commissioning panels
- Co-created work: e.g., a large-scale event which has been co-created by the community
- Opportunity (Linguistically Easy Read): I have new opportunities because of [it].
- Voice (Linguistically Easy Read): People at [it] listened to my ideas.
- Decision Making (Linguistically Easy Read): [It] made me feel I can have a say in my community.
Important – The subject in square brackets (for example, ‘[It]’) should be edited in the Design tab of your survey. It should be edited to reflect the title of the experience/event/activity you are evaluating. For example, ‘I have new opportunities because of [it].’ could be edited to ‘I have new opportunities because of the Painting Workshop.’ This is to increase the specificity of wording and ultimately improve the accessibility of the survey.
Evaluation Requirements
As a minimum each evaluation must consist of a Pre-activity Impact & Insight survey from a relevant member of staff outlining the creative intentions for the activity (we strongly recommend that you encourage more than one member of staff to complete this) AND a Post-activity Impact & Insight survey completed by a representative sample of the audience/participants/community decision makers or co-creators.
You may also wish to include:
• one post-event Impact & Insight survey completed by relevant peers.
• one post-event Impact & Insight survey completed by a relevant member of staff.
5. The Pre-activity Impact & Insight Survey (for staff)
The purpose of this survey is to outline the intentions for the activity, which will then be compared with the outcomes and experiences of the audience/participants. This survey should be carried out ahead of the activity, by a relevant staff member – usually the person who organised it. We strongly recommend that you encourage more than one member of staff to complete this. The survey respondent should answer the questions in a way that they would expect the audience/participants to respond, based on their intentions for the activity.
For example:
If the intention is to allow co-creators a strong voice in the creation of a project, the self-respondent should award the metric ‘Voice (Linguistically Easy Read): I expect people at it will listen to their ideas.’ a high score.
6. The Post-activity Impact & Insight Survey (for Audiences/Participants/Community Decision Makers and Co-creators)
This survey should be carried out after the activity has taken place. You should survey a representative sample of the audience/participants/community decision makers or co-creators.
Counting What Counts has produced guidance on appropriate sample sizes for different types of event, including participatory, which typically involve smaller groups of people. As a general rule of thumb, CPPs should aim for 70-100 public responses per event, where possible, and depending on the type of activity you are evaluating.
Please see sample size guidance here – https://impactandinsight.co.uk/resource/sample-size/
7. Support in using the evaluation platform, Culture Counts
On a quarterly basis, Counting What Counts offers live, CPP-specific demonstrations of Culture Counts. Please see any upcoming dates here – https://impactandinsight.co.uk/resources-guidance/training/
If you need further support in using Culture Counts, please contact Counting What Counts – [email protected]
8. Appendix – Respondent groups
- Self prior – An Impact & Insight survey from a relevant member of staff outlining the creative intentions for the work
- Self post – An Impact & Insight survey completed by the same staff members that completed the prior survey
- Peer post – An Impact & Insight survey completed by relevant peers
- Public – An Impact & Insight survey completed by a representative sample of the audience/participants/decision-makers/co-creators