Skip to content

Introduction

It aims to address long-standing challenges in the arts and cultural sector:

  • Measuring artistic, social and economic outcomes that are meaningful to and shaped by arts and cultural organisations
  • Building sector-wide dataset on the impact of publicly funded arts and culture.
  • Giving audiences a voice in debates about the value and impact of arts and culture
  • Strengthening peer review within the arts and cultural sector
  • Building evaluation capacity and capability
  • Encouraging cross-sector collaboration on the collection, interpretation and use of data

Ultimately, the Impact & Insight Toolkit aims to help arts and cultural organisations to understand people’s perceptions of their work and how well this aligns with their creative intentions. It informs and enriches self-evaluation and the conversation between the Arts Council and NPOs about the impact of funded work.

 

Who’s who

Arts Council England (ACE): An executive non-departmental public body, sponsored by the Department for Digital, Culture, Media & Sport. ACE champions, develops and invests in artistic and cultural experiences to enrich people’s lives. It supports a range of activities across the arts, museums and libraries – from theatre to digital art – reading to dance, music to literature, and crafts to collections.

Counting What Counts (CWC): A team of Consultants and Researchers who specialise in impact evaluation and measurement.

Creative People and Places (CPP): ACE-funded organisations whose work centres on place. In accordance with their funding agreements, some CPPs are required to use the Toolkit.

Culture Counts (CC): An international performance evaluation platform that has developed research, methodologies and tools to assist state and local governments, councils, cultural organisations and other third sector agencies to implement outcomes-driven and evidence-based decision-making processes.

National Portfolio Organisations (NPOs): A core group of arts organisations that receive ACE funding.

 

How it works

Evaluation: An evaluation is the folder which contains the collection of surveys (typically Public, Peer and Self-assessor surveys) used to assess one specific piece of work.

Pre event survey: Surveys issued before a work/performance/exhibition occurs are referred to as pre event surveys

Post event survey: Surveys issued after a work/performance/exhibition occurs are referred to as post event surveys

Peer survey: Surveys issued to invited peer reviewers after the work are called ‘Peer’ surveys. Peer reviewers are usually surveyed after they have experienced a work (post event survey). This gives them the opportunity to reflect on their experience and subsequently provide feedback.

Peer reviewers: Peer reviewers are individuals who are deemed to offer useful insight as well as a fair, informed critique of a piece of work. A Peer reviewer is anyone that has not been directly involved in the curation or creation of the work that is being evaluated, but that the receiving NPO values and respects their professional opinion. For more information on becoming a Peer reviewer please see our guidance.

Public surveys: Surveys issued to members of the public that have experienced the work are referred to as ‘Public’ surveys.

Public: The public is the general public that experienced the work. They might be audience members, visitors or participants.

Self assessor survey: Surveys issued to self assessors are referred to as self assessor surveys. Self assessors are usually surveyed pre-event survey and post-event survey. The pre-event survey measures the self assessors’ expectations on the work. The post-event survey measures the self assessors’ reflections on the work.

Self assessors: Self assessors are usually staff members within the hosting organisation, or better still, a range of key individuals involved in the event’s curation or creation. Organisations can think broadly about the individuals who have contributed to the making of your event; in this way, self assessors might include the artists, curators, marketing or education staff within your organisation.

 

Evaluation content

Dimensions: Metrics used to measure an intrinsic quality of a work. Within the Impact & Insight Toolkit, a dimension statement is made up of an outcome and a statement that captures that outcome.

For instance:

Pride: It strengthened my cultural pride

Custom questions: These are original questions created by the user and included in a Toolkit survey. Users can add these on the Design page of the survey builder, using a range of content types, including drop-down and multiple choice. Users are encouraged to include some custom questions in each evaluation to ensure that the evaluations speak directly to the unique outcomes of their work and organisation’s mission.

Evaluation properties: Metadata tags, or descriptors of the work which will enable the organisations to make more detailed comparisons between evaluations over time, or against aggregate datasets, like in the Dimensions Interpretation Tool. These tags include artform, sub-artform, location, and key words. These details are input by a user and, when submitted to Arts Council, feed into their Insights Report, giving valuable context.

Peer Matching Resource: The Peer Matching Resource is a database of over 600 peer reviewers. Within the Culture Counts platform, the Peer Matching Resource allows the user to search for peers to provide a review of their work. Peer reviewers that have registered to join the database may be browsed and filtered so that the user can find a peer reviewer that can provide most valuable and relevant insight. The majority of Peer reviewers in this database work for an ACE funded organisation such as an NPO and are Toolkit users themselves.

 

Reporting and analysis

Insights Reports: An Insights Report is a summary of one evaluation’s results, comprising of automatically generated graphs and statistics. It also offers space for the user to reflect on their creative intentions and insights achieved.

Annual Summary Reports: The Annual Summary Report compiles the evaluation data for four evaluations. The report was created to help band 2 and 3 National Portfolio Organisations (NPOs) meet their funding requirements during the 2019-23 Arts Council England (ACE) funding period. Because of this, they only compare dimension and demographic questions which were mandatory for band 2 and 3 NPOs at that time.

Benchmarking Dashboard: Formally know as the Dimensions Interpretation Tool, the Benchmarking Dashboard is an interactive tool, developed to help band 2 and 3 National Portfolio Organisations (NPOs) interpret results to mandatory dimension questions during the 2019-23 Arts Council England (ACE) funding period. This tool allows users to compare their survey results to other Toolkit evaluations, providing artform-specific context for their individual results.

Aggregate data set: Aggregate data refers to information that is collected from multiple sources and/or on multiple measures, variables, or individuals and compiled into data summaries or summary reports. For more information please see https://www.edglossary.org/aggregate-data/

Sample size: Sample size refers to the segment of an audience that completes a given survey. The more audience members surveyed, the larger the sample size, and therefore the data is more accurately representative of the wider audience. For more information see our sample size guidance.

 

Have we missed anything? Is there a term or word that you think we should include here? If so, please contact us at [email protected]