Skip to content

‘Dimension’ questions are a central component of the Impact & Insight Toolkit.  But what are these ‘dimensions’ and where do they come from?


What are ‘dimensions’?

Dimensions are standardised statements, developed in collaboration with both the cultural sector and funders, which are included in surveys and which survey respondents can rate their agreement level to.  Each dimension is designed to capture data relating to a specific outcome which arts and cultural organisations may deem important to determine how successful their work was.

For instance, if it were important to my organisation that participants at my workshop felt as though their voices were heard, I might choose the dimension labelled ‘Voice’ and hope that the participants strongly agree with the corresponding statement, ‘I felt my ideas were taken seriously’.

If we were to use this dimension in surveys for all workshops, I and my organisation would be able to determine if the workshops overall were succeeding in enabling participants’ voices to be heard, and to maybe even determine which of our workshops were more successful in achieving this than others.


Where do ‘dimensions’ come from?

When speaking with users of Culture Counts, and the Impact & Insight Toolkit, we will often remark that dimensions have not just been plucked out of the sky; they are a result of extensive collaboration.

This is absolutely accurate, but it doesn’t communicate the amount of work that has been invested in their creation, and we know that there will be some Toolkit users that would like to know more about this process.  We think that having a better understanding of where the dimensions came from will give people a greater appreciation of their value.

If you feel that you would benefit from a rich understanding on the origins and development of dimensions, please do spend some time reading the thorough the ‘Dimensions History’ report, written by Managing Director, John Knell, and Research Director, Marc Dunford.



A bit about the report

The report explains the process of creating the dimensions, which began in 2011. It’s a long report going into detail on numerous co-creation projects and 10-years of development, including:

  1. Public Value Measurement Framework – Commissioned in 2011 by the Department of Culture and Arts (DCA) in Western Australia, the goal was to create a robust measurement system of intrinsic value that was simple, standardised, and evidence-based. This resulted in the creation of Culture Counts, the online evaluation platform.
  2. Manchester Metrics Pilot – Supported by Arts Council England (ACE) in 2012, 13 organisations in the North West of England engaged in a pilot to determine which key outcomes could best capture the quality and reach of cultural experiences and cultural production.
  3. NESTA Digital R&D Programme – Made possible by the Digital R&D fund in 2013, 19 organisations engaged in a project which focussed on continuing and developing the work that had resulted from the Manchester Metrics Pilot.
  4. Quality Metrics National Test (QMNT) – Funded by ACE in 2015-16, the QMNT involved 150 organisations using a set of dimensions to capture self, peer and public perceptions of their work using the Culture Counts platform.
  5. Quality Metrics National Test, Participatory Metrics Strand – Within the QMNT, there was further work, involving 20 of the organisations, to develop participatory dimensions. These aimed to capture quality of the process rather than the product.
  6. The Impact & Insight Toolkit – In 2018, ACE announced that they had commissioned the Impact & Insight Toolkit, which ran from 2019-23. This project involved approximately 400 ACE-funded arts and cultural organisations collecting data using dimensions across peer, self and public respondent types.  This enabled the testing and refinement of pre-existing dimensions on a much larger scale.
  7. The Impact & Insight Toolkit, Place-Based Research – Under the umbrella of the Toolkit project, a group of organisations were engaged to define place-based working and to create a set of new dimensions which help to capture outcomes of place-specific work.
  8. The Impact & Insight Toolkit, Second Iteration – In the autumn of 2022, it was announced that CWC would continue to work with ACE-funded organisations for a further three years, 2023-26, through which further testing and refining of dimensions could occur.

For each of these projects there has been a similar format, which involves:

  1. Bringing together a group of cultural organisations with the intention of creating metrics targeting a specific outcome area
  2. Facilitating workshops where the group brainstorms statements individually – usually with the provocation “What would you like people to say about your work?”
  3. Reviewing and refining the statements until consensus is agreed amongst the group

In some cases, third parties have been involved to test the new statements and provide feedback and suggestions for improvements.

Throughout this process, there has been a consistent objective of creating a standardised approach to metric statements and measurement which could open the possibility of aggregate impacts across similar institutions, art forms, funding programmes, geographies, time periods, or any other abstract characteristics.

A few principles have been central to how we have gone about achieving the objectives:

  • The cultural sector should be fully involved in developing the metrics.
  • The methods of data collection and analysis should have the capacity to produce bigger data sets and results at low cost and effort, which can help build a more developed data culture across the cultural sector.
  • It should challenge the perceived difficulty in gathering and harnessing data in this form allied to data and evaluation expertise gaps across the cultural sector.


Are you interested in knowing which dimensions were created when?  Or maybe you would like to understand the process behind the creation of the dimensions…

For the full detail on each of the metric development projects, we strongly recommend exploring the report.



Photo by Aditya Chinchure on Unsplash

Lensball Perspective