In January 2021, I wrote about open source principles and how they can enable great things to be achieved. In the world of open source software, people contribute to and use projects that are useful. This creates a sort of virtuous circle, where a useful project attracts contributors who in turn help to make the project useful.
Essentially, you might only be a single contributor, but you benefit from the project in its entirety because the outputs of the project are freely available to you.
The Impact & Insight Toolkit project is like this – perhaps you are only collecting a relatively small amount of data for your organisation, but the things being built with the Toolkit benefit from all the evaluation activity taking place, as do you.
As we’re approaching the end of what has been another unusual year for us, the sector and everyone in general(!), we thought it would be a good time to look back at the year we’ve had and reflect on your contributions to the project and the outputs that we have been able to produce.
Contributions – evaluation and data collection activity
For the Toolkit, the contribution aspect this year has been focussed on evaluation activity and we have been supporting organisations in evaluating their work.
The charts below give a snapshot of evaluation activity which has taken place throughout the Toolkit project to date, with each evaluation year shown in a different colour to allow us to compare them. The x-axis shows the months April to March (not January to December) as this is the funding year.
This year has seen the highest levels of activity so far, despite the effects of COVID. In November 2021 alone, 15,500 surveys were collected. In 2019 (the only uninterrupted year so far) we saw the highest levels of activity over the festive season and the run up to April. We are yet to know the extent to which the festive programmes of 2021 will be affected, so we may see momentum build into the New Year.
We are delighted to see this because it means we collectively have a greater opportunity to build fantastic tools and services using the data and support the organisations using them in getting an even greater return on their investment.
Toolkit outputs – useful and relevant tools this year
We’ve delivered many things this year including:
- New flexible reporting tools
- Flexible evaluations
- The Insights Dashboard
- Artform specific dimensions
For me and my role as Lead Analyst in particular, there are a few things that we have delivered in addition to the above that I am particularly proud of because I believe they support Toolkit users in traversing their data journeys. So, I’d like to give some personal thoughts on these.
Unless you are a data analyst yourself this might not seem like the most exciting thing. However, from the point of view of advancing data culture, being able to publish an anonymised version of the Toolkit data which is available for anyone to download is a real win and something which I feel strongly about continuing to do.
In many sectors the problem is that there is too much data – it is collected so quickly that people struggle to build systems that can process it. This is not the case for arts and culture. There is a distinct lack of publicly available and relevant data, and the data that is available is hard won. For this reason, being able to publish a large(ish) dataset representing distinct characteristics not easily found elsewhere, such as audience experience, is a huge step in the right direction and one which I am delighted that we are taking.
Dimensions Interpretation Tool
With hindsight, perhaps we should have given this tool a better name as every time I talk about it, I feel the need to apologise for the mouthful that it is (sorry!).
Despite the name, I’m very happy that we have a tool such as this that is solely focussed on providing insight to the users of the Toolkit AND which they have the opportunity to shape. We have already had feedback from a few users of the Toolkit and this feedback has resulted in improvements to the Tool this year.
The Dimensions Interpretation Tool (sorry) is also powered by the data that is submitted by everyone using the Toolkit, and therefore is one of the outputs which benefits from everyone’s contributions and helps power the virtuous circle – more and richer data means a better tool and a better tool makes it more insightful.
Finally, I think the functionality that this tool offers is crucial to help users of the Toolkit better understand their evaluations. The ability to compare and benchmark evaluation results to those from other evaluations provides important context for interpreting the data (although I am always open to hearing otherwise if there are differing opinions – please get in touch!).
Since the beginning of the project, we’ve had the goal of analysing the aggregate dataset with specific research questions in mind, aiming to find answers which are insightful and valuable.
This year we conducted a piece of data-driven research into the ways different people experience different sorts of work. It was fantastic to do some hard analysis into the data and discover some patterns which make sense – e.g. older people have a higher threshold for considering works to be distinctive – and which could be useful for those using the Toolkit to evaluate their work.
Research such as this isn’t the be-all-and-end-all, but it is an additional string to the Toolkit’s bow and one which demonstrates that the data being collected can be put to good use.
Our collaborative project is seeing the contributions that we hope for from our users and we are working hard to deliver things which do these contributions justice. The journey is not over yet but, looking back at the year we’ve had, we are delighted about what’s been achieved and are excited about what’s to come in 2022.
 Large for arts and culture