This case study is structured in a ‘question and answer’ format, where Counting What Counts asked Rachael from Museum Development London (MDL) questions about their experience, and she answered accordingly. To make this case study easier to navigate, we’ve divided it into sections for you. Of course, we would recommend that you read the whole thing, but, if you’re particularly interested in how a specific type or size[i] of museum used the Impact & Insight Toolkit, click on the ‘case studies’ link below to be directed and explore accordingly.
Contents:
- Introduction
- Insights and Evaluation Tips from MDL
- Abridged Case Studies: Learnings, Changes and Reflections
- Complete Case Studies: Reasons, Challenges, Learnings, Changes, Effects and Reflections
- A few closing words from Counting What Counts
Introduction
Who are you in relation to the organisations mentioned in this study?
My name is Rachael Crofts and I work for Museum Development London (MDL), one of five Museum Development (MD) Teams across England funded by Arts Council England (ACE) to deliver Museum Development Programmes across England.
Why did you decide to encourage the museums to participate in your project and participate in the Toolkit project?
Between January 2020 and March 2024, MD[ii] collaborated with colleagues in ACE and Counting What Counts (CWC) to design a programme to measure and evaluate the quality of the work our non-national, non-NPO museums produce[iii]. The aim was to demonstrate how these museums can deliver against ACE’s Ambition and Quality Investment Principle[iv].
As part of the programme, the group wanted to showcase:
- The quality of exhibitions and events non-national non-NPO museums can create
- How museums can effectively utilise their evaluation findings provided through the Culture Counts platform, to further benefit and improve their overall visitor offer.
Below are our insights and tips from an MDL perspective.
Then, there is a series of mini studies, combining to make this larger case study, demonstrating how London’s participating museums used the National Pilot Programme and Impact & Insight Toolkit to aid development. The studies have been compiled using their Application Form and Project Reports collated throughout the programme by MDL.
Due to the high staff turnover in these museums, during and since the programme, the name of the museums involved have been redacted. Instead, we will focus on the museum’s funding type and size. With that in mind, please know that the accompanying images are also not related to the specific museum and are used for decorative purposes.
Our aim is for this case study to inspire organisations and demonstrate how it is possible to use and benefit from the Impact & Insight Toolkit as individual smaller organisations and as a collective.
Image credit – Ian Dooley
Insights and Evaluation Tips from MDL
Is there anything that you have learned through this experience, regarding evaluation practice in non-NPO museums?
All 6 museums who were involved in the pilot in London, were able to articulate through the reports, graphs and data generated by the Impact & Insight Toolkit the quality of the exhibitions and activities they created and produced.
Through their use of the Impact & Insight Toolkit, I have seen that participating museums have learnt to:
- Test the functionalities, standardised questions, core set of dimensions, audience data collection methodologies of the Impact & Insight Toolkit to customise their forms going forward
- Re-emphasise the importance of training and development opportunities to its staff following the pandemic (for instance, undertaking and understanding Peer Review as a useful benchmarking tool to inform developments)
- Provide quantitative and qualitative evidence of the quality of their exhibitions, events and experiences offered, compared with other museums and NPOs all in one place
- Create and share Insights Reports through Culture Counts
- Develop a more embedded, realistic and achievable Evaluation Framework for future exhibitions, based on available: staffing, resources, technology and space
- Effectively build a case internally for their work, by presenting their data to their respective Board of Trustees, Councils, Committees and Senior Leaders
- Successfully use the data as a basis to apply for external funding (50% of museums have used it for this purpose)
All the museums and its staff throughout the programme, commented on the amount of time and effort it has taken them to understand the use of dimensions in a consistent manner.
However, despite this, all 6 museums have written about the amount of support and time they have received from MDL and CWC in capturing, understanding and using the data collected. As a result, these museums reported an overwhelming feeling that this time and effort has been worthwhile and wish to continue to use the Impact & Insight Toolkit going forward.
Is there anything that you have learned through this experience, following the data that’s been collected?
Throughout the process, MDL has been in a unique position, able to monitor the data that was collected by the individual museums. This was discussed with the participating museums at the beginning and they all agreed to their data being shared in this way.
As a result, MDL has been able to use the Impact & Insight Toolkit to see patterns and comparative data across responses from all 6 museums. This was shared with museums as part of the process. This helped to drive conversations about similarities and differences between how experiences at the various museums were perceived.
When comparing the museums’ results with the Impact & Insight Toolkit’s other datasets[v] for all museums participating in the Toolkit, as well as other artforms who use the platform, I found that:
- All 6 museums scored higher in at least one dimension when comparing against average scores for all Toolkit participating museums.
- Half of the pilot museums scored higher ‘Relevance’ scores when compared to average scores of all Toolkit participating museums
- None of the 6 museums had higher ‘Presentation’ scores when compared against the average score for all Toolkit participating museums.
- Half of the pilot museums received higher scores in 5 out of the 6 dimensions (‘Captivation,’ ‘Concept,’ ‘Enthusiasm,’ ‘Insight’ and ‘Relevance’) when compared against the average scores for all Toolkit participating museums.
- At least 2 of London’s pilot museums received higher scores in the same 5 out of 6 dimensions when compared to the average scores across all artforms.
- All 6 museums scored higher in at least one dimension when compared to the average respective scores from across all artforms.
- 5 out of 6 of the participating museums received higher ‘Insight’ scores when compared against the average scores for all artforms.
Ultimately, the ability to compare individual museum data against 3 other data sets within the pilot (the collective data amongst participating non-national, non-NPO museums; all museums participating in the Toolkit; and all artform data) has been incredibly useful to help drive individual and group conversations around the dimensions.
In particular, we noted that the ‘Relevance’ dimension is one of the lowest dimension scores for both audiences from the individual museums as well as the collective pilot non-national, non-NPO museums. However, when these scores are compared to both Toolkit participating museums and overall artform scores, London’s 6 museums can say their audiences did find the content quality of their exhibition more ‘relevant’ than the overall and museum figures.
Such figures and comparative analysis have helped staff within these pilot museums to make positive changes to their exhibitions and the overall visitor experience in all 6 of London’s participating museums. Furthermore, 50% of the pilot museums have successfully used the data as a basis to apply for external funding.
The Impact & Insight Toolkit acts as a great benchmarking tool. Within London’s participating museums, it has enabled staff to really understand what they do well and where to direct resources and energy to make improvements. It also importantly demonstrates the high quality of exhibitions that non-national, non-NPO museums can produce for their audiences.
How will you move forward with your use of the Toolkit; do you have plans to encourage usage amongst other non-NPO museums?
Going forward, MDL plans on repeating the programme again for another cohort of accredited non-national, non-NPO museums.
It is hoped that these museums will see as much benefit from using the Impact & Insight Toolkit as the first cohort. However, MDL will continue to monitor the effectiveness of the programme on these museums to drive how it supports its non-national, non-NPO museums to evaluate the quality of its exhibitions going forward.
It further hopes that the Toolkit will continue to enable museums to formulate and articulate what they want to achieve, plan how to achieve it through the platform, and deliver and evaluate their activity from all perspectives.
Overall, what has your experience of the Impact & Insight Toolkit been like?
Over the course of the two years, I feel I’ve learnt a lot about the practicalities of using the Impact & Insight Toolkit as well as how to better support the non-national, non-NPO museums I work with.
As someone who has collected evaluation forms over the course of my career and used a number of platforms to collate and analyse the data collected, I realise that no evaluation system is ever perfect; there are pros and cons to different platforms to collate and analyse your data, and the Impact & Insight Toolkit is no different.
Initially, it did feel like a lot of organising of other people. However, as I and these museums have got more familiar with the functionalities of the Culture Counts platform, it becomes easier to create new evaluations and tweak them to make them more fit for purpose for your own museum’s purposes.
Alongside the flexibility within the surveys, I find the benchmarking ability and aspect of the platform invaluable to help drive conversations within MDL as well as the participating museums.
Overall, I’ve found that the Toolkit enables museums to build a better case of support, based on evidence, which can be used to demonstrate the quality of the work they produce to funders, external stakeholders and/or Local Councils.
Following their participation in the Toolkit project, the museums are better able to:
- Showcase the high quality of work they can and do create for their audiences
- Use the feedback to inform future activities
- Provide evidence needed to support successful future funding applications.
Do you have any final words of wisdom for other Toolkit users, especially those working with a consortium/group?
If you:
- Are looking to better understand the quality of the works you are producing against a consistent set of quality metrics and dimensions
- Would like to use a standardised set of survey questions that you could change over time
- Feel as an organisation you can think through how you want to capture your Audience Surveys and trial different capturing methodologies that will ultimately work for you
- Are open to receiving feedback from your peers…
…then the Impact & Insight Toolkit is very useful!
Some general tips for all organisations using the Impact & Insight Toolkit:
- Roughly 90-100% of audiences across the 6 museums used the slider to score the first dimension they were presented with. However, as one would expect, this response rate drops lower after each subsequent dimension asked. Therefore, organisations should not only think about which dimensions they use, but also the ordering of them, with those they want more responses to, closer to the top of the list[vi].
- In the pilot, the ‘Enthusiasm’ dimension was the last dimension audiences were asked. Despite a consecutive drop-off of audience response rates, ‘Enthusiasm’ seems to have been a well-received dimension by museum audiences. So, if organisations decide to use it as a dimension, ‘Enthusiasm’ is well attributed to being the last dimension asked in the schedule.
- When collecting audience surveys, use more than one data collection method. It not only increases the overall response rate, but it also enables people of differing demography to complete the surveys through a method they feel most comfortable with.
- Where possible, get someone from another organisation to undertake a Peer Review of your exhibition/event/activity. Overall, Peer Reviewers provided more specific feedback in the free text questions, which provided the museums with key qualitative data to use.
- Organisations who are open to receiving constructive criticism can use the Toolkit to build a case for support with senior managers, board of trustees, councillors and funders. Some feedback will be what the organisation is expecting, but it is often the case that it is better coming from (and more likely to be addressed!) when coming from a third party.
Some final tips on working with a consortium using the Impact & Insight Toolkit:
- Work with the organisations within the consortium to agree on a key set of demographic data you wish to collect (such as age, gender identity, disability status and ethnic identity), as well as a set of questions you all wish to use. This enables you to compare and analyse your own audiences’ thoughts against other responses from those within the consortium.
- Agree on a small core set of dimensions that all organisations within the consortium will always use in their surveys. Then, allow organisations to choose a couple of organisation-specific dimensions too. These additional and often different dimensions should either allow organisations to showcase the quality of a specific exhibition against a specific aim or enable them to demonstrate how it meets a funder’s criteria. Firstly, it enables the consortium to benchmark and help drive change. Secondly, it enables organisations to provide statistical and graphical data that address any specific funder’s or council’s requirements.
- Use the Peer Review element of the Impact & Insight Toolkit as a good training and development opportunity for staff within the consortium. The consortium can organise 1-2 reciprocal Peer Reviews amongst themselves of each of their exhibitions. This will ensure that an important voice is heard alongside your audiences’. It also ensures that, whilst it is an external voice, the reviewer understands the overall nature of the work and can give feedback with more confidence.
If you are interested in more information about MDL’s overall Measuring Up Programme, as well as full Evaluation Report, please visit our website. Finally, if you are an accredited non-national, non-NPO museum in London, interested in using the Impact & Insight Toolkit to evaluate the quality of the exhibitions and events you produce, please get in contact with MDL.
Image credit – Rodolfo Cuadros
Abridged Case Studies: Learnings, Changes and Reflections
- Organisation 1: Small Local Authority Museum
- Organisation 2: Small Independent Museum
- Organisation 3: Medium Local Authority Museum
- Organisation 4: Medium Ex-Local Authority, now Independent Museum
- Organisation 5: Medium Independent Museum
- Organisation 6: Large Independent Museum
Organisation 1: Small Local Authority Museum
What did they learn from the data?
The museum felt the Toolkit enabled them to capture some deeper feedback from a range of visitors over and above their usual review of ‘visitor book’ comments. They were surprised by the proportion (55%) of visitors of the 25-44 age group who completed their Audience Survey on their own device, compared to previous styles of evaluation.
Staff were also surprised that 100% of people responding to this question came by public transport. Their presumption is that a high proportion of visitors arrive by car. Postcodes showed an expected spread of locals with a scattering of wider London origins.
Visitor positivity about the shop and the great welcome from staff confirmed previous feedback. People responded positively to the ‘3D things to look at’ in addition to framed texts and asked for more hands-on activities within the exhibition spaces, as well as wanting to know more about the museum’s objects and themes.
The museum found that its Peer Reviewers provided a more in-depth assessment of the exhibition than the public. Reviewers scored their dimensions consistently lower than the public, which is to be expected, but their accompanying comments were either heart-warmingly positive and/or very useful specific prompts for improving future exhibitions.
Using this evidence, what change(s) did they make to their offer?
The museum purchased two standalone iPads: the first to capture future audience surveys; the second to add an additional interpretation level to the museum space and enable visitors to directly link to additional content about their objects online (such as films and podcasts they had already created).
Staff also decided to ensure any future exhibition included some handling objects/tactile element, along with a family-friendly activity. Finally, they also decided to purchase further cases at different heights so they could display more objects throughout spaces for all their visitors to interact with.
These changes were made in direct response to feedback from both the visitors and the peer reviewers.
Final thoughts from the organisation itself:
We’re keen to continue to use the Impact & Insight Toolkit for as long as it’s available to us. Even though initially it seemed like lots of forms and things to consider, by the end of the programme we have seen how it’s manageable in scale, relatively easy to use, generates analytics and has the imprimatur and clout of Arts Council England, and we have really seen the benefit of using the feedback to drive the planning of our future exhibitions; thank you for your help in understanding how to use the platform effectively in our museum setting.
Image credit – Nick Night
Organisation 2: Small Independent Museum
What did they learn from the data?
The data collected through the Impact & Insight Toolkit from Audience Surveys and Peer Reviews gave the museum the confidence that it’s going in the right direction. The evidence provided through the platform showed the museum was reaching new, younger audiences who are interested in learning more about diverse elements of their collection.
Visitor feedback was extremely positive and was higher than the self-assessed scores. This reinforced that the museum was connecting well with their audiences, that they understood the rationale for their new strategic priorities around collections and exhibitions, and, overall, that the museum was going in the right direction.
Through the feedback collected from the Peer Reviews, they identified a need to review the approachability of staff and volunteers, as well as their accessible resources. One of the Peer Reviewers also commented it was important to continue funding this type of work, if the museum wanted to continue to achieve against their overall goals.
Using this evidence, what change(s) did they make to their offer?
The feedback enabled the museum to review the approachability of staff and volunteers during a visit, as well as the accessibility of the interpretation. This has helped the museum develop their Front-of-House training programme going forward.
Overall, staff felt the data provided strong support to continue working in this direction, and, as a result, decided to continue programming more diverse exhibitions in partnership with artists and partners from ethnic majority groups.
Final thoughts from the organisation itself:
The value has been understanding the nuance of what visitors think about [our] exhibitions, as well as understanding how we gather and use the data. [Through the programme] we also understood the value of brevity [in our surveys] as a lot of our first round of Audience Surveys were unfinished.
Image credit – Michal Parzuchowski
Organisation 3: Medium Local Authority Museum
What did they learn from the data?
The exhibition and event, which were informed by and utilised new research into the marginalised voices of the museum’s permanent collection, was well received by visitors. Visitors rated the exhibition highly against all the dimensions, but, in particular, 100% of respondents either strongly agreed or agreed that they felt the exhibition had given them new insights and knowledge (‘Insight’ dimension).
Through the Audience Surveys, staff found that 63% of respondents said the exhibition was good because it covered important historical stories, with 38% of respondents highlighting the important of the specific local historical stories the exhibition was sharing.
Furthermore, both their audiences and peers rated the exhibition as better produced (‘Presentation’ dimension) and a better idea (‘Concept’ dimension) than staff had as part of the self-assessment process.
Whilst the data showcased how much both peers and visitors enjoyed how the museum explored the different histories, both expressed in free text questions that they wanted more information on the other stories on display and online as well as opportunities to ‘hear’ these stories as part of the museum’s very successful adult tours and within spaces in the museum itself.
Using this evidence, what change(s) did they make to their offer?
It was clear from the Peer Reviewers and audience feedback that the stories being told were very important and relevant to visitors. However, there was feedback that visitors wanted to ‘hear’ these stories as well as read them. As a result, the museum has started to create a film series of stories they have been collecting, which has been added to the museum’s website and YouTube channel.
Staff felt they wanted to do more of this and ensure such stories can be ‘heard’ in the gallery spaces. As such, they have included the costs to do this within a larger, and subsequently successful, funding application. This will enable the museum to extend its work on growing a collection of community films that highlight local stories.
Not only will these stories be able to be ‘heard’ in the museum, but the museum intends to establish standalone tablets at Front Desk and in exhibition spaces to capture visitor feedback.
Final thoughts from the organisation itself:
The process was helpful for us all, particularly for those new to evaluation and analysing surveys. It was useful being part of a well-supported process to pilot new ways/technology for recording information. It provided us with good data and information to assist with shaping/adapting our future projects. [The Toolkit has enabled us to generate] a more professional report based on the analysis of our visitor experiences and professional peers’ comments, which can be easily shared with our staff and Local Council.
Image credit – Gillian Lingard
Organisation 4: Medium Ex-Local Authority, now Independent, Museum
What did they learn from the data?
Visitors seemed to have really enjoyed their visit but were not always clear in providing the reasons for this. 59% felt the exhibition was interesting and informative (‘Concept’ dimension); 75% of respondents felt they gained new knowledge (‘Insight’ dimension).
Staff found that, although the Peer Reviewers generally scored the exhibition and event lower than museum staff had hoped, there were still a lot of positive comments and helpful suggestions.
Using this evidence, what change(s) did they make to their offer?
The feedback has encouraged staff to continue reviewing other permanent exhibitions. Staff members have used the feedback collated to successfully apply for additional external funding to develop interpretation alongside the introductory films with support of community groups and academics.
Staff felt the feedback confirmed a few issues that they were already aware of. However, it did also give them some detailed feedback on the event that they would not have received without using the Impact & Insight Toolkit. This has encouraged staff to put in place more specific review and feedback for their programming.
Final thoughts from the organisation itself:
It [the pilot project] has embedded the use of visitor evaluation throughout our work. The initial use of the platform was interesting because it embraced the whole visit – arrival, signage, welcome etc – which covered more than we initially thought. This feedback proved useful in better understanding how visitors experience our site. The different ways in which we gathered data has also been a useful learning process.
Image credit – Yuya Hata
Organisation 5: Medium Independent Museum
What did they learn from the data?
Overall, staff found visitors rated the exhibition very highly, sometimes higher than they had rated themselves as part of the Self-Assessment. Visitors commented on enjoying the personal family connection of the exhibition and commended the intimacy of the exhibition, with 100% of all visitor respondents agreeing or strongly agreeing the exhibition was absorbing and held their attention (‘Captivation’ dimension).
The museum’s peers rated the exhibition either the same or higher than the museum had rated itself. However, peers also suggested that the content of the exhibition and whole museum could be more accessible for its visitors. Peers suggested the museum consider looking at the text size, producing large print materials, multilingual resources as well as resources for families.
Visitor and peer feedback also noted that, within the museum, visitors would be unaware of any of the associated events the museum was putting on, or how to donate or become a Member or Patron.
Using this evidence, what change did they make to their offer?
As a result of the feedback, staff developed and printed a short Exhibition Guide in 3 languages for the next exhibition: English, Spanish and Portuguese. Staff also developed a letter writing and drawing activity for visitors to complete and then display outside of the exhibition for visitors of all ages.
Furthermore, staff wished to address the need to provide information to visitors about upcoming events. As such, the museum purchased a digital screen for the entry space to share information on its current exhibition, upcoming exhibitions, associated and general events, weekly tours, membership and patron schemes.
Lastly, the museum decided to trial a new in-house marketing tool, Beaconstac, for a year to explore a new interpretation method, think about new ways it can help visitors discover their collections, whilst measuring visitor traffic and engagement. The tool also enables staff to monitor which locations visitors access the various QR codes the most.
Final thoughts from the organisation itself:
We are learning that feedback is an opportunity for growth and should be embedded into future planning to ensure continued success of our work. We have a better understanding of what visitors want to see from us long term and what would make visitors consider visiting us again and have more confidence in exploring this.
Image credit – Emily Webster
Organisation 6: Large Independent Museum
What did they learn from the data?
Audiences scored the exhibition highly across all the dimensions, with 91% of respondents either agreeing or strongly agreeing the exhibition was relevant to modern society (‘Relevance’ dimension).
However, although 84% of responding visitors either agreed or strongly agreed that the updates to the permanent exhibition had been well produced and presented (‘Presentation’ dimension), when looking at the reasons why the visitor had given them their score, 31% indicated they wanted more…
Staff found many of the comments reiterated findings from other surveys and interviews conducted with their audiences and staff were already aware of, establishing the need for improved wayfinding around the museum.
In a similar way, staff found the Peer Reviewers highlighted things they were already aware of. However, overall, the feedback was really positive, with all the peer reviewers stating that they had enjoyed their visit and found it educational. Front of House staff were spoken of very highly and staff were able to share this across the organisation, improving the morale of these key customer service roles.
Using this evidence, what change(s) did they make to their offer?
As a result of the data and feedback, staff looked at commissioning a designer to improve the wayfinding experience of visitors.
Final thoughts from the organisation itself:
The introduction of the Impact & Insight Toolkit has allowed us to explore a programme for collecting responses, which we have not had access to before. The mixture of self, peer and audience surveys all gathered together in one place has allowed us to take into account more feedback than we would have before, as well as allowing us to see what our own expectations are against what our peers and audiences think…. [The Platform] has been good for compiling all of the results into one place and for creating visual diagrams of the results.
Image credit – Meizhi Lang
Complete Case Studies: Reasons, Challenges, Learnings, Changes, Effects and Reflections
- Organisation 1: Small Local Authority Museum
- Organisation 2: Small Independent Museum
- Organisation 3: Medium Local Authority Museum
- Organisation 4: Medium Ex-Local Authority, now Independent Museum
- Organisation 5: Medium Independent Museum
- Organisation 6: Large Independent Museum
Organisation 1: Small Local Authority Museum
Why did they choose to participate in the project?
There were 2 reasons why the museum wished to participate:
- They wanted an opportunity to trial and test a new evaluation framework to see if it could be implemented within a small site like their own
- They wanted to utilise the opportunity of being peer reviewed and being a peer reviewer to encourage staff development and benchmark their work against other museums.
What challenges did they experience and how did they overcome them?
The museum, with a very small staff team and volunteers manning its Front Desk, found it hard work collecting Audience Surveys for both its exhibition and the event. As such, halfway through the exhibition run, the museum started to display the QR code, from the ‘Online’ link, around the exhibition and museum spaces. Staff found that people felt comfortable using their own mobiles to complete evaluation forms privately at their own pace.
Staff also decided to trial and compare the approach of face-to-face interviews versus collecting the surveys via a standalone iPad embedded within a stand at Front Desk, using the ‘Display’ link. Staff felt that by the end of the project they had a much clearer idea of the staff input needed to conduct face-to-face interviews, and for a museum of their size and staffing structure felt that the standalone iPad was a much more viable and less resource-heavy route that worked for them.
What did they learn from the data?
The museum felt the Toolkit enabled them to capture some deeper feedback from a range of visitors over and above their usual review of ‘visitor book’ comments. They were also surprised by the proportion (55%) of visitors of the 25-44 age group, who completed their Audience Survey on their own device, compared to previous styles of evaluation.
Staff were also surprised that 100% of people responding to this question came by public transport. Their presumption is that a high proportion of visitors arrive by car. Postcodes showed an expected spread of locals with a scattering of wider London origins.
Visitor positivity about the shop and the great welcome from staff confirmed previous feedback. People responded positively to the ‘3D things to look at’ in addition to framed texts and asked for more hands-on activities within the exhibition spaces, as well as wanting to know more about the museum’s objects and themes.
The museum found that its Peer Reviewers provided a more in-depth assessment of the exhibition than the public. Reviewers scored their dimensions consistently lower than the public, which is to be expected, but their accompanying comments were either heart-warmingly positive and/or very useful specific prompts for improving future exhibitions.
Using this evidence, what change(s) did they make to their offer?
The museum purchased two standalone iPads: the first to capture future audience surveys; the second to add an additional interpretation level to the museum space and enable visitors to directly link to additional content about their objects online (such as films and podcasts they had already created).
Staff also decided to ensure any future exhibition included some handling objects / tactile element, along with a family friendly activity. Finally, they also decided to purchase further cases at different heights so they could display more objects throughout spaces for all their visitors to interact with.
These changes were made in direct response to feedback from both the visitors and the peer reviewers.
How was this change received, either by public or by team members or peers?
Feedback from visitors and peers alike has been overwhelmingly positive and visitors have enjoyed interacting with the handling objects on open display as well as the small-scale activity. The museum plans to continue adding these additional layers of interpretation into all future temporary exhibitions.
Finally, Front of House staff were initially nervous about having the standalone iPad at the Front Desk. As such, they have received further training in the need to steer visitors towards the standalone iPad and encourage them to use their own phones using the QR code at Front Desk.
The training has helped more staff understand the data being collected and given them confidence to approach visitors and answer any questions they have about the methodology. Together, this has helped the museum gain further responses from visitors.
Final thoughts from the organisation itself:
We’re keen to continue to use the Impact & Insight Toolkit for as long as it’s available to us. Even though initially it seemed like lots of forms and things to consider, by the end of the programme we have seen how it’s manageable in scale, relatively easy to use, generates analytics and has the imprimatur and clout of Arts Council England, and we have really seen the benefit of using the feedback to drive the planning of our future exhibitions, thank you for your help in understanding how to use the platform effectively in our museum setting.
Image credit – Simon Infanger
Organisation 2: Small Independent Museum
Why did they choose to participate in the project?
There were 3 reasons why the museum wished to participate:
- Over the previous 2 years, the museum had tried a new approach to exhibitions and events and wanted to better understand success and failures of this approach
- Staff felt the Toolkit and Peer Review element would be valuable in shaping the museum’s future exhibitions and programming
- Staff wanted to gain a better understanding of how visitors engaged with their exhibitions and events.
What challenges did they experience and how did they overcome them?
The museum decided to combine both the standardised Audience Survey used by all museums in the pilot, with its usual Visitor Survey. It felt this would be a good use of resources and enable volunteers to focus on encouraging visitors to complete just one survey.
However, looking at the number of visitors who completed the survey in its entirety, it was clear to staff that many visitors struggled to do this and had ‘given up’ by the end. Feedback from volunteers at Front Desk reiterated this, with many noting that visitors didn’t complete the form and that they were unable to encourage them to complete the final questions.
Therefore, staff created an abridged version of the Audience Survey, which included the dimensions, alongside the visitor survey questions. This was trialled and staff found its visitors were more likely to complete the whole survey.
What did they learn from the data?
The data collected through the Impact & Insight Toolkit from Audience Surveys and Peer Reviews gave the museum the confidence that it’s going in the right direction. The evidence provided through the platform showed the museum was reaching new, younger audiences who are interested in learning more about diverse elements of their collection.
Visitor feedback was extremely positive and was higher than the self-assessed scores. This reinforced that the museum was connecting well with their audiences, that they understood the rationale for their new strategic priorities around collections and exhibitions, and, overall, that the museum was going in the right direction.
Through the feedback collected from the Peer Reviews, they identified a need to review the approachability of staff and volunteers, as well as their accessible resources. One of the Peer Reviewers also commented it was important to continue funding this type of work, if the museum wanted to continue to achieve against their overall goals.
Using this evidence, what change(s) did they make to their offer?
The feedback enabled the museum to review the approachability of staff and volunteers during a visit, as well as the accessibility of the interpretation. This has helped the museum develop their Front-of-House training programme going forward.
Overall, staff felt the data provided strong support to continue working in this direction, and, as a result, decided to continue programming more diverse exhibitions in partnership with artists and partners from ethnic majority groups.
How was this change received, either by public or by team members or peers?
As a result of continuing to programme more diverse exhibition and events, the museum has been able to track through the Toolkit that they are diversifying their audience profile.
Extracts from feedback and the data has been shared with volunteers, staff and the Board of Trustees throughout, making it an important advocacy tool and framework for sharing and informing future activities based on the feedback internally.
Through the Toolkit, the abridged Audience Survey created and trialled, can now be used as a template in its own right to create future Audience Surveys. This will save staff considerable time going forward and ensures they have created a bespoke framework embodying all the elements that they have found work for them, using the Toolkit to capture and analyse responses.
Final thoughts from the organisation itself:
The value has been understanding the nuance of what visitors think about [our] exhibitions, as well as understanding how we gather and use the data. [Through the programme] we also understood the value of brevity [in our surveys] as a lot of our first round of Audience Surveys were unfinished.
Image credit – Vidar Nordli Mathisen
Organisation 3: Medium Local Authority Museum
Why did they choose to participate in the project?
There were 2 reasons why the museum wished to participate:
- The museum wanted a chance to reconnect with museums and the work they are doing through undertaking Peer Reviews, and use it as a benchmarking exercise to inform their own programmes
- They wanted to better understand how their audiences felt about their exhibitions.
What challenges did they experience and how did they overcome them?
As a local authority museum, staff wanted to trial different ways to capture Audience Surveys. This included seeing how well a standalone iPad worked in encouraging and collating visitor feedback to justify whether it was worthwhile to purchase one for this purpose. As such, the museum was able to use a loaned tablet from CWC for the purpose of the project, testing whether it could work, without going into already tight budgets unnecessarily.
The museum also had a relatively new team who did not have much, if any, previous experience in undertaking evaluation and gathering data. As such, the new team was initially hesitant to get involved and ask visitors to complete surveys.
The pilot was an opportunity to:
- Train new staff on why the museum needed to evaluate its work; why they, the staff and volunteers, were important in capturing feedback
- Discuss and trial different methods the Toolkit allowed to capture the data.
As a result of this work, there was more buy-in from staff and volunteers to help gather the surveys, as well as encourage visitors to complete the survey.
What did they learn from the data?
The exhibition and event, which was informed by and utilised new research into the marginalised voices of the museum’s permanent collection, was well received by visitors. Visitors rated the exhibition highly against all the dimensions, but, in particular, 100% of respondents either strongly agreed or agreed that they felt the exhibition had given them new insights and knowledge (‘Insight’ dimension).
Through the Audience Surveys, staff found 63% of respondents, said the exhibition was good because it covered important historical stories, with 38% of respondents highlighting the importance of the specific local historical stories the exhibition was sharing.
Furthermore, both their audiences and peers rated the exhibition as better produced (‘Presentation’ dimension) and a more interesting idea (‘Concept’ dimension) than staff had as part of the self-assessment process.
Whilst the data showcased how much both peers and visitors enjoyed how the museum explored the different histories, both wanted more information on the other stories on display and online as well as opportunities to ‘hear’ these stories as part of the museum’s very successful adult tours and within spaces in the museum itself.
Using this evidence, what change(s) did they make to their offer?
It was clear from the Peer Reviewers and audience feedback that the stories being told were very important and relevant to visitors. However, there was feedback that visitors wanted to ‘hear’ these stories as well as read them. As a result, the museum has started to create a film series of stories they have been collecting, which has been added to the museum’s website and YouTube channel.
That said, staff felt they wanted to do more of this and ensure such stories can be ‘heard’ in situ in the gallery spaces. As such, they have included the costs to do this within a larger, and subsequently successful, funding application. This will enable the museum to extend its work on growing a collection of community films that highlight local stories.
Not only will these stories be able to be ‘heard’ in the museum, but the museum intends to establish standalone tablets at Front Desk and in exhibition spaces in order to capture visitor feedback.
How was this change received, either by public or by team members or peers?
Feedback on the stories that have been produced so far have been well received. Staff feel the data collected has helped them build a strategy for identifying priorities going forward.
Staff felt, as a result, they can demonstrate how they have a vastly improved narrative of the building that is more inclusive of marginalised voices within their collections, as well as build a case for including future videos and audio clips within the gallery itself as well as online.
The reports that the Impact & Insight Toolkit can generate, have been incredibly useful and important as staff worked on the funding application, as well as raise awareness of the museum within the council and shared with consultants developing their new Audience Development Plan.
Final thoughts from the organisation itself:
The process was helpful for us all, particularly for those new to evaluation and analysing surveys. It was useful being part of a well-supported process to pilot new ways / technology for recording information. It provided us with good data and information to assist with shaping / adapting our future projects. [The Toolkit has enabled us to generate] a more professional report based on the analysis of our visitor experiences and professional peers’ comments, which can be easily shared with our staff and Local Council.
Image credit – Neon Wang
Organisation 4: Medium Ex-Local Authority, now Independent Museum
Why did they choose to participate in the project?
There were 3 reasons why the museum wished to participate:
- Staff felt it provided an opportunity to test newly co-produced re-written text panels within its permanent exhibition spaces with visitors
- Staff also wanted to gain a greater understanding of visitors’ overall experience of its permanent exhibitions
- Finally, the museum wanted to collect and record evaluation in new ways and give them the chance to receive professional advice through the peer review process.
What challenges did they experience and how did they overcome them?
Although the museum undertook training of volunteers to conduct 1:1 interviews with visitors, due to their availability, the museum struggled to gather as many surveys as it had intended.
As a result, the museum decided to email out the survey to ticketholders and ask for feedback through social media. Staff found that they received a lot more feedback by doing this and plan to continue using both collection methods going forward.
When analysing the data, staff found that, although the visitors gave very positive feedback, it was less specific and analytical. Visitors seemed to have really enjoyed their visit; however, they were not always clear what the questions were asking.
Feedback from the volunteers undertaking 1:1 interviews said visitors did sometimes struggle to understand the meaning of the different dimension statements.
Following a discussion with MD, the evaluation template used in the second phase of the project provided prompts prior to each dimension. An example of this is shown below:
Prompt
The next question is around Insight. Consider whether you learnt something new or saw the exhibition’s content/subject matter in a new light and why. If not, consider why…
Dimension
It helped me gain new insight or knowledge
As a result, staff have felt that they have needed to answer less questions about what each dimension statement means.
What did they learn from the data?
Visitors seemed to have really enjoyed their visit but were not always clear in providing the reasons for this. 59% felt the exhibition is interesting and informative (‘Concept’ dimension), and 75% of respondents felt they learnt new knowledge (‘Insight’ dimension).
Staff found that, although the Peer Reviewers generally scored the exhibition and event lower than museum staff had hoped, there were still a lot of positive comments and helpful suggestions.
Using this evidence, what change(s) did they make to their offer?
The feedback has encouraged staff to continue reviewing other permanent exhibitions. Staff members have used the feedback collated to successfully apply for additional external funding to develop interpretation alongside the introductory films with support of community groups and academics.
Staff felt the feedback confirmed a few issues that they were already aware of. However, it did also give them some detailed feedback on the event that they would not have received without using the Impact & Insight Toolkit. This has encouraged staff to put in place more specific review and feedback for their programming.
How was this change received, either by public or by team members or peers?
The additional changes the museum has made to other gallery spaces and its introductory video have been positively received by the wider team and the museum’s visitors.
The data gathered because of the changes has helped the museum with future funding applications and evidencing to its wider staff and volunteer teams about how much visitors appreciate the new content that they wish to update going forward.
This has given the staff the confidence to put in a larger and subsequently successful funding application to help them deliver more of this work.
Final thoughts from the organisation itself:
It has embedded the use of visitor evaluation throughout our work. The initial use of the platform was interesting because it embraced the whole visit – arrival, signage, welcome etc – which covered more than we initially thought. This feedback proved useful in better understanding how visitors experience our site. The different ways in which we gathered data has also been a useful learning process.
Image credit – Jes Rodriguez
Organisation 5: Medium Independent Museum
Why did they choose to participate in the project?
There were 3 reasons why the museum wished to participate:
- It wanted to increase staff confidence in carrying out multiple evaluation methods
- It wanted staff to feel confident in looking at, understanding and using the data produced to inform future programming, planning and development
- It wanted feedback on the museum’s orientation, signage and wayfinding within the museum to make informed and useful improvements to the overall visitor’s journey through the museum to encourage repeat visits
What challenges did they experience and how did they overcome them?
The museum recruited a small team of volunteers to support with collecting visitor surveys. This worked very well at the beginning of the exhibition, when the volunteers were more enthusiastic to volunteer. However, there were less volunteers available during the final weeks of the exhibition’s run to interview visitors and responses dwindled.
As such, the museum emailed the survey to all visitors who had bought a ticket. The museum found it interesting to compare the response rate from visitors who completed the survey at the museum with a volunteer and those who completed the survey online.
Staff found that although many visitors completed the survey at the museum, when compared to the overall visitor figures, the response rate proportion was relatively low. Comparatively, staff received fewer responses from visitors completing the survey online after their visit, but, when compared with the number of visitors who had been emailed, the response rate proportion was very high. This has encouraged the museum to maintain the mixture of the two methods going forward.
For the next exhibition, the museum plans on recruiting a more general volunteer team of ‘Visitor Engagement Volunteers’ who will speak to visitors in the different rooms, find out if they’ve been to the Museum before and ask them to complete the survey, rather than specific ‘Evaluation Volunteers.’ It hopes that the wider remit of the role will encourage volunteers to remain in the role for the duration of the exhibition run.
What did they learn from the data?
Overall, staff found visitors rated the exhibition very highly, sometimes higher than they had rated themselves as part of the Self-Assessment. Visitors commented on enjoying the personal family connection of the exhibition and commended the intimacy of the exhibition, with 100% of all visitor respondents agreeing or strongly agreeing the exhibition was absorbing and held their attention (‘Captivation’ dimension).
The museum’s peers rated the exhibition either the same or higher than the museum had rated itself. However, peers also suggested that the content of the exhibition and whole museum could be more accessible for its visitors. Peers suggested the museum consider looking at the text size, producing large print materials, multilingual resources as well as resources, information for families.
Visitor and peer feedback also noted that within the museum, visitors would be unaware of any of the associated events the museum was putting on, or how to donate or become a Member or Patron.
Using this evidence, what change(s) did they make to their offer?
To increase the accessibility of the museum’s content to younger audiences as well as those visitors whose first language was not English, staff developed and printed a short Exhibition Guide in 3 languages: English, Spanish and Portuguese. Staff also developed a letter writing and drawing activity for visitors to complete and then display outside of the exhibition for visitors of all ages.
Furthermore, staff wished to address the need to provide information to visitors about upcoming events. As such, the museum purchased a digital screen for the entry space to share information on its: current exhibition, upcoming exhibitions, associated and general events, weekly tours, membership and patron schemes.
Lastly, the museum decided to trial a new in-house marketing tool, Beaconstac, for a year to explore a new interpretation method, think about new ways it can help visitors discover their collections, whilst measuring visitor traffic and engagement. The tool also enables staff to monitor which locations visitors access the various QR codes the most.
How was this change received, either by public or by team members or peers?
Staff decided to incorporate additional questions within the Audience Survey to evaluate how many of its respondents had seen and used the multilingual Exhibition Guides and completed the activity. Feedback collected suggested that they were well received by visitors.
The Exhibition Guide translations have helped to make a variety of visitors feel welcome, as many of them have been posting about it on social media and tagging the museum. As such, the museum hopes to continue to plan multilingual guides for its future exhibitions.
Hundreds of visitors also wrote letters and drew drawings, which were displayed as part of the interactive exhibition activity. Many were written by adults, but there was also a significant contribution from students from visiting school groups (approx. age 12-19). Due to the popularity of this kind of activity, staff plan to include an interactive element in future exhibitions.
Due to the popularity of the interactive activity, they have been inspired to work on an activity booklet for even younger children visiting the museum (ages 5-11); programme more family friendly events for its visitors as well as creating a child-friendly activity sheet.
Integrating a new marketing tool has helped staff identify what visitors are most interested in, which locations work best for on-site marketing, where the best spots are for encouraging visitors to provide feedback and how they can provide opportunities for visitors to discover more and dig deeper on particular interest topics by linking objects to further online content.
Final thoughts from the organisation itself:
We are learning that feedback is an opportunity for growth and should be embedded into future planning to ensure continued success of our work. We have a better understanding of what visitors want to see from us long term and what would make visitors consider visiting us again and have more confidence in exploring this.
Image credit – Hongbin
Organisation 6: Large Independent Museum
Why did they choose to participate in the project?
There were 2 reasons why the museum wished to participate:
- Staff wanted to use the programme to provide career development opportunities for its team and enable them to get more involved in undertaking and understanding evaluation
- Staff were keen to learn from the associated evaluation collected as part of the programme and better understand how to use the data going forward
What challenges did they experience and how did they overcome them?
The museum was keen to try using QR codes to enable visitors to complete an online survey, to help reduce staff time spent conducting surveys. However, staff found that many of their visitors preferred using a paper version, due to being of an older demographic. This was exacerbated by the fact the museum has minimal WiFi, so even those who felt comfortable using the QR code, did not have a signal to complete the survey.
As such, the staff reverted to collecting paper surveys and inputting this data into the Impact & Insight Toolkit. Whilst this was more labour intensive, staff found it was a relatively straightforward process, and they found the online platform was very good for compiling the results and for creating visual diagrams of the results to share with other staff and the board of trustees.
The Culture Counts platform has since been developed, and the museum can now undertake interviews with its visitors using a tablet without an active internet connection. The surveys will be stored and then uploaded at a later stage when the device is connected to WiFi. The museum plans to explore this option further for subsequent exhibitions.
What did they learn from the data?
Audiences scored the exhibition highly across all the dimensions, with 91% of respondents either agreeing or strongly agreeing the exhibition was relevant to modern society (‘Relevance’ dimension).
However, although 84% of responding visitors either agreed or strongly agreed that the updates to the permanent exhibition had been well produced and presented (‘Presentation’ dimension), when looking at the reasons why the visitor had given them their score, 31% indicated they wanted more…
Staff found many of the comments reiterated findings from other surveys and interviews conducted with their audiences and staff were already aware of, establishing the need for improved wayfinding around the museum.
In a similar way, staff found the Peer Reviewers highlighted things they were already aware of. However, overall, the feedback was really positive, with all the peer reviewers stating that they had enjoyed their visit and found it educational. Front of House staff were spoken of very highly and staff were able to share this across the organisation, improving the morale of these key customer service roles.
Using this evidence, what change(s) did they make to their offer?
As a result of the data and feedback, staff looked at commissioning a designer to improve the wayfinding experience of visitors.
How was this change received, either by public or by team members or peers?
Not only have visitors commented and responded positively to the changes, staff feel it has made the overall visitors’ experience much better and more seamless. It has also reduced the time needed for staff to explain the layout to visitors when they arrive, which has been very positively received by staff manning the Front Desk.
Final thoughts from the organisation itself:
The introduction of the Impact & Insight Toolkit has allowed us to explore a programme for collecting responses, which we have not had access to before. The mixture of self, peer and audience surveys all gathered together in one place has allowed us to take into account more feedback than we would have before, as well as allowing us to see what our own expectations are against what our peers and audiences think…. [The Platform] has been good for compiling all of the results into one place and for creating visual diagrams of the results.
Image credit – Jingxi Lau
A few closing words from Counting What Counts
We have really enjoyed working with MDL and the consortium of museums, and we look forward to continuing with the next cohort. Working alongside Rachael to develop an evaluation approach that provides the museums with helpful insight and the opportunity to apply for additional funding has been overwhelmingly positive.
We are encouraged that these non-NPO museums have found the experience advantageous, and we hope that this case study has provided guidance to others that are looking to evaluate within a consortium.
“Thank you”
to Rachael and the Museum Development team for their work during the programme and their contribution to this study.
Image credit – Tomas Anton Escobar
What makes a museum ‘small’, ‘medium’, or ‘large’
These size categories correlate to those used in MD’s Annual Museum Survey, whereby the size categories are based on latest data on visit numbers. In recent years, these size categories were based on an assessment of a museum’s size using pre-pandemic visitor numbers. Size categories include: small (between 10,000 and 20,000 visitors per annum), medium (between 20,000 and 50,000 visitors per annum) and large (between 50,000 and 100,000 visitors per annum).
MD focusses its support on museums participating in or working towards Accreditation, the UK industry standard, which are not in receipt of funding directly from government (national museums) or one of Arts Council England’s National Portfolio Organisations. The museums supported by MD range from small heritage sites run by volunteers, to specialist university museums, local authority services with encyclopaedic collections, historic houses, art galleries and independent museums. MD’s support includes advice, guidance, training, programmes and the administration of small grants for museums. The aim of these interventions is to help the museums its supports to strengthen their governance, collections management, learning and engagement activities, workforce and commercial operations to increase the quality of their work and their relevance to the communities they serve.
We developed two standardised templates to be used across the National Pilot Programme – one for exhibitions and one for activities or events. The 8 dimensions consistently used were: Concept – It was an interesting idea; Presentation – It was well produced and presented; Captivation – It was absorbing and held my attention; Insight – It helped me gain new insight or knowledge; Relevance – It had something to say about modern society; Enthusiasm – I would come to something like this again; Excellence – It was one of the best examples of its type I/they have experienced; Skills – I gained new skills
ACE’s Let’s Create Ambition and Quality Investment Principle, is defined as having 2 key components: Ambition: the formulation and articulation of what you want to achieve and how you plan to achieve it; and Quality: the delivery of your activity and the evaluation of it against your ambitions.
The Impact & Insight Toolkit comparable data sets can be found here.
[vi] The importance of question order has been researched and the findings are explained in the blogpost, “Why are respondents not completing your survey?”.
Featured image: Photo by Frans Ruiter on Unsplash