With very special thanks to Beth Wells at the Royal Liverpool Philharmonic for sharing her results and insight for this case study.
Why did you sign up?
We surveyed three very different events in order to test the Quality Metrics, all Royal Liverpool Philharmonic Orchestra concerts.
- Petrenko’s Shostakovich – this was very much a “standard” performance for us, the Orchestra are highly regarded for performing Russian Music especially with our Chief Conductor Vasily Petrenko.
- Santa’s Sleigh on Hope Street is our annual festive Orchestral Family performance aimed predominantly at 4-10 year olds and their families.
- Sixties’ Valentine – the Orchestra performing a selection of love songs from the 60s (such as Tom Jones, Cilla Black and the Beach Boys) with West End Singers.
What was the most challenging element of the process?
In many ways collecting audience responses was the easiest part, we run a box office system and we send a post-performance email for everything anyway as a matter of course so we just included a link to the survey in that.
I think that in hindsight I would think twice about the time of year for the performances as finding peer assessors for a daytime performance on the last weekend prior to Christmas and on the Saturday night closest to Valentine’s Day was challenging to say the least!
We struggled with some of the wording for the metrics for Petrenko’s Shostakovich, the Orchestra celebrated its 175th anniversary last year and we consider ourselves to be quite central to life in Liverpool and so when it came to the question around “it was important that it’s happening here” I think both internally and externally there was a challenge to make the distinction between its important that the Orchestra reside in Liverpool and whether that particular performance was important artistically.
What was your most interesting finding?
I think we got the most use out of the concerts where we had spent a lot of time internally working around the presentation of concerts. Recently there has been a huge amount of internal discussion about how we can improve the presentation of concerts that we know particularly attract new audiences, so family performances and ones with a more crossover “pops” type appeal. For these we have invested more resources in wrap around activity, additional lighting and more than just the Orchestra on the stage so this was a good opportunity to have a 360 degree evaluation of them and include some external input. It generated new conversations since we were able to marry together, audience, peer and internal all at once and look at the responses at the same time.
For 60s Valentines there had been ticket offers available and so we were aware that this performance had a higher than average number of first-time attenders. This bore out in the results where the metric for metric for distinctiveness was higher than we would have anticipated as it was a new experience for many who were there.
Additionally, we can be quite hard on ourselves internally – always feeling as though we fall short and whilst it’s good to be ambitious it was great for our artistic team to have some external peers come along and evaluate what we are doing and give their impressions.
What will you do differently next time?
One of our challenges is that the majority of our performances are one-nighters and so when it comes to inviting peer assessors we are not able to be flexible and offer a selection of dates. However, on the other side of the coin it was nice to have the opportunity to invite peers to attend as we do not have press nights or review nights so often those opportunities can be limited.
We carry out a lot of research as a matter of course with regular audience surveys, however we would definitely use this again for the more special “one-off” type events.