Dynamic evaluation in community engagement

We have recently completed work as evaluators on the ACE-funded community engagement project ‘Asset-based Storytelling in Kingston’. The project was led by Kingston Libraries (Fiona Tarn) and professional storyteller Richard Neville, working in partnership with community organisations in the Borough of Kingston: Mind in Kingston, Hestia, Kingston Churches Action on Homelessness, Voices of Hope, Kingston Mencap, Refugee Action Kingston and Balance (Support). It delivered a minimum of 6 (and as many as 14) storytelling workshops to each of these organisations. The purpose of the project was to develop skills around storytelling and curate community resources, bringing together community assets, and inviting participants to think of their own assets, as well as to think of themselves as an asset to their community. The project was designed to test Kingston Libraries’ long-term vision to transform into community hubs (see here).

You can read our full report on the project here.

In its collaborative development, adaptive delivery, flat leadership, dynamic evaluation, and co-designed legacy output in the form of a storytelling toolkit, the project provides a model for creative knowledge exchange, and for working with diverse communities more widely. We’ll touch on a few key points from the process of delivering the project before focussing on our role as evaluators. 

Delivering the project

Delivery (and evaluation) of the project was complicated by the range of community organisations involved, representing members with diverse needs, from those with learning disabilities (Mencap, Balance) to those who suffer with mental health issues (Hesta, Mind), women victims of abuse (Voices of Hope), people who suffer with addiction and homelessness (Kingston Churches Action on Homelessness), and those who had experienced dislocation (Refugee Action Kingston).

The range of partners and participants was motivated by the Libraries’ desire to become a community hub for its local communities, including communities that were being underserved by Library services. To cater to these different groups, the storytelling programmes had to be bespoke and co-created with the participants themselves, responding to their existing needs and narrative practices. Richard Neville, an experienced storyteller, tailored each programme to each organisation and individuals involved, facilitating activities ranging from oral storytelling and puppetry, to literary games and life-writing. He displayed sensitivity and adaptability to the changing needs of participants. For example, during sessions held at the Joel Stabilisation Centre for former rough sleepers, Neville would sometimes abandon planned structured activities and instead accompany residents to their favourite spot for a chat perched on a wall outside the local church. These were effective sessions, with the emerging discussions producing rich and impactful storytelling, the participants feeling heard on their own terms. 

Neville’s acute attunement to the participants’ needs, and his sensitive and responsive approach, contributed to the success of the project and maximised benefits for participants. While we bore witness to what Neville did and the positive impact his working methods had on participants, the process asked questions of us as evaluators. There weren’t fixed outcomes that could be checked off to signal ‘success’ – and it wouldn’t have made sense to attempt to apply such crude methodologies when the needs of the different organisations and participants were so varied and changeable. Rather, there was a co-exploration of what was important to each organisation and, often, to individual participants engaging in the storytelling sessions; and careful consideration of how this could be supported through tailored activities and even, where possible, tailored environments (spaces conducive to those activities). 

Dynamic Evaluation

So, how did we go about evaluating a project designed to be variable, changeable and fluid, with no fixed outcomes and, certainly, no quantitative targets?

Unlike traditional evaluators – at worst, observers parachuted in at the end to express critical judgment on a project – we were involved from the outset, contributing to the project design. The process of evaluating the project thus became a continuous part of the project delivery. We, as the evaluators, were part of the project team; bringing a spare pair of eyes and ears to the sessions, while taking part in storytelling activities the same as any other participant. Negotiating an ‘insider/outsider’ position within the groups, we acted as ‘critical participants’ – an active version of the critical friend – experiencing the activities while, at the same time, experiencing the participants’ engagement in the activities, and bearing witness to their creative and developmental process. As insiders, we became familiar to participants, shared our own stories, and contributed to the group’s development. As outsiders, we represented a wider audience (individual and institutional as members of Kingston University) that further validated participants’ voices – particularly the voices of those who felt silenced or dismissed by wider society because of who they were or what they had experienced.

Dispensing with logic models and fixed input/output markers of success, we employed a Theory of Change, an approach to evaluation and project management that has a dynamic and causal understanding of success, looking at how immediate (change-mechanisms), short-term (outcomes) and long-term (impacts) benefits interact. For example, observations of participants engaging in storytelling and gaining confidence in their contributions (change mechanism) led to a sustained sense of achievement and understanding of the value of their stories (outcomes) contributing to lasting improvements in participants’ personal confidence, wellbeing, social skills, and creative expression (impact). 

Our theory of change was a living document, produced in tandem with the rest of the project team and updated as the project progressed, incorporating feedback from partners and participants. This was supported by the flat leadership structure of the project team, where all members were given an equal voice and encouraged to share thoughts on best practice for the project. 

Employing a dynamic evaluation tool allowed us to assess the project on a rolling basis in an iterative cycle, suggesting tweaks or identifying potential issues. For example, we observed a lack of interaction between the different partner organisations and considered this to be a missed opportunity in terms of achieving one of the project’s aims (develop networks to support and empower partners to generate future storytelling projects). This was discussed in project meetings and addressed through a number of events that brought groups together.

Co-designed Legacy

A key outcome of the project was the creation of a storytelling toolkit to empower community organisations and library staff to facilitate their own storytelling activities. The toolkit represents the possibilities of this type of collaborative community partnership where evaluators are embedded within the project team and engaged with participants. The toolkit draws on the expertise of the professional storyteller, the insights gained through evaluation of the project, and the input of community partners both indirectly, through observations of sessions, and directly, through workshops and interviews. 

You can access the toolkit here.

Please get in touch if you would like to discuss the project further or have ideas for future storytelling activities. 



Image credit: Tom James, Kingston University