Evaluation in the arts is embracing the opportunities presented by 21st Century technology to collect rich information that is meaningful to participants. Examples range from using smartphone GPS technology to add value to and measure young people’s experience of an exhibition, to using the photos and videos taken by participants as part of their day-to-day experience in making a film as a starting point in interviews with those participants. The proliferation of smartphones means that taking photos and films while involved in an activity or attending an art exhibit is ubiquitous. As it is an extension of normal interaction with an experience, it allows participants to tell the story of their experience without the intrusion of the sense of their experience being measured.

So there is evidence to suggest that a smartphone in the hands of the participants has the potential to be a powerful tool, but what about a smartphone in the hands of those who are delivering a programme or intervention? Does having a device that allows you to quickly and unobtrusively collect photos, video and sound create fresh opportunities to capture the work of the programme, which can then form the basis of evaluation? If so, what are the risks and rewards of embracing this opportunity?

I believe that smartphones present a potentially revolutionary opportunity to conduct evaluation that blends in with participant experience of the programme. As a volunteer with the charity Cricket Without Boundaries I have spent the last few months developing “active” evaluation methods for Sports for Development programmes, integrating evaluation in to physical activities. Evaluation of sports for development programmes is notoriously challenging; and in a context where participants are engaged with physical activity, sitting down to discuss programmes seems to run counter to the principles of the programme itself.

Smartphones present the ideal tool to begin to capture this information, and indeed information that was already “out there” but difficult to collect. They’re lightweight, easy to use, and ubiquitous. Here are some examples of how this method could work in practice, in a Sport for Development context.

First, a cricket session that includes discussion on how to protect yourself from HIV. An adapted focus group before or afterwards can evaluate participant knowledge and beliefs, particularly useful for process evaluation. Asking the question, “The bat protects our body from the ball in cricket, what can we use to protect our body from HIV?” participants are encouraged to write their responses onto a plastic cricket bat. The process of discussion can be filmed or audio recorded, and the final conclusions of each group captured with photography:

CWB bats
Photos of group discussions and final output

Second, a session focusing on how to overcome personal challenges in a girls’ empowerment through sports programme. Here, the activity is more expressly integrated into the session, and girls are invited to create targets of “personal challenges” to knock down. Quickly and easily collected through photography by busy coaches, these can generate key knowledge for process evaluation to ensure the programme is meeting the needs of participants:

WhatsApp Image 2017-03-18 at 17.51.15
Targets in position for the activity

Finally, and sadly without illustration (for now!), video can be used for more than just capturing focus groups and discussions. It could be an easy way to capture a snapshot sense of participant knowledge and experience, as participant answers can be collected quickly and then coded and analysed later, without interrupting the activity with a survey or questionnaire. Here’s an example idea:

Instruction
Instructions for a volunteer coach/deliverer

So what are the risks? Some immediately spring to mind. Firstly, that deliverers are selective about what they capture with their smartphone, only taking photos of things that cast the programme in a good light. Secondly, that evaluators begin drowning in data as so much information is gathered. Thematic analysis of discussions or responses can take hours, repeat across hundreds of iterations in a large programme and the task quickly becomes impossible. A systematic method of determining when to collect data might be used to tackle these; shuffle a deck of cards and every time you draw a heart record data from that session, or focus on a single activity to capture in every session over a period of time.

What, then, are the rewards? Ultimately, the use of photo monitoring as a supplementary data collection method in Sports for Development is not, in itself, groundbreaking. The ideas above are untested and may have flaws. However, as this method becomes easier to implement the opportunity presents itself to develop validated and usable tools to collect information for evaluation. And for anyone who’s spent time trying to interview a participant while their friends are in the distance throwing a tennis ball around, that usability is a genuinely exciting prospect.

Advertisements