Evaluating widening participation work is a difficult endeavour! There is a careful balance to strike between the practical realities of ‘evaluation form apathy’, and the need to continually produce evidence of impact. There are also additional issues around what forms of evidence count and whether this effectively captures what is meaningful to young people or your organisation.
Drawing on research into evaluating widening participation and outreach, we have identified 10 things you might wish to consider when planning your evaluation strategy:
#1: Research has identified the importance of listening to the voices of learners in designing widening participation and outreach activities (Gazeley et al, 2017; Thomas, 2000). Provide space in your evaluation for learners to feedback not just on the specific activity but provide their own perspective on what they would like to see in other widening participation activities. Value these contributions and, where appropriate, take them forward.
#2: Shift the evaluative gaze and think more broadly about who you are evaluating. The focus is quite often only on the learners themselves and it could be that others such as the teachers or parents and carers may also have a lot to contribute in terms of what might work for supporting young people. For example, Stone de Guzman (2017) explored the influential role of parents in shaping young people’s decision making, suggesting value in evaluating ‘what works’ for them too.
#3: In order to evidence the impact of the work you are undertaking, where possible, evidence and baseline the point from which you are starting. While you cannot always attribute the change to particular initiatives, it can provide a useful headline indicator of your work’s value. This is most often done so through the use of statistics but, as Harrison and Waller (2017b) argue, there is value in looking beyond numerical data for other ways to measure impact.
#4: Identify your theory of change. Moore et al. (2013) identified the need for robust evaluation processes to be built into outreach activities from the beginning that focus on targeting and measuring change. Consider: What do you want to happen as a result of doing this activity? How will you know it has worked for individuals and for your organisation? What measures (short and long term) might you employ as evidence? How will the evaluation evidence you collect inform your future practice? For example, if you work wants to increase young people’s grit and resilience consider what this might look like, how it emerges for young people and how it might be shaped and changed in relation to others including (but not limited to) the work you do
#5: Consider your evaluation scope. While it is important to consider the long-term implications of the work you do, it may be more realistic and effective to focus on smaller, targeted key outcomes. For example, Harrison and Waller (2017a) recommend emphasising the value of ‘small steps’ achievements for each intervention, rather than aiming to prove large scale transformations over time.
#6: Focus on what it is important to know rather than what might be nice to find out. For examples, evaluation form questions that relate to how well that specific session went might be a good measurement of the venue or of the quality of the speakers but may not tell us much about the value of the particular initiative for the young person. Scrutinise your evaluation questions in terms of what data will this produce, what value it has for the work you do and what changes might be made as a result.
#7: Think creatively about evaluation methods. You don’t have to get students to fill in a form. You can create an online quiz with 2 or 3 questions that students can complete on their mobile phones. Or you could ask them to draw or take photos capturing their perspectives and use these as prompts for discussion. Prioritise meaningful, rather than convenient, data.
#8: Think broadly and consider some of the unintended, as well as the intentional benefits of the work you do. For example, while the projects ultimately aim to increase the participation of NCOP learners in higher education it may also raise their confidence or enable them to effect change in others e.g. via mentoring schemes.
#9: Explore the role of different aspects of learner identities in shaping learners’ engagement with outreach activities. A young person’s home postcode is only a small part of who they are. Consider other factors such as age, gender, ethnicity, disability and other relevant life experiences and how these interrelate together. Question who is missing from the data, as well as who is included.
#10: Nuance the findings. Your project may have worked successfully on one reading but interrogate what bits worked and for who to gather a deeper understanding of the impact of your work on diverse learners.
Please get in touch with us if you have any other guidance to add.
- Gazeley, L., Hinton-Smith, T. & Shepherd, J. (2017). Overcoming the barriers to working-class students’ participation in higher education. Brighton: Sussex Learning Network.
- Harrison, N. and Waller, R. (2017a) Evaluating outreach activities: overcoming challenges through a realist ‘small steps’ approach, Perspectives: Policy and Practice in Higher Education, 21 (2-3), pp. 81- 87.
- Harrison, N. and Waller, R. (2017b). Success and Impact in Widening Participation Policy: What Works and How Do We Know? Higher Education Policy, 30, pp. 141–160.
- Moore, J., Sanders, J. and Higham, L. (2013). Literature review of research into widening participation to higher education. Bristol: HEFCE.
- Stone de Guzmán, V. (2017). Redressing the balance: Research exploring views of university engagement activities and how they could be developed in the future. Brighton: Sussex Learning Network.
- Thomas (2000) “Bums on Seats”; or “Listening to Voices”: Evaluating widening participation initiatives using participatory action research, Studies in Continuing Education, 22 (1), pp. 95-113.
*If you cannot access these articles but would like to, please contact us*