Top tips for a successful evaluation process
Interested in maximising the success of your outreach evaluation? These tips have been divided into categories according to the various stages of conducting an evaluation, though we do recommend reading them all at the start!
- Planning your evaluation
- Collecting your data
- Analysing your data
- Coming up with your conclusions
- Writing the report
- Sharing your findings
Planning your evaluation
- Make sure your evaluation explores what it is that you’re trying to achieve with your outreach activities, and that it tells you (or others) something useful and/or interesting for planning similar future events. A good starting point is to consider what you want to know and who the evaluation is for (i.e. yourself, funders, managers?)
- Integrate the evaluation into the activity as much as possible, rather than have it as a separate “add-on”. That way participants feel that it’s a natural thing to be part of, rather than something they have to do after the activity is over and they want to leave.
- If you can make the evaluation easy and practical for participants to contribute, you will get a better response.
- If possible, try to use more than one method for collecting data. Using different techniques will give you different perspectives and better insight into what happened.
- Only collect the sort of data that you are able to analyse, interpret or display usefully. It is a good idea to think about how you will analyse the data before collecting it.
- If you are planning to use online evaluation techniques, make sure the internet connection/wifi available is reliable and fast enough.
- Keep it simple! Don’t try to evaluate everything, especially not at first. It is much better to do one thing well – use one method, address a straightforward question – than to try to do too much.
Collecting your data
- Make it fun – for both the participants and you! If people are enjoying doing the evaluation they are more likely to take their time and provide meaningful responses. For this reason we’ve tried to include a number of engaging tools within this Toolkit.
- Keep the instruments short! This is not a school exam and if the evaluation takes too long (or even looks too long), people simply won’t do it!
- Make sure the language used in the tools is appropriate to the reading level of your audience. For instance, for primary school children, sentences need to be kept short and simple. In fact, it’s always better to err on the side of simplicity – adult audiences appreciate this too and it makes the evaluation quicker. You may want to consider using a reading age calculator to test how complicated your text is.
- Pilot your questions and instruments in advance – it’s worth the time it will take. Have three or four people who are similar to your intended audience (e.g. primary school students, members of the general public) try your tool to make sure they understand what is being asked. These people can be family members or friends but should NOT be colleagues or others with similar scientific backgrounds.
Analysing your data
- Don’t simply dump unprocessed data into a Word doc and call it a ‘report’. This will make it harder for a reader to comprehend, and may mean that your overall successes get lost. Providing clear indications as to the meaning of your data will ensure that your findings have greater impact.
Coming up with your conclusions
- Don’t overclaim from your data! If people say they intend to do something, that does not mean they definitely will. People are also often inclined to provide positive responses (give you the answer they think you want) so it is advisable to be careful in interpreting some responses, especially if they are about future intentions.
- Be careful about generalising from your responses. You usually have no way of knowing whether your respondents were representative of the people who were in the audience overall. Moreover, if your audience is the ‘general public’ they are unlikely to be representative of the wider population. However, the views they express can still be helpful in understanding their experience! Just be careful to talk about your participants’ responses rather than extending your findings to the wider population.
Writing the report
- Include an executive summary in your report. This will increase the chances that someone (other than you!) will at least have an overview of your evaluation findings. It can also be a good idea to include different executive summaries for different audiences (colleagues, funders, managers, even your participants), especially if you’re publishing your report online.
- Before starting your report, ask yourself who do you need to convince and about what? Then write your report in response to that issue.
- Use photos and quotes in your report to convey a real sense of participants’ experiences.
Sharing your findings
- Think about how you can maximise publicity of your findings within your organisation and beyond it. In addition to circulating the report itself, can you publicise the evaluation via newsletters, blogs and/or journal articles?