Evaluation Toolkit – Choosing the right tool

Evaluation Toolkit – Choosing the right tool

Once you have decided exactly what you want to find out and taken account of the facilities and possible limitations available, it is time to choose the most appropriate evaluation tool.

Think before you choose

Until you have answered all the questions in the chapter on planning your evaluation tool, it is easy to unintentionally select a data collection
method that does not deliver what you want.

For example, you might go to all the trouble of compiling and distributing a questionnaire, only to find that the results don’t tell you very much, and it may have been better to use an alternative such as Tool 6 – Snapshot Interviews to get more specific, focused feedback. Worse still, your audience may have got bored or even frustrated by filling out “yet another” questionnaire, thereby detracting from their experience of an otherwise successful event. Once you’ve answered the questions above, you’re in a better position to decide what methods will be best for your activity or project.

Pathways to choosing the right tool

To help you select which tool is most appropriate for your planned outreach activity we have prepared an overview table at the end of this section
comparing their features. Based on feedback from members of the Europlanet community we have listed the tools in groups according to when they might best be used (during, beginning/end, or after an event).

Depending on your interests, you may find other pathways more helpful. For example, if you know your audience is ‘Primary School’ students then
you can use that row in the overview table to help narrow down which tools are most appropriate.

Other possible selection criteria that are included
in the overview table are:

  • Suitable audience(s)
  • Appropriate activity type(s) e.g. lecture presentation
    or drop-in workshop
  • Response type e.g. online, written, verbal and
    whether it is short or long
  • Time required to prepare the evaluation methods
  • Time required to implement the evaluation
  • Time required to analyse the data collected
  • Sort of information gained e.g. what works /
    suggested improvements / evidence of changes
  • Most relevant Generic Learning Outcomes

Tailoring and testing your chosen tool

Once you have chosen the tool you think is most appropriate for your purpose, you may need to make some adjustments depending on, for example, your audience or available resources.

Aim to integrate the evaluation into the activity as much as possible, rather than have it as a separate “add-on”. That way, participants feel that it’s a natural thing to be part of, rather than something they have to do after the activity is over and they want to leave.

Make sure the language used in the tools is appropriate to the reading level of your audience. For instance, for primary school children, sentences need to be kept short and simple. In fact, it’s always better to err on the side of simplicity – adult audiences appreciate this too and it makes the evaluation quicker. You may want to consider using a reading age calculator (e.g. http://datayze.com/readability-analyzer.php) to test how complicated your text is.

If people are enjoying doing the evaluation, they are more likely to take their time and provide meaningful responses, so make it fun, for the
participants and for you! For this reason we’ve tried to include a number of engaging tools within this Toolkit. Similarly, if the evaluation takes too long (or even looks too long), people simply won’t do it, so keep the instruments short – this is not a school exam!

It is also a good idea to pilot your questions and instruments in advance – it’s worth the time it will take. Have three or four people who are similar
to your intended audience (e.g. primary school students, members of the general public) try your tool to make sure they understand what is being asked. These people can be family members or friends but should NOT be colleagues or others with similar scientific backgrounds.

Finally, do consider using multiple tools to gain different perspectives, or even as complementary stages in the process. For example, using Tool 9 – 3 Words at a pilot event with a smaller group might help you identify what questions to include in a pre-post quiz (perhaps using Tool 3 – Mentimeter or similar) for later events with larger audience sizes..

Back to Evaluation Toolkit homepage