Evaluation tool 11 – Post-event surveys

Evaluation tool 11 – Post-event surveys

What is this tool?

Post-event surveys are self-completion questionnaires that are used immediately after an event, workshop or programme to help you understand the experience of your audience (what they thought of the experience, what they feel they have gained from it, what they think could be improved etc).

What kind of activities can I evaluate with it?

Post-event surveys are particularly effective for use with school groups and interested adults and for evaluating online activities. They are also suitable for use with the general public in interactive workshops, ongoing series of events (courses, astronomy clubs) and lectures / presentations.

Post-event survey at a glance…

Who:Who: Particularly Suited to Primary SchoolWho: Particularly Suited to Secondary SchoolWho: Particularly Suited to Interested AdultWho: General Public 
What: Activity Type: Interactive Workshop  Activity Type: Ongoing Series (Clubs, Courses, etc)  Activity Type: Lecture Presentation Activity Type: Particularly Suited to Online
Data:Data: Online Long ResponseData: Online Short ResponseData: Writing Short ResponseData: Writing Long Response Data: Multiple ChoiceData: Drawing / Visual
Time:Time: Preparation Long (Week before)Time: Respondent Completion Time (Implementation) - LongTime: Analysis Long (Day/s)
Gain:Information Gained: What Works / ImprovementsInformation Gained: Immediate ReactionsInformation Gained: Evidence of Changes that OccurredInformation Gained: Misconceptions Held
GLOs:GLO: Knowledge & UnderstandingGLO: Particularly Suited to Attitude & ValuesEvaluation Toolkit Icon Key (IGeneric Learning Outcomes (GLOs)): Enjoyment, Inspiration & CreativityGLO: SkillsEvaluation Toolkit Icon Key (IGeneric Learning Outcomes (GLOs)): Behaviour & Progression
Hover over the icons to see a description or see the key to symbols

When should I use it?

This tool is best suited to use at the end of (or after) an event.

What do I need?

  • Post-event surveys can be delivered on paper or online.
  • If you decide to use paper versions, you’ll need to prepare plenty of copies in advance and have pencils or pens available at your event, as well as flat surfaces to write on (e.g. clipboards or tables).
  • Time set aside at the beginning and end when people can fill in the quiz!

Let’s get started…

Prior to the event, you need to create the survey. See our Steps to choosing the right tools advice to help you do this effectively.  For example, your survey items (questions) should relate directly to your aims for the activity you are evaluating and should, ideally, provide you with information you might find useful for improving your practice or feeding back to a funder. Because it is important to keep surveys as short as possible, it is best to leave off questions related to, for instance, background knowledge of the topic.

To start, consider what you want to find out from the survey. For instance, you might want to cover participants’ overall experience of an activity (e.g. whether they enjoyed it, found it interesting or dull). You may also want to find out about their experience of specific elements of a broader activity (e.g. whether particular elements were clear or confusing, whether they felt they could contribute to a discussion) or whether they felt they learned about a particular topic or felt more confident in their understanding. You might also want to know about their motivations for participating (i.e. why they had decided to participate in a particular event) or whether their expectations for the event were met.

We’ve outlined the main types of questions you might want to consider using below; as you’ll see in our case study example, most surveys combine multiple different question types in order to capture a broad range of information.

Question types

There are many different question types that can be used, but some of the most common types are (see the pre-post quiz section for some other question types):

Yes/No questions:

Have you been to the space expo before? Yes/No

Rating scales:

On a scale of 1 to 5, where 1 is not interested at all and 5 is very interested, how interested are you in going to another talk about supernovas?

     1          2          3          4          5     

How clear or confusing were various parts of today’s talk?

How much do you agree or disagree with the following statements?

Please note – the above statements are broad examples. It is best to tailor your questions to your specific event or elements of an activity, to make it as clear as possible what you’re asking. In addition, it is very important that rating scale response options are balanced – that there are as many negative response options as positive (you’ll see that the choices in the examples above are balanced). In contrast, the options ‘really strongly agree, strongly agree, agree, neither, disagree’ are skewed in a positive direction.

Open-ended questions:

These questions give your respondents a bit more flexibility in how they answer. However it is also important to keep them as short as possible to ensure they are clear. For instance:

What would be one thing you would change about the show?

What would be one thing you keep the same?

When creating open-ended questions, it is important to keep them as clear and concise as possible, or people will not respond. In addition, people often skip them anyway, so if you decide to include them, they should not be critical to what you are trying to find out.

Demographic questions:
You may also want to include demographic questions such as:

What is your gender?

        Male        Female        Other    Prefer not to say    

Which age bracket do you fall into?

        18-24        25-39        40-54        55+    

These kinds of questions will give you some insight into who attended the event and their experiences. However, because not everyone will respond to the questionnaire, they may not provide an accurate reflection of participants. In addition, these questions can be perceived as intrusive, so unless you need to ask them, you may want to leave them off. If you do decide to include them, it is advisable to keep them to a minimum.

Types of questions to avoid

Writing good questions – ones that are clear and concise and that provide data that will be useful – is not easy. Here are some example questions that might seem fine but are actually problematic:

Loaded or leading questions (where it is obvious what kind of answer you want):

Was this the most exciting show you’ve ever been to? Yes/No

Are the public well informed about scientific developments or are scientists deliberately keeping them in the dark?

Planetary scientists often use most of their research funding to throw all night parties where they take lots of illegal drugs. Should the government increase research funding for planetary science?

Double-barrelled questions (questions that look like a single question but are actually two):

Should the EU cut spending for planetary science and increase spending for health care?

Do you like watching TV documentaries and attending lectures about planetary science? (Someone might like watching TV documentaries but have no interest in attending lectures.)

Iceberg questions (questions that look simple but are not as clear as they seem):

Do you think space exploration should focus on exoplanets? (Question assumes respondents know what exoplanets are.)

Is it safe for research to be funded by industry? (Which research? Which industrial companies? What do we mean by ‘safe’? For whom?)

Hypothetical questions (questions asking respondents to predict future behaviours – you have no way of knowing whether people will follow through)

Will you buy a telescope to go stargazing after visiting our event?

OK, what do I do with my data now?

You’ll need to start by entering your data into a programme for analysis. Excel (or free versions such as Google Sheets) are generally fine, or you can also use a dedicated statistics package such as SPSS if you are familiar with it. Allow one row per respondent, with each question having its own column. For some kinds of questions, you may want to give each response its own column indicating simply whether or not that response was selected – this is especially useful when using multiple-response questions where respondents can select multiple options from a list. (We’ve provided a detailed example in our case study).

You may find it easier to work with the data if you ‘score’ it. For example, 1 = yes, 0 = no; or 1 = strongly disagree, 5 = strongly agree. Then, you can compare the percentage responding in particular ways to each question (e.g. the percentage strongly agreeing, agreeing, disagreeing etc with particular statements; or responding yes/no to questions). Drawing on some of the questions above (and a couple of other multiple response  questions relating to visitor experiences, such as their motivations for visiting), let’s assume the first respondent had the following answers (scores indicated in brackets where appropriate):

How interested are you in going to another talk about supernovas? 3

How clear or confusing were various parts of today’s talk?

Introduction: Clear (4)

Today’s speaker was an expert: Agree (4)

Visit motivation – family: Not selected (0)

Visit motivation – interest: Selected (1)

The beginning of your spreadsheet would then look like the following: 

RespondentInterested in another talkIntro clarityPresenter expertVisit motivation – familyVisit motivation – interest
134401

If you have collected some background information (e.g. about gender), you can use this as part of your analyses. For example, you could compare the percentages of males and females agreeing with a particular statement. This could include findings such as: ‘8 of 10 females but only 3 of 12 males agreed or strongly agreed that the introduction was clear’. 

Given the recommended length of the survey (short!) and likely sample sizes (quite small), it is unlikely that the data will meet the criteria for statistical testing. However, with well-constructed questions, responses can still be useful. At the same time, it is important not to overclaim from your data. For instance, if people say they intend to do something, that does not mean they definitely will.

In addition, even without leading questions, there is often a positive response bias in the data. People are often inclined to agree with (positive) statements so it is advisable to interpret responses with a grain of salt.  You have no way of knowing (unless all/nearly all of your participants filled in the survey) how representative the responses you collected are of the people who were in the audience overall. Moreover, if your audience is the ‘general public’ (rather than, say, school students) they are generally unlikely to be representative of the wider population (given that people who usually choose to participate in public events often have an interest in the topic to begin with).

Got it! How can I take this further?

Although paper surveys are most common, post-event surveys can also be created online, for completion either via a website (e.g. through providing a weblink), in person using tablets (if you have access to some that can be used at your event, or you’re confident participants will bring their own) or for mobile phones. Most modern survey programmes include a portable version, though you may have to pay extra for the relevant App or processing software. When selecting an online survey programme, do keep in mind whether you’ll have access to the internet during the actual data collection, or whether you need an offline option that you can then upload once you get within wifi range. Zapier provides a useful review of free online tools to create forms, and apps to capture survey data.

Download Worksheet

Download the Post-event survey worksheet

Case Study

This Case Study uses Tool 11 – Post-event surveys to evaluate a festival of astronomy at a university.


Back to Tools

Back to Europlanet Evaluation Toolkit homepage