How to Design a Course Feedback Form
Knowing what the delegates think of your training course can be extremely valuable as you get to see what works and what needs improving.
In this article, I am going to share with you a series of important guidelines on designing feedback forms and the underlying reasons behind them. Whether you are a trainer, work for HR, in charge of a training department or work for a training agency, you can benefit from the guidelines given here.
This guide is packed with examples, both good and bad, so you can easily design or update your own forms.
12 November 2019
Have you ever received an evaluation form at the end of a course that made you feel you wanted to leave real fast!
Let’s try to design a form that has the exact opposite effect. Something that makes people enjoy the experience of telling you what they thought of your course and in the process help you improve it.
To find out what works and what doesn’t, I have reviewed hundreds of forms and came up with a comprehensive set of guidelines for you to design and improve your own feedback forms. There is no such thing as a universal perfect form; it all depends on what you want to achieve and the topic you are teaching.
I am an engineer at heart, and here I am aiming to deconstruct bad feedback forms and formulate how to ‘engineer’ a good form.
By knowing the principles behind good design, you can easily design and update your forms to suit your needs.
Why Feedback Form Design Matters
I usually like teaching by example and here I have included tons of them for you. I have also included a series of action-based guidelines that are easy to follow and the reasoning behind them.
An evaluation form gives you a great opportunity to see what delegates think of your course. Unfortunately, quite often, this gets wasted by putting too much emphasis on asking about the facilities, the refreshments, the lights, etc. rather than focusing on the usefulness of the actual course delivered.
Say we want to engineer a form. Most people just write a bunch of questions that first comes to mind, or worse just copy them from another course. You can achieve a lot more by approaching this task methodically.
Form design is an art form. By this I don’t mean just the design aspect. I mean, there are many possible elegant solutions, and a lot of bad ones. It is a matter of designing something against the constraints. The design constraints are namely the available space for the questions and the time it takes to fill in the form.
What Is the Point of Giving a Feedback Form?
Ok, let’s quickly review a number of points:
Is It a Good Idea to Have a Feedback Form at All?
Feedback form allows you to see what went well and crucially what didn’t. Every time you run a course you get an opportunity to see how you can improve next time you run it. Do not miss this opportunity.
Is it Really That Bad to Give Them a Simple Form?
The results can mislead you. A bad form can annoy the delegates right at the end of a potentially good course. What is experienced last is remembered first and so the taste of a bad feedback form stays with the delegates. We don’t want that.
When Should You Distribute the Form?
Before you design any feedback forms, you must decide when the form is given to delegates.
This is crucial because you cannot just ask any question irrespective of when the form is given. For example, right after the course you may want to know what they thought of the food, but if you ask this a week later, no one is going to remember what they ate. Try to remember what you had to eat exactly a week ago and you know what I mean.
In contrast, let’s consider questions like these:
“What are you doing differently as a result of having gone through the course?”
“Were you unable to put the lessons to practice, if so, why?
To answer the above questions some time must have passed between the end of the course and the time the forms are given to delegates. They would need time to apply the lessons and only then can they answer such questions.
This is why it is critical to know what you want the form to do for you, before you embark on the design process.
For the purpose of this article, we are going to focus mostly on forms given on the day or immediately after the course delivery. The forms can be paper based or digital. Most of the guidelines here are focused on designing feedback form questions which is the meat of any feedback form design adventure.
What Makes an Evaluation Form Bad
As the old saying is, there are many more ways to fail than to succeed and this seems quite applicable to feedback form design.
Four Reasons Why Some Forms Are So Poor
They take a long time to fill in.
- People resist filling long forms and their frustration can spill into the feedback itself. Sometime I get a request that says, it will only take you 15 minutes to fill this. Fifteen minutes, in this day and age is an eternity.
They focus on unimportant areas at the expense of more important ones.
- Do you really need to ask three questions about the lighting, the room and the receptionists while ignoring questions on content and delivery?
They have poor visual design.
- Some forms look as if they were designed in the 70s and have not been updated since. Sometimes they force you to select an answer for their own convenience; for example score 9 means not applicable!
They have a confusing layout.
- Delegates don’t quite know where they need to tick or what score they need to give. Any friction leads to frustration which in turn can contaminate the feedback.
In my career, both as a trainer and a learner, I have come across many poor feedback forms. What I don’t want to do in this guide is to go through a process of name and shame; instead, we want to focus on learning what doesn’t work.
The bulk of a feedback form is the questions and so rather than reviewing an entire form, I will walk through a number of poor questions and explain why they are bad.
“Was the trainer prepared?”
Let’s suggest that someone scores this as 3 out of 5. What does this mean? In what way the trainer wasn’t prepared enough? Is it knowing the content? Is it being organised? Is it about being good at questions?
If you are the trainer and you see this feedback, do you know what to do differently next time to improve your performance? Not really.
In other words this statement is vague.
When formulating questions, always ask yourself that if someone scores it low, would it tell you something useful that you can act on?
Let’s reformulate this. Ask a question that make your intentions very clear:
How organised was the trainer?
How knowledgeable was the trainer?
“Would you recommend this course to a colleague?”
This sounds more like a marketing question than an evaluation question. Let’s say someone scores this as 4 out of 5. This means the course is not good enough to be recommended, that there is something they didn’t like. But what is that? Do you know what to do about it? Which area needs to be improved? If the score was 2, you would still be just as clueless expect that you know someone really didn’t like your course. You could have just asked the most generic question, “Did you like the course?” and you would have got the same result.
With this question you have lost an opportunity to learn where you can improve the course.
If you could ask unlimited number of questions, this question might add some value, but sadly attention spans are limited and you cannot ask hundreds of questions. You must therefore prioritise.
“Was the trainer professional, well-prepared and knowledgeable about the topics at hand?”
If someone scored 3 or 4 out of 5, what do you conclude? Ok, they are not happy about something, but was this about being prepared, being knowledgeable or being professional. They are very different and your score doesn’t tell you which area is the concern.
You question must therefore focus on one idea at a time. You cannot save space by putting three questions into one.
“Length of the course was sufficient...”
If a delegate scores “strongly disagree”, what does it mean? It could mean the length was too long or too short. You wouldn’t know. As you will see later in this article you can use spectrum questions instead, which allow you to obtain much more useful information.
“The training was not relevant in my role…”
This is way too much text to capture a simple question. More text takes precious space and also takes longer to read. Multiply this by several questions asked in the same style and you lose a lot time and space.
It could easily be expressed with the following:
Now that you have seen some examples of poor feedback questions, let’s see what works.
What Are the Principles of Design for Evaluation Forms?
An effective evaluation form must satisfy three design requirements simultaneously:
Principle 1: Must Be Quick to Fill In
In general, people don’t like long forms that take ages to fill in. The only exception is perhaps personality questionnaires where people actually enjoy sharing personal information about themselves and find it rather therapeutic. A course feedback form is certainly not a personality questionnaire.
At the end of a course, especially a day-long one, or multiple day course, delegates are tired and ready to leave. You cannot be too demanding at this point or their frustration will contaminate the evaluation itself.
Principle 2: Must Maximise Information Collection
You must get the most amount of useful information per amount of time spent filling the form. This requires form engineering and design.
To maximise information transfer include open questions as you will see shortly.
Principle 3: Must Be Measurable
You should include questions that are rated with the aim to obtain a quality score that can be compared to other results. This helps with consistent analysis.
Hence, you should include both qualitative and quantitative questions.
What Do We Want to Get Out of a Feedback Form
Feedback form design starts by deciding what you are aiming to get by distributing the form.
At the end of the day, a feedback form is for you to improve your training. As such you should expect and encourage constructive criticism. Everyone likes to be praised, but if people only say you are good, it is a missed opportunity to see how you can improve.
However, in contrast with this idea, you want the delegates to leave the course thinking that they have attended a great one. Yes, the training course can be improved here and there, but only just.
Therefore a feedback form has be subtle; it needs to collect valuable information while also reinforcing that the course was great.
Even if you are giving the form a few days after the course, an opportunity to reinforce the idea that the course was good is never a bad thing, so long as you know how to improve it even more.
Let’s see how such a form can be designed.
How to Distribute Feedback Forms
There are several methods you can use to distribute the feedback form to delegates. Each method leads to a specific design.
You should never use the same form design irrespective of how you distribute it to delegates.
Your design is dictated by the way you distribute the form. If you design the form and then think of distribution, you most certainly have to go back and redesign it again.
- Includes any paper-based forms.
- Includes all tech-based methods such as direct questions in an e-mail, PDF questionnaire attachment, sharing a link to a web page with forms and so on.
Now let’s look at pros and cons of each distribution method briefly:
Give a paper form to delegates to fill in at the end of a course
- They get to fill in the form there and then. There is no way out of it and you get everyone’s feedback straight away.
- However, it might be a bit slower to write and you cannot validate the results automatically.
Give a paper form to delegates to fill in and hand over to the office later on
- You hope to get anonymous feedback here. The form is designed with this in mind.
- Anonymous responses can encourage honest answers but will certainly add a layer of complexity beyond giving a simple form at the end of a course.
Send a link to their e-mail and ask them to fill in a form online
- Results can be validated easily. You can also use more fanciful scoring systems, checkboxes, combo boxes, bullet choices and so on.
- The forms can be updated easily over time.
- However, if results are automatically generated including previous forms, new designs can break this analysis. This often leads to resistance in changing the form design which is crucial as your course evolves.
There is much more to this topic, but let’s press on as we want to focus more on question design.
Should Course Evaluation be Anonymous
You should also decide if feedback is anonymous.
Consider two points:
You want to get honest answers
- All things being equal, it is generally better to get anonymous feedback for more honest views.
You want to know what each delegate thought of the course
- Sometimes, it is useful to see what each delegates thinks of your course. Since you already know how a particular delegate performed during the course, you may want to know what feedback that specific person gives. For example, you might have a good student who knows the subject better than the others. The feedback, in relation with what you know about the person, can be quite useful. When a good student complains about something we pay more attention and try to do something about their concern.
You must therefore design your form based on this decision. For example, if it needs to be anonymous you cannot include questions that encourage leaving clues that might reveal the identity of the delegates. Suppose your main aim is to collect honest feedback. Once delegates sense they can be recognised, they may not share sensitive thoughts and you will lose the opportunity to obtain them.
Anonymous Collection Trick
At the end of the course, distribute paper forms and blank envelopes. Ask the delegates to fill in the forms anonymously and seal them in an envelope. You will then pass the envelops to the office who would read and analyse the results.
The benefit of this simple technique is that you get to collect all feedback forms there and then while still allowing delegates to share their opinions anonymously, hoping to get honest results.
What Are the Common Complaints?
Before embarking on formulating questions for an evaluation form, start by studying commonly reported complaints. Consider current courses and even courses you and your colleagues have attended throughout the years. If you hear people moan about one specific aspect all the time, then this would be an important parameter to include in your list.
Here is an example list of complaints:
- Was too difficult
- Training method was poor
- It had too many unnecessary repetitions
- Didn’t seem deeply knowledgeable
- Didn’t seem to care for my learning
- Was irrelevant and outdated
- Was poorly structured
- There was not enough content
- Did not like the food
- Did not like the venue
- Was too cold / too hot
- Was too noisy or distracting
- Couldn’t get to the venue easily
- Didn’t learn anything I didn’t already know
- Was confusing
- Was delivered too quickly
- Was delivered too slowly
- Was all over the place and unstructured
- Was monotone
- Assessment was not representative
- Was just like listening to a presentation
- Didn’t need to learn about some topics
- Couldn’t find parking easily
- Did not feel welcomed
What Would You Need to Find From the Delegates
Having decided on the purpose of the feedback form and what delegates commonly complain about, you can now compile a set of questions that help you better understand how they experienced your course:
First, divide the complaints into 3 or 4 main categories which you can then use for your questions:
- Ask questions that focus on training materials, choice of syllabus, usability of the content and so on.
- Ask questions that focus on the instructor and the manner in which the training was conducted.
Course Specific (optional category)
- Ask questions that are specific to the subject matter and evaluate if the method used to teach this specific subject works.
- Ask questions about the physical space or the quality of the service provided. These usually relate to the physical needs of delegates.
The course specific category is optional because you can consider including them under content or delivery. Only add extra categories if it makes the form easier to use. Three or four is ideal.
Next, formulate questions for yourself to get information about those common complaints. The question should sound like a goal for you. You will then rephrase these questions in a follow up step you will see shortly to make them suitable for inclusion in the course:
- Was the content relevant?
- Was the content presented at the right level?
- Did they like the class?
- Was the content easily understood?
- Did you communicate clearly?
- Was it at the right pace?
- Was it enjoyable?
- Did they like you?
- Did they think you were knowledgeable?
- Did they think they were assessed fairly?
- Was the refreshment/lunch suitable?
- Was the training room comfortable and relevant?
- Was it easy to get to?
Apply Best Practice Guidelines When Designing the Feedback Form
By now we have a list of complaints and a list of goals that help us find out about delegate’s experience of attending a course. It is tempting to dive in and formulate the questions.
But let’s hold back and review a series of powerful guidelines that help with the questioning technique. I have provided examples for most of them so you can see exactly how they are applied and why they are so important.
1. Ask for Minimal Personal Information
Most forms start with a section on personal information. And most forms ask way too much!
Let’s analyse this:
Waste of Space
Half the feedback form is wasted by you asking the delegates to spend precious time writing the name of the trainer, number of participants or even the location of the venue! This is totally unnecessary. You are delivering the course and you are giving the form to them; you should know all this information already. Just print it on the form, or stack them all up for this specific course. Don’t waste their time in the process.
Design custom forms for each course with the details of the course already there. In this day and age this is super easy. If you are sending delegates an online form, send one pre-populated with the details of the course so they don’t have to waste their time.
Remember, even a couple of minutes spent filling a form feels like a fair amount of time by delegates. Use it wisely to get the most out of your forms.
Unnecessary Personal Details
Now, the form also asks for their details. Do you really need to know their home address? What is the point of wasting those precious minutes and form real-estate to get them write down their address? In what way would it help you improve your training course. Not a bit!
By the time they get to the main questions they will be annoyed already.
The real state space on a form is very limited and so you only want to focus on the data you need and the data you don’t already have. If by taking their name you can look up their e-mail, phone and address, why would you need to get them write it on the form? You already have this.
2. Design It In One Page
Aim for a one-page form, maximum two pages, but no more.
Delegates will frown if you give them a stapled stack of papers and ask them to fill it in. It should take about 5 minutes to fill in the form. If they have to spend more than 10 minutes, it is way too long.
If you are giving delegates an online form, don’t paginate it. Let them see all questions in one go. It is a dreadful feeling when you think something is going to take 5 minutes and you end up spending 15 minutes just because the designer hid the questions in 6 pages. A delegate will feel resentful and may not pay much attention to questions in the last pages, or worst may abandon the process altogether.
3. Always Ask a Question. Do Not Write Incomplete Statements
For example do not say, “Was the trainer...” and expect them to rate it.
Incomplete statements are usually used in conjunction with quality keywords selected for rating such as “poor”, “satisfied”, “good”, etc. This is poor form design because it takes longer to understand what the question wants.
4. Always Include Rating Questions and Comment Boxes
Use more than one type of questions. Mix and match. Some questions are best answered with a rating. Comment boxes help you obtain personal opinions. Some forms only include comment boxes, but this can come across as tedious and doesn’t include anything measurable.
5. Ask an Open Question
Since we want to include both quantitative and qualitative questions, in addition to rating questions you must also include comment boxes for answers to qualitative questions.
For these questions, you must elicit as much information as you can. Ask a question that can be answered in two sentences rather than a few words. At the same time, don’t expect an essay.
Instead of asking:
“Did you benefit from this course?”
Ask an open question such as:
“What did you get out of this course?”
6. Make Your Open Question Specific
Open questions tend to become general quite easily.
Consider the following question:
“What did you think of this course?”
You are expecting everything—the good, the bad and the ugly! If you ask this, be prepared to get the classic off-hand answer: “It was ok.”
This answer means nothing.
You must ask specific open questions with an aim to extract information out of delegates.
7. Ask Directly to Discover What They Liked
You want to know what works in your training course. Although your overall aim is to improve your course, knowing what works can help you dig deeper to see why it worked. Armed with this information, you can improve your course or even use it when designing future courses.
8. Ask Them to See What They Disliked
Don’t always ask questions that lead to positive results. In order to extract information, sometimes you need to ask what they disliked. This needs to be done with care though and I will expand on this more later in this article.
9. Get Them to Rate Between 1 and 5
Five star rating is extremely popular on the net thanks to companies such as Amazon and Google. People are already used to this system when rating products and services. Scoring 1-10 isn’t as familiar anymore and frankly it is an overkill. Rating up to an odd number (5) provides a middle rating (3) so delegates can vote in a neutral way if they wish. Even number rating (such as 1 to 4) forces people to choose between good (3 and 4) and bad (1 and 2) with no neutral option in between. It is best if you give people the choice.
10. Present Rating 1 to 5 from Left to Right, Bad to Good
Some forms are designed so poorly, it is mind boggling! Consider this example:
Where do we start analysing this one? So many problems:
No 1 is missing.
- Why just 4, 3 and 2? Why is there no rating for 1. This is unusual and distracting. Use 1 as minimum rating.
Ratings are written in decreasing order.
- This is what we are used to: 1,2,3,4… Write the numbers in increasing order.
Strongly agree is to the left.
- We are used to an axis that increases from left to right. This is a universal mathematical convention. Please stick to conventions for the good of society: show “less agreement” towards the left and “more agreement” towards the right, much like an axis on “agreement”. Don’t place strongly agree first to the left.
More than two labels is not useful.
- The middle rating 3 says “agree”. It is best if you don’t include more than two labels such as “agree” for a high number (say 5) and “disagree” for a low number (say 1). People will know what to do with whatever rating you have in between. For example, your max rating could be “strongly agree” or “high” and your min rating “strongly disagree” or “low” and let the delegates decide what to score in between just based on numbers. Don’t over complicate your form.
The middle rating is not neutral.
- Even if you really want to label that mid rating such as 3 here (in between 4 and 2) or more normally a 3 between 1 and 5, make sure it is labelled in a neutral way. In this example, it says “agree” which is positive and not neutral.
11. Don’t Limit the Scope of a Question
Consider the following question:
If you answer that you strongly agree, then you are saying that the level was about right. Not quite right, but about right! What if you want to say the difficulty level was perfect. The formulation of the question is limiting.
If you disagree, then it is not clear either. Is that because you found the course too difficult or too easy? Not obvious!
Always make sure that your formulation of the question allows you to capture both extreme sides of an opinion.
12. Use Non-Distracting Labelling
Label the rating choices with general keywords only for the beginning and the end. For example, label 1 to 5 as “Poor” to “Excellent”.
If you use “Strongly Disagree” to “Strongly Agree” you have to formulate your questions in a way that matches such answers. I believe this limits what you can ask in comparison with the simpler and more generic poor/excellent combination. With “Strongly Agree” you need to write a statement rather than a question and then you have to load it with a quality.
For example rather than asking:
“How was the performance of the trainer?”, which can be answered between “Poor” to “Excellent”,
For these labels, you have to say instead:
“The performance of the trainer was good.”, answered between “Strongly Disagree” to “Strongly Agree”.
The first one sounds much better and is more satisfying. May be “good” is not a great word but you are forced to include good or a similar quality word with the second style. I would say to avoid any potential problems just use the simple Poor/Excellent labels.
In addition, don’t include fancy words for in-between ratings such as “somewhat” or “satisfied”. These words can distract and mislead. Just ask for 1 to 5 stars and people know what you want.
13. Group Questions Logically
Don’t jump from area to area when sequencing the questions. Divide them into logical groups such as Delivery, Content and Facilities as you saw earlier.
14. Don’t Mix Up Collecting Feedback With Collecting Marketing Data
The aim of the evaluation form should be focused on information collection that helps improve the course.
Do you really need to ask the delegates about their race, religion or sexual orientation? Collect your marketing data separately.
15. Focus the Evaluation Form on Training Rather Than the Training Agency
The feedback form is about you, not the training agency, if you are using one. Use the opportunity to learn about your delivery, rather than focusing most questions on how the agency handled the booking.
As a trainer you must insist on designing the feedback form yourself unless you are happy with what a third party organisation provides. This is a particularly common problem as forms designed by agencies serve their purpose more than yours and tend to be way too generic as they are used for a variety of courses.
16. Consider Customising the Form Based on What You Are Teaching
Most institutions use a generic feedback form for all courses which means you miss an opportunity to ask course specific questions. As a training specialist and the subject matter expert, you are best qualified to design your own course’s evaluation form.
Your form should help you understand what you can do next time to improve the course. Include minimal questions on facilities and domestic arrangements. Otherwise you will receive too much feedback on areas that are not as important, even if delegates are naturally more vocal about them.
After all, we know that people take food seriously and bad food can really annoy some, but no amount of food feedback and food improvement is going to improve your training delivery!
How to Design Each Question
You now need to design the way you ask each question.
For example this is a common way to ask a question and expect a rating.
Here you can use generic labels for 1 and 5 because you can use the same rating for all sorts of questions. Delegates know what you mean. Avoid using fancy wording for rating such as “Not at all related” to “Exactly what I was looking for”. These just distract.
Now imagine if you used the same Poor/Excellent rating for the following question.
You are asking an open question and you are using easy labels but there is still something not quite right about this. Let’s illustrate with an example:
Suppose a delegate replies 4. What does this mean? Does it mean the pace was too fast or too slow? You cannot really tell, which means you have failed on one of the principles of form design; maximising information transfer. Here, you have missed an opportunity to see what you can do about the dissatisfaction. You know the pace is not right, but you don't know if it is too slow or too fast.
What can we do to improve on this?
Here is an elegant solution. Instead of using the same 1/5 rating we can give them a spectrum like this:
Now they can place an X where they think it represents how they felt and you will know straight away what they thought of the pace.
Now suppose get different results from a group of delegates you have been training.
If you get four delegates scoring “just right”, one delegate scoring “slow” and one “fast”, then you may assume your pace is fine. However, if you get four “fast” and two “just right”, then you are too fast for some people. Consider cutting content, asking more questions, slowing down on certain thorny topics and increasing repetition to make sure everyone follows.
You may now wonder what about the other two who thought the pace was just right. If you slow down, wouldn’t you annoy them? For people who want more content, you can consider optional exercises or extra lessons to consume. For example, with your slower pace you realise that two of the delegates are getting impatient. Put them in a group with a more advanced researched-based exercise while getting other delegates involved in simpler exercises. The two will be challenged enough to feel that the pace was “just right” for them too.
The trick is to know that pace is relative. Yes, Einstein was right, everything is relative. Pace is subjective. As a tutor, you must make each individual feel the pace was just right for them. From your point of view, you have a generic pace and six custom paces for the delegates, when teaching six delegates.
You must always be ready to adapt to what is needed. Remember, not everyone has to do everything.
How to Take Advantage of Psychology to Get What You Want
A form is a series of questions. Answering questions is subject to the psychological mood of each delegate. Much like in real life, people may find it awkward to answer certain questions. You must therefore design the form to make it easy for them to answer.
As always, let’s start with an example. Suppose you ask the following:
Was there anything you did not understand during today’s sessions? Please provide specific examples.
Adults don’t easily admit to understanding something, especially on paper. By the time they write something they may feel quite bad about the course or themselves.
So you must be a bit more tactful because we do want them to tell us about any issues, but we want to get it out of them rather gently.
To do this you can use the Sandwich Technique. Here is how it works:
Include three open questions.
Ask them to say what they liked
Asked them to say what they disliked
Ask them to provide any other comments
They have to open up with something they like. This is great for your ego! And you also get to see what worked well.
Once they have said something positive, it would be much easier for them to continue sharing any concerns. This is your chance to receive some constructive criticism!
They will then follow this by any comments they want to make. What do you think they say now?
If they have remarked on something they disliked, at this point they are more likely to share what they liked. This is because psychologically, they feel they need to say something positive at the end.
I am assuming that your course wasn’t that bad and you just want to know what you can improve. If they didn’t like your course, trust me, they will let you know, sandwich technique or not!
Feedback Form Sample Questions
Here is a list of questions to inspire you when designing your feedback form. You can ask the following with a rating of 1 = Poor to 5 = Excellent.
Sample Questions: Content
How related was the content to your needs?
How organised was the course?
How easy was it to understand?
How much did you enjoy the content and the class?
How did you like the exercises?
Were the topics covered useful and educational for the subject of the course?
Were the visual content such as slides or anything written on boards clear and useful?
Was the workbook educational and well-designed?
Sample Questions: Delivery
How was the performance of the trainer?
How knowledgeable was the trainer?
Did the trainer communicate clearly?
Did the trainer know the content well?
How confident do you feel about the subject matter after training?
Was the trainer well-prepared?
Was the trainer in control of the classroom?
Did the trainer keep the focus on the main topic?
Was the course enjoyable?
Did the trainer manage to keep you interested?
Did you receive personal attention when needed?
Did the trainer keep you engaged?
Sample Questions: Facilities
How welcome were you made to feel?
How satisfied were you with food & refreshments?
How satisfied were you with the venue?
How easy was it to get to the venue?
How easy was it to park at the venue?
Comment Boxes (Expect answer as a few sentences)
What did you like the most about the course?
What did you like the least about the course?
What worked well in this course?
What was the most important lesson you learned today?
Which topic impressed you the most?
What suggestions do you have on improving this course?
What do you like to be different in this course?
If you had to change one thing to improve the course, what would you change?
How was the pace of the course?
Spectrum (Too Slow <---> Too Fast)
How was the length of the course?
Spectrum (Too Short <---> Too Long)
How difficult was the course?
Spectrum (Too Easy <---> Too Difficult)
How much content was covered in the course?
Spectrum (Too Little <---> Too Much)
I will be updating this page as I come across more examples of good or bad forms and will reflect them here. Revisit as a reference page when you need to design or update your forms.
If you have any killer feedback forms or proven rules, please let me know. Always eager to know what works for you.