April 8th, 2014 E-Learning

10-imageOne of the often over-looked components of training program development is the post-training evaluation. In many cases, the evaluation design is an afterthought as precedence is given to the actual training content.

This is a huge mistake.

Arguably, the design of your evaluations should begin near the start of the entire project. This includes pre-course, in-course, and post-course evaluations. Otherwise you will likely end up using paper-based-smiley-face driven evaluations, which won’t really give you any insight into the course material and how to improve.

When it comes to post training assessment, there are a few “best practice” tips that you can use so as to help maximize its effectiveness and participation. It doesn’t matter if the training is elearning or live, these guidelines apply to both:

Post Training Evaluation Tips

Make evaluation part of course completion metrics – By linking the evaluation to a learner’s proof of attendance, your participation rates will skyrocket.

Administer assessments electronically There are a variety of reasons for this, but mainly because it allows you to generate meaningful reports with the responses received, as well as keep accurate records. Even in a live training session, look to deliver the evaluation electronically.

Use proven evaluation techniques – Don’t just think of random questions, put some time into researching (and using) proven theories for capturing relevant data. There are many evaluation theories that you can go with for your training (my personal favorite being the Kirkpatrick model).

Require Comments – At least one section of your final evaluation should require written (typed) comments. While making every written feedback portion of the evaluation required is a bit too much, having one section is just fine. If you use the first tip, then you won’t have to worry about much backlash to this.

Justin Ferriman photo

About Justin Ferriman

Justin Ferriman started LearnDash, the WordPress LMS trusted by Fortune 500 companies, major universities, training organizations, and entrepreneurs worldwide for creating (and selling) their online courses. Justin's Homepage | Twitter


9 responses

Leave a Comment

I could not agree more! In one of the large multi-nationals I am currently working, we even have the evaluation form completion, which is not available until the course content has been completed, as a pre-requisite to the final course assessment. OK, we cannot ask for feedback on the final assessment itself, but it does raise the importance of the course assessment and it also tends to be completed immediately after completing the content.

I love your tip about linking evaluation to attendance. Great idea!

Avatar Tina Jackson

Thanks Tina, it is a nifty little ninja trick to help bolster response rates!

I agree that evaluations are critical. Thank you for saying so! Too many people see these as a waste of time.

Unfortunately, we have to go beyond the Kirkpatrick model.

As pointed out in:

Salas, E., Tannenbaum, S. I., Kraiger, K., & Smith-Jentsch, K. A. (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13(2), 74-101.

Which is from a top-tier scientific journal:

“Historically, organizations and training researchers have relied on Kirkpatrick’s [4-Level] hierarchy as a framework for evaluating training programs…The Kirkpatrick framework has a number of theoretical and practical shortcomings. [It] is antithetical to nearly 40 years of research on human learning, leads to a checklist approach to evaluation (e.g., ‘we are measuring Levels 1 and 2, so we need to measure Level 3’), and, by ignoring the actual purpose for evaluation, risks providing no information of value to stakeholders… (p. 91)

Their words, not mine.

But it does suggest that we can do better!

How to do better? Well, that’s a long discussion. In short, we need evaluation models that are aligned with the research on learning, that help learners make good decision on their smile sheets, and that provide data that is more meaningful than numeric averages.

A suggestion: Please don’t call this “evaluation.” It’s actually a survey. It’s a survey that collects data from respondents. If you have that focus, then you can learn how to do it effectively. There are lots and lots of resources on collecting data in this way that is useful.

Evaluation is the result of the assessment. Assessment is comparing expected outcomes to actual. Simply asking people to respond to a survey is not ‘evaluation.’

My suggestions on the survey? Align the survey with course results. Ask respondents to assess their skill level on key learning objectives in the course. Ask them to assess their change (improvement?). Compare that to your expected improvement. Collect self-reported examples of learning — ask participants to tell you the most valuable thing they learned with an example of it’s possible application. Collect these and compile them. THEN, follow up with a small number of course alumni to assess longer term transfer of skills and knowledge to the work place. Assess the enabling and hindering factors as best you can.

No, a form at the end of a presentation is not an ‘evaluation.’ This stuff takes a bit of work, doesn’t it?

Avatar Dan Topf

I cannot figure out a way to create an evaluation using LearnDash. Is there a way to do this with LearnDash?

I too am interested in creating a post course survey/evaluation with LearnDash 3.0. Is their capability in 3.0??

Avatar Tim Cassidy

Course evaluations are a requirement for our Health care courses for accreditation purposes. we are juts struggling to work out how to add this using Learndash? we have been having to link to survey monkey to collect this information electronically. Is there a way we can use Learndash to generate a qualitative survey and collect and summarise results for this evaluation survey?

I strongly dislike the the quiz builder and it’s a nightmare for making an evaluation form / survey, which my business requires for CEs, similar to what jennifer said.

We have traditionally used Google Forms, where we can make a Multiple Choice Grid with “Strongly Agree, Agree, etc” in a column and different questions in rows. It’s just good survey design to combine similar questions. Unfortunately, that’s not really possible in LearnDash unless I’m missing something. I can’t even choose which questions to appear in the same page.

Avatar Josué Cardona

Comments are closed.

Your course could already be online!

We offer a 15 day money-back guarantee and have a world class community to help you get your course online today!

See LearnDash in action. Online Demo


👋 Meet LearnDash Webinar

Are you trying to decide if LearnDash is the right learning management system to build your online courses? Join us for our next live walkthrough.

Join Our Next Webinar!

LearnDash webinars are designed to teach you how to build, grow, and scale your course-building business. Join our team in one of our next live webinars.