Stupidly Simple Strategy to Demonstrate Training Effectiveness
One of the most important aspects about any training initiative is gathering user metrics in the effort to analyze the effectiveness of the training. Sadly though, this is often the last thought about area prior to going live. Most of the attention is on the training content, delivery mechanisms, schedule, and so forth – but I would argue that end-user evaluation should be one of the first items planned out.
In a typical training scenario, there is a survey administered at the end of training to capture user feedback, what is often referred to as Level 1 Evaluation. This information does provide value (assuming it is done properly), but it only tells one part of the story. Even if you don’t have a ton of time to dedicate to robust user analytics, you can improve upon the Level 1 reporting data by simply administering a similar survey prior to the training being taken.
For example, let’s say you have a two day workshop that you are going to be training to a particular skill. You could require a 30 minute pre-requisite elearning course where users are introduced to the topic. Since many elearning programs allow you to create quizzes and surveys, this is an opportune time to get baseline data on your users. Prior to the live training of the event, you can look into this data to see what areas you should spend more time on, resulting in more effective training. At the end of your two day workshop, you administer a similar survey in the effort to see how well your users progressed.
This simple technique for Level 1 evaluations will go a long way in validating the metrics you provide. Your data becomes a little more meaningful when reporting upon the results of the training. Incorporating Level 2 and/or Level 3 evaluations will take your reporting to the next level (but we’ll save that for another post).
Tools for Evaluation
Everyone has their favorite tools they use for end-user evaluation, but I thought I would share the ones I prefer. The first of which is a popular one, and for good reason. I have used SurveyMonkey on quite a few consulting engagements and find it not only intuitive, but very flexible. The reporting function in SurveyMonkey is second to none. With the click of a button, you can export graphs and data for each question and the number of responses. It really is handy for streamlining the reporting process. Although there is a monthly fee for SurveyMonkey, it isn’t over-the-top.
If you’re on tight budget though, then I suggest you take a look at LimeSurvey – a free open-source surveying tool similar to SurveyMonkey. This tool has many of the same features as SurveyMonkey and offers decent reporting metrics as well. It will take you some time to become accustomed to the interface, but it’s nothing too challenging.
There are many opinions and frameworks when it comes to metric gathering and evaluations. The important thing is not which one you use, but that you don’t ignore it. Although “boring”, data is what makes training tangible for leadership. If you can effectively show employee performance improvement (or perhaps ROI) with your data, then you will go a long way in validating the importance of the training you create.