More than a smile sheet
/#8 in our training evaluation blog post series:
Digging into training evaluation uncovers a lot of debate and discussion around the value of level one evaluation data.
In my last evaluation post in this series, A little evaluation can be a dangerous thing, I wrote about the potential dangers of only using level 1 evaluation data to determine the effectiveness of learning back in the workplace. There are many articles, blog posts and forums dedicated to discussing the merits (or lack thereof) of using level 1 evaluation. I personally believe that a level 1 smile sheet has value to the learner as it allows them to reflect on their learning and provides a vehicle for their thoughts and feelings. But I also believe that we need to keep in mind that it’s only one small measurement in the overall evaluation process. Much less weight should be on the “qualitative” data gathered from a level 1 smile sheet and much more weight and importance be given to level 4 evaluation results - the impact training has on the business results.
Whether simple or complex, level 1 end-of-course evaluation forms (a.k.a. “smile sheets”) are used in the majority of training courses offered by organizations – in over 91% of organizations according to a 2009 ASTD Value of Evaluation research study. But does your level 1 end-of-course “smile sheet” go beyond the basic questions to capture data that will help your organization measure evaluation levels 2, 3 and 4?
A well-designed level 1 evaluation plan should measure not only learner satisfaction but also their level of engagement and relevance to their job. The goal is to incorporate statements or questions that focus the learner on higher levels of evaluation and get them thinking about how the new learning will benefit both them and the organization after the training event is over.
There are some simple changes you can make to your level 1 evaluation form that can provide further value:
- Consider using a 7, 9, or 11 point rating scale to provide a richer level of feedback. Only label each end of the rating scale, rather than labeling each number on the scale (e.g., 1=strongly disagree and 7=strongly agree).
- Make all evaluation statements or questions learner-centred. For example, rather than “The instructor provided debrief activities for students to demonstrate their learning”, instead use “The debrief activities helped me to effectively practice what I learned”.
- Consider adding statements or questions to the course evaluation form that measure engagement and relevance. This helps to focus the learner on levels 2, 3 and 4. Some examples include:
- I had sufficient opportunities to contribute my ideas. (level 2)
- I estimate that I will apply the following percent of the knowledge/skills learned from this training directly to my job. (Provide a % scale from 0% to 100% in increments of 10.) (level 3)
- This training will improve my job performance.(level 4)
You can see that just a few tweaks to a level 1 evaluation leads to insightful information that can improve your training process.
Stay tuned for more upcoming blog posts with tips and strategies for other levels of evaluation and be sure to check out our other evaluation blog post in this series: