Post-training evaluation data is nice… but what should we do with it?

A while back I wrote about 8 transferable lessons from my Fitbit that I’ve applied to my L&D practice. As part of that post, I complained that the Fitbit sometimes gave me data, but I couldn’t do anything with it. Specifically, I was talking about my sleep pattern.

A typical night could look like this:

Fitbit - Sleepless

FORTY ONE TIMES RESTLESS! That’s a lot of restlessness. It’s not good. But what am I supposed to do about it? It reminded me of my post-training evaluation scores.

Sometimes learners would give my sessions an average of 4.2. And sometimes those same learners would give a colleague’s presentation an average of 4.1 or 4.3 (even though I knew in my heart of hearts that my presentation was more engaging!!). But what could I do with these post-training evaluation scores? I’ll come back to this point in a minute.

As for my restlessness, my wife suggested something and suddenly my Fitbit sleep tracker looked a lot different. An app called Insight Timer. She used it one evening and I woke up the next morning to find the number of restless periods had decreased by 88%.

Fitbit - Good Sleep

If there was hope for me to find value in my Fitbit sleep tracker, perhaps there was value in some of life’s other perplexing metrics… maybe even value in post-training evaluation scores!

Just as my wife offered an idea on how to make my Fitbit sleep metrics more actionable, Will Thalheimer helped me to reframe my attitude on post-training evaluation and identify ways to make those metrics more actionable.

In his book Performance-focused Smile Sheets: A Radical Re-thinking of a Dangerous Artform, he advocates for scrapping the traditional Likert-scale questions that currently fill up post-training evaluation scores, and instead ask questions like this:

Eval Question - Supervisor Support

In the past, I may have asked a Likert-style question about whether people agreed or disagreed (or strongly agreed or strongly disagreed) with the idea that their supervisor would support them in applying skills after a training session. And I’d probably get some sort of numeric score, like 4.2. But what does that mean?

With this new format, I get responses that breakdown like this:

Eval Question - Supervisor Support Responses

Now I have a much better idea of how people feel they’ll be supported by their supervisor after a training session. This data is much more actionable than finding an average score of 4.2.

One thing I might want to do is follow-up with the client and offer them some suggestions of ways that supervisors could better support their staff in implementing ideas, knowledge and skills from this workshop.

This data also tells me that I may need to adjust my design practices in the future to include a checklist of discussion points for learners to take from the session and use in one-on-one check-ins with their supervisors in order to make sure there is post-training conversation around ideas, knowledge and skills gained from this workshop.

If we’re going to collect data, it ought to be actionable… otherwise we’re just collecting vanity metrics that may serve to make us feel good on the surface, but don’t really mean anything.

What are some metrics or data from your training programs that you wish could be more actionable?

 

 

2 thoughts on “Post-training evaluation data is nice… but what should we do with it?

  1. I think that it is also imperative to communicate with supervisors PRIOR to sending someone to training. Often, they have not been through the class themselves (or so long ago they forgot or the class has changed). Send the supervisors a list of the topics or objectives and get their input on how important these are to the employees’ success. This allows them to buy in to the training and may even help L&D professionals to alter the course to meet the current job needs. Then you also have an actionable checklist for follow up. For example: “Dear Joe, before the training class you said that objective 3,4,7 and 9 were important to your direct report’s success. Here are 2 ways that you can follow up with him/her to ensure that their performance soars.”

    • Yes! Absolutely. Another example of an actionable take-away from this particular post-training evaluation… And according to the research, engaging supervisors PRIOR to training can be even more important than having them engage with trainees AFTER the training.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s