I’ve spent a lot of time this week curled up by the fireplace, reading a gripping, suspenseful, page-turner of a book entitled Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Artform, written by Will Thalheimer.
Be on the lookout for a full book review once I’ve finished it. For today, however, I want to explore one specific area of the book that Will talks about early on: the difference between awareness training and performance training.
Will describes awareness training as conveying “information to learners but doesn’t provide sufficient support for remembering or on-the-job application.”
Performance training “provides remembering and application support – and aims to improve on-the-job performance.”
As I reflect on the past several years, I’ve certainly been frustrated by how often I’ve intended to design performance training that demonstrates some sort of measurable return on the training investment, only to see the end result more of an awareness training that exposed people to concepts that they never ended up applying on the job.
Recently I’ve begun to ask several questions when someone comes to me and asks for help designing training for their team or for a specific audience. As I’ve grown busier in my role, I’ve learned that I cannot help someone who doesn’t have a good answer to these questions. I’m fortunate enough to have a supervisor and an organization who will back me up when I walk away from someone who can’t give a good answer to these questions.
Here is what a typical conversation now looks like:
Manager: Brian, can you help us overhaul our core training program?
Me: Probably. What exactly are you thinking?
Manager: We’d like to incorporate more adult learning principles into the classroom portion of our initial training program and we know you have a good handle on adult learning.
[Author’s Note: In the past, this would have been good enough for me. First, they’ve buttered me up by using the two magic words every instructional designer loves to hear: adult learning. Second, that would have been an easy thing to measure in the past: does the overhauled training program have more lessons/activities that integrate principles of adult learning? Yes or no? But that’s such a shallow way to measure success. I can do better.]
Me: I’d love to bring adult learning into your program… but why do you think that’s going to help. It’ll take some time to revamp activities… what are you going to get out of that?
Manager: We’re hoping it helps people to better remember what they were taught.
Me: How do you know people aren’t remembering what you’ve taught them?
Manager: They call me with questions two weeks after their training. Or worse, they miss something – they forget to sign their name in a certain place or they overlook some other important piece of information. Then someone else needs to clean up my team’s mistake, which just adds time to the process.
Me: Do you have actual data on this, or is this just anecdotal?
Manager: Both. We feel that if we try to improve the training by incorporating ways to better help people retain information, we can tally up the post-training emails we got before the training with the emails we got after, and we can have a pretty good estimate on the time savings in our work. If we’re not having to answer emails, we can get other things done. Plus, the manager in the department where people need to fix our team’s mistakes has very clear metrics on how much longer the cycle time is when they need to fix our team’s mistakes. Again, we can compare before and after data once there is an overhaul.
Me: Ok, let’s do it.
The Bottom Line
Like any good trainer, I’m always desperate to find some way, any way, to measure the impact of my training projects. Counting how many training programs now have adult learning integrated into their design, however, is not a very valuable metric. Nor are Level 1 satisfaction scores (thus, Will Thalheimer’s book).
Here are a few training needs analysis questions I’ve found to be essential to demonstrating value of training initiatives:
- If someone wants to “show improvement”, ask: do you have baseline data? Anecdotal evidence is too squishy to show improvement. If there’s no baseline data, figure out a way to work with the client to identify or collect baseline data. Otherwise you may not be training to address the right issue/problem.
- If someone wants to make people aware of something, ask: why? Awareness is nice, but it rarely sticks. If the client responds to your question “why” with “so that our audience can…”, then maybe the training should be more than about making people aware of something. Perhaps the audience needs to be aware of something in order to improve how they do something (see question #1 above). Perhaps they need to be aware of a new policy so they can adjust behavior (instead of telling them what they’ll need to be aware of, perhaps you should give them an opportunity to try that new policy out in a practice setting).
- If someone brings an ongoing performance problem to you, ask: are you sure this is a training issue? Sometimes it will be a training issue (see question #1 above). Sometimes it may be a matter of providing a job aid. Or coaching. Or mentoring. Or a shift in team structure. Or motivation. Training can’t solve many of those things.
What are some of the questions that help you identify needs and set you up for effective ways to measure your impact?