Summary 1/03/2012 – Learning Measurement

On March 1st we chatted about measuring the impact of learning.  It was quite a lively debate, with some consensus reached and some topics leaving us still divided – we must be passionate about the topic!  The blog post, including Kelly Meeker’s (@OpenSesame) guest post, urges us to move away from measuring output, such as materials generated, attendance and completion of learning programs, and start measuring outcomes (changes in behavior and performance that impact on the business) because that’s wbat’s really important.  Not that it’s easy by any means…

Q1) Measuring the ROI of learning is difficult. Should we be concerned about it?

I thought it might be beneficial to summarize this section in a pros & cons list, as there were so many comments on both sides, many of which were contradictory.  All excellent points, so I hope this helps you think about whether or not YOU should be concerned with measuring some level of ROI or impact.

Pro Con
Business asks us to prove value of lrng intervention or L&D dept (continued employment) Impossible to measure ROI (cause & effect)
ROI = impact on performance, “was it worth it”, not necessarily bottom-line $$/££ (value doesn’t always mean money) – need “enlightened stakeholder” Effort to control other variables not worth it (and not possible; unethical to have a control group?)
Measure own performance & demonstrate success to others (build support/interest, be taken seriously in business) Term “ROI” fills everyone with dread (negative connotations)
ROI is the language of business Is learning even measurable? (tangible result?)
Significant part of learning evaluation Need quantification in monetary terms to measure ROI vs. cost of providing the learning intervention
Don’t know what’s making a difference if we don’t measure Expensive to meaure – more wasted money if the intervention turns out to not be worth it
Need to know impact to decide among choices (cost & is it WORTH that cost) Hard to create an operational definition that is relevant & credible
Don’t let what you think you can measure easily dictate how you measure improvement and value – they may not be the right measures (butts in seats, number of materials developed and completions don’t show the VALUE) Pace of change foils attempts to connect interventions to the bottom line

Q2) What should we measure to illustrate the impact of what we do – to ourselves and others?

There were a lot of ideas generated here for real measurements of our impact; some were more fleshed-out than others, and some contradicted others, but definitely a lively discussion!

  • Business impact and performance against business goals (need a scale  – a Blooms taxonomy for business efficiency!)
  • Agility
  • Service
  • Whatever stakeholders want to know (but coach on value of measurement)
  • What we want people to DO – performance objectives vs. baseline
  • State of problem solved vs. baseline
  • Time in which change was implemented
  • Based on goals of intervention
    • adoption
    • efficiency
    • satisfaction
    • survival
  • Compare to what happens if DON’T do intervention
  • Knowledge & skills
  • Performance improvements (vs. expected)
  • Reasons for “failure” of intervention
  • Not ALL learning, just those interventions invested in
  • Informal learning – justify time spent “not working”
  • Costs (money & time)
  • NOT the “busy-ness” metrics
  • NOT “tick the box” compliance

Q3) What data-driven decisions does your department need to make and what data is needed to make them?

In the discussion after this question, it seems that there wasn’t a real identification of the data we need to collect to measure impact. However, we did see more good ideas for impactful things to measure, which are added to the list above, as well as some side conversation about the challenges and strategies around measurement:

  • Find out what really helps people improve & give them that – TALK to people, “get in the trenches”, understand the work (the right thing to help may not be training)
  • It’s OK to fail!! We often learn more from a failure than a success
  • Try to do fewer “courses” and use less resource-intensive interventions
  • Increasing pace of change = difficulty in changing behaviors
  • Measurements should be customized to each initiative – one set of measures will not work for all
  • Placing more importance on measurement and control rather than individual experience of learning means we have lost our way [excellent point by @olliegardener]

Q4) How do your measurements link to business performance?

This question honestly seemed to be largely ignored in favor of continued debate and discussion about what we should be measuring, and a little of what some of us are currently measuring:

  • Make lives/jobs easier
  • Engagement
  • Compliance (completion only – not reduction of accidents/disasters)
  • Only things that are in the business plan/department objectives
  • 360° feedback

Q5) What do learning professionals need to do to measure more effectively?

Some ideas the group generated for us to consider working on as we attempt to measure more effectively:

  • Focus on outcome, not incremental steps (and not just attendance)
  • Understand business results, not just “learning” – engage business in advance & determine what success looks like
  • Get “in the trenches” & find out what matters to the business – more user/job analysis, get in touch w/ the work – LISTEN to engaged workers, facilitate, enable
  • Follow up is best measurement of performance – need reminders (i.e. 3mo, 6mo later); reinforce learning (should we allow learners their own pace?) (time frame less important than the fact of follow up)
  • DRIVE it – set objectives, verify, share what’s not working
  • Learn about measurement techniques from other sectors
  • Educate the “interpreters of the results” (can be really difficult)
  • Be honest about effectiveness
  • Be firm when you know training is not the solution
  • Speak business language to be heard
  • Be confident that L&D adds value – more inclined to look for proof

QWrap) Chatting is great, but reflection & action are better. What is your ˜take-away’ from our chat?

@JudithELS Think I’m going to write a blog post on ROI: the why’s & wherefores!
@elearningguy My take-away is a good reminder of the importance of agreement on measurements: what/how/before/after
@FionaQuigs Lots of different ideas of what to measure need convergence. Conversations with ppl in your org. Agree what success means
@dropthepencil definitely   follow-up.  So often forget best practices, because I fail to practice them -gah-
@Melissa_Venable No one-size-fits-all method. Know your org: expectations, values, needs. Communication, context, culture all key
@owenferguson My take away: ROI is still a topic that we’re not completely comfortable with. At least the debate is getting sharper.
@KathyJeep As the newbie to my org, learn more about what measurements are already in place. Try to map to est quality measures.
@pattishank We need to be more focused on doing the right kinds of measurment (and we need to learn how)
@lesleywprice That there are still two camps regarding ROI and we need to understand what it means to different people
@klaceyd Use next financial year to shift the way & the what of defining and measuring learning!
@owenferguson I’m delighted that we didn’t have a single mention of Kirkpatrick though … oops …
@AndrewJacobsLD Make sure we’ve no pockets of using the lazy options of measurement just because that’s the way it’s always been.

Have any of you followed up on your take-aways?

Thanks for a great discussion & hope to see you next time as we talk about social learning – a mentality March 15th 16.00GMT/11.00EST


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s