The Problem with Millennials

This week’s post is written by Ross Garner (@R0ssGarner). Ross is an Online Instructional Designer at GoodPractice and the most recent addition to the #chat2lrn crew. Unfortunately, he is one of the golden generation: born between 1980 and 2000, he expects to be praised for the slightest effort and loses interest after 140 characters…

Image courtesy Flickr user musyani75.

Image courtesy Flickr user musyani75.

There’s a problem facing today’s organisations: millennials, those self-satisfied young folk born between the 1980s and early 200s.

Walk into any office and you’ll see them sitting there, smugly Snapchatting each other, getting distracted by smartphone notifications and popping out to run personal errands.

They’re narcissistic, pop-culture obsessed zombies: the product of participation medals, child locks and E! online. They think that a tweet is an appropriate medium for corporate communication and they get all of their news from Facebook.

They were told from an early age that they could be whatever they want to be, and they don’t half go on about it.

No job is ever good enough. No sooner have you hired one and spent a fortune on training than they’ve handed in their notice because something better has come along.

They don’t mind working evenings and weekends, but want to take time off whenever they please. They say they value feedback, but are crushed by the slightest criticism. They walk right up to the CEO with their great new business idea and expect it to be implemented immediately.

And don’t worry: they won’t be offended by all this. They won’t even have read this far. After 140 characters they’ll have switched to a new tab without even realising what they’re doing.

Or will they?

When I was at the ATD conference in Orlando earlier this year, almost every session I went to mentioned millennials. “Training needs to move online or millennials won’t engage with it”. “You’d better include badges so that millennials feel like they’ve been rewarded”. “You’d better chunk your training because millennials can’t focus for more than two minutes”.

But what’s the truth here? Do millennials have different expectations to their older colleagues? Do they learn differently? Do they even exist?

Join #chat2lrn to share your views and thoughts on “The Problem with Millennials” Thursday 8 October 8.00 PDT/11.00 EDT/16.00 BST.

Do we take our SMES for granted?

Today’s post comes from Fiona Quigley (@fionaquigs), chat2lrn crew member and Unappreciated SMEDirector of Learning Innovation for Logicearth Learning Services. The chat serves two purposes this week.

First – to introduce you to a brilliant DevLearn 2015 session being run by our very own Andrea May, and her able conspirator, Dawn Mahoney.

Entitled, From SME Smackdown to Nirvana, you can read more about it here.

The session runs on October 1st and if any of you are going to DevLearn this year, I’d urge you to consider going to listen to these two fabulously knowledgeable ladies. Andrea and Dawn aim to try to get us out of the ‘kick the SME’ habit and see how we can really get under the hood of what makes them tick.

The second purpose of this blog post is to consider more fully the purpose, role and usefulness of communicating well with the SME.

So that is today’s question. Is the SME a gift that we undervalue?

A SME, in case you aren’t aware is a Subject Matter Expert. Traditionally, it is the person or persons that eLearning folks use to design eLearning content. If you talk to any instructional designer or indeed eLearning project manager, it won’t be long before they are sharing their ‘SME war stories’. In fact, along with the LMS, put a room full of IDs together, and most of the conversations are likely to include SMEs.

The SME is vital to the success of an eLearning project. It is the SME who sets the tone and depth of the content, as well as (hopefully) helping to provide an insight into the target audience. But it is often east meets west when it comes to ID and SME understanding what each needs from the other.

So this week’s chat, I’d like to focus on the relationship of the SME to the eLearning project, and to also think a little beyond the traditional purpose that most of us attribute to the SME.

What is it all about?

If you think about what we demand or need from a SME, it is a bit of a tall order. First and foremost, the SME has a day job. Secondly, they aren’t likely to know much about eLearning, never mind training. Some SMEs are trainers, and this can often help, but by and large, SMEs are ordinary folks who happen to just know a lot about a particular subject.

When you think about the average eLearning project, it is often very time pressured and has a fairly narrow scope in terms of the knowledge and skills we wish to impart. That means from get go, SMEs have to take on a lot of rules in which to impart that knowledge. And most of us involved in learning know that once you put someone under pressure with lots of rules and caveats, it can stiffle communication.

Often SMEs are thrust into the fray because:

  1. No-one else wants to do it
  2. No-one else has time
  3. They didn’t know enough to say no! (or they had no choice)

Understanding where the SME is coming from is a vital first step in building a good relationship. To this end, working along with Dawn and Andrea, we’ve come up with a magnificent seven SME archetypes:

Magnificent 7 SME Archetypes

What we were aiming to do here is to help people think of the SME as someone that we should at least meet half-way. It is important to spend time understanding the SME’s pressures and how to work with them to make imparting their knowledge and insights as easy as possible. We’d argue that it is up to Instructional Designers, along with Project Managers to manage the eLearning development process in such a way that the SME is setup for success.

Everyone is a SME in this modern age!

At a quick glance, you might think we’ve been a bit harsh with the names of our SME archetypes. But on closer inspection, it is more subtle than that. Who hasn’t felt clueless, or unfocussed or a little bit control freakish from time to time on a project? Remember that feeling, because it will help you to help your SME.

However, in this networked world, we’d argue that getting ahead in the workplace is just as dependant on the knowledge and relationships in your network as it is on your own knowledge. A couple of weeks ago, I was interviewing two SMEs and it struck me that the types of precise, targeted questions that I was asking would actually serve me well in most business conversations. Are the skills I use for helping SMEs to impart their knowledge really just good communication and listening skills, and should I apply/practice them more widely?

So join us for #chat2lrn on Thursday September 24th, 8:00 PDT/11:00 EDT/16:00 BST. Can we leverage good SME communication skills to help all of our business relationships?

Internal vs. External

Today’s post comes to us from #chat2lrn crew members, Meg Bertapelle & Holly MacDonald.

Meg is a Senior Instructional Designer of Clinical and Product Education at Intuitive Surgical, a medical device company which makes the da Vinci Surgical System. You can find her on twitter at @megbertapelle

Holly is the owner and principal consultant of Spark + Co, a boutique training company that provides custom training solutions to organizations for employees or customers. You can find her on twitter at @sparkandco  

Startup Stock Photos

Meg and Holly were chatting about the differences between internal and external L+D work, and captured some of their observations in this blog post.

What’s the biggest challenge you face as an internal L+D expert?

Meg: I would have to say that we run so lean sometimes, that our team isn’t able to really do our best work under the timelines & sheer number and scope of projects assigned to us. Always having to compromise on the best solution to get an OK solution out the door eventually gets exhausting.

What’s the biggest challenge you face as an external L+D expert?

Holly: Typically the biggest challenge is communication. Working with such a range of clients, some of whom are brand new to e-learning, others who are familiar with it means that we are constantly having to check assumptions, confirm things and keep those lines of communication open.

How do you deal with analysis as an internal/external?

Meg: Our fall-back position is always surveys and focus-groups, but sometimes the timeline of a given project just doesn’t allow for those methods, and we have to try to extrapolate information about the need from internal folks that work closely with the true audience. Our company just recently created a data analytics group that will work cross-functionally to gather what data we can directly from our products, and will advise on other ways to incorporate data gathering as learning experiences are designed and revised. I’m very excited about this because we might actually get real (not anecdotal) information about the gaps in our current materials and processes.

Holly:I think it’s easier as an external to do analysis, since you need to get information about the client and the learning need before moving ahead. I think as an external, you get more latitude to do an analysis. That being said, sometimes you find out that the problem is not a training one and those are not conversations the client always wants to have. But, if it won’t fix the problem, then they need to know.

What design challenges do you face as an internal/external

Meg: Usually time is my biggest challenge here. I would LOVE to be able to design tons of scenario-based practice activities; link directly to resources; provide everything our learners need in an easily-accessible, SINGLE place; and provide just-in-time and performance support for a truly flexible and end-to-end solution to all of our challenges. It just ends up being impossible while also keeping up with the project load on our team.

Another big challenge for us is that in order to meet deadlines, especially for product-related training materials, we have to split up design & development work between team members, and then struggle with the lack of consistency in the end result.

Like Holly, we have to adhere somewhat to the company brand guidelines, but thankfully (!!) more of the general feeling rather than the “letter of the law.”

Holly: Either the constraints of the “brand guidelines” where the client’s marketing team has decided to apply branding rules to elearning. This can really mean that you aren’t able to get as creative as you’d like. I usually try to find out if there’s a way to adapt the brand guidelines to elearning. To be honest, if not, then I’d actually consider walking away. If the branding overshadows the need to learn, then it can actually be an indicator of an organization that really doesn’t value learning.

The other common constraint is that the budget is not big enough to get custom design assets, so you head into your digital closet to see what you’ve collected and stockpiled over the years to use on the project. One other aspect that I’ve found challenging is to source great designers who get instructional design and/or elearning. I have found a few who kind of get it, but there is sometimes a tension around which designer knows best.

What implementation challenges exist as an internal/external?

Meg: Managing the different permutations of products released where & when – what system, what software version, where is it cleared, where is it launched, in what language… (did you hear that? it was my head exploding)

Holly: The LMS. That’s the biggest challenge we’ve faced with the implementation. Some clients engage us to work on their launch plan with them, but sometimes we hand it off to the LMS Administrator or IT department and that’s the end of it.

What do wish you could do that you can’t as internal/external?

Meg: I think I would love to be able to say “no” to a project that I just don’t want to do. LOL :) Honestly, since my biggest constraint is usually time, and I imagine that’s not that different in an external role, I’m not sure what else to wish for! Hopefully some of you in the chat will give me some good ideas that I can try to make happen internally ;)

Holly: I have been an internal before and I think the thing I miss the most is the ability to modify the program once it’s launched, or having a more flexible role to extend the program. As an external, you live and die by your scope and once the program is launched, it’s gone. We’re very lucky to have long term relationships with our clients, so we do get to do some of that with them. But, for some it’s a budget decision.

What do you think you could teach internal/external?

Meg: I have a lot of “tribal knowledge” of our business, so I think I could help an external person come up with a solution that would fit our organization, and make a business case for it. Sometimes the things that matter to the organization are not as visible to someone external.

Holly: After doing this for so long with many different clients, I think the thing I’ve really mastered is how to understand a client’s business quickly. I get to use my “ignorance card” constantly and coming at things from the perspective where you know little or nothing means you have a unique point of view. I have one client who often says things like: “I love how you make us think about things in ways we haven’t thought before.” When you are internal, it’s much harder to maintain that perspective. You need to find ways to do that consciously, otherwise you just end up making assumptions.

What about you? What have you found to be the benefits and challenges of being either an internal or external learning expert?

Let’s discuss during #chat2lrn on Sep. 10th, 8:00 PDT/11:00 EDT/16:00 BST

Hope to see you there!

Making Social Learning Happen!

This week’s post comes from #chat2lrn crew member, Judith Christian-Carter. Judith is a Director of Effective Learning Solutions, a UK-based learning services company. You can find her on Twitter @JudithELS

What’s happened to social learning?Social learning handbook

Maybe it’s just me but, after several years of hearing and seeing the term ‘Social Learning’ on an almost daily basis, it now seems to have faded out of frequent use. Forget Bandura’s social learning theory for one moment, as that’s not what we’re talking about here. Neither is the fact that social learning is an inherent human condition. No, what we’re talking about here is the way that learning and working is happening, or should be happening, in organisations.

It was back in 2011 when Jane Hart really pushed the whole idea of Social Learning into the limelight with the publication of the first edition of her extremely well-received Handbook. Back then, Jane, and others, were talking about and demonstrating how the social media tools of the day could, and should, emancipate people to become workplace learners. It was these tools that shaped social learning in the second decade of the 21st Century.

4 years on …

Not only do all these tools still exist but they have also been improved, made more Social mediauser-friendly, grown in number and, even more importantly, are now used by even more people than ever before. Just compare how many people are using tools like Twitter, Facebook, Skype, YouTube and Pinterest today with 4-years ago, and what they are using them for. Are people learning through the use of these tools? You bet they are! Are they using them even more for workplace and social learning, and if not, why not? Well, on that one the jury is out but it’s about to be called back in!

The #chat2lrn jury

Social Learning is not something you just talk about or read about, it’s something you do!” (Jane Hart, 2011). So, as a member of the jury, is social learning happening in your world or not? If it is, what’s making it happen? If it’s not happening, then what will it need to take to make it happen? 

Join the jury and discuss these and other questions on 27th August 2015.



The Learning Trap: Why Satisfied Learners and Knowledge Retention is Worthless

“Ajay is a Chartered Professional Accountant and a Certified Training and Development Professional but considers himself a Workforce Revolutionary. Ajay is a 3-time published author with John Wiley & Sons recently publishing his third book titled, “The Trainers Balanced Scorecard: A Complete Resource for Linking Learning and Growth to Organizational Strategy” ( Training Magazine recognized his company CentralKnowledge (and as the 2008 Project of the Year for their work with Apple Inc. He is also a multi award-winning writer receiving the 2014 and 2015 prestigious Readership and Editors’ Award for Editor’s Choice and the Top 10 most read articles. Ajay regularly appears on the #1 Montreal Talk Radio morning show discussing workforce performance issues.”


Learning practitioners are taught early, or should I dare say brainwashed, to believe the ‘essential’ four levels of evaluation. Many of us refer to these levels as the Kirkpatrick’s Evaluation Model and it has been a cornerstone in every learning event and also a foundation for many evaluation models that followed.

But let’s be honest, the unspoken truth is that the Kirkpatrick model is flawed. Yes, I dare say it out loud and may the learning gods, and some of my peers, strike me down. While you pick you jaw off the floor, the fact is that the evaluation method has some apparent issues. new-and-improvedWhile the Kirkpatrick organization will not admit to this publicly (naturally, since it is the foundation of the revenue stream) they are attempting to ‘adjust’ it accordingly by repackaging it as the ‘New World Kirkpatrick’. This reminds us of an ‘All in the Family’ episode where Archie and Meathead ask the question about a product being new and improved asking what was wrong with the original one, was it old and lousy?

But I digress. Let’s review the four levels. Level one refers to learning satisfaction. Simply put, this is what learning practitioners refer to as the ‘smile sheet’. This learner feedback process asks everything from did the learning meet your needs to whether the lunch was adequate.

Level two speaks to learning retention or simply put, do you remember what you are supposed to remember? Often this is considered through some form of ‘testing’. While this is what many practitioners accept as learning success, the Kirkpatrick model assumes that if the learner remembers the knowledge they will naturally apply it to their job. I’ll revisit this logic shortly.

Level three is about changing the learner’s behavior or in layperson terms, skills application. This level is the first ‘holy grail’ for learning practitioners. The logic is that if the learner retains the knowledge from the initial learning process then their behavior will change and become more effective in their job. This sounds reasonable and correlates to Level four.

Finally, achieving level four for learning practitioners is similar to wining the Super Bowl. This level refers to the learning effort having an impact on business and performance objectives. What the Kirkpatrick model implies is that if learning practitioners are able to connect their efforts to this level the will gain the admiration of their business leaders. Essentially, this is the promise of demonstrating tangible results for your learning budget.

Now, the Kirkpatrick methodology sounds logical and simple enough that learning practitioners are able to buy into the process but dig deeper and you will discover issues that undermine learning efforts.

To accept the premise of this post you must first accept that the role of learning in any organization is considered an internal business unit. Just like every other internal business activity whether it is accounting, marketing, or HR, learning is also held accountable to specific performance expectations for itself and how it contributes to organizational results. You don’t have to accept this premise. But if you don’t then you should also not question why your training budget gets reduced every year.

By accepting the reality that your learning efforts are part of the business and ultimately affects the business, hopefully positively, you begin to see learning from the perspective of your business leaders and business unit managers.

With that said, for any business level one and level two are essentially irrelevant. Think about it. Why would leaders care whether their employees like the learning event (level one)? It has no bearing on the business or expected results. Level one smile sheets exist for learning practitioners to prove that they are actually doing something that helps them to avoid getting fired from their job.

Every learning practitioner has done this at least once. They wave their smile sheet results to their leaders hoping that this will validate their efforts, similar to a child seeking the admiration of their parent and trying to get their work put on the family refrigerator.

Don’t believe that Level two is any better. Like level one, your leaders could care less that employees actually can remember any of the skills they learned. Like the smile sheet learning practitioners are quick to fly their successful ‘test’ results in their leader’s faces. The problem with level two ‘learning’ retention is that, more often than not, they are inaccurate or invalid. Why? Essentially, practitioners ‘game’ results in their favor, the knowledge tested is often irrelevant to changing learner behavior, or worse, the skills tested are not applicable to their job. Whatever the reason, the practitioner’s goal is a futile attempt to prove to leaders that their efforts are close to being effective.

wrong-wayLevel two is as irrelevant for the business as is level one. What your leaders expect is that employees actually apply the skills on the job. Their logic, which many practitioners ignore, is that if an employee is applying a new skill or knowledge that improves their performance it will consequently improve the organization’s performance.

Fundamentally, leaders are concerned solely about level three and four. In reality, this all you should be concerned about as well. Regretfully for the Kirkpatrick model, there are still concerns that practitioners must be made aware. Even Kirkpatrick found flaws and hence, developed a ‘new world model’, but lets not get into that now.

At Level three the need to change behavior is not as relevant as the need for leaders to see the actual application of knowledge and skills. As any qualified psychologist will tell you changing human behavior is something that happens consistently over time and not something any type of training effort can accomplish successfully.

Simply, your leaders see level three evaluations as the vehicle to meet pre-established performance metrics and not necessarily to change employee behavior. The question we are asked from practitioners is, “how do we connect to level three expectations?” The answer is quite simple. First, don’t create new learning measures to prove your efforts are effective. Your leaders and business unit managers have their performance metrics already set. All you need to do is to partner with the business units, learn about their performance expectations, and then proactively work with them to conduct a needs assessment to determine the required skills that will help contribute to achieving their performance metrics.

Finally, level four is what every practitioner strives to achieve. Keep in mind that while level four is what your leaders expect they don’t expect every training effort to meet it. And for those initiatives that must achieve level four expectations you are not alone in your effort. You leaders don’t expect learning to be the sole hero. Recognize that when attempting to impact business results to take into account the involvement of other internal activities.

Your leaders will never believe that your ‘level 4’ achievement is only a result of your learning solution. It is a cross-functional effort so involving many internal business processes. So take credit when due but also, give credit to those that deserve it. This will build your business impact credibility and ensure sustainable leadership support for learning.

Finally, never, ever go to your leaders and refer to the Kirkpatrick four levels. They won’t understand what you are talking about and frankly don’t care about your evaluation methods. Just sayin’.

Join #chat2lrn to share your views and thoughts on “The Learning Trap” Thursday 13 August 8.00 PDT/11.00 EDT/16.00 BST

Total Cost of Ownership – What is the ‘real cost’ of a learning intervention?

This weeks post is written by Lesley Price (@lesleywprice).  Lesley is a co-founder of the #chat2lrn crew and now, although supposedly ‘semi-retired’, she works part-time for Learn Appeal  and continues to love challenging and being challenged!  Lesley is Scottish and the Scots have a reputation for being ‘canny’ with money…so her challenge to you is: Does the Total Cost of Ownership of a learning intervention really matter?

When we buy a car, some folks only look at the purchase price, others may also consider obvious running costs e.g. insurance, road tax, petrol consumption. Some may take into account the cost of servicing and replacement parts, but I wonder how many factor in depreciation cost and how many years we expect to have the car before we replace it? I have yet to meet anyone who does all of this, puts the information onto a spreadsheet and then calculates the cost of having the car over a number of years. If we carried out this exercise prior to purchase, would we be able to work out which car would offer us the best value for money and the optimum time to replace it? Logic would say yes, as we would then know the total cost of ownership.

IcebergSo what has buying a car to do with learning?  I would suggest that as the picture says, ‘what we see often is only a fractional part of what really is’.   So the question I ask is, ‘what is the real cost of a learning intervention?

All too often we only consider the cost of the course itself or the purchase cost/license fees of either an LMS and/or a content authoring tool, but what about the other ‘hidden’ costs? Do we even know what these are?

When we consider face-to-face training, these are relatively easy to calculate, or are they? If we send somebody on a course that is held elsewhere, there is generally a flat fee, but do we include the cost of the attendee’s time? We are told on a regular basis that time = money, so if we expect colleagues to disseminate what they have learned during the course, how much does that cost both in terms of their time and the time of others who are learning from them?

If face-to-face training is ‘in house’, what is the cost? Should we include the trainer’s delivery time, the time the trainer has spent on creating learning materials, the time of all those who attend the course, the cost of the space used for training which takes place on the premises or might the training involve room hire?

This becomes even more complex when we move into elearning. Yes, we think about the number of licenses we need, but do we consider whether we will need more IT equipment? Most people would say ‘yes of course we do’, but if the system needs a dedicated server what is the cost of IT support of both the software but also equipment?

If we are offering an elearning programme that we are going to create, how do we put a cost on that? We have to consider the time it will take to create…that’s easy….it’s the cost of an Instructional Designer (ID). Mmmm….. but most IDs refer to subject matter experts (SMEs) to ensure the content is fit for purpose and that then takes up the time of the SME and how many do we need to consult?

The other thing we know about elearning is that we cannot assume that just because we create and build systems and content that people will use it. So we have to generate interest and awareness otherwise all the time that has been spent creating the elearning content will be wasted, but that also takes time and to reiterate time = money!

Let’s not forget that to implement a new system; we will need the support of the senior management team (SMT). How many of us factor into the cost of the intervention the number of meetings we have attended, on-going conversations and reports we have written to get SMT buy-in?

Ooohhh and lets not forget all the conversations we have ‘out of hours’ with colleagues and pondering we do ‘in our heads’ about whether or not the learning intervention we feel passionately about will make a difference.

I guess the ultimate question is so what? Does the Total Cost of Ownership of a learning intervention really matter? Do we really need to know the real cost and if we do, what impact does that have on whether we proceed or not? So many questions and probably even more answers. Join #chat2lrn to share your views and thoughts on Total Cost of Ownership #TCO Thursday 30 July 8.00 PDT/11.00 EDT/16.00 BST

Is This a Training Problem?

by Patti Shank, PhD

I was very lucky when I was a young training manager and had the opportunity to learn with Geary Rummler ( I truly believe that this training greatly helped my performance over the lifetime of my career. It provided a certain way of doing my work. The resource I will share with you will provide a brief synopsis of some of the thinking involved that I hope will intrigue you.

Why Care About This?

Training is an expensive intervention. We only need to provide training for one reason: People need skills they don’t have (or need to upgrade or re-establish their skills) and it makes sense to provide it in a formalized way.

When there are problems, such as people unable to do their jobs because of inadequate tools or not enough feedback about whether they are performing as needed (no performance standards), those problems must be fixed and training won’t solve the problem.

Example: A manager asks for team training for her staff because they are don’t work well together. In reality, she causes problems among them by how she treats them. She favors some over others. She provides more work and overtime to people she doesn’t like as much. Training might help this but she is the one that needs it. And before that, she needs coaching about the problems she is causing so the training might be valuable to her.

When we get requests (or demands) for training and we don’t determine if training has a good chance of solving the problem (or being part of the solution), we are creating a problem, not solving it.  Why?

We are using resources that could be better put elsewhere.

We are removing people’s time (when they are stuck in training) that they could be using towards better purposes. They could be using that time to get work done.
The problem doesn’t get fixed. (Think of all the resources used to not solve the problem!)
We look foolish and are unprofessional, and frankly, this happens too often. Who would hire a carpenter who couldn’t measure or build the right solution?

How Training Doesn’t Work

Example: When someone asks for customer service training but they have insufficient tools to answer customer questions or their process requires multiple workarounds, adding customer service training is a misplaced and expensive intervention.  They may need some training (or not) but they DO need better tools and an improved process so customers aren’t angry about being put on indefinite hold or sent to the wrong department.

Carl Binder’s discussion of Gilbert’s Six Boxes is a great introduction to thinking about what we need to do to have the type of performance organizations need and what influences these performance outcomes in the workplace. Read it and think about what part each part plays in your work. If you don’t think it fits in L&D’s world, we’ll have to disagree.

The Six Boxes: