Revolutionize Learning and Development

#Chat2lrn is delighted to have a guest post from Clark Quinn. Clark Quinn, Ph.D., is a recognized leader in learning technology strategy, helping organizations take advantage of information systems to meet learning, knowledge, and performance needs. His approach is learning experience design, combining what we know about how people think and learn with comprehension of technology capabilities to meet real needs. He combines a deep background in cognitive science, a rich understanding of computer and network capabilities reinforced through practical application, considerable management experience, and a track record of strategic vision and successful innovations. He is a well regarded speaker and author on human performance technology directions. You can follow Clark on Twitter: @Quinnovator. See more of Clark’s views on this subject in his book Revolutionize Learning & Development.Revolutionize Learning & Development , Clark N. Quinn


 

Is Learning & Development achieving what it could and should? The evidence says no. Surveys demonstrate that most L&D groups acknowledge that they are not helping their organizations achieve their goals. It’s worse than the cobbler’s children, because they at least got others shoed, but here we’re not getting ourselves effective enough to help anyone else. Where are we falling apart?

My short answer is that we’re trying to use industrial age methods in an information age. Back then, one person thought for many and our training was to get people able to do rote tasks. There wasn’t a real need for knowledge work, and we were happy to filter out those who couldn’t succeed under these conditions. In this day and age knowledge work is most of what contributes to organizational success, and we want to foster it across the organization.

To succeed, there are a few things we need to get into alignment. The simple fact is that much of what is known about how we think, work, and learn isn’t being accounted for in L&D practices. We use courses to put information into the head, but there’s clear evidence that our thinking is distributed across information in the world. It’s also hard to get information into the head. So we should be focusing on putting as much information into the world as we can. We also now know that the way to get the best outcomes is to get people to work together, and that silos and hierarchies interfere. If we want the best outcomes, we want to facilitate people working and playing well together. Finally, we know that learning should involve models to guide performance, be emotionally engaging, and have sufficient, appropriate, and spaced practice. All of this is antithetical to so-called rapid elearning.

Underpinning this is the fact that we’re measuring the wrong things. We’re out of alignment with what the business needs; when we’re measuring how much it costs per seat per hour, we’re worrying about efficiency, and we’re paying no attention to effectiveness. It’s taken as a matter of faith that ‘if we build it, it is good’, and that’s empirically wrong.

Quite simply we need a revolution; a fundamental shift in what we value and what we do. It’s not redefining what we do completely; e.g. courses are still a viable tool, but they’re just one part of a bigger picture. There are two things organizations need: optimal execution of those things they know they need to be able to do, and continual innovation to adapt to the increasingly complex environment. Courses are only a part of the first, and essentially irrelevant to the latter. We need to incorporate performance support for one thing, and sponsoring innovation is about facilitating communication and collaboration. That comes from using social media (all of it, not just technology) in appropriate ways.

The upside is big. We can, and should, be the key to organizational outcomes. We should be designing and fostering a performance ecosystem where people can work in powerful ways. We should be shaping culture to get a workforce that is motivated and effective. If we do so, we’re as fundamental to organizational success as anything in the business. I suggest that this is an achievable goal and emphasize that it’s a desirable goal.

To get there, you need to ‘think different’. You need to shift from thinking about learning and training, and start thinking about performance. You need to take development to mean facilitation. L&D should be Performance & Development, or even Performance and Innovation. That’s the promise, and the opportunity. Are you ready to join the revolution? Your organization needs it.

Let’s discuss in #chat2lrn this week.  See you on Thursday, May 7th 8:00 am PDT / 11:00 am EDT / 4:00 pm BST.

Design Shortcuts for Surviving in the Real Worldcuts

Whechallenge1n we study design, we learn rigorous methods based upon sound research and elegant theory.  Then we hit the real world and are faced with deadlines, limited resources, and unrealistic demands.  How do we cope?  We generally choose some design shortcuts,

We generally choose some design shortcuts, heuristics, that give us what we believe to be suitable approximations to what we’d prefer to do in a perfect world.  These heuristics, experience-based solutions which may not be optimal but are often good enough to get the work completed, are often unexamined.

Our major steps in designing learning, whether ADDIE or SAM, still (or should) require determining meaningful objectives, creating essential practice, providing conceptual guidance and supporting examples, and creating a learning experience.  However, we might not do the full cognitive task analysis, simulation-based practice, model-based concepts, story-based examples, etc.  Some of our shortcuts are well-founded, yet some might also preclude us from exploring more effective alternatives.  Still, we need to be conscious of the tradeoffs.

For example, rapid e-learning tools make it easy to capture knowledge, present it, and test it.  Yet how often is knowledge the real barrier to success in performance?  Most of the research suggests that, instead, the emphasis should be on the ability to apply knowledge, not just recite it.  Knowledge alone isn’t sufficient for the ability to use the knowledge to meet workplace needs.  Do we find effective ways to even use these tools or are we just putting content on pages?

We need to be conscious of the shortcuts we take, the tradeoffs they entail, and reflect from time to time on where our practice in regards to where it could, and should, be.  What are the shortcuts we’re taking, and what are the assumptions they encompass?

This post was written by Clark Quinn, who is directing this week’s #chat2lrn tweetchat. Thank you, Clark, for your contribution!

How Do We Do It? Crafting Decision-Making Practice in eLearning

Today’s post comes to us from #chat2lrn crew member, Meg Bertapelle. Meg is a Senior Instructional Designer for Clinical & Product Education at Intuitive Surgical. You can find her on twitter at @megbertapelle.

Decision-makingI think most of us would be happy to build more scenarios and practice decision-making activities into our elearning projects (time permitting) – if we knew up-front how to plan and execute them. Sometimes the hardest part is knowing where to start.

This week, I’ve asked a couple of our #chat2lrn community members to share their experiences crafting decision-making practice elearning activities. Fiona Quigley (@FionaQuigs), one of our #chat2lrn crew,  is Head of Learning Innovation at Logicearth, an Irish learning services company with a global client base. They specialise in the production of modern multi-device elearning content, learning technologies and training support services. Laura Payette (@ljwp) now works at Nielsen, but is coming off a three-year stint designing and developing elearning and corresponding product/marketing communication for KPA, a dealer services and internet marketing provider for over 5,000 automotive, truck, and equipment dealerships and service companies across the US. Her DOT Hazardous Materials course won the National Excellence in Training Award from the Automotive Training Managers Council in 2013.

We’ll get some great information from them in the form of an interview, then we can all discuss the pros & cons, and crowd-source some suggestions during the chat on Aug. 7th. I hope this will help us all take the initiative to help our audiences start applying their new knowledge & skills right away.


Q1: What kinds of decision-making practice activities have you been able to incorporate in your elearning projects?

Fiona: I’ve designed a lot of content for health and social care professionals, especially in the area of communication skills and policy compliance. I have designed scenarios that present a typical patient interaction and then ask the question – what would you do or say next? That would be what I call a level 1 decision making scenario.

Higher level scenarios are more immersive and instead of leading a leaner down a pre-planned path, they include random events that differ each time you ‘run’ the scenario. I’ve used these higher level scenarios with nurses and pharmacists in areas such as medication management – reducing errors and also for dealing with patient complaints.

Laura: I spent three years building elearning for the automotive industry, particularly in the areas of environmental safety and compliance. There was a lot of regulatory information that had to be included and I was constantly challenged to find ways to make it relevant and interactive. One of the ways I did that was to inject scenarios into the training. Keep in mind that many of them were on the smaller side. In other words, I didn’t build a course around one big scenario with a million branching options (although that would’ve been so cool!). My content simply didn’t lend itself to that. Instead, I used smaller scenarios and sprinkled them in where they had the most impact.

Q2: What kind of planning steps did you take before beginning to write the activities?

Fiona: You must talk to real people who do the jobs. Observing people making the real decisions is the gold standard – but it is often difficult to get the opportunity to do this. You need to be careful who you chose as the Subject Matter Expert is. Often SMEs are senior ‘expert’ people who are very far removed from day-to-day practice. To help people practice real decisions you must talk to the people who make the everyday decisions.  I also like to structure conversations with SMEs into what I call a ‘DIF’ analysis:

  1. Difficult – what if anything, do you find difficult about this decision
  2. Important – what is most important about getting this decision right/wrong?
  3. Frequent – what frequently comes up, e.g. common myths/misunderstandings, good practice?

Often competent practitioners won’t be aware of how they make good decisions. They are unconsciously competent; so it is the job of the ID to turn this tacit knowledge into explicit learning. Once that learning has been made explicit, you can more easily share that with others.

There is also a difference between formal and informal practice. There may be formal rules in place about how someone does their job – but many competent practitioners create shortcuts as they gain experience. Being able to identify these ‘tips and tricks’ is very useful learning in itself.

Finally, I would also advise talking to people at various levels of experience. For example, talking to a novice in the area will help you see the challenges first hand, rather than relying on the recall of someone more senior who may gloss over these challenges.

Laura: Research! Obviously, reviewing content from SMEs and talking to SMEs is critical but, like Fiona said, talking to people on the frontlines — or who at least aren’t far removed from the frontlines — really helps build context for understanding the challenges that employees face in doing their jobs. Sometimes that access can be hard to get; it was for me. If that’s the case, use everything you can to tease it out. Think of yourself like an investigative reporter. In my case, I had access to a robust database with thousands upon thousands of real-life examples that had been logged. I also had access to people who could elaborate on those examples to help fill in the gaps. I relied heavily on them and went back many times to ask additional questions.

Q3: How did you determine appropriate activities that would simulate the real-life application of your learning objectives?

Fiona: Again – much like the answer to Q2, observe decisions being made, find out how people actually make the decisions and base the activities on what actually happens in the workplace – not what SHOULD happen. Too often in elearning, we are forced to idealise and formalise the learning process, which then becomes so far removed from reality that it loses credibility with the target audience! You often see this in elearning content where the scenarios are so easy that you don’t actually need to complete the course to be good at them.

For example, when we designed a Medications Management programme, quite a few of our nurses said that one of the most difficult challenges they had was doing the ward round, handing out medication and being interrupted by patients or family members. They said they needed to concentrate and focus on making sure they gave out the correct medication – often a complex range of drugs for patients with very different medical needs. Another source of concern was worrying about patients who found medications hard to swallow and not having enough time to spend with them to help them. Together, we came up with guidelines about how to resolve these challenges and built in a scenario challenge around this.

Laura: The scenarios that I wrote were usually an outgrowth of the content development process. In other words, I didn’t approach a course with a specific scenario in mind. I generated them organically as I pulled the content together. It becomes apparent in talking to SMEs and frontline employees and in reviewing existing content where the gaps in understanding and practice are. Those gaps were usually the places I chose to insert scenarios because they illustrated the performance issue and allowed employees to think through things by answering the questions. In some cases with my content, there were right and wrong answers (remember, a lot of it was compliance based), but there were also usually shades of gray — and it was in those areas that I was able to challenge employees through scenarios to think about their actions and the ramifications of them.

Q4: How did you evaluate the effectiveness of your activities?

Fiona: I normally ‘dry run’ the decision plan with a selection of the target audience in a focus group setting. It is important to have a range of different people with different levels of experience. Role-playing the scenario, trying it on for size, works well to see if it is a realistic enough representation of the actual day-to-day-job. I normally use simple post-it notes to visualise the decision and focus on:

  1. Decisions – what is the actual decision to be made?
  2. Knowledge/Skill – what knowledge or skills do you need to make the decision?
  3. Actions – what specific actions do learners take to make the decision?
  4. Consequences – what are the results of each action, for both good and poor decisions?

Laura: I ran the activities by stakeholders and SMEs, as well as a core group of what I’ll call advisors for lack of a better word. (They were internal employees who interfaced directly with the external employees I built training for.) If they responded by effectively saying, “Oh, that really made me think about things differently,” or, “That really caught my attention,” then I knew I had hit the mark. If they didn’t, or if they were confused by what to do or how to respond, then I knew the scenarios needed more work. I know that’s vague, but there’s really no set recipe for scenario building; it’s very context specific. I also evaluated the activities by looking at actual evaluation responses from employees who took the course once it was deployed.

Q5: What made certain activities more effective/impactful than others?

Fiona: The more realistic the decision and scenario – the closer it is to the learner’s actual normal workplace activities, the better. Not only does the decision need to be realistic, but so does the consequence. We don’t want to use phrases like “Well done, that is correct” – rather, we need to show actually what happens in the workplace.

We have a challenge in elearning in that we usually have to design for a very generic audience. That means we lose the nuance and subtlety that actually drives high-performance. If you look at what drives and helps people to perform at a high level, it is mainly about understanding the subtlety of communication that goes on around you. It is also about reacting to unexpected happenings – like covering for a co-worker or working when you are understaffed.  We need to make sure we build in this nuance and realism.  To do this well, we perhaps need to have different types of scenarios to suit different types of people in our target audience. As learning designers we just can’t go on accepting a once-size fits all approach to our learners.

Also – a by-product to this analysis is that you need to be open to the fact that not all challenges that you uncover will be solved by training. For example, for our nurses, we identified that adding a simple “Drugs round in progress” notice to the drugs trolley, helped to reduce the interruptions staff faced. Identifying these possible environment or process problems is a great benefit of doing good decision making analysis. If you explain this to your client upfront, it can also be a great motivator for them to really engage with you.

Laura: Fiona makes some very good points here. I think including real consequences to real situations, and writing them in the parlance your target audience speaks, is key. If you fabricate your scenarios they won’t be authentic and people will dismiss them. They also have to be contextually bound. In other words, you may see a great idea for a scenario somewhere and think, “I’ll put that in my course!” But if you don’t mold it for your audience/content and their specific performance needs, it won’t be a great scenario for what you’re building. I think sometimes the scenarios that are most impactful are those that address gray areas — the places where employees are a little uncomfortable or uncertain — and the places where the biggest performance gaps are.

Q6: Please share your top tips/tricks for crafting decision-making practice activities.

Fiona: I think I have covered most of these in answering the questions above, but to summarise:

  1. Talk to real learners of different levels of experience.
  2. Be aware of the formal way of doing something versus the informal way.
  3. Help your SMEs make their decision making practice more explicit by asking good questions.
  4. Have a range of scenarios to suit different types of people in your target audience.
  5. Dry run your scenario plan with representatives of the target audience and adjust accordingly.
  6. Find out why people are making common mistakes e.g. is it a process or environment problem rather than a training problem?

Laura: Fiona’s tips are great. The only thing I’d add is be sure to craft your scenarios in the language your target audience speaks so they sound authentic.

Thank you to Fiona and Laura for sharing their insights. What about you? If you have some experiences and insights to share, or just want to hear what others may have to say, please join us Thursday, August 7th for #chat2lrn at 8am PDT, 11am EDT, 4pm BST.

Design thinking: What does it mean to L+D?

Design thinking – what does it mean to L&D?

It doesn’t matter what learning theory or approach you follow, at some point those involved in the L&D world will have to…well…design something. Even if it’s informal or social, there is some kind of design involved, and there’s no denying that design thinking is a current media darling. It may show up as “UX” or usability/user design, which tends to live in the web development (or digital product development) world, or game design, fashion design, graphic design, behavior design…if you think of all the places that “design” can show up, it’s fairly ubiquitous.

What is design thinking?

“Design thinking is a human-centered approach to innovation that draws from the designer’s toolkit to integrate the needs of people, the possibilities of technology, and the requirements for business success.” —Tim Brown, president and CEO of IDEO, who are leaders in design thinking. They’ve even produced a whole “Design thinking for educators” site: http://designthinkingforeducators.com/ complete with free toolkit and helpful articles about design thinking. Their perspective is that design thinking is a mindset.

OR

“…is generally considered the ability to combine empathy for the context of a problem, creativity in the generation of insights and solutions, and rationality to analyze and fit solutions to the context. While design thinking has become part of the popular lexicon in contemporary design and engineering practice, as well as business and management, its broader use in describing a particular style of creative thinking-in-action is having an increasing influence on twenty-first century education across disciplines.” (http://en.wikipedia.org/wiki/Design_thinking)

The d-school at Stanford uses this approach, which is summarized in this graphic:

design thinking process

They share their approach in this PDF:

Sam Burrough (@burrough) presented a great webinar for LPI – which you can watch for free here: https://eseminars.adobeconnect.com/_a827192574/p2p64ap7wco/?launcher=false&fcsContent=true&pbMode=normal

You can see more design thinking ideas, resources and suggestions in the “Scoop-it” Sam’s curated: http://www.scoop.it/t/big-idea

Learning Design thinking

More perspectives on design thinking

Think about design thinking and how it impacts your work. Come to the chat on Sept 19th and share your thoughts.