4 L&D Wishes for 2015

My wish list for corporate learning 2015. Just four, all of them L&D maturity indicators.2015

1. “Blended” fades

No, we don’t stop blending. But we stopped talking about it because slowly but surely everything became more or less blended. At design time, all options are considered equally, without having to discuss technology-mediated versus no-tech solutions in a discussion that (quite artificially) precedes design. The use – or rather, the non-use of the term “blended” becomes a sign of maturity in L&D organizations.

If your L&D plan document for 2015 contains many instances of the word “blended”, then your department is behind the curve. Those ahead have embraced technology and learning solution design by judicious use of the right affordances for the right learning activities and outcomes. There is no need to talk about “blended”, just as we didn’t use a special term when printed matter, whiteboards, overhead projectors, radio or video became part of the L&D toolbox – it’s all part of the toolbox. Do you still “blend” today? If so, you must hurry, there’s a lot of ground for you to cover.

2. The LMS evolves

In the US, male workers hold on average 7 jobs in their first 10 years. In Europe, graduates hold an average of 1.6 jobs in their first three years (“Job Mobility in Europe, Japan and the US“). Still, the typical LMS of many large corporations provides a closed, non-portable experience for learners, with data owned by the company and not by the individual. And while we have witnessed the consumerization of IT reshape how organizations think about technology use, many have not moved an inch when it comes to learning data storage and portability.

Fortunately, the LRS and Tin Can specifications make it possible for learning systems to talk to each other, and do so in a richer vocabulary than what was available under SCORM. 2015 marks that inflection point where portability and Tin Can become one of the high priority items when considering any LMS upgrade or replacement. We have empowered employees in many ways, but not in how they choose and document their learning. Let’s start to make that happen in 2015.

3. The tech dust settles

It’s been a great year for designers and trainers who are not bound by strict rules on the type of technology they can use in the workplace. It has been, however, a year of experimentation more than fully developed applications. The app range supporting learning and training activities is vast, but fragmented, and in some cases not far from basic proof-of-concept builds.

My wish for 2015 is that it will mark the year where new patterns for L&D app development will start to emerge, with more focus on affordances that support specific learning activities and goals, on good integration paths, and less on technology itself. Only the disciplined developers who offer cross-platform solutions coupled with solid support will see their projects move on to 2016 with a substantial customer base.

4. We let MOOCs be

No more MOOC-derived acronyms please. Let’s leave MOOCs where they belong, a connectivist, aggregated, open learning space. Sorry, Coursera and Udemy, you are not it (last time I checked you don’t look like this), and no acronym rehash will ever make you closer to a true MOOC. The original design is an inspiration, a building block – let’s thank Downes, Siemens, Cormier et al. for their contribution, and let’s use that building block to create new pieces that fit the corporate and commercial contexts. But please let’s stop pretending that a paid-for, canned piece of self-paced individual elearning has anything to do with MOOCs.

In 2015, L&D teams stop feeling the urgency to do “something about MOOCs”. First, because they have truly understood what they are, how they work and what aspects can help learning and development in organizations. Second, because they understand that corporate learning and the MOOC concept are partly incompatible, and there is no point in mixing water with oil, or calling oil by another name so it doesn’t look like you are trying to mix the unmixable. Social learning and personal learning paths you say? Now that sounds like being on the right track. Add an LRS to that please.

What do you think?

Are these part of your L&D wishes for 2015? Do you disagree with some of my “predictions”? What would you add? What would you change?

Hacking course surveys

Course surveys: Is this cheating?

Course surveys: Is this cheating?

If you had a simple, honest system to sway course result surveys by as much as 20% to your advantage, would you use it? What would this mean to the reliability of course surveys filled out by learners after ILT, webinars and elearning deliveries?

The setting

Imagine your work environment allowed you to deliver the same learning solutions both in person and as distance offerings. Imagine part of the evidence used for measuring results is the “exit survey”, the one learners complete at the end of the offering. Imagine it contains several Kirkpatrick L1 questions in a 5-point Likert scale. Sounds familiar?

I have observed that opinions about exit surveys are somewhat polarized. Some businesses believe they are useless, and focus on results. On the other hand, and perhaps an opinion that gravitates closer to line managers, the exit survey is everything.

I’ve had the opportunity to experiment and see what evidence I could bring to the business to reaffirm or discredit exit surveys. So I created several experiments, and this is one of them.

The experiment

Taking advantage of a newly scheduled batch of offerings , I decided to focus on one specific question within the survey: “Will you be able to apply what you learned back in your job?”. Yes, this question carries an assumption, and as such it was probably not the most scientific choice for an experiment. I chose it because if I was able to influence answers, it would be more impactful than others such as “Did you enjoy the course?”.

Next, I planted the experimental piece in half of the scheduled offerings. I’m not talking about a design change, just a small addition: at certain points during the course, mostly after discussions, exercises or section recaps, I would insert a comment such as: “This is something you will be applying during your day to day work” or “…and this is why it’s so relevant to our jobs”, etc. Only one sentence, only after section summaries and brief recaps, always linked to the wording of the survey question.

Then I delivered the two versions of the course several times, both in their face-to-face and over-the-wire flavors. The number of delivery times was again somewhat limited to be a proper scientific experiment. However, the results were surprising.

The results

While the survey results were largely consistent across all versions, my experiment question had improved one full notch relative to the control group. That’s a 20% improvement on a 5-point scale. What’s shocking is that I didn’t change the course content or design, and that the delivery method (online or face to face) was immaterial. All I did was make a small change in the experience of the course, and that change mattered. I believe we deliver experiences rather than content, and this is why I like to conduct small experiment like this one where content remains largely unchanged.

Do you think these experiments are “cheating”? Have you tried something similar? Are we really hacking course surveys? If you obtained similar results, would that change the way you think about measuring learning effectiveness?

3 reasons to manage elearning terminology

Dictionaries will help manage elearning terminologyNo, by “elearning terminology” I don’t mean the glossary that may appear at the end of some online courses. Elearning terminology is about the language that is used within learning solutions, and also across, if as most elearning designers, you manage more than one project for the same customer.

Elearning terminology is the common vocabulary you use to describe things and actions in your elearning solution. Click, create, canvas, adhesive, versioning software. Any “thing” or “action” that is relevant to the learning task you are supporting becomes a term.

Why terminology?

Why is elearning terminology important? Here are three important reasons to consider a good terminology strategy.

  1. Learning effectiveness. Your elearning solution is a small piece in a bigger puzzle, where tools, processes and things already have their names. It is imperative that in order to provide the most effective learning solution, your elearning uses exactly the same words (terminology) that are used in the workplace. The effectiveness of your solution also depends on how seamlessly it integrates with other existing materials such as manuals, workplace signs, compliance standards documentation, manufacturer datasheets, etc.
  2. Usability. Learners must deal with many platforms, software and devices during their day. Make their lives (and their learning) easier by creating usable elearning solutions that use exactly the same terminology they are used to see and read elsewhere. You may think it’s not that important whether we click, push, touch or press a screen button, but in fact the choice of terms have potentially huge consequences in elearning usability. Make sure you choose the right term and then stick to it throughout your solution.
  3. Translation. If you ever think about translating your elearning solution, then having a solid terminology base and having elearning solutions that strictly adhere to that terminology will go a long way in ensuring that you minimize the cost of translating your elearning solution. The inconsistent use of names and verbs in an elearning solution make it incredibly hard to translate while keeping a minimum level of effectiveness and usability. Well managed terminology means cheaper translations.

Getting started

So how do you manage terminology? Here are three tips to get you started:

  1. Create a simple glossary that contains key names and verbs used in your elearning solution. A simple glossary in spreadsheet format can help manage your elearning terminologyThis will depend on the subject matter, but also on the industry, the hardware and software platform used, as well as any existing documentation already in use in the workplace.
  2. For each entry, enter a brief description of when to use it, and also any other terms that you should avoid in favor of the chosen term. For example, if you decided that on-screen buttons on your platform are “clicked”, then also add a note saying that the verb “push” is not acceptable, but has to be replaced by “click” instead.
  3. Before each major milestone involving text (audio scripts going to recording, text going to prototype, etc.) make sure you perform a search (& replace, where appropriate) to wipe out any non-compliant terms.

Of course there are many advanced tools for managing terminology more efficiently, but for small projects or where you are not likely to handle more than 100 terms, then these simple steps will ensure your elearning terminology management improves learning effectiveness, usability and any potential translation work.

You will also have another strong argument to be involved in the next elearning project for this customer: you already know the terminology well and have the documentation to prove it; it will be easier for you than for competitors to apply it consistently in future learning solutions.

You, the elearning storyteller

Dear Jane,

I am going to be blunt: No, we don’t need a storytelling workshop. No, I don’t think we need to review our solution portfolio to include storytelling.

It’s not that I don’t believe in storytelling. I get it. But frankly, I think we are all well beyond the basics of writing a story. In fact, I believe we are accomplished storytellers.

But wwoman-reading-a-booke are not that good at reminding ourselves of the things we do well. Or perhaps the hype that surrounds some topics drags us into self-doubt. In hindsight, making so much fuss about that Forbes article a couple of years back was a mistake – it turned us blind to our own storytelling abilities.

So it’s almost insulting, yet vital that I remind you of your own portfolio. No, not the work you do today: I’m talking about the elearning work you were doing back in 2012, before you read that article. Here’s what you will find:

Case studies

Remember those compliance courses where you used case studies with narrative? Rather than explaining how to be compliant, you showed actual cases of good and bad compliance behaviors and results. Then, you let learners explore what makes them compliant. That, Jane, is storytelling. Yes, that was back in 2012.

Personal accounts

In solutions ranging from onboarding to change management I see good elearning examples where you used senior and executive videos capturing values, humbling and inspiring experiences. Candid stories told in the first person. That was pre-2012 storytelling too.

Vicarious learning

Although some people may not see it that way, I believe that learning by watching others learn is also telling a story. Their mistakes, their wins, all captured in a reality show of sorts where personal experiences as they discover new skills become the narrative. More stories, your stories.

Past events

One of your favorites: in your brainstorming and idea generation workshop, you explain how 3M’s Post-It notes were invented. You use a story, the story of Arthur Fry.

Dramatizations

So turning to your Health and Safety portfolio, how could you forget the hassle of finding “actors” for the manufacturing plant accidents, or even better, getting permission to cause the “accidents” in front of a camera? To us, those videos were artefacts in an accident prevention strategy. For the learners… they were stories.

All these stories are the result of planting the right foundations: using personas and scenarios. Personas help us connect with learners as we work through the elearning design, and they make storytelling much easier, authentic.

I hope it makes sense now when I say you don’t need a storytelling workshop. In fact, you are in a position to deliver one. After all you, Jane, are an accomplished elearning storyteller. So please, please stop reading posts about storytelling. You should be writing one.

Thanks Jane. See you on Monday 🙂
Antonio

Writing Learning Objectives

Good learning objectives capture the essence of a learning solution. From the results of a gap analysis to how the solution will be measured, they are a snapshot of what the ADDIE cycle will look like. They have multiple audiences: business owners, designers and learners must all find the right answers in every learning objective.

In addition to Bloom’s cognitive domain map and some sample verbs, I’m adding two important checkpoints as part of the writing learning objectives process. First, ensure that it is clear “what” will be learned. I often find that a great deal of content is in fact reference or a job aid, but not necessarily something that must be learned. It is also important to state “why” learning it is relevant to the learners and to the business.

The second checkpoint ensures that we will be able to measure results by describing specific, observable and measurable goals. Being specific means carefully avoiding verbs such as “consider”, “be familiar” or “see”. They are ambiguous and hard if not impossible to assess.

Should every learning objective follow these rules? I tend to break the rule on occasion, leaving space for humor, participation and learners who want to go beyond the scope of the solution. Still, I keep this handy reference present whenever I’m writing learning objectives.

Writing Learning Objectives

How to say “no” to unnecessary training

Sooner or later, your L&D department will receive a request to create a training solution that you know is not necessary. You know it won’t address the root cause of the performance problem, and you would be adding an unnecessary training item to your catalog.

So how do you say “no” to unnecessary training in a way that doesn’t hurt your relationship with the business? Can you say “no” and at the same time initiate a productive conversation that points to the source of the problem?Say No to unnecessary training

There are many approaches to achieve that, but in this post I am going to share a small downloadable template that has worked quite well for me in some contexts. Borrowing a page from Human Performance Technology principles, it explores in a systematic way the factors that may be affecting performance.

30 Minutes

But wait… Human Performance Technology… that sounds like a long consulting process, right?

Well, no. I learned my lesson and know that many teams are too fast and nimble for a more conventional consulting process. I have always completed this template with the business leader that requested the new training in about 30 minutes, and in most cases it was an eye opener for them, both in terms of offering a solid, well justified “No” and in pointing to the right areas to tackle performance.

There is just one rule for filling out the template: all the data must be provided by the business owners. Your role is to facilitate and clarify the use of the template, not to contribute data. In practice, I always keep control of the file by projecting it on a large screen and typing all the information. Business leaders can then focus on reflecting so they provide an accurate account of the system they are trying to improve.

How does it work?

The template guides you through a review of environmental and individual factors that affect performance. The goal is to mentally review each of the areas and assign a score. It doesn’t have to be very accurate.

In fact, if you have more than one business owner at the table, expect some disagreement. In those cases, do not try to drive consensus; instead, simply enter the average. Remember, you are trying to complete this in around 30 minutes so you can use the remaining 30 to discuss what (if not training) can be done to address the problem.

The template has questions grouped under six areas, taken from Performance Improvement literature, derived from original work by Gilbert (2013):

  • Information
  • Resources
  • Incentives
  • Knowledge/Skills
  • Capacity
  • Motives

You will find three questions under each of these areas. Feel free to customize them according to your context. They should represent a substantial portion of one of the six high-level areas above and be potential causes of poor performance in your context.

Briefly explain that you are asking your business owner contact to provide their perception on each of these areas and questions in a 9-point scale. When answering each question, they should keep in mind the problem or opportunity that they believe justifies new training.

Don’t let them dwell on any question for too long. Average out, compromise, put a dot on the scale and move on. By the end of the exercise, you will have scores calculated for the 6 areas, and those requiring more attention highlighted in red and orange. Is Knowledge/Skills one of them? If not, training is most likely not part of the solution, and you have one or more areas highlighted in red. Use the remaining 30 minutes to talk about them.

And yes, perhaps training is part of the solution. If so, make sure you develop a learning solution that ties in with everything else, particularly what also appeared in red or orange.

Use carefully

Finally, this is not a template I would use on every occasion. But when I am going to meet number-driven, time-poor business owners who are looking for a quick performance analysis and are genuinely open to suggestions to tackle a known, well-defined problem, this approach has worked well for me.

The template has been pre-populated with a fictitious example where I have been asked to provide training to employees who are not completing incident reports on time, and the reports are not descriptive enough to address the incident. The business owner wants me to develop application training so employees use the report filing software efficiently, and also a writing skills workshop. Thanks to this template, we conclude that the team will need to be more explicit about why filing correct reports on time is important to the business, and will need to tweak the incentives that support this activity. No new training is developed.

Download the template

Practical use notes: This is an Excel 2013 file that uses conditional formatting. I use a bullet to place my score, but any symbol or letter will do; just make sure all other cells are empty so they are not computed. Also, if you modify the template by adding more questions, keep in mind that the formulas may require some editing.

Human Competence: Engineering Worthy Performance (2013) T. F. Gilbert, PFeiffer

Translating elearning into other languages: 6 tips

Learning designers who only start thinking about the translation of an elearning solution when the original language has been released face unnecessary additional costs and, potentially, a lengthy list of minor corrections that will lead to an inconsistent, expensive, hard to maintain solution. Here are some recommendations for designers who want to minimize cost and hassle when creating elearning that will be translated into other languages:

  1. Would learning goals change? Subjects usually covered by elearning modules such as compliance are likely to change slightly or a lot depending on the country or region. Do all the goals still hold in all the geographies and legal contexts where the solution will be deployed? If not, bring these back to the design board and look at ways to integrate them seamlessly at all stages, from course catalog filtering to navigation to assessment.
  2. Are your elearning platform and design truly global? Typical platform limitations that may
    Writing text from right to left changes the layout of menus, navigation and other interface elements

    Writing text from right to left changes the layout of menus, navigation and other interface elements

    give you a headache, or even prevent you from offering a solution in a specific language are the inability to display right-to-left text, navigation bars and menus (languages such as Arabic and Hebrew are right-to-left), the inability to handle multiple character sets in a single module, and lack of flexibility in handling formats such as time, date and currency according to the rules of the target language. Typical design limitations include the inability to allow for longer terms and sentences (for example, by stretching button sizes or text boxes) and assumptions with a language component -such as puns, grammar construction and metaphors- that are embedded into visual interfaces or activities.

  3. Have you included personas that fit the target language? Personas describe your audience
    Image representing a persona document

    Personas describe your learners in rich detail

    in rich detail, and this description informs the design process. An elearning design that does not include personas in all target languages may not be a complete solution, as you may be overseeing important factors. You can learn more about personas in this post.

  4. Is the content culturally acceptable? Gestures, flags, pictures, maps, colors, popular sayings and many other things we believe to be harmless can be highly offensive in other cultures. Don’t forget those day-to-day objects that people usually remember by brand, such as Sharpies and Post-Its, and measurement expectations, such as inches and centimeters, gallons and liters, etc.
  5. Are you creating text-dependent videos? If you are shooting real-life scenes that rely heavily on the text that appears on screen, have a clear description of what has to be filmed again in a target language and be aware of the cost. For software, even though capture solutions such as Camtasia make it very easy to translate and rebuild the video in very little time, consider what is being captured too. For example, if translating into French, will you need a French version of the operating system and application you are capturing? Text can also be your friend: subtitling is an acceptable alternative to fully translated video, although in some countries you will find that regulations call for a full translation.
  6. Have you estimated and included translation cost and effort into the overall project plan? Video translations with native actors, voiceovers, text translation, system builds and applications in native languages… have a complete inventory and explore options beyond giving everything to the same provider – sometimes you can get text translated by a vendor that specializes in the terminology of your field, while another one can get better quality media translations.

Rather than thinking about “translating elearning”, it’s best to “design elearning for translation”. You will save time on the long run as your designs remain stable while your company grows to reach new countries and audiences. For more tips, go to elearningindustry, yourlearningworld and learndash.

On the “lock navigation” debate

Opinions about locking navigation in elearning are riddled with assumptions. These assumptions are not always disclosed as part of the conversation. So when I hear yet another “best” course of action regarding lock navigation, it usually feels like a cacophony: while most opinions are certainly sound (pun intended), we haven’t really agreed on the score.

to lock or not to lock

to lock or not to lock…

But untangle the web of assumptions, and you will find good arguments on both sides. Here are some of the assumptions I like to tackle before answering the lock navigation debate:

  • What we mean by “elearning”
  • The nature of the problem elearning is trying to solve
  • What “locking” means
  • The scope of locking

elearning is many things

One learner and one machine. This may be your understanding of “elearning”. Mine isn’t, I tend to think about a more integrated experience where learners have access to an environment where they can interact with each other. Call it social learning. An opportunity to involve management, SMEs, and others we would not think of as “learners”, but who contribute to the learning experience.

“Lock” in this environment could mean not allowing to post, for example. Or not being able to connect with certain contacts. Would you consider other forms of locking? Would your answer change if this was your understanding of elearning?

What problem are we trying to solve?

For companies in regulated industries, there is a need to document when employees learned, or at least had an opportunity to learn about certain policies or procedures. Every click on “Next” becomes a signature, and we need to collect and retain that proof.

In other environments, we don’t need employees nor employers to prove anything. We are just providing opportunities to learn. Why lock anything then? Keep reading, I think there is a strong argument for certain types of locking even in this context.

What “lock navigation” means

Does it mean I cannot progress through the learning experience unless I follow a certain sequence? Or is it related to my ability to successfully complete assessments? And does it mean the sequence is forced every time, or just the first time I go through the learning experience?

I often find that in discussions about locking, opinions are heavily influenced by the capability of the software that is being used to implement the learning experience, as opposed to the requirements of that learning experience. We should be able to think beyond the confines of those annoying functionality hurdles. Hacking, experimenting, engaging with the LMS authors. Don’t let the software win. Or define what “locking” means.

The scope of locking

Another common assumption is that when we talk about “locking”, we mean “page-level locking”. But what about module-level locking, assignment completion locking, prerequisite locking, timed locking?

In the pharma industry, there are very good reasons why you would lock one learning experience unless another has been completed. In a complex web of SOPs, sequence locking brings structure and order.

One of the most seasoned online educators, The Open University UK, uses timed block locking. This means that learners gain access to chunks of a course one or two weeks at a time. One of the reasons for doing this is so that learners turn their assignments at about the same time. But a more important reason is that social chatter is in synch across learners. Because they are all hovering within a reasonably ample selection, the principles of self-directed learning are not breached, but the larger group still stays within topic and can have a coherent chat about it. Can you see this type of locking being used for say onboarding employees together across geographies?

So would I lock navigation? There are very good reasons for using various types of locks, and sometimes for no locks at all. It all depends on context, assumptions and goals.

Why I Won’t Sign the Serious eLearning Manifesto

If you are here, it is quite likely that you have read the “Serious eLearning Manifesto“. When I first saw a reference to it, I immediately left what I was doing, excited and determined to learn more. Then, disappointment. I can’t, I won’t sign the Manifesto. Here’s why.

Blame is not the answer

The Serious eLearning Manifesto devotes two paragraphs to blaming others. “Most elearning fails to live up to its promise”… “trends evoke a future of only negligible improvement”.  Two paragraphs that, while short, constitute a whopping 33% of the manifesto. That’s right: out of six paragraphs, two are blaming the work of others.

I am sure the four “Instigators”, as they call themselves, are seasoned professionals. I am sure they have seen a lot more elearning than I have, including lots of poorly designed elearning. I have seen some of that too. However, I don’t claim to know the constraints and limitations that led to a particular piece of elearning. No matter how much I may know about the industry, I don’t know the specific circumstances that lead to bad examples of elearning.

For anyone interested in learning, we live in truly exciting times. Technology is finally converging with learning in meaningful ways. We have only scratched the surface. Many L&D departments, even some belonging to hi-tech companies, are still suffering from a painful dichotomy that is simply a consequence of this early convergence: L&D staff are either non-tech or tech, learning professionals or elearning professionals, but rarely both. The divide is individual and sometimes organizational, perpetuated by L&D hiring strategies. This causes tensions, inefficiencies, and yes, probably bad elearning. I believe this is transitional, and the profession will evolve to embrace technology while standing on solid adult learning theory and practice. It is circumstantial. I do not believe that “bad elearning” is a trend, particularly one that “evokes a future of only negligible improvement”. So I am not going to blame anyone for being in an L&D department in flux, trying to cope with the changes, let alone for their future work.

It would be easy to fall for the “us vs. them” rhetoric and somehow distance myself from the pack by signing a manifesto that blames bad elearning on others. But I believe that won’t help the profession at this crucial junction. If there is one way we are going to drive substantial improvement in the field of elearning, it’s by sticking together. I won’t start that effort by proclaiming that there is a lot of bad elearning out there. Work together, learn together and win.

Not exclusive to elearning

Moving on to the Supporting Principles. I had a quick look at them and to my disappointment, there is nothing that I would not say of any type of learning experience. Assume for a moment that you haven’t seen the title, and read the Supporting Principles again, with “learning” (no leading “e”) as the general concept in mind. Anything that doesn’t belong? Nothing? Well, yes, that is what I thought too: this is a set of generic learning principles, equally at home in the classroom, in the field, in elearning, in simulations and any other learning experience. Looks like a “learning manifesto” set of principles to me. Don’t get me wrong: there is goodness in every one of those principles. I just don’t see them confined to elearning.

Are these common-sense principles being applied consistently to elearning? No. But the same goes for any other type of learning. Shall we go and blame them too, draft a “Serious Classroom Manifesto”? OK, I think you get my point.

A value proposition

I admire the elegance with which the Agile Manifesto was written. Although proposing a sharp U-turn in terms of how software projects are run, it does so in a gentle, inclusive, respectful way. “While we see the value in this, we value that more”.

But there’s more. The principles behind the Agile Manifesto stand the test of time. They do not hinge on circumstantial evidence that points to bad software development (although it’s out there). And by doing so, by sticking to values and not “trends”, the agile manifesto will outlive many “future trends”.

Politeness, respect, values, timeless principles, no pointing fingers. Is there a learning manifesto written in these terms? I will sign that.