Our eLearning Challenges blog series explores the common situations our clients face when deploying learning solutions. We use samples from real projects to help you uncover the best way to meet these challenges in your organization. This is Part 5.
No matter how confident you are about the learning solution you are producing, you need to verify what your target learners need. We’ve made this a regular part of all our eLearning projects. The job is not done until you have crafted an accurate learner profile and confirmed the course’s desired outcomes can realistically be met, given that profile.
Here are some of the ways we make sure we are hitting the mark with target learners:
Think Out Loud sessions
The best way to see things from the learners’ point of view is to be present while they take the course and gauge their responses. We conducted a Think Out Loud session while developing a recent course for Cummins. We had just finished programming the first draft of the eLearning course and wanted to gather true learner feedback before transitioning to the second draft of the programmed course. Because there was a considerable amount of highly technical content in the course, as well as a multi-layered post-test requiring a 100% passing score, we wanted to know with certainty what learners thought.
With a Think Out Loud Session, you ideally want one observer for each learner participant. Our observers watched the learners take the course from start to finish, tracking their start and stop times for each module in the course, and for the post-test. We encouraged learners to literally “think out loud” when they came to a screen that confused them, or that was easy for them, or that they liked (or didn’t like!). We weren’t looking for feedback on the accuracy of the content or for wordsmithing suggestions; we wanted to know how they felt about the overall learner experience. After the learners completed the course, our observers used a Think Out Loud debrief worksheet to ask them all the same questions, and then we debriefed the experience as a larger group so everyone could share their perspectives and elaborate when necessary. We asked questions like:
“How easily did you navigate through the course? Did anything confuse you?
“Did you always know what to click and when to click it?”
“Did you learn from the course? If so, can you describe the key things you learned?”
“How do you think your peers will react to this course?”
“Was the post-test too easy? Too difficult? Just right? Please explain.”
By having learners explain reasoning for doing what they did, we were able to:
- Test the length of the course: The course was originally supposed to be 45 minutes, but it took learners 90 minutes to go through material. This was important to find out!
- Find out where navigation and directions are confusing: Is someone stuck because they didn’t read the directions? Did they have trouble finding the Next button? All this helps us adjust the course to make it more user friendly.
- The difficulty level of the post-test: Is 100% as a passing score realistic? How much information can and should learners be expected to remember, to pass the test? Are there any post-test questions that do not directly relate to a learning objective? A post-test that is impossible to pass only serves to frustrate the learner. We want to challenge the learners, but we want to be sure we’re asking the right questions, and the right number of questions.
The session had some interesting takeaways: Everyone agreed that there was a lot of “required content” in this course but no one thought the amount was unacceptable. The customer realized that with all of the information learners needed to know, it was more important to provide them with easy-to-use reference tools than to expect them to memorize technical content.Learners also told us their favorite parts of the course were the more visual and interactive one, especially the problem-solving activities.
So, what did we do with this feedback? We moved large content sections of the course into an interactive PDF resource tool and focused the course on visual scenarios, animations, and activities that required learners to use the tool to find the information needed to complete the activities, and even to pass the post-test. Thanks to the Think Out Loud session, we were able to revise the course for a more positive learner experience that also met the overall learner outcomes. We were also able to provide learners with a user-friendly resource tool they can use on the job as well.
Learner Interviews and “on the Job” observations
Focus groups and learner “on the job” observations are a terrific way to see things from the learner perspective at the very beginning of a project, as part of a good training analysis. Visiting client workplaces is a regular part of what we do… and vital to producing great learning solutions. We can’t create this stuff in a vacuum! Whoever your vendor is, make sure they spend some time onsite to get a feel for your company culture and the performance issue(s) that should be addressed.
We recently created a new employee curriculum for Harlan Laboratories. Visiting the work sites and talking to employees was one of the most important analysis tasks we did when designing the new curriculum. Jennifer Bertram, project manager for this project, conducted two site visits… which helped her understand what the employees actually do during a typical day, and where they do it. She observed and asked questions to fifteen or twenty Harlan employees all in the same role, but with varying lengths of employment (new hires and seasoned employees). By asking questions like “What do you think you need to learn to do this task well?”, “What do you wish you would have learned on the first day? The first week?”, “What mistakes are most common?” “What challenges do you face?” and “What tools would be helpful to you now”, Jennifer was able to map out the optimal curriculum.
The site visits made the curriculum design workshop much more focused because we came into it with our analysis findings and recommendations. The employee interviews made it apparent that they needed (and wanted) more “just in time” quick reference tools, but electronic tools were not going to work. Why? Because employees did not have access to computers or tablets as part of day-to-day work. Since most employees did not have prior experience coming in either, the training needed to cover all the basics in a staggered, four-week approach.
Based on information gathered from the site visits and interviews, we created the following materials:
- Training flip book that Harlan trainers can use to show new employees “what good looks like.”
- 3×5 spiral employee notebook that contains all the key messages from training along with a place for learners to take notes. And, it easily fits into their scrub uniform pocket.
- Visual flash cards for studying key terminology and facts.
- A Trainer Guide that helps Harlan trainers guide their new employees through the four-week training program in a consistent, logical order and at the right pace.
On-site visits helped us identify that these “low tech” tools were actually the best learning solution for the employees. Spending time on site with the target learners was critical to getting this right!
Listen, Test, Refine
These aren’t the only ways to test out your learning solutions. We also use usability testing, scenario building workshops and pilot programs to make sure learning hits the mark.
If you want to hit the mark with target learners, listen to them! Spend the time asking them questions and gathering information before starting a project. During development, take the time to conduct Think Out Loud sessions and perform usability testing. When you’re ready to launch, start with a pilot and use the results to make any final revisions.
Good points. Yeah, you have to listen to the students and see how people are actually using the course.
Crazy that the course was taking them twice as long to finish than expected though!
Brilliant! The blog series, the idea of focusing on the learner´s needs from the very beginning, the different techniques from usability engineering applied to instructional design, the challenges and the solutions discussed are extremelly useful indeed. You outdid yourself in this post, Steve!
Why, thank you! It helps to have great examples to share.