28814 members and growing – the largest networking group in the maritime industry!

LoginJoin

Friday, November 27, 2020

Maritime Logistics Professional

The Second Most Common Mistake (Arguably) in Maritime eLearning

Posted to Maritime Training Issues with Murray Goldberg (by on October 28, 2013

As I wrote in the first article of this series, it is easy to make mistakes in the implementation of maritime eLearning. While it is true that making them can be very damaging to an eLearning program, they are all pretty easily avoided with just a bit of thought and knowledge. Thus, this short series of articles discusses some of the most damaging, yet some of the most easily avoided mistakes when implementing eLearning programs in the maritime industry.

Maritime Training: The full library of maritime training articles can be found here.

Blog Notifications: For the latest maritime training articles, visit our company blog here. You can receive notifications of new articles on our company blog by following the blog.

Maritime Mentoring: International Maritime Mentoring Community - Find a Mentor, Be a Mentor

The Second Most Common Mistake (Arguably) in Maritime eLearning

Introduction

As I wrote in the first article of this series, it is easy to make mistakes in the implementation of maritime eLearning. In my work over the last 6 or so years implementing eLearning in the maritime industry (and in the 10 years before then as an eLearning user, researcher and LMS designer / developer), I have become very aware of many of the more common implementation mistakes - some of them made by me. While it is true that making them can be very damaging to an eLearning program, they are all pretty easily avoided with just a bit of thought and knowledge. Thus, this short series of articles discusses some of the most damaging, yet some of the most easily avoided mistakes when implementing eLearning programs in the maritime industry.

The first article in this series discussed the practice of delivering unsupervised multiple choice exams. Whether done on line or on paper, this is generally a very poor practice as there is no way to determine whether the answers submitted represent the knowledge of the person being tested. That trainee could have honestly written the exam, or could instead have relied on a cheat-sheet, course materials or a knowledgeable friend when answering. The message in that article was clear - if you wish to rely on the integrity of the assessment results, then the examination must be supervised by a trusted person in a position of authority.

Being able to rely on exam results brings us to the topic of this article - arguably the second most common mistake in maritime eLearning implementation. In the spirit of the previous article, I'll first state the problem in one sentence, and then go on to explain:

"If you deliver exams to assess the knowledge of your seafarers - especially multiple choice exams -, make sure that no two trainees receive the same exam!"

This may seem kind of extreme - not to mention difficult to achieve. So before getting too upset with me, let me explain.

Why not?

The answer to this questions is likely self-evident, but for completeness, the problem with delivering the same exam to more than one trainee is that people talk. That is, the first trainee to write the exam is going to share his or her memory of the exam questions with subsequent trainees. If trainees are left unattended while writing the exam (the subject of the previous article), then they may go so far as to write down the questions asked. Not everyone will do this, but it only takes one. After a few short meetings with other similarly motivated trainees, the correct answers will be determined and before you know it, there will be a piece of paper circulating around the vessel with a complete set of questions and answers. Worse yet, in this internet age, these questions and answers can be distributed broadly and shared with "colleagues" anywhere in the world. Subsequent trainees interested only in "getting the grade" will bypass the learning all together and proceed straight to the assessment, typing in the answers to the questions which are already known (and answered).

We know this happens. I don't need to explain why this is bad.

What can we do about it?

There are a couple good solutions to this problem. First - let's look at assessment randomization. The goal here is to deliver different exams to different trainees. Technology can be of real help here. In particular, a learning management system (LMS) or other assessment delivery system. Most modern LMSs have the ability to randomize examinations specifically to address this problem.

It works as follows. Instead of creating a specific, fixed exam in an LMS, the trainer or instructional designer creates a set of questions and enters them into the LMS "question database". The question database is divided into categories, and each question is placed into one category. Ideally, each question category will be composed of questions of roughly equal difficulty which all test the same competency or knowledge.

Once the question database is populated with questions, exams can be defined. The process of defining exams is quite simple. Here, the trainer (or, again, the instructional designer) indicates that a specific number of questions should be drawn from each of the categories chosen for that exam. For example, for a first-aid exam, the designer might indicate that 5 questions should be drawn from the category containing questions about dealing with broken bones, 8 questions from the category on cuts, 4 questions from the CPR category, and so on.

Now - each time that first-aid exam is delivered to a trainee, the LMS randomly selects the required number of questions from each of the specified categories and creates a unique exam for that trainee. In addition, if the question categories and questions are carefully selected, then each exam is guaranteed to test the required knowledge and be of roughly equal difficulty.

There are some great advantages to this technique. First, if you have a reasonable number of questions in each category (say 5 - 10 times the number of questions which are required for each exam), then the reuse frequency of exam questions is low and it becomes very hard for trainees to share enough information with others to have a significant impact on how exams are done. Second, as a process of continuous improvement, you can gradually add more questions to each exam category and cull out of date or poor questions. These questions become a real company asset which then grows with time. Third, as a by-product of having each question category contain questions covering a single competency, it may allow your LMS to deliver analytics indicating how well each competency is being trained. This fine-grained information is much more useful than simply reporting on the overall exam score.

But what about the CBTs we use?

Many CBTs are outstanding resources with expertly crafted content and assessments. However, most CBTs do not randomize exam questions. This does not make CBTs any less valuable, but it does mean that they, like any other valuable tool, must be used intelligently. What does this mean?

As a general rule, we must always be cautious about too much reliance on any technology assuming it will do all of our work for us. Just as you would not set your autopilot and then take a nap until you arrive at your destination, we should not rely 100% on a CBT exam, an LMS exam, a demonstrative exam, a verbal exam, or on any other single form of assessment. These can all be excellent tools when employed as one part of a comprehensive assessment strategy, but incomplete or misleading when used alone.

For example, at BC Ferries, their assessments are four-phased. First, there is a supervised, randomized multiple-choice exam. Then, in addition, there is an oral scenario exam (eg. "tell me what you would do if you came across a fire burning in a trash can on the car deck …"). Third, there is a demonstrative exam (eg. "show me how you don that fire suit …"). And finally, there is a meeting with the master in the case of deck crew. This four-part assessment program is carefully crafted to test different kinds of abilities, knowledge and attitude. If any one of these assessment techniques were to be used in isolation, it would be an incomplete assessment. So how does this apply to CBTs?

It does not have to be complicated. One easy and very effective technique is to use the CBTs and their assessments as usual, but then have a trainer or supervisor ask a few verbal questions of the trainee to test some aspects of knowledge covered by the CBT. Are the answers to the verbal questions consistent with the level of knowledge demonstrated in the CBT assessment? If so, then it is reasonable to conclude that the CBT assessment was performed honestly and that the results are a fair representation of the trainee's knowledge. If not, then more investigation is needed. This means asking more questions and/or applying an additional, more in-depth exam. Essentially you are doing whatever is required to gain some confidence that the trainee has learned the required knowledge, or to get to the point where it is clear that they have not.

A CBT-based exam can be extremely effective when used in this way. In fact, the mere knowledge that trainees will be subject to a short oral exam administered by a supervisor following the CBT assessment is usually enough to ensure that they take the training seriously and learn the material.  The CBT assessment then becomes formative (like a self test) - helping the trainee assess whether he or she has learned the material to the degree required and is ready for the supervisor's assessment. Used this way, well written CBTs and their assessments, even though not randomized, can be excellent instructional tools.

Must ALL kinds of exams conform to this rule?

The simple answer is "no". But only under specific circumstances. For example, if you are in the position of delivering a single exam, one time, to a large number of trainees concurrently, then there is no problem with giving the same exam to all trainees. In this case, assuming the exam is supervised, there is no possibility of sharing answers because the trainees are all doing the exam together. Having administered hundreds of exams for many thousands of university students, I can attest to the fact that some will do their very best to try and cheat. But with enough eyes on the test-takers you can uncover most attempts (if only they spent as much time studying as they spend thinking of creative ways to cheat …). But in my experience as a faculty member, while I might reuse one or two brilliantly crafted questions from last year's exam, by and large, each year's exam was new. It had to be.

Conclusion

The simple message here is don't keep re-delivering the same exam over and over. Unless, that is, you verify the results, and make it clear to the trainees that you will be doing so. In this way, you will be able to determine whether the results are reliable. The other message is to be careful never to rely solely on any one kind of assessment, be it electronic or "manual" (written or verbal). Only by combining assessment techniques are you able to take advantage of the disparate strengths of each, providing a comprehensive assessment experience.

# # #

About The Author:

Murray Goldberg is the founder and President of Marine Learning Systems (www.marinels.com), the creator of MarineLMS - the learning management system designed specifically for maritime industry training. Murray began research in eLearning in 1995 as a faculty member of Computer Science at the University of British Columbia. He went on to create WebCT, the world’s first commercially successful LMS for higher education; serving 14 million students in 80 countries. Murray has won over a dozen University, National and International awards for teaching excellence and his pioneering contributions to the field of educational technology. Now, in Marine Learning Systems, Murray is hoping to play a part in advancing the art and science of learning in the maritime industry.

Maritime Training: The full library of maritime training articles can be found here.

Blog Notifications: For the latest maritime training articles, visit our company blog here. You can receive notifications of new articles on our company blog by following the blog.

Maritime Mentoring: International Maritime Mentoring Community - Find a Mentor, Be a Mentor