<img src="https://certify.alexametrics.com/atrk.gif?account=b2hlr1ah9W20em" style="display:none" height="1" width="1" alt="">
    Login
    Get started

    Strategies for improving the quality of compliance assessments

    Published on 13 Jun 2019 by Vivek Dodd

    Assessments are critical components of corporate compliance programmes. They are used to evidence employee awareness and competence to internal stakeholders and regulators. They can also be used for pre-testing, personalising the training, and improving the training content over time.

    In this blog post, we explore some of the strategies Skillcast uses to create robust, high-quality assessments for corporate compliance training.

    If you’re a learning developer, trainer, or subject matter expert involved in planning online and in-person evaluations, then you should find this interesting.

    Testing beyond comprehension and knowledge-recall

    Most compliance assessments rely on multiple choice questions (MCQs). Despite their shortcomings, MCQs are easy to write and replicate to create variants, and easy to score and collate into aggregate statistics. However, to test knowledge and competence with a good degree of reliability (ie the reliability with which the MCQ tests competence), you need to write questions of high quality, consistency and level that are fit for purpose.

    Testing comprehension and recall of factual knowledge, eg types of fire extinguishers or penalties for fraud, may be possible with simple MCQs. However, compliance assessments should be more about testing "how employees will act" than about "what they know". Testing application, rather than basic knowledge. For this reason, compliance assessments should feature scenario-based questions where learners are required to analyse, critique and make decisions.

    A typical scenario-based question starts with a case statement or vignette that can include a detailed background, even a separate document that the employee must study. It can also include a realistic dialogue between the protagonists (in plain text or video format) that the learner is required to consider. A policy document. An order form. You get the picture.

    The vignette can serve as the basis for asking one or more MCQs, each with its question stem and response choices. These choices should be written as decisions or critiques that the learner is asked to make.

    Writing and quality testing such scenario-based questions can be time-consuming and expensive, but this can be offset by creating variants of the questions for randomisation.

    Best practice for writing multiple choice questions

    • Each question must be related to a learning objective in the course
    • The question stem should state clearly the problem and only test a single idea
    • The details of the scenario should be kept in the question stem, and the response choices should be kept short
    • The distractors must be plausible, but the correct choice should be unambiguously the best answer
    • The distractors should incorporate common compliance errors that people are known to make - rather than being obvious
    • The length of the response choices should be similar - if the correct choice is longer than the distractors in some questions, it must be shorter in others
    • All response choices, correct choice and distractors alike, should follow on grammatically and logically from the question stem
    • The question stem should be worded positively - if it's necessary to use a negative word like "not", it must be underlined or capitalised
    • There should be no double negatives, eg a question stem featuring a "not" or "except" and one or more choices also featuring a negative
    • Ideally, "All of the above" and "None of the above" should be avoided
    • You can use dialogue too as response choices - to make conversations, advice and decisions sound more natural
    • The position of the correct choice should vary randomly (in online assessments, the options can be jumbled each time the question is asked)
    • The language should not use humour or colloquial terms that leave non-native speakers at a disadvantage

    Structured assessments

    Compliance assessments typically consist of a bank of MCQs from which a certain number are randomly selected for each test. The size of the question bank varies, but a 3:1 ratio (eg a bank of 30 questions for an assessment where 10 questions are asked) is generally considered to be sufficient.

    However, this random sampling approach is flawed and results in a low degree of reliability. When questions are picked from a single pool, there is a risk of the test including multiple questions on certain topics (eg gift-giving) but no questions on other topics (eg PEPs) covered in the course. Moreover, since the questions on some topics are likely to be less difficult than others, the overall difficulty of each test will vary.

    The best approach for addressing this shortcoming is to structure the assessment into multiple question banks - each of which aligns with a single action point that we want to achieve in the course. Action points are analogous to learning objectives but distinct. Whereas the learning objectives list what people will learn, the action points list what the company wants them to do (or not do) after being trained.

    The MCQs (preferably scenario-based) in a given question bank must test only the corresponding action point and be of the same level of difficulty. The test is then composed by randomly selecting a set number of questions (preferably one) from each question banks. This ensures that each instance of the assessment test is consistent and each learner is tested on all the points that are important for compliance.

    Multiple true false questions

    True/false questions are regarded as being inferior to multiple choice questions. This is unjustified as numerous research studies have shown that the reliability score of multiple true false (MTF) questions is higher than that of MCQs.

    An MTF consists of single vignette with a question stem followed several items, each of which the employee must evaluate as true or false. The items may be presented together or one at a time - the latter is better accessibility if the assessment is online.

    To compare the MTF format with MCQ, consider the example of a course on Bribery Prevention. Let's assume that this course has five action points and an assessment with two MCQs for each action point. Take one of these action points - say related gifts and hospitality procedures in the company. This would be tested with two MCQs with four response choices. The probability of an employee with no competency being able to select the correct answer randomly is 25% (it would be higher if the learner had a partial understanding to eliminate one or more of the distractors). So the probability of this employee answering both questions correctly is 6.25%.

    Extrapolating this, if there were 1,000 employees in the company with poor or no competency of gifts and hospitality procedures, 62 of them would pass the assessment on this action point nevertheless.

    This level of false positive error may be tolerable if the assessment comes after the course and if the purpose of the intervention is to raise awareness rather than assess competence. However, it would be intolerable if the assessment was being used for pre-testing.

    The purpose of pre-testing is to enable those who demonstrate competence on an action point to skip parts or the whole of the training content related to that action point. For companies to fulfil their compliance training obligations, they need to be able to demonstrate that the pre-test is robust.

    In the above example with a 6.25% false positive error rate, 62 out of every 1000 employee with insufficient competency would be able to skip the content on gifts and hospitality procedures via a pre-test composed of two MCQs on this point.

    To reduce this false positive error rate, we use the MTF format. Each MCQ with four options can be replaced with MTF with three, or even four items, with no appreciable difference in the assessment duration or user experience. The probability of the employee with no competency being able to score any item correctly is 50%, and that of scoring the two MTFs with three items each is 1.56% (the probability falls to 0.4% if the MTFs have four items each). This four-fold reduction in the false positive error rate makes MTF the format of choice for pre-testing.

    Learner confidence

    To reduce false positive errors further, we take a gamification approach to pre-testing, in which learners are invited to play a game for points (and optionally compete for places on a corporate leaderboard). In this game, we can allow learners to bet on their answer to some or all questions to earn additional points.

    The game dynamics can be fine-tuned with a variety of settings including negative marking and the value of the bet. Irrespective of the points, this format of pre-testing adds a valuable new dimension - the confidence that each learner has in their responses. Using this confidence level alongside the MTF score can drive down false negative error and improve the reliability of the testing.

    This article was originally published by T-CNews Online, an independent resource for people development and people regulation personnel within the UK financial services industry.

    Leave a comment

    Tick

    eBook: Essential Uncovered

    Skillcast Essentials is our best-selling library and there's a reason for that. Essentials library provides comprehensive coverage of the key compliance / conduct issues that companies in the UK face today.

    Request now

    Transparency International & Skillcast Relaunch Anti-Bribery Training

    Transparency International (UK) have partnered with Skillcast to develop free anti-bribery training resources, including a refreshed version of the acclaimed 'Doing Business Without Bribery' ...

    Read More
    5 Steps You Should Take To Avoid Facilitating Tax Evasion

    Even if a company has no knowledge of its employee or associated person facilitating tax evasion, they could still be held liable for 'failure to prevent' the offence. The UK Government introduced ...

    Read More
    10 Things You Should Do To Improve Risk Management At Work

    Whether it's catching the train to work, crossing the road, investing in financial products, or making dietary choices, life is full of risk. You can't remove it, but you can contain or mitigate it.  ...

    Read More
    Skillcast at World of Learning 2019

    Skillcast are hosting a 30-minute seminar at World of Learning (#WOL19) the UK’s most comprehensive event for all aspects of learning and development. About World of Learning 2019 Showcasing a ...

    Read More