Making e-exams accessible

Website: OpenMoodle der Universität Bielefeld
Kurs: Material package accessible teaching
Buch: Making e-exams accessible
Gedruckt von: Gast
Datum: Donnerstag, 12. Dezember 2024, 08:39

1. What are e-exams?

The term e-examination stands for electronic examinations (or assessments) and is a changing concept in examination regulations. While written, oral and practical examinations can be conducted digitally as well as in analogue form, e-examinations only take place digitally. E-examinations are digital examinations in a self-contained system. This system can be Moodle, for example. These systems can be accessed on site, i.e. in person, or digitally (virtually).

E-exams are characterised by the fact that the entire process, from preparation to implementation and correction, must take place in the same system.

Schematic representation of the test types

Figure: Schematic representation of the test types, according to Malte Persike (own illustration)

E-examinations can be:

  • E-examinations and quizzes: formative form, often during the semester
  • Remote exams: any location ("take home exam") - supervised or not
  • Scan checks: Digitisation by scanning after the inspection - no digital inspection in the actual sense

Hybrid examinations: Select variants or mixture or mixture of processing forms in the same test

2. Examination formats

There are different examination formats for assessing students' learning success by means of an e-examination:

  • Closed book exams: No aids other than those specified by the examiners are permitted
  • Cheat sheet exams: own exam aids ("cheat sheets") are permitted 
  • Open Book Exams: Textbooks and selected materials may be taken into the exam
  • Open Web Exams: everything may be used, except live exchange with other students during the exam

Advantages of open exams: higher competence knowledge, authentic testing, use of parameterisation, constructive alignment. BUT: Must be practised! Subject-specific and competency knowledge must be taught.

Attention: Observe the examination regulations of the respective universities!

3. Typical task formats in e-exams

The tasks within the e-exams can take on different formats. Teaching staff, instructors, lectures can design the exams to suit the course type and purpose. The following formats can be used:

  • Open: Result entry, free text entry, scan task
  • Closed: multiple-choice task, single-choice task, Kprim task, matching question, classification task, cloze task ("CLOZE"), image annotation/hotspot/ImageMap
  • Specialised tasks such as formulas, coding etc.

3.1. Typical task formats in e-exams in Moodle

Although the Moodle LMS is not primarily designed for conducting e-exams, the "Test" activity contains a sufficient range of functions for conducting summative assessments. A large repertoire of different question types is available to teaching staff, instructors, lectures and others:

  • Multiple choice questions
  • Free text tasks
  • Assignment questions
  • Questions by numbers
  • Questions about calculation results

3.2. "Rules for multiple choice questions" (according to Malte Persike)

To ensure that students can actually answer multiple choice questions, the following rules must be observed:

  • Answerable without reading the questions/answers
  • No clue words ("never", "always" etc.)
  • Similarly long alternative answers
  • Equally plausible alternative answers
  • Equally probable positions for "correct" -> not always in the same position
  • Not too similar answer alternatives
  • Few repetitions of words
  • Few negations
  • Four or more alternative answers
  • Report the number of correct answer options

4. Change in the examination culture

A change in examination culture is needed, and not just because of increasing digitalisation. Rising student numbers, changes to degree programmes and the coronavirus pandemic are driving this change in examination culture. Old structures must be adapted to the new circumstances and a new understanding of examinations must be created. Both familiar and newer models can be used for this.

4.1. Constructive alignment (according to John Biggs)

In constructive alignment, learning processes should match the learning outcome. Learning objectives, teaching and learning methods, learning activities and forms of assessment should be harmonised.

Schematic representation of constructive alignment

Figure: Schematic representation of constructive alignment, own illustration, based on Wildt and Wildt.

4.2. The SAMR model (according to Dr Ruben Puentedura)

The SAMR model is designed to help teaching staff, instructors, lectures and examinations to digitalise their teaching step by step. The aim is to first improve and then redesign teaching/examinations.

SAMR stands for the following four stages:

  • S = Substitution (replacement): Analogue forms are converted directly into digital forms, but not functionally adapted
  • A = Augmentation (extension/supplement): Tasks are functionally adapted, e.g. through the additional use of computers for the fulfilment of tasks
  • M = Modification (reorganisation): Tasks and thus the learning process are redesigned
  • R = Redefinition (redefinition): Use of new tasks/learning processes, e.g. programming during an exam

Attention: Technology competence must be available or learnt!

 Schematic representation of the SAMR model

Figure: Schematic representation of the SAMR model

5. Online exams

The use of electronic information and communication channels is central to online examinations.

Online exams (also known as digital remote exams) can be divided into so-called "take-home" and on-site exams.

"Take-home exams are mainly, but not exclusively, used to complete asynchronous and longer exam tasks.

5.1. "Take-home" tests

As the name suggests, take-home exams do not take place at universities, but at home. Such exams are comparable to written exams and should therefore contain tasks that can be completed in a relatively short time. Take-home exams are often set as open-book exams or suitcase exams and aids are permitted. The examination task is issued and submitted electronically. There are two different types of take-home exams:

"Take-home" tests without guarding
  • Students themselves are afraid of being disadvantaged by cheating fellow students.
  • High intensity of preparation (as a pure knowledge test is not suitable).
  • Performance time and academic achievements higher e.g. due to the fact that the home context is not an examination venue.
  • Deception control through: Deception avoidance, deception monitoring, deception detection.
  • Deception in distance greater than in presence.
  • But you can always deceive.
"Take-home" examinations with supervision: online proctoring

Online proctoring is a location-independent digital assessment using special software designed to prevent cheating. The software can record the exam in various forms (screenshots, video, audio, clickstream).

4 levels of proctoring:

  • Level 0: Video conferences with up to 50 students, no recording
  • Level 1: Proctoring with dedicated software, 1 camera, website logging, recording if necessary
  • Level 2: Proctoring with 1 camera, website and application logging plus computer lock-down, recording if necessary
  • Level 3: Proctoring with 2 cameras, complete activity logging plus computer lock-down, recording if necessary

Records must be viewed critically. The regulations vary depending on the federal state.

Proctoring can be used digitally and in presence, as on-site examinations often take place in shielded and unobservable examination centres.

5.2. Digital on-site inspections

Digital on-site examinations take place at the universities themselves. Various special features need to be taken into account in order to ensure successful implementation on campus. For example, the conditions regarding the premises, infrastructure and hardware must be clarified.

Required spatial and technical infrastructure:

  • Computer pools (fixed computer workstations)
  • Mobile pools (computer workstations that can be set up in different rooms)
  • BYOD - "Bring your own device" (testers bring their own devices to carry out the test)

Examination options:

  • Open book exams
  • Testing with third-party application
  • E-exams in the same system

5.3. Results of a student survey

Student survey on online examinations at Bielefeld University

The student survey in the winter semester 2020/2021 showed that 64.5% of students were (very) satisfied with the organisation of the online examinations.

In 42.7% (N=2837) of the online exams, a test run took place prior to the online exam. 80.6% (N=2340) were (very) satisfied with the test runs. 73.7% (N=2837) of the students surveyed were provided with instructions or information material in the run-up to the online exams. And 85.7% (N=2749) found these (very) helpful. This presumably influenced the fact that the students felt (very) well prepared for the online examinations (69.1%, N=2194).

These results show that good preparation also makes sense for the technical conditions of an online exam.

Nevertheless, 34% of respondents experienced technical problems and 27.9% organisational problems (N=3154). For 58.2%, the technical problems could be solved during the examinations. However, among the students for whom the technical difficulties could not be solved or could only be partially solved, almost 20% had to cancel the online examination prematurely.

Furthermore, the students stated that the demands of the online examinations were higher (42%, N=1889) and the stress level was significantly higher than face-to-face examinations (44.2%, N=1889).

Technical problems in particular can occur unexpectedly and cause considerable stress. These should be responded to appropriately.

According to the survey, the most popular examination format is an online examination as an open book in the online examination system with 49.1% (N=3042). However, only 31.6% have taken these forms of examinations (N=3762). In the winter semester 2021/22, most of the students surveyed were (very) satisfied with open-book examinations with free editing (79.5%, N=836). In this survey, 46.6% would also like online examinations as open-book in the online examination system as a future examination format (N=1720).

6. Implementation of e-examinations

When implementing e-examinations and digital examinations in general, the following measures are highly relevant: examination and data protection measures, technical-infrastructural measures, didactic-psychological measures, organisational-logistical measures.Four fields of action for digital audits

Figure: Four fields of action for digital audits, taken and translated from Alexander Schulz.

6.1. Preliminary considerations before an e-test

Before an e-test can be carried out, a number of considerations must be made in advance and various factors clarified. The following aspects should be considered:

  • Preliminary technical considerations
  • Preliminary legal considerations
  • Preliminary didactic considerations
  • Premises, room bookings if necessary
  • Software, if necessary software training before first use

6.2. Expenses for conducting e-examinations

When conducting e-tests, it is not just theoretical considerations that need to be taken into account. The necessary equipment must also be provided:

  • Provide software and, if necessary, hardware.
  • Stable internet must be available. When working with a Safe-Exam browser, a second end device is required for monitoring by Zoom, for example.
  • Staff for monitoring, technical support and content-related questions.
  • Room issues: Rooms must be available and equipped on site; at home, students need the opportunity to take the exam in peace and quiet.
  • Training in digital skills

7. Barriers in digital exams

At the beginning of 2022, a survey on digital accessibility was conducted at the four project universities as part of SHUFFLE. Students, teaching staff, instructors, lectures and managers were surveyed in an online survey and in interviews.

In the online survey, students were asked about barriers in digital examinations, among other things. 60% of respondents who have already taken part in digital examinations have encountered the following barriers:

  • Processing time
  • Sensory overload
  • Stress and mental strain (due to insecurities about handling digital content and technology, fear of being observed, difficulty concentrating)
  • Problems with a stable internet connection
  • Inadequate hardware and software

The results of the survey indicate that fundamental problems may exist in digital examinations.

One student commented on the processing time as follows:

"Unfortunately, the given time for most online exams were extremely short. Professors should understand that we are humans too and we need time to read, think and answer the questions given. I think they are doing this to prevent us from "cheating", but we are human, it is not possible to respond in a second like robots. How can I focus on questions if I'm looking at the remaining time every ten seconds and my heart beats twice as fast as it should?"

The question "To what extent do you cope with the following aspect: e-exams on the learning platform?" revealed that only around a third cope (very) well with e-exams on learning platforms. Just under a third cope partially with e-exams and a third (not at all).

Diagram for the question "How do you cope with e-exams?"

Figure: Diagram for the question "How do you cope with e-exams?", own illustration.

7.1. How can I make my exams as accessible as possible?

To ensure that the examination itself is also accessible and can be completed by all students, the following aspects must be taken into account:

  • Make sure that your exams are designed to be accessible (documents, Moodle tasks) and that the software programme(s) to be used are accessible.
  • Make sure that the students have the opportunity to read through all the tasks first before they have to answer them. This gives them the opportunity to determine the order in which they complete the tasks themselves if this has no added didactic value.
  • If the order is fixed, this should be communicated to the students in advance.
  • For better orientation, a table of contents should be integrated at the start date of the exam.
  • Practise using digital programmes with your students so that they can concentrate on the content during the exam.
  • Indicate the processing time at the beginning of the exam and the approaching end of the exam in good time during the exam.
  • Provide information on technical support.
  • If it does not conflict with the purpose of the examination, students should be able to choose which editing techniques they wish to use. These should also be described in detail and practised. Possible editing techniques for exams: scanning written texts, uploading digital documents, writing on a PC using a keyboard, dictating to an assistant, etc.
  • Supervisors should be trained to recognise that pupils with disabilities may behave differently in online settings than other students. For example, they may use measuring devices or certain skills to calm down, go to the toilet more often or have an assistant to support them.

7.2. Which question types in Moodle should I rather not use?

Not all question types are suitable for a question within e-exams, as they can present a barrier. As not all students have the opportunity to answer different question types, the following table provides an overview of question types that should be avoided.

Table: Question types that are not accessible

Question type

Brief description

Do not use because...

Arrangement

Organise the randomly arranged elements in a meaningful order.

Is not accessible for people with visual impairments, motor impairments or mental illnesses.

Not controllable via keyboard.

Not readable with screen reader.

Drag and drop on image

Images or texts are dragged onto storage areas of a background image.

Is not accessible for people with visual impairments, motor impairments or mental illnesses.

Not controllable via keyboard.

Not readable with screen reader.

Drag and drop on text

Missing words in the question text are filled in using drag-and-drop.

Is not accessible for people with visual impairments.

Not controllable via keyboard.

Not readable with screen reader.

Drag-and-drop markings

Markers are dragged and dropped onto a background image.

Is not accessible for people with visual impairments, motor impairments or mental illnesses.

Not controllable via keyboard.

Not readable with screen reader.

Extended cloze text

A question type that deals with filling in blanks. Allows "drag&drop" and "drop down" answer options with the display of incorrect answer options, which serve as a distraction when answering the question.

Is not accessible for people with motor impairments or mental illnesses.

Cloze text (Cloze)

Questions of this type are very flexible. The text must be coded to create a gap in order to integrate multiple choice questions, short text questions or numerical questions.

Is not accessible for people with motor impairments or mental illnesses.

Gap text selection

Missing words in the question text are filled in via drop-down menus.

Is not accessible for people with motor impairments or mental illnesses.

Mark words

Highlight letters or words in a paragraph of text.

Is not accessible for people with visual impairments, motor impairments or mental illnesses.

Allocation

The answer to each of the sub-questions must be selected from a list of possibilities.

Is not accessible for people with motor impairments or mental illnesses.

Source: FH Potsdam: Moodle question types of the activity "Test"

 

8. Tips and tricks

The Assessment Toolbox Bern offers, among other things, the possibility to search for alternative assessment formats in a structured way (in german).

You will also find tips for digital exams (PDF) as a compact PDF document.

On behalf of the project E-Assessment NRW by Prof. Dr Nikolaus Forgó, Simon Graupe and Julia Pfeiffenbring, the recommendation for teaching staff, instructors, lectures on how to deal with electronic examinations (PDF) has been created and provides further information.

9. Sources

Bandtel, M., Baume, M., Brinkmann, E., Bedenlier, S., Budde, J., Eugster, B., Ghoneim, A., Halbherr, T., Persike, M., Rampelt, F., Reinmann, G., Sari, Z., Schulz, A. (Hrsg.) (2021). Digitale Prüfungen in der Hochschule. Whitepaper einer Community Working Group aus Deutschland, Österreich und der Schweiz. Version 1.1. Berlin: Hochschulforum Digitalisierung. DOI: https://publikationen.bibliothek.kit.edu/1000138521/131305183, last accessed 08/05/2023.

Budde, Jannica; Tobor, Jens; Beyermann, Jasper (2023): Blickpunkt Digitale Prüfungen. Hochschulforum Digitalisierung. Online abrufbar als PDF unter: https://hochschulforumdigitalisierung.de/sites/default/files/dateien/HFD_Blickpunkt_Digitale_Pruefungen.pdf, last accessed 08.05.2023.

Gattermann-Kasper, Meike; Peschke, Susanne (2022): Digitale Prüfungen inklusiv gestalten: Didaktische, technische und organisatorische Anforderungen, Präsentation vom 11.11.2022, online als PDF verfügbar unter: https://www.studentenwerke.de/sites/default/files/praesentation_peschke_gattermann.pdf, last accessed on 09.05.2023.

Krückeberg, Jörn & Markus, Holger (2013): Unterstützung von Leistungsbewertungen. In M. Krüger & M. Schmees (Hrsg.), E-Assessments in der Hochschullehre. Frankfurt am Main: Peter Lang.

Leibniz-Institut für Wissensmedien (2021): Constructive Alignement; https://www.e-teaching.org/didaktik/konzeption/constructive-alignment, updated on 21.02.2023, last accessed 08/05/2023.

Persike, Malte (2021): Infopoint Hochschullehre: Digitale Prüfungen – Konzepte, Technik, Praxis vom 24.09.2021. Hochschulforum Digitalisierung. Online-Video, last accessed 08/05/2023.

Schmees, Markus & Horn, Janine (2014): E-Assesments an Hochschulen. Ein Überblick. Szenarien. Praxis. E-Klausur-Recht. Band 1. Münster, New York: Waxmann.

Wildt, Johannes & Wildt, Beatrix (2011): Lernprozessorientiertes Prüfen im „Constructive Alignment”: In B. Berendt, H.-P. Voss & J. Wildt (Hrsg.), Neues Handbuch Hochschullehre, Teil H: Prüfungen und Leistungskontrollen. Weiterentwicklung des Prüfungssystems in der Konsequenz des Bologna-Prozesses (S. 1–46). Berlin: Raabe.