Designing a Secure Exam Management System (SEMS)

Designing a Secure Exam Management System (SEMS)

SEMSEXAM ENGINE CORE SERVICES AND FUNCTIONALITIES

The Quiz Engine embedded in Moodle is not built based on Service Oriented Architecture. It is implemented as a bulk of PHP code which has to be accessed through standard web browsers that are a bit slow on mobile de-vices and cannot address the exam security issues that exist in m-learning environment. Moodbile services ex-tension to Moodle does not touch the Moodle’s Quiz En-gine. Thus, we need to develop a new Quiz Engine that can be deployed as a service oriented application, so that its services can be consumed by a mobile application designed to cater to m-learning specific security require-ments. As well, it should be integratable with Moo-dle/Moodbile in order to have a complete LMS which suites the m-learning environment and addresses all of its security issues. Learn More The core services of the proposed Exam Engine are discussed below.

  1. Secure and Random Distribution of Exam Questions

This service provides the following functionalities:

1.Enabling the teacher to define a bank of exam questions and to link them to his/her subject through an appropriate interface (Subject’s Ques-tion Bank Interface). In case of objective kind of questions, each question may have a set of options. The teacher has to provide those options through the same interface and specify the correct choices among them to enable the exam engine to auto-evaluate students’ answers. In case of descriptive kind of questions, a text box (or probably a sketch-ing canvas) will appear below each question at the student device screen to allow him/her to write/draw the question’s answer; those answers will be saved at server side to be further reviewed and evaluated by the teacher. In addition, each question will have a property to specify its diffi-culty “level” (let’s say: A, B, C, D, and E).

2.Enabling the teacher to specify a subject’s exam properties such as: Date and Time, Duration, Per-centage of level A, level B, and level C questions in the exam paper, etc. through an appropriate inter-face (Subject’s Exam Setup Interface).

3.Securely authenticating and enrolling students, using any of the well-known secure authentication mechanisms, into exams at the pre-defined date and time through the Exam Enrollment Interface. Multifactor authentication can be adopted for stronger security as explained in Section 2.4.

4.Creating exam instances by random distribution of exam questions to the enrolled students’ mo-bile/tablet devices according to the predefined ex-am properties such as percentage of each question level. This means that questions are not going to reach students in the same order. Moreover, the multi-choices of each question, in case of objective questions, will be flipped randomly and delivered differently to each student. The Exam Server asso-ciates the exam questions with a message digest signed by its private key to ensure data integrity. The Exam Server also has to memorize the way it has distributed the questions to each student to be able to evaluate the correct answers once the stu-dents submit their answers back to the Exam Serv-er. This process, illustrated in Fig. 2, guarantees that each student gets different questions order and makes cheating by “hand-signals” impossible. The prepared questions bank is reusable. Teachers can always enrich their courses’ questions bank by adding new questions or upgrading old ones dur-ing the semester. At the exam time, it is the re-sponsibility of the Exam Server to create exam in-stances out of the questions bank. Incorporating the “question level” concept helps the Exam Serv-er to prepare a moderate kind of questions while selecting them out of the questions bank.

5.Students answer the exam questions through the Exam Client Software Interface. Their answers are then submitted to the Exam Server along with a signed message digest to ensure the integrity.

6.Processing students’ answers to determine their grades in the test. The Exam Server has to evaluate students’ answers according to the questions’ cor-rect solutions pre-defined by the teacher. Then it has to generate the appropriate reports.

7.Reporting: The Exam Engine has to generate a set of reports to enrich the assessment process, like: ·Subject’s Exam Report: It reflects statistical in-formation about a particular exam (Students’ Grades, Min, Max and Average Grade, etc.). ·Student’s General Report: It reflects general in-formation about the performance of a particular student in the whole semester/year. It shows his/her scored marks in all subjects and calcu-lates his/her GPA and other statistical values. ·Teacher’s Report: It shows the average perfor-mance of students in all the subjects given by a particular teacher.

2. Turbo-Mode Assessment

This service can be useful for conducting arbitrary quiz-zes during class time rapidly. It increases or decreases the level of the questions in a reactive manner. Assuming we have five levels of questions (A, B, C, D, and E), the Exam Server starts asking each student questions of level C. According to the student’s answers, it increases or de-creases the level of the questions in a reactive manner. As a result, student’s level can be determined using fewer questions and in a shorter time (binary search).

3. Preventing the “Unattended Exam” Issue

In a Wi-Fi based network, we cannot guarantee that each student is going to attend an exam from a dedicated classroom. A student can simply sit in a nearby room and log in to the exam system through the Wi-Fi network. He/she can subsequently open his/her course notes and use it to answer the questions illegally. To encounter this issue, we propose the following strategies.

4. Providing an Appropriate Mechanism for Anti-impersonation Student authentication for exam enrollment is a serious issue. Especially when there is a large number of students attending the exam and the proctor does not know all of them personally. A student may employ an impersonator, providing his/her credentials, to attend the exam on his/her behalf. To prevent impersonation, we recom-mend the adoption of a well-known biometric-based authentication technology, such as face recognition, to serve as a supplementary access key. Authentication based on face recognition is a long standing problem that has been studied extensively and several well-established techniques have made it a very common authentication approach [30], [31], [42]. There are plenty of methods available in the literature which can be classified as template-based vs. geometric-based, appearance-based vs. model-based, holistic vs. piecemeal. Due to ever-increasing use of mobile devices, new algo-rithms for secure authentication on such devices attract considerable attention from research community [43], [44], [45]. Computational load imposed by the face recognition algorithms is generally one of the key issues. However, the current computational power of the mobile devices gives a pave to obtain a real-time application of face recognition. Extensive research effort is dedicated to im-prove the real-time performance of face recognizer by utilizing available embedded GPUs on mobile devices [46], [47], [48]. In SEMS, we plan to utilize the OpenCV library [49], which allows easier development of proven algorithms such as Eigenfaces, Fisherfaces, and Local Binary Pattern Histograms for face recognition. OpenCV supports ex-ploiting parallel processing power of GPUs. In the proposed system, a face recognition module will be integrated with the OAuth protocol as a second au-thentication factor. A student will firstly be authenticated using his/her own username and password, whereupon he/she will be prompted to take a proper pause in front of his/her mobile/tablet device camera. The software on the student’s device will be responsible for capturing a proper face. Since current computational power of mobile devices allows us to implement feature extraction section of the face recognition, we propose to implement this section as a service on the mobile device. The extracted features will be sent to the Server to be compared against the student’s registered face features and a confirmation will be sent back to the mobile device to approve stu-dent’s identity. Spoofing attacks are of concern with respect to face recognition security, but are taken into consideration by the research community [50], [51]. There are plenty of techniques such as liveness detection [52] and progressive authentication [53] which can be easily integrated in the face recognition module to counter-attack spoofing. Thus, we believe that face recognition is a usable, highly secure, and efficient biometric-based authentication mechanism that can be adopted as a second authentication factor.

5. Preventing Students from Exchanging Mobile/Tablet Devices during an Exam

Beyond all the enforced security mechanisms discussed earlier and those which are going to be discussed later on in this paper, students might still attempt to cheat by simply exchanging their mobile/tablet devices after they get authenticated by the Exam Server. To prevent this issue, ECS tries to re-authenticate the students biometri-cally by asking them to represent their faces in front of the mobile camera on a random basis. With this mecha-nism, students cannot exchange their devices during an exam after getting authenticated as the system at any point of time can ask them to represent their identity. Moreover, the proctor software will have the function-ality to force a particular student attending an exam to get re-authenticated by the system in case any suspicious case occurs. It can simply signal the corresponding student’s ECS to re-initiate the authentication process. ECS will always respond to this signal coming from the exam’s registered proctor device.

6. Following the Widely Accepted Industrial Standards

SEMS Exam Engine must conform to a well-known and widely-adopted set of standards and specifications devel-oped by IMS Global Learning Consortium (IMS-GLC) [54]. IMS-GLC is a specification authoring organization comprised of distributed computer learning system ven-dors, publishers, digital content vendors, government agencies, universities, training organizations, and other interested parties. It is a global and non-profit member organization supported by over 190 of the world’s leaders in educational and learning technology. It has approved and published some 20 standards that are the most wide-ly used learning technology standards in higher educa-tion around the globe. These include meta-data, content packaging, enterprise services, question & test, competen cies, tools interoperability, sharable state persistence, vocabulary definition, and learning design. All IMS-GLC standards are available free of charge via the IMS GLC web site and can be used without royalty. The IMS Ques-tion & Test Interoperability (QTI) specification enables the exchange of item, test and results data between authoring tools, item banks, test constructional tools, learning sys-tems, and assessment delivery systems.