Open Access Library Journal
Vol.05 No.11(2018), Article ID:88708,10 pages
10.4236/oalib.1104919

Design and Analysis of an Electronic Platform for Examinations*

Daniel Paul Godwin1, Modi Bala2, Abdulfatai Habib1

1Department of Computer Science, Faculty of Science, Kebbi State University, Birnin Kebbi, Nigeria

2Department of Computer Science, Faculty of Science, Gombe State University, Gombe, Nigeria

Copyright © 2018 by authors and Open Access Library Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: September 18, 2018; Accepted: November 20, 2018; Published: November 23, 2018

ABSTRACT

An alternative approach for conducting tests of multi-choice format, is the e-Examination Platform or simply a Computer based Test (CBT). The CBT is necessary considering the large population of students enrolled in Nigerian Secondary and Higher Institutions of learning. Furthermore, the student size and the unique nature of some departmental courses hinder its total implementation. This paper investigates the challenges attributed to the manual processes of conducting the said tests and examinations. The paper also examines the potential for using student feedback in the validation of assessments, by attempting to minimize the difficulties associated with it. The process was further modified to run in parallel, making it faster, and helped to minimize human errors, improve the accuracy of the CBT and produce quality and transparency in the process. A survey of 230 students from various schools was taken to test and sample their opinions, by filling a carefully worded questionnaire. The data collected was then collated and analyzed. The analysis showed that more than 95% of the students surveyed were already competent in the use of computers and the CBT platform. Also, more than 90% of them found the platform easy to navigate and use. Lastly, about 98% of them said that the platform was a much better alternative to the manual process of conducting the same tests and examinations.

Subject Areas:

Software Engineering, Simulation of Software Functionality

Keywords:

CBA, CBT, Algorithm, SQL, Dreamweaver

1. Introduction

An e-Examination Platform or Computer-Based Assessment or Test (CBA or CBT) is considered to be an e-assessment, and computer-administered testing method for administering examinations. As such, media responses to questions are been electronically recorded or assessed or in some cases both, as the case may be. However, in this particular case, a standard pre-test-intervention-post-test design [1] was not utilized for testing, rather the CBT here implies the use of devices such as: computers, cell phones, iPads etc. [2] . In recent times, CBTs have been introduced as the new assessment platforms in some tertiary educational institutions within Nigeria. The medium is considered to be faster and more efficient compared to the traditional paper-and-pencil approach [3] [4] . Some tertiary educational institutions in Nigeria such as Gombe State University, Gombe and a few selected ICT-compliant Secondary Schools in Gombe had tested the software to that effect. This was achieved by sampling about 230 students at random to have a trial run with the home-grown software discussed in this paper. Other institutions such as: Kebbi State University, Kano State University, University of Lagos, National Open University of Nigeria, Polytechnics and Colleges of Education have also introduced CBT for their yearly entrance examinations. The said institutions started this form of examinations with the Post University Matriculation Examination (Post-UTME).

CBTs are also used for semester examinations especially where the classes and number of students are too large. The system enables educators and trainers to author, schedule, deliver and report surveys, quizzes, test and other types of examinations. CBTs could be built as stand-alone systems or partly as virtual learning environments, assessable through the World Wide Web (www). A collection of tools are available on most CBT platforms, which enable automatic marking of the question responses of multiple-choice formats.

A good example of a CBT platform is the Business Language Testing Service (BULATS) developed and managed by the English Department of Cambridge University, United Kingdom. The system is a highly sophisticated online test platform that determines a candidate’s ability quickly and accurately using adaptive testing techniques. As a candidate progresses through the test, the computer schedules progressive questions on the basis of the previous answers. This becomes progressively easier or more difficult until a consistent level of ability is reached. Many candidates find the individual, non-distracted environment and often the immediate score report feedback an attractive feature of the CBT system. This paper discusses a CBT platform that puts into consideration unique tools that improve upon existing platforms.

The remainder of this paper is organized thus: A brief summary of related works is described in Section 2. Section 3 contains the background of the work, and Section 4 defines how the proposed system is designed. Section 5 explains the method used for data collection. Section 6 discusses the result, while Section 7 concludes the discussion.

2. Related Works

Researchers have more recently performed a large-scale review that examines the differences between CBT and paper-based tests performance. The review showed that, when CBT is similar in format to pencil and paper tests, it has little or no effect on test performance according to [5] . From a student’s perspective [4] on CBT, there have been a number of mixed reactions. Previous research has shown that more people anticipated problems with the CBT.

This paper argues that the inexorable advance of technology will force fundamental changes in the format and content of assessment. However education leaders in several states and numerous school districts are acting on that implication, implementing technology-based tests for low- and high-stakes decisions in elementary and secondary schools and across all key content areas [6] which will ensure standardization of testing procedures [7] . However, it is difficult to accurately predict when this day will come. However, [3] has shown that CBT has the ability to provide for expansions in the types of cognitive skills and processes measured. Also, by allowing data to be collected during such examinations, an accurate distinction is made between “omitted” and “not reached items”, and the response latency is also collected. Furthermore, [3] also showed that provision can also be made on a CBT system to produce large prints and audio for vision-impaired examinees [8] .

CBTs are economical, accurate and time bound. As such, primary, secondary and tertiary institutions can adopt this system to solve challenges noted above. Examination bodies such as the Joint Admission and Matriculation Board (JAMB) in Nigeria have already adopted the system that carters for her examinations across more than 500 CBT centers nationwide. This has aided to overcome challenges facing such examinations.

Another challenge facing CBT test designers and administrators is how to design and construct CBT software that is fair, reliable and capable of producing valid test results. CBT Candidates find it difficult to navigate backwards to rework problems. Some are resistant to the computerized testing process because they are accustomed to taking notes and circling question. Others say that they read more quickly and more easily on paper [9] than on a glaring computer screen [10] . As such, this paper has proposed a system that is quite capable of handling the said problems and even better.

3. Background

A Logarithmic algorithm was used in implementing the said system, particularly the modified Quick sort algorithm. The algorithm is based on the classic Divide-and-Conquer approach. The process generates and processes a set of array indexes that can run simultaneously to generate questions and answers automatically.

Algorithm 1: Generation of survey questions and answer.

Begin

Repeat

Display “Type in the letter chosen or No. 1-20 to finish”

DISPLAY “Question 1”

DISPLAY“A:Zero” DISPLAY“B:Single double bound” DISPLAY“C: More than one double bond ” DISPLAY“D:Two double bonds” DISPLAY “Enter correct answer”

ACCEPT letter If letter = ‘A’ then

Zero = Zero + 1 If letter = ‘B’ then Single double bound = Single double bound + 1 If letter = ‘C’ then More than one double bond = More than one double bond + 1 If letter = ‘D’ then Two double bonds = Two double bonds + 1 UNTIL letter = ‘A’ DISLAY “Zero scored”, Zero, “wrong answer” DISLAY “Single double bound scored”, Single double bound “wrong answer” DISLAY “More than one double bond scored”, More than one double bond, “correct answer” DISLAY “Two double bonds scored”, Two double bonds, “wrong answer”

End loop

End Begin

Algorithm 2: Generation of Questions

begin

Step 1: Create an array A[ ] of N

Step 2: Generate a random number “rand”

Step 3:CA-GRF{

count==0

fori=0 to N

if(rand

{

test(rand,i)

A[i]=rand

Count=count+1

}

else

return 0

}

int test(rand,i)

{

int j=0

for j=i+1 to N

if(A[j] != rand

return rand

else

{

Generate rand==A[j]

Return rand }

}

end begin

4. Proposed System Design Objectives

The main objective of this research paper is to terminate the manual process of examinations called Pencil-Paper Test (PPT) into an automated system. Such a system allows the user an interactive design and implementation called an online computer based test system OCBTS. The Specific objectives of this research paper are defined as follows:

1) To develop a CBT system that will automatically replace the manual form of examination system.

2) To developed a CBT system that automatically generates examination questions.

3) To developed a CBT system that automatically generates examination numbers for students.

4) To ascertain the operational effectiveness of the system.

5) Introduce a means of training for students on or before the actual application of the CBT system called Computer Assisted Learning (CAL).

6) To develop a CBT system with enhanced security features to avoid exam malpractice.

7) To design a CBT system with real-time processing of results for candidates automatically.

8) To allow for bursary and online payments.

5. Methodology

The methodology of this paper includes the use of Paper Format, Questionnaires and methods used in the collection of data.

5.1. Research Questions

The following five research questions as seen on Table 1 were formulated to address the problems identified in this study, namely:

1) What are the issues peculiar to the use CBT among the students?

2) What are the general constraints with the use of CBT for assessment of student?

3) What are the effects of the test administration mode on student’s performance i.e. students’ scores?

4) What is the relationship between prior computer experience and performance in CBTs?

5) What practices are helpful to improve the perception about CBTs?

Table 1. Paper format.

5.2. Software Specification Requirements

1) User Interface: PHP, XHTML, CSS, JQUERY

Client-side Scripting: JavaScript, PHP Scripting

Programming Language: PHP, ASP

IDE/Workbench/Tools: Adobe Dreamweaver CS6, NetBeans.

Database: MySQL (MySQLite, Optional, Oracle 10 g).

Server Deployment: Apache/2.2.4, Tomcat Apache

2) Hardware Specification Requirements:

Monitor: 17 inches LCD Screen (optional).

Processor: Pentium 3, 4, dual-core, Intel, Core i 7.

Hard Disk: 500 GB or 1 - 4 Terabyte.

RAM: 4 GB or more.

5.3. The System Design and Implementation

The system analysis process has a major component in which the system is designed. These are: the administrative account panel (i.e. for registration and login authentication), user login panel, question generation and pin-code or application number generation.

The workflow of the system enables the user to easily understand the process for generating questions. Furthermore, the system also provides a user with graphical user interface (GUI) and a simple interactive interface to enter the details concerning: the generation of question i.e. input to the database. The system administrator provides the users access to get registered for access rights to the system (Figure 1).

The architecture of the system is design structurally, and constitutes three essential parts. This includes the GUI, Front-End and Back-End modules.

1) The GUI defines the structural design and how the system looks like after implementation, which details a unique and interactive platform that suites user needs.

2) The Front-End (FE) comprises of everything concerning what the user sees, and includes the design and types of programming languages used in the designing

Figure 1. System design architecture.

of the system. Such language include: PHP, PHPMYQL, HTML, Bootstrap AND CSS etc. The Back-End (BE) otherwise called the Server-Side, deals with the system inputs, retrieval, edit and updates. This refers to everything the user cannot see in the browser such as the database and servers used.

6. Results and Discussion

The test-plan is basically a list of tests. A convenient sample of about 50 questions per students was taken on the developed CBT test. Afterwards, a survey and questionnaire was used for data collection. The data analysis demonstrated auspicious characteristics of the target context for the CBT implementation. A few were not in favor of it, due to the impaired validity of the test administration which they reported having some erroneous formulas, equations and structures in the test items. The test process deployed provides immediate scoring, speed and transparency in marking. It is important to also note that the test cases cover all the aspects of the question generating system.

Description of Test Results

Tables 2-6 and Figures 2-16 demonstrate the results of the survey carried on 230 students in total. This was conducted by the administrator of the platform. The questions and answers were carefully worded, selected and formulated to ascertain their level of CBT competence/awareness, ease of use of platform and preference over the manual process of examinations.

The three aforementioned levels of questioning were each ranked on the scale of 1 - 3 as: not too well (1), well (2) and very well (3). The idea is to compare the three levels of questioning, bearing in mind that age and gender could be of great significance too. The population of the students concerned was well above 2000 in number as such a random sample of an average of 57 students per school was a good sample (i.e. roughly 10%) was used to estimate the general opinion of the students. The students were also randomly passing by while being issued

Table 2. Survey of 1 - 50 students for determining the performance of the CBT platform for Gombe State University, Gombe-Nigeria.

Table 3. Survey of 1 - 80 students for determining the performance of the CBT platform for Matrix International Academy, Gombe-Nigeria.

Table 4. Survey of 1 - 60 students for determining the performance of the CBT plat form for Yahaya Ahmed Schools, Gombe-Nigeria.

Table 5. Survey of 1 - 40 students for determining the performance of the CBT platform for Gombe High School, Gombe-Nigeria.

Table 6. Survey of 1 - 50 students for determining the performance of the CBT platform for Gombe International School, Gombe-Nigeria.

Figure 2. Age vs level of competence, ease of use and preference.

Figure 3. Comparison between the levels of competence, ease of use and preference.

Figure 4. The age distribution of surveyed students.

Figure 5. Age vs level of competence, ease of use and preference.

Figure 6. Comparison between the levels of competence, ease of use and preference.

Figure 7. The age distribution of surveyed students.

Figure 8. Age vs level of competence, ease of use and preference.

Figure 9. Comparison between the levels of competence, ease of use and preference.

Figure 10. The age distribution of surveyed students.

Figure11. Age vs level of competence, ease of use and preference.

Figure 12. Comparison between the levels of competence, ease of use &preference.

Figure 13. The age distribution of surveyed students.

Figure 14. Age vs level of competence, ease of use and preference.

Figure 15. Comparison between the levels of competence, ease of use & preference.

Figure 16. The age distribution of surveyed students.

with invitations to attend the trial test of the developed CBT platform using a demo test. In [11] [12] the instrument of the study was conducted by four tests, measuring problem solving, inductive reasoning, working memory and creativity. Also, a questionnaire was designed to focus on participants’ demographic data, learning strategies, and ICT familiarity. However, in this paper, the students after sitting for the demo test, their level of CBT competence/awareness, level of ease of use/navigation and the level of their preference of CBT or the manual process of examinations using pen-and-paper approach From Table 2 and Figure 2 representing the 50 students selected from Gombe State University, Gombe, it can be clearly seen that age varied among the students, with about 85% of the students being very competent or at least very aware of what and how a CBT platform works. Also, about 90% of them said it was easy to navigate around while using the platform. Lastly, about 95% of them preferred the CBT platform in comparison to the conventional and manual process of conducting tests and examinations. Figure 3 shows that that more than 72% of them were generally satisfied on all three levels, while Figure 4 from the same institution highlights the variations between the ages of the students. It can be seen that more than 66% of them above the age of 19 years as seen in Figure 2 and Figure 4. This shows that the older they get, the more experience they get with CBT application. However, the statistics seem to portray a slightly different analysis with secondary students. The students are a little younger in age compared to their higher institution counterparts as seen in Figure 2.

From Table 3 and Figure 5 representing Matrix International Academy had 80 students that were surveyed during their test. According to the figure, the age distribution varied among the students, with about 70% of them paced above 13 years as seen in Figure 5 and Figure 7. More than about 88% of the students were very competent and very aware of how a CBT platform works. Also, about 20% of them did not find it easy to navigate around while using the platform. Furthermore, about 80% the students preferred the CBT platform in comparison to the conventional and manual process of conducting tests and examinations. Figure 6 shows that more than 80% of the students were generally satisfied with the level of their CBT awareness, and ease of navigation and preference of CBT over the manual process of examinations.

From Table 4 and Figure 8 representing Yahaya Ahmed Schools had 60 students that were surveyed during their test. Accordingly, the age distribution varied among the students with about 60% of them paced above 13 years as seen in Figure 8 and Figure 10. More than about 85% of the students were very competent and very much aware of how a CBT platform works. Also, only about 6% of them did not find it easy to navigate around while using the platform. Furthermore, about 80% the students preferred the CBT platform in comparison to the conventional and manual process of conducting tests and examinations. Figure 9 shows that more than 96% of the students were generally satisfied with the level of their CBT awareness, and ease of navigation and preference of CBT over the manual process of examinations. The branch of the school that was surveyed is female-only as seen in Table 4.

From Table 5 and Figure 11 representing Gombe High School had 40 students that were surveyed during their test. The age distribution also varied among the students with about 62% of them paced above 13 years as seen in Figure 11 and Figure 13. More than about 90% of the students were very competent and very much aware of how a CBT platform works. Also, only about 5% of them did not find it easy to navigate around while using the platform. Furthermore, about 98% the students preferred the CBT platform in comparison to the conventional and manual process of conducting tests and examinations. Figure 12 shows that more than 95% of the students were generally satisfied with the level of their CBT awareness, and ease of navigation and preference of CBT over the manual process of examinations.

From Table 6 and Figure 14 representing Gombe High School had 50 students that were surveyed during their test. The distribution of their age also varied among the students with about 32% of them placed between 14 - 16 years as seen in Figure 14 and Figure 16, which shows that a majority of the students are younger. Furthermore, only about 16% of the students were not competent and aware of how a CBT platform works. Also, only about 2% of them did not find it easy to navigate around while using the platform. Generally, almost all the students preferred the CBT platform in comparison to the conventional and manual process of conducting tests and examinations as seen in Figure 15.

7. Conclusions

This paper presented a Logarithmic Quicksort algorithm for solving a highly constrained questions and answer CBT system. The approach used a problem-specific domain representation, with Stochastic and context-based reasoning for obtaining feasible results to questions answered by a student within a reasonable time.

The system makes use of a procedural method and processes in generating questions and answers automatically. This made it easier to understand and quickly process. The system completely eliminates the manual process of writing examinations using the conventional pen and paper style.

The system automatically handles all the other time consuming processes of generating exams scores and the setting up of examination questions such that over time, a considerable databases of questions can easily be queried and reset at random for an arbitrary number of sessions. By implication, this means that the issue of acquiring more infrastructures to accommodate more students is not required. This is because every session for the same examination will be scrambled and reset for the next set of students sitting for the same test of examination.

Future work will entail real-time generation, content analysis and reports to be generated for management information system (MIS) purposes. The work will also include some aspect of machine learning, particularly using Recurrent Neural Network (RNN) to determine and predict the level of collisions that may likely occur, whenever fresh questions are scrambled.

Acknowledgements

We acknowledge our creator for the knowledge passed down. Our families, friends, and colleagues are all appreciated as well for their effort. The open source software communities are well acknowledged as well for providing readily available free downloadable software, particularly: PHP, XHTML, CSS, JQuery etc. for interface design, and Apache/2.2.4, Tomcat Apache etc. for data service deployment. This has made research a lot easier now.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

Cite this paper

Godwin, D.P., Bala, M. and Habib, A. (2018) Design and Analysis of an Electronic Platform for Examinations. Open Access Library Journal, 5: e4919. https://doi.org/10.4236/oalib.1104919

References

  1. 1. Baker, R.S., D’Mello, S.K., Rodrigo, M.M.T. and Graesser, A.C. (2010) Better to Be Frus-trated Than Bored: The Incidence, Persistence, and Impact of Learners’ Cognitive-Affective States during Interactions with Three Different Computer-Based Learning Environments. International Journal of Human-Computer Studies, 68, 223-241.
    https://doi.org/10.1016/j.ijhcs.2009.12.003

  2. 2. Lee, K.S.K., Wilson, S., Perry, J., Room, R., Callinan, S., Assan, R., Hayman, N., Chikritzhs, T., Gray, D., Wilkes, E., Jack, P. and Conigrave, K.M. (2018) Developing a Tablet Computer-Based Application (‘App’) to Measure Self-Reported Alcohol Consumption in Indigenous Australians. BMC Medical Informatics and Decision Making, 18, 8.
    https://doi.org/10.1186/s12911-018-0583-0

  3. 3. Cynthia, G.P., Judith, A.S., John, C.K. and Tim, D. (2002) Practical Considerations in Computer-Based Testing. Considera-tions in Computer-Based Testing. Springer-Verlag, New Jersey, 1-2.

  4. 4. Nugroho, R.A., Kusumawati, N.S. and Ambarwati, O.C. (2018) Students Perception on the Use of Computer Based Test. IOP Conference Series. Materials Science and Engineering, 306, Article ID: 012103.
    https://doi.org/10.1088/1757-899X/306/1/012103

  5. 5. Darrell, L.B. (2003) The Impact of Computer-Based Testing on Student Attitudes and Behaviour. The Technology Source Archives, University of North Carolina, USA. http://technologysource.org/article/impact_of_computerbased_testing_on_student_attitudes_and_behavior/

  6. 6. Bennett, R.E. (2002) Inexorable and Inevitable: The Continuing Story of Technology and Assessment. Journal of Technology, Learning, and Assessment, 1, 1-2.
    https://ejournals.bc.edu/ojs/index.php/jtla/article/download/1667/1513

  7. 7. Dembitzer, L., Zelikovitz, S. and Kettler, R.J. (2018) Designing Computer-Based Assessments: Multidisciplinary Findings and Student Perspectives. International Journal of Educational Technology, 4, 20-31.
    https://educationaltechnology.net/ijet/index.php/ijet/article/view/47

  8. 8. Prastikawat, F.A. and Huda, A. (2018) The Effect of Computer-Based Image Series Media toward De-scription Writing Skills of a Student with Intellectual Dissability in the Grade VII SMPLB. Journal of ICSAR, 2, 52-56.

  9. 9. Lariana, R. and Wallace, P. (2002) Paper-Based versus Computer-Based Assessment: Key Factors Associated with the Test Mode Effect. British Journal of Education Technology, 33, 593-602.
    https://doi.org/10.1111/1467-8535.00294

  10. 10. Ridgeman, B., Lennon, M.L. and Jackenthal, A. (2001) Effects of Screen Size, Screen Resolution, and Display Rate on Computer-Based Test Performance. ETS Research Report Series, 2001, 1-23.

  11. 11. Butcher, J.N., Perry, J.N. and Atlis, M.M. (2000) Validity and Utility of Computer-Based Test Interpretation. Psychological Assessment, 12, 6-18.
    https://doi.org/10.1037/1040-3590.12.1.6

  12. 12. Wu, H. and Molnár, G. (2018) Com-puter-Based Assessment of Chinese Students’ Component Skills of Problem Solving: A Pilot Study. Literacy, 1, 5.

NOTES

*CBTA Electronic Examinations Platform.