Enhancing University Student Engagement Using Online Multiple Choice Questions and Answers

Abstract

For many education providers, student engagement can be a major issue. Given the positive correlation between engagement and good performance, providers are continually looking for ways to engage students in the learning process. The growth of student digital literacy, the wide proliferation of online tools and the understanding of why online gaming can be addictive have combined to create a set of tools that providers can leverage to enhance engagement. One such tool is Peerwise, https://peerwise.cs.auckland.ac.nz/, an online, multiple choice question (MCQ) and answer tool in which students create questions that are answered by other students. Why use MCQs? Using MCQs tests knowledge, provides reassurance of learning, identifies gaps and makes this data available to student and provider. Students use this information to focus their time on areas requiring additional work [1], benefiting from the early feedback provided. Formative assess- ments using MCQs are beneficial in preparing students for summative testing and are appreciated and liked by students [2]. Providers can use this information to determine how the material is being received and react accordingly. Students use Peerwise to create MCQs that are answered, rated and commented on by their peers. Students’ engagement in Peerwise earns trophies for contributing regular use and for providing feedback, all of which act to stimulate further engagement, using the principles of gamification. Bournemouth University, a public university in the UK with over 18,000 students, has been embedding Peerwise in under-graduate and post-graduate units since 2014. The results experienced by Bournemouth University have been beneficial and correlate with other studies of using Peerwise [3] [4]. A statistically significant improvement was seen by one cohort of students compared to the previous year where Peerwise was not used. However, no correlation was found between Peerwise participation and a student’s unit mark. The processes followed by Bournemouth University and the advantages and disadvantages, backed by qualitative and quantitative data, will be presented so that other institutions can gain an informed view of the merits of Peerwise for their own teaching and learning environments.

Share and Cite:

Biggins, D. , Crowley, E. , Bolat, E. , Dupac, M. and Dogan, H. (2015) Enhancing University Student Engagement Using Online Multiple Choice Questions and Answers. Open Journal of Social Sciences, 3, 71-76. doi: 10.4236/jss.2015.39011.

1. Introduction

How to encourage students to be more engaged in learning is a major issue for many tutors and is widely discussed in education literature. Strategic and surface approaches to learning [5] and the “plug and chug” method [6] are some of the ways in which students display a lack of engagement by focusing on what is needed to pass the unit and, in consequence, experience poor learning outcomes [7].

The influence of assessment on students’ focus and attention is well documented by authors such as Biggs [8]. By choosing appropriate assessment strategies, students can be encouraged to take more interest in and develop a strong understanding of their subject. In addition to assessment, good teaching requires active student participation in the learning process engendering student independence and control over the learning process [9].

The appropriate use of this blended learning environment which integrates teaching and online learning [10] is likely to change the approach of students to the unit and improve the learning experience [5]. This paper looks at one technology enhanced tool called Peerwise and recounts the experiences of using the application at Bour- nemouth University during the academic year 2014-2015.

2. What Is Peerwise?

Peerwise is an online repository of MCQs in which students create, share and answer questions.

A question can consist solely of text but can also include images or links to web resources such as videos. Students indicate the correct answer to their question and up to four wrong or distracting answers. Question creators are encouraged to include an explanation of why the answer is correct and why the distractors are not correct. Students are asked to rate questions for difficulty and quality. Questions can be tagged to group them into themes and categories. Students earn trophies for their participation and this acts as a motivator. Leaderboards of those creating the best questions and those with the most correct answers also act to engage participants and introduce an element of gamification.

Peerwise offers a self-directed learning pedagogy based on a tool that can be accessed at times that suit students and which provides immediate feedback to students, supported by explanations and links to further learning resources. This flexible learning aid is augmenting the more traditional tutor-student relationship which can be seen as demotivating by students [11]. Instead, a more personalised format enables students to develop their paths to knowledge in a very individual way [12].

Gibbs and Simpson [13] highlight the importance of timely feedback to students. Once each question has been answered, a student finds out if they are correct and how other students answered the question. This information allows a student to form a view on their own knowledge and also that of their peers. The recognition is important in indicating areas for future improvement and development, reinforcing existing learning and raising attainment [14].

3. Benefits for Students and Staff

Various studies have identified benefits of using Peerwise to both staff and students [3] [4] [9] [15] [16]. These benefits are summarised below:

For staff:

§ Peerwise involves students more in teaching and learning. Peerwise supports student-led learning based on socio-constructivism and co-creation of material where students can either work individually or in groups to create questions.

§ Can provide the tutor with both formative and summative assessment feedback.

§ Potentially reduced contact time. One of the purposes of test questions is to foster deep learning without incurring additional tutor time [15].

§ Quickly highlight topics or concepts that are troubling students.

§ Create reusable learning objects (RLOs) as the questions can be exported and imported.

§ Can be used for crowd-sourced exam questions.

§ While a small number of unit marks are usually allocated for the use of Peerwise to ensure participation, studies have found that students often exceed the minimum requirement [3].

§ Tutors control access to the repository and the user’s roll (supervisor or student).

§ No cost implications.

§ In many implementations of Peerwise, the responsibility for creating questions rests with students with little tutor involvement after the initial introduction. This means that Peerwise can be highly efficient.

For students:

§ Peerwise promotes a self-directed, independent approach to learning where the student takes the initiative to formulate and achieve goals they set for themselves, determine the quality of their own work and the work of others and successfully filter information to satisfy their needs [4].

§ Creating and answering questions develops creativity, independence, knowledge and understanding. Question creation forces students to reflect on the unit’s learning outcomes.

§ Access to a repository of pertinent questions, organised by category, difficulty and quality.

§ Empirical testing of students’ knowledge [3]. Receive immediate feedback on questions answered. In viewing the responses of their peers, students can self-assess their knowledge level.

§ Utilise current technology to access Peerwise (smart phone, tablet or computer). Use Peerwise wherever they are and whenever suits them best (24 hour access).

§ Work individually or in groups to create or answer questions.

§ Use Peerwise confidentially because their real identity is not displayed. This creates a climate where students can be comfortable and confident in using Peerwise.

§ Students with different learning styles, for examples activists, reflectors, theorists and pragmatists [17] will each be able to benefit from the use of Peerwise. Activists are likely to find Peerwise challenging, reflectors have time to consider questions, theorists will enjoy the structure and methodology of Peerwise and pragmatists will perceive how using Peerwise will enable the achievement of their goals.

§ Encourages participation through gamification. There are 25 trophies available in Peerwise.

The evidence presented has been supportive of Peerwise. However there are potential issues:

§ While studies have demonstrated the benefits of peer instruction, improving learning twofold when compared to the lecture format [6], some researchers have identified issues of trust. While students have implicit trust in tutors, they are less trusting in the knowledge of their peers. However, students’ trust in tutors can lead to acceptance of what is said without any critical assessment, behaviour which acts to suppress deep learning in students [15].

§ Peerwise is a standalone application and is not yet integrated with other applications. This means that students have to follow links and log into Peerwise separately from their virtual learning environment (VLE). When new questions are raised, there is no mechanism to alert students.

§ Concerns can be raised about the quality of questions created in Peerwise. With no tutor to oversee the questions, will students create simple, poor or incorrect questions?

4. How Bournemouth University Embedded Peerwise

Bournemouth University embedded Peerwise in one under-graduate and two post-graduate units in 2014-15. The embedding process was the same in each occasion. At the start of each unit, a scaffolding session was held. It was felt important to set and communicate expectations to students [10] because, while the application is intuitive to use, there are benefits from comprehensively scaffolding the tool for students. The scaffolding session covered 8 areas:

1) Rationale. The reasons for using Peerwise. The high level of constructive alignment between the unit content, assessment and Peerwise was reinforced in the minds of students.

2) Functionality. How to use Peersise. Reassurance that the identity of question creators could not be seen by peers.

3) Examples. Examples of good and poor quality questions. Quality is the extent to which a question is an effective and efficient means to acquire the knowledge required for the unit [16]. In terms of the SOLO taxonomy [8] questions should tend towards the higher levels that are relational and require students to integrate, analyse and apply their knowledge. Questions at the other end of the taxonomy, that are unistructural and test memory and recognition, will also be required but these will be less prevalent in the repository. The examples were also framed in terms of Bloom’s taxonomy [18] with emphasis being given to create questions that test the higher levels of the taxonomy. It makes intuitive sense to provide examples however they may be unnecessary. Purchase et al. [19] report how a repository of adequate quality was created without any instruction on what constitutes a quality question.

4) Creativity. A prompt to use text, images and links to provide an engaging question.

5) Usage expectations. The practice and live use of Peerwise during the semester.

6) Assessment. How the use of Peerwise would be assessed. To provide encouragement to students to use Peerwise, the unit assessment marks were changed to allocate 10% of the unit mark to its use. The marks were banded based on participation which allowed the tutor to allocate a mark without an undue assessment overhead. Different marking scenarios were trialled but they all had in common that they accounted for 10% of the unit mark. In one unit for example, 4 marks were given for creating 5 questions and answering 10 questions correctly in the semester and 10 marks for creating 30 questions and correctly answering 60 questions.

7) Issue management. The process by which any issues could be highlighted and managed.

8) Feedback mechanism. How the students can feedback their views on Peerwise.

5. Experience in the 2014-2015 Teaching Year

After the scaffolding session, students were given a week to trial Peerwise before the test questions were removed and live use commenced. Table 1 shows the metadata for the three units.

The different numbers of questions created per student by unit (17, 1, 27.7 and 38.1 respectively) reflected the different requirements for gaining the 10 marks for participation. The requirement was increased in each subsequent unit. The different student levels, under-graduate and masters, was factored in. As 20 credit units, the amount of time that could be allocated to the use of Peerwise in the semester was also taken into consideration.

The profiles of questions created and answered across the semester showed that the initial scaffolding had been beneficial, in some of the units, in suggesting that students create questions early in the semester so that there were a variety and range of questions to be answered. Figure 1 shows the profile for questions created in the under-graduate unit.

Figure 1 also shows that many questions were created at the end of the semester as the unit deadline approached. It is unlikely that many of these questions were answered by other students, thus reducing their use. In the third unit, the profile of questions created was much less balanced. Figure 2 shows the post-graduate project management unit.

The unbalanced profile in Figure 2 is likely to have reduced the benefit of using Peerwise for this cohort of students and suggest that the scaffolding is inadequate in this the third use of Peerwise by the same tutor.

The feedback from students on the use of Peerwise was positive. For most students, it was the first time they had used Peerwise and they reported it was easy to use, beneficial to their learning and that Peerwise had increased their engagement with the unit. While the majority of students fulfilled the assignment brief to gain the 10 marks possible, some students went much further. In the under-graduate unit, four students answered over 100 questions, one student more than 200 and a single student answered 385 questions during the semester.

The potential concern about poor quality questions was only marginally experienced. If students created poor questions, they were rated as such by fellow students and these questions were then bypassed by students looking for better quality questions. This supports the view that students are effective judges of question quality and that there is a willingness to accept the judgements of other students on what is a quality question [16]. Where students indicated the wrong answer to a question, feedback from other students encouraged the question creator to revise and correct the question and once led to a very lively seminar debate.

Many students reported that making questions for their peers elevated their role in the unit and they enjoyed those feelings. There were no concerns raised about learning from other students.

In looking for evidence of the benefit of using Peerwise, correlation tests were undertaken to look for a link

Table 1. Peerwise trials.

Figure 1.Profile of questions created by day in the under-graduate unit.

Figure 2. Profile of questions created by day in the post-graduate project management unit.

between a student’s overall grade mark for the unit and the number of questions answered. No statistically significant correlations were found (the Pearson product-moment correlation coefficient for unit 1: r = +0.120, n = 50, p = 0.408, unit 2: r = −0.233, n = 11, p = 0.490 or unit 3: r = +0.475, n = 12, p = 0.118). In addition to small sample sizes, a possible explanation for this comes from the fact that the three units are all assessed by coursework. If the assessment involved an examination, it is possible that the use of Peerwise may have correlated in a more positive way with the students’ mark. In unit 1, data from a self-assessment undertaken at the start of the unit and repeated at the end created a measure of student improvement. When the data from the cohort that did not use Peerwise were compared with the Peerwise cohort, a statistically significant improvement was seen in the Peerwise cohort (t = −2.385, df = 65, one-tailed p = 0.01).

6. Conclusion

The last academic year has seen the successful but limited use of Peerwise at Bournemouth University. The Peerwise team at Bournemouth University is working to develop Peerwise through six initiatives:

1) Promote to staff. Raise the profile of Peerwise to other tutors at Bournemouth University so that they may understand and assess the suitability of Peerwise for their own units.

2) Improve the embedding of Peerwise. For the next academic year, use the lessons learnt to improve the scaffolding and use of Peerwise at Bournemouth University.

3) Generic resources. A staff-student co-creation project is currently working to create 500 questions in each of the two university-wide repositories; academic writing and research methods. Please contact peerwise@bournemouth.ac.uk for more information and access to these repositories.

4) Benefit measurement. Identify ways in which the benefit of Peerwise can be measured.

5) Integrate with the VLE to simplify access using the Learning Tools Interoperability (LTI) facility.

6) Incorporate in a MOOC. In developing its first Massive Open Online Course (MOOC), Bournemouth University will be incorporating Peerwise for assessment and revision.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Fielding, M. (2001) Students as Radical Agents of Change. Journal of Educational Change, 2, 123-141. http://dx.doi.org/10.1023/A:1017949213447
[2] Foos, P.W. (1989) Effects of Student-Written Questions on Student Test Performance. Teaching of Psychology, 16, 77- 78. http://dx.doi.org/10.1207/s15328023top1602_10
[3] Denny, P. (2010) Motivating Online Collaborative Learning. ITiCSE’10: Proceedings of the 15th Annual Conference on Innovation and Technology in Computer Science Education, Ankara, June 2010, 26-30. http://dx.doi.org/10.1145/1822090.1822176
[4] Luxton-Reilly, A., Denny, P., Plimmer, B. and Sheehan, R. (2012) Activities, Affordances and Attitude: How Student- Generated Questions Assist Learning. Proceedings of the 17th ACM Annual Conference on Innovation and Technology in Computer Science Education, 3-5 July 2012. http://dx.doi.org/10.1145/2325296.2325302
[5] Bloxham, S. (2007) The Busy Teacher Educator’s Guide to Assess-ment. http://dera.ioe.ac.uk/13028/
[6] Simon, B. and Cutts, Q. (2012) Peer Instruction: A Teaching Method to Foster Deep Understanding, Communications of the ACM, 55, 27-29. http://dx.doi.org/10.1145/2076450.2076459
[7] Entwistle, N. (2000) Promoting Deep Learning through Teaching and Assessment: Conceptual Frameworks and Educational Contexts. 1st Annual Conference ESRC Teaching and Learning Research Programme (TLRP), University of Leicester, November 2000. http://www.tlrp.org/acadpub/Entwistle2000.pdf
[8] Biggs, J. (2003) Aligning Teaching and Assessing to Course Objectives. Teaching and Learning in Higher Education: New Trends and Innovations, University of Aveiro, 13-17 April 2003.
[9] Denny, P., Luxton-Reilly, A. and Hamer, J. (2008) Student Use of the PeerWise System. ITICSE’ 08: Proceedings of the 13th Annual Conference on Innovation and Technology in Computer Science Education, Madrid, 30 June-02 July 2008, 73-77. http://dx.doi.org/10.1145/1384271.1384293
[10] Hinton, D. and Cooner, T.S. (2014) Blended Learning Design Planner v1.2 Resource Pack. Design for Inquiry-Based Blended Learning (DIBL), University of Birmingham, Birmingham. http://www.birmingham.ac.uk/Documents/college-social-sciences/social-policy/CEIMH/DiBLPlannerResourcePack.pdf
[11] Hanrahan, M.U. (1998) The Effect of Learning Environment Factors on Students’ Motivation and Learning. In-ternational Journal of Science Education, 20,737-753. http://eprints.qut.edu.au/1352/#
[12] Biggs, J. and Moore P. (1993) The Process of Learning. Prentice Hall, New York.
[13] Gibbs, G. and Simpson, C. (2004) Conditions under Which Assessment Supports Students’ Learning. Learning and Teaching in Higher Education (LATHE), 1, 3-31. http://insight.glos.ac.uk/tli/resources/lathe/Documents/issue%201/articles/simpson.pdf
[14] Hounsell, D. (2007) Towards More Sustainable Feedback to Students. In: Falchikov, N. and Boud, D., Eds., Rethinking Assessment in Higher Education, Routledge, London, 101-113.
[15] Draper, SW. (2009) Catalytic Assessment: Understanding How MCQs and EVS Can Foster Deep Learning. British Journal of Educational Technology, 40. http://dx.doi.org/10.1111/j.1467-8535.2008.00920.x
[16] Denny, P., Luxton-Reilly, A. and Simon, B. (2009) Quality of Student Contributed Questions Using Peerwise. ACE’ 09: Proceedings of the 11th Australasian Conference on Computing Education, Wellington, 55-63. http://dl.acm.org/citation.cfm?id=1862724
[17] Swailes, S and Senior, B. (1999) The Dimensionality of Honey and Mumford’s Learning Style Questionnaire. International Journal of Selection and Assessment. http://dx.doi.org/10.1111/1468-2389.00099
[18] Anderson, L.W. and Krathwohl, D.R. (2001) A Taxonomy for Learning, Teaching, and Assessing. Longman, New York.
[19] Purchase, H., Hamer, J., Denny, P. and Luxton-Reilly, A. (2010) The Quality of a PeerWise MCQ repository. ACE’ 10: Proceedings of the 12th Australasian Conference on Computing Education, Brisbane, 37-146. http://dl.acm.org/citation.cfm?id=1862219.1862238

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.