Comparison of Student Engagement in a Large First-Year Statistics Course between Flipped and Face-to-Face Instruction ()
1. Introduction
The unprecedented global COVID-19 pandemic has introduced additional complexity to student engagement. Before the pandemic outbreak, most universities implemented a blended learning approach to accommodate large and diverse student cohorts through an effective mix of synchronous (such as face-to-face or streaming online lectures) and asynchronous (such as recorded videos, reading materials) teaching. However, at the beginning and the peak of the outbreak, the strict lockdown restrictions required all learning activities to be delivered fully online where earlier face-to-face learning turned into online learning using Zoom (Zoom, 2023) or similar platforms. This abrupt transition to online learning and teaching created setbacks towards improving student engagement as it hindered participation, particularly in collaborative tasks such as group-based activities (Davidson, 2019; Graham & Misanchuk, 2004; Jaques & Salmon, 2007). Meanwhile, academics started to rethink and redesign their instructional approaches as they realised that more was required than simply moving the pedagogy from one medium to another to ensure quality teaching (Findlay-Thompson & Mombourquette, 2014; Henriksen et al., 2020). As a result, many experimented and adopted a flipped classroom model. At its core, the new instruction paradigm attempts to engage students by moving the transmission of knowledge outside the lecture hall via technology and reading materials (DeVaney, 2009; Dunham et al., 2018) while bringing interactive learning activities, such as discussions on real-world examples inside the class (Bergmann & Sams, 2012; Brunsell & Horejsi, 2013; Tomas et al., 2019; Young, 2011).
Flipping the traditional classroom is not new (Velegol et al., 2015), and advocates of this approach believe its potential to resonate well with the characteristics of today’s leaners as, firstly, it utilises the 21st-century technological tools that are familiar to the students. Secondly, it offers flexibility to students involved in extracurricular activities and those with jobs and caring responsibilities to study around their schedules (Clark, 2015). Other proponents claim the theoretical foundations of the flipped model stem from comprehensive literature on student-centred learning (Bishop & Verleger, 2013), maximising proactive engagement and class participation, which ultimately increases learner autonomy and enjoyment (Graham & Misanchuk, 2004).
Many instructors in Science, Technology, Engineering, and Mathematics (STEM) subjects have trialled flipped classrooms and reported improved student retention (Gilboy et al., 2015; Hope, 2016), content understanding (Dhawan, 2020; McLean et al., 2016; Nielsen et al., 2018; Wilson, 2013; Winquist & Carlson, 2014), student engagement (Burke & Fedorek, 2017; Clark, 2015; Khan & Watson, 2018), and student-teacher interactions (Graham & Misanchuk, 2004). While various disciplines have adopted flipped classrooms (Baker, 2016; Johnson & Renner, 2012; Johnson et al., 2020; Strayer, 2007), reports on the flipped classroom model in the field of statistics remains limited. There is only one paper in Teaching Statistics Journal that uses the term “flipped classroom”, where the authors (Turner & Dabney, 2015) suggested some simulation activities could be used during a flipped classroom for discussion or before a flipped classroom as part of students’ preparation for the class. However, they do not present or report any implications or results of implementing their suggestions in a flipped classroom. There is evidence that there were implementations of flipped classroom in statistics classrooms (Khan & Watson, 2018; Wilson, 2013), although the literature is scarce. Besides the need for more rigorous research on the flipped classroom for statistical education, there is growing evidence of resistance from teachers and students to the flipped classroom (Wilson, 2020). Some attributed the resistance to the flipped classroom to the isolated opinions of their study participants (Comber & Brady-Van den Bos, 2018; Davenport, 2018). In contrast, many claim it is a pedagogy only suited to some learners (Brooks et al., 2020; Long et al., 2017; Sammel et al., 2018). A study in the higher education context uncovers the resistance to the flipped model primarily lies in the perceived or actual increased workload and equity (Wilson, 2020). With such varying quality and mixed results, the flipped classroom should be further studied to provide concrete evidence as to whether it is a pedagogy worth pursuing to enhance student engagement and performance in statistics classrooms.
The flipped classroom model may be beneficial for introductory statistics courses offered to students with a wide range of majors, as non-math majors with weak mathematical skills and backgrounds often express “statistical anxiety”. Therefore, many students are unmotivated, anxious, and apprehensive, which undermines their class engagement (Onwuegbuzie & Wilson, 2003). However, the self-paced off-class activities in the flipped model, such as watching recorded lectures and reading textbooks, allow extra time for students to get familiar with the topics. In-class interactive learning activities on authentic problems could assist students in making real-world connections. Positive perceptions of relevance and usefulness of statistics in introductory statistics course have been linked to greater likelihood to the effective use of statistics in students’ future studies and jobs (Dove, 2013; Songsore & White, 2018; Tucker, 2012). Statistics instructors worldwide have already added the flipped model to their pedagogy toolbox. During the 11th International Conference on Teaching Statistics (ICOTS 11) in Rosario, the first author created an online survey with one question “Did you use the flipped classroom for statistics teaching?” Out of 27 responses, nearly half answered “Yes” (n = 13), and the other half answered “No” (n = 14). So, we should know the implications of using a flipped classroom for statistics classes.
In this paper, we shed light on the benefits and challenges of flipped classroom model with carefully crafted learning activities to engage first-year statistics students for collaborative learning in an online learning environment. The question we would like to answer is “Does the flipped classroom with online learning work as well as or better than traditional face-to-face classes for a large first-year statistics unit?” Most flipped classroom implementations do not include a control group for comparison. In contrast, this paper controls for factors including the impact of lecturer and learning resources (mostly remain the same). Therefore, future research and design questions can be posed based on the evidence presented in this paper.
2. Background and Data Sources
In 2023, 65 per cent of the students joined Macquarie, a public research university located in Sydney, Australia, directly from high school or after a gap year (Macquarie University, 2024b). Most first-year students arrive in class expecting that learning will be the same as in high school. While many students transit smoothly, thrive, and succeed, about 15 per cent drop out (Price, 2019). It is well-known that the Tall Poppy Syndrome in Australia is widespread (Feather, 1989) which negatively impacts on students’ interactions with their peers and lecturers. The Australasian Survey of Student Engagement (Australian Council for Educational Research, 2009), the largest cross-institutional survey of first-year students in Australia, found that compared to their American counterparts, students in Australia were less likely to raise questions in class or participate in active learning activities. It might be because Australian students want to avoid being identified as a Tall Poppy. On the other hand, participation in higher education has increased significantly, fostered by successive government policies to encourage access. Alongside the increased student enrolment, the growing diversity of commencing students in terms of their race, ethnicity, family income, and parental education, is expected to have a bearing on their engagement (Yorke, 2006; Yorke & Bernard, 2008). The university’s infrastructure, curriculum, academic and professional staff are equally crucial to inclusiveness (Lage et al., 2000) and student engagement. Consequently, academics in Australia have a growing interest in comprehending the factors associated with first-year students’ engagement, retention, and success. Astin (Astin, 1999) defines student engagement as “the amount of physical and psychological energy the student devotes to the academic experience”. It implies that student engagement should be evaluated in a broad socio-cultural context (Yorke, 2006).
A first-year business statistics unit with enrolments of more than 1000 students per semester at a major metropolitan university in Sydney, Australia, is chosen for this study. Data for students in 2019 (before COVID) and 2021 (during COVID) were extracted from the learning management system (the LMS - Moodle). Most of the students who undertake this unit, study towards a Bachelor of Commerce degree where the unit is an essential part of their degree. The topics of the unit are similar to many first-year statistics units around the world which include descriptive statistics, graphical displays, two sample t-test, paired t-test, simple linear regression and chi-square tests. Each week, the unit has a two-hour lecture; a one-hour tutorial where use of computers is limited to what students bring to the class; and one hour of computer lab-based practical where each student has access to a university computer and software for analysing data sets. During tutorials, students discuss with their peers and tutors to solve presented problems, while students mostly use Excel to process and analyse data, and draw conclusions from the results of data analysis during the practical classes with the help of their tutor. Attendance (participation) in the lectures is not compulsory, but participation in tutorials and practicals is compulsory and a hurdle. A hurdle in our university policy is defined as “… is an activity for which a minimum level of performance or participation is a condition of passing the unit in which it occurs.” (Macquarie University, 2024a) For the participation hurdles in our unit, if students fail the participation requirement (participating in less than 10 out of 12 practicals and/or tutorials), their highest possible unit mark, even if they get full marks for each assessment task, is 49 with a grade of Fail Hurdle (FH) (Macquarie University, 2024a). Therefore, we expect more engagement in tutorials and practicals than in lectures.
2019 represented a usual offering of the unit. All learning activities were scheduled in-person, except that the on-campus lectures were live-streamed and were recorded for future viewing (Table 1). The assessments comprised of 1) five formative online non-invigilated quizzes with ten attempts (10%); 2) two computer-based class tests consisting of multiple-choice and numerical questions either based on given scenarios or by downloading data sets and analysing the data. Tutors invigilated both tests during the practical classes, with the first test taking place in mid-semester (15%) and the second one close to the end of the semester (25%), each of them had a time-limit; and 3) a pen-and-paper final exam (50%), which mainly included explanation questions and calculations (using a calculator) based on given computer outputs or scenarios, was scheduled at the end of the semester. The final exam was invigilated by the exam officers. Most questions in online quizzes and class tests were created using R Exams package (Zeileis et al., 2014), which randomized the numbers and scenarios to generate various versions of the same questions to increase engagement and learning opportunities for students (Talley & Scherer, 2013). Randomisation of the Moodle quizzes was used to safeguard academic integrity, reduce possibility of sharing questions and answers between students as students took the class tests during the test weeks (Weeks 7 and 11) spreading across 40 practical classes.
Table 1. Comparison of learning activities in two cohorts.
|
2019 |
2021 |
Lectures |
On campus face-to-face Recordings of lectures are available after the lectures (online) Lecture quiz to record participation |
Pre-recorded lectures (available online), students were expected to watch them before the synchronous lectures Synchronous lectures (on Zoom) Recordings of synchronous lectures are available after the lectures (online) Zoom report for attendance |
Tutorials |
On campus face-to-face In-class participation quizzes were used to record participation No recordings were available |
Scheduled Zoom classes In-class participation quizzes were used to record participation Recordings were available for each week For a missed class, off-class participation quizzes could be completed |
Practicals |
On campus face-to-face In-class participation quizzes were used to record participation No recordings were available |
Scheduled Zoom classes In-class participation quizzes were used to record participation Recordings were available for each week For a missed class, off-class participation quizzes could be completed |
In semester assessments |
Five formative online non-invigilated quizzes with ten attempts (10%) Two computer-based class tests consisting of multiple-choice and numerical questions either based on given scenarios or by downloading data sets and analysing the data (40%). Invigilated during the face-to-face Practical classes. |
Five formative online non-invigilated quizzes with ten attempts (10%) Two computer-based class tests consisting of multiple-choice and numerical questions either based on given scenarios or by downloading data sets and analysing the data (40%). Invigilated online during Zoom classes. |
Final Exam |
In person On paper and pen Invigilated Each student answered the same questions 50% |
Off campus Online using Moodle Non-invigilated Different versions of questions were given to the students 50% |
In 2019, students’ participation in synchronous lectures, attending either in-person or via the online live lecture, was recorded with a one-question quiz in Moodle asking, “Are you attending the lecture today?” The lecturers randomly provided weekly passwords during the two-hour lecture to ensure only the students present in class could attempt the quiz. Students’ participation in tutorials and practicals was also recorded using Moodle quizzes, which students could only access during their class times with passwords given out by their tutors. We allowed students to submit their worked solutions to the tutorial and practical class problems to count towards their missed classes and to reduce failures due to failing the participation hurdle. Student engagement with their learning in 2019 is used as a benchmark group, since it represented how the introductory business statistics unit at Macquarie University was conducted under normal circumstances.
At the beginning of the COVID-19 outbreak in 2020, we trialed a fully online, flipped classroom, which gave us some ideas regarding what would be possible and suitable for learning online. During the second wave (mid-year 2021) of the pandemic in Sydney, strict lockdowns meant all learning activities were shifted online and conducted in Zoom again. Therefore, we used data from the 2021 cohort of students as the comparison group. In the online flipped classroom, we spent significantly less time lecturing, moving away from sage on the stage to the guide on the side. The lectures generally started with us providing a summary of the weekly topic followed by discussions of quizzes, homework, and extension questions where students were encouraged to take the lead. Some extension questions were created from questions posted by students in the discussion forums within the LMS (authentic problems created by students). In summary, 2021 online interactive lectures were not traditional lectures; they were partly student driven and partly teacher driven though teacher part was based on student questions posted in LMS (in-direct student driven part). To ensure students were exposed to the necessary foundation knowledge to become active participants in the class, we released the pre-recorded lectures targeting factual and conceptual knowledge of the weekly topic, a week before the live lecture which also included examples of how the statistical questions can be answered.
The content of the tutorials and practicals remained the same as in 2019, however, we provided recorded tutorials and practicals which were released at the end of each week in Moodle to the students. Students could use these recordings as learning resources for revision or for catching up for a missed class. In addition, we designed off-class participation quizzes in Moodle allowing students to make up for missed classes. The completion of these off-class participation quizzes required students to watch pre-recorded tutorials or practicals’ videos and go through the class problems to ensure learning outcomes are achieved. Students could complete the off-class participation quizzes at their own pace before the end of the semester. Any serious attempt at these off-class quizzes was considered students’ participation. In other words, these off-class quizzes served as formative assessments for students’ self-assessment and to make sure their participation is counted.
The assessment items (five quizzes, two class tests and one final exam) remained the same as in the 2019 offering, except that all assessments were conducted online (off-campus). Although class tests were changed from on-campus computer-based to online via students’ personal computers, class tests were still invigilated by tutors. Instead of being in class, they were on Zoom. There was no variation in assessed content given the course’s learning outcomes remained consistent across both years, a notable difference lied in the delivery of the final exam. Due to the lockdown restriction, the final exam in 2021 was conducted via Moodle online quiz, non-invigilated and timed, but could be completed within a specified time window (i.e. 2 hours exam can be completed within a 6-hour window). In contrast to the paper-based exam in 2019, where all students received the same questions with the same numerical values and scenarios, in 2021, we used R Exams package (Zeileis et al., 2014) to design questions with randomised numbers and scenarios to reduce breaches of academic integrity, similar to the design of the class tests. To keep the student voice as part of their final assessment, we created open-ended questions which required students to put their thinking in words, explain their decisions and justify why they have chosen specific options from the given options. Student engagement with their learning in 2021 is used as the experimental group since it represented a changed learning environment.
This paper examines student engagement in lectures, tutorials and practicals. In 2019, in a traditional classroom, student engagement in lectures was differentiated between synchronous and asynchronous participation. Synchronous participation implied in-person attendance or via the live lecture. Alternatively, students could engage in asynchronous learning by watching the lecture recordings after the live-streamed class. However, the definition is slightly different in a changed learning environment with an online flipped classroom model. In 2021, synchronous participation required students to watch the pre-recorded lectures to obtain the foundation content of the class before attending the interactive live lecture. Alternatively, students could engage in asynchronous learning by watching pre-recorded lectures followed by watching the recorded live lectures. Asynchronous engagement, involving the review of class recordings post-streaming, in both years was evaluated using data extracted from the Echo360 Analytics (https://echo360.com/) platform. We used the video view metric as a measure of asynchronous engagement. For each student, the metric recorded the number of instances students clicked the play for a given class. We considered students engaged if their video views exceeded one. We acknowledge that this measure may inflate asynchronous engagement as students may click play and exit the video; however, this is the most comprehensive information available through the Echo360 platform. In 2019, synchronous lecture participation was measured by completing in-class quizzes. Since the live interactive lectures were conducted in Zoom in 2021, students’ Zoom login data were used to measure their interactive live lecture participation. Lastly, the video views from the Echo360 Analytics were used to measure students’ asynchronous lecture participation in 2019, asynchronous participation in the pre-recorded lectures and the recorded live lectures in 2021.
Besides lectures, student engagement in other learning activities such as tutorials and practicals are also integral to their learning outcomes. We differentiate in-class participation from off-class participation, where students complete specific tasks to make up for missed classes. In both years, in-class participation in tutorials and practicals was recorded by in-class quizzes with passwords given out by tutors. In 2019, off-class participation was recorded by students’ submission of the worked solutions. In 2021, we also “flipped” the tutorials and practicals, making recorded videos of these classes available to students at the end of each week, where students could “attend” and complete the online off-class quizzes to compensate for their participation or to reinforce their learning.
3. Results
The LMS data for students does not include any demographic information (such as gender and age) about the students, therefore we are unable to provide such information. The total student numbers were similar in the two semesters (control and experimental groups) but slightly lower in 2021, possibly due to COVID 19 impact on international students’ enrolments, especially those from China. The grade distribution of the students in 2019 and 2021 is provided in Table 2 to set the scene for the results. To ensure a fair comparison of student success between the two years, we rescaled the 2021 data to align with the total number of students in 2019. The passing grades (HD, D, Cr and P) were similar in both cohorts, though slightly lower in 2021. To make a fair comparison in terms of student success across two years, a row is added to table (scaled). The total failure rate (32.4%) in 2021 was 16.4% higher than in 2019 (16%).
Table 2. The Grade distribution for 2019 and 2021 student cohorts. (* Scaled, adjust the 2021 student size to be consistent with the 2019 student cohort, ensuring that each category’s percentage is proportionally adjusted).
Grade |
High Distinction (HD) |
Distinction (D) |
Credit (Cr) |
Pass (P) |
Fail (F) |
Fail Absent (FA) |
Fail Hurdle (FH) |
Fail Withdrawn (FW) |
Total |
2019 |
12% |
19% |
20% |
33% |
10% |
5% |
- |
1% |
1195 |
2021 |
11% |
14% |
18% |
29% |
26% |
- |
2% |
- |
1032 |
2021* |
12.7% |
16.2% |
20.8% |
33.6% |
30.1% |
|
2.3% |
|
1195 |
In our university, there are different kinds of failures, as seen in Table 2 (Macquarie University, 2024a). F is given when a student does not meet the defined standards at an appropriate level and does not provide evidence of attaining the learning outcomes of the unit of study, a mark between 0 and 49. FA is also associated with a mark between 0 - 49, but it is given when a student fails to complete an assessment task (which might have helped the student to pass the unit). Most commonly, FA is given when a student does not sit the final exam. In other words, it might be used as a proxy for student disengagement with assessments. FH is defined to identify the students who fail a unit due to failing a hurdle assessment, it helps to differentiate between normal failures and hurdle failures. It is only given to the students whose raw mark is over 50 but who fail all available attempts of at least one hurdle assessment. Consequently, the students’ marks are brought down to 49 regardless of their original marks. Finally, FW is awarded when a student withdraws from a unit after the Census Date (last day to withdraw from a unit without an academic penalty).
The grades were not independent of the year of study (χ(4) = 49.8, p < 0.0001) (note that we collapsed all fail graded to “fail” for the analysis here). In the following sections, we evaluate whether reduced student engagement in learning activities is associated with a higher proportion of failure. We have limited access to all student-level engagement data. Therefore, we use student-level data whenever possible while using cohort data, aggregated data for both cohorts, for other cases (more details provided later in this section). Student engagement is analysed by examining two trends, the weekly trend (Section 3.1) and the distribution of student engagement for a given learning activity (Section 3.2).
3.1. The Weekly Trend of Student Engagement
We begin this section by examining the weekly student engagement in lectures, a critical learning activity, and compare it to in-class and off-class student engagement in tutorials and practicals (other learning activities). In addition, we evaluate student engagement in two settings, synchronous and asynchronous, as mixed findings have been reported regarding the impact of synchronous and asynchronous environments on students’ learning experience. Studies (Northey et al., 2015; Slootmaeckers et al., 2014) observed no significant difference in student achievement between students who attended synchronously and those who watched the recorded lectures (asynchronous attendance). On the contrary, researchers (Nieuwoudt, 2020; Strayer, 2007) found that students who actively participated in both synchronous and asynchronous learning activities demonstrated higher engagement and achieved better learning outcomes than those who only attended face-to-face classes.
In 2019, the weekly student engagement in synchronous and asynchronous lectures was represented by the percentage of student engagement over a semester of 13-weeks in both learning and teaching settings. We calculated the percentage as the total number of students who participated synchronously or asynchronously out of the whole student cohort. The number of in-class quiz completion approximated synchronous in-person or via the live stream attendance compared to the number of video views by students for measuring their lecture engagement in an asynchronous setting.
In 2021, students could engage by attending live interactive lectures (online and synchronous), asynchronous pre-recorded lectures and recordings of live lectures (after the class time). While the weekly student engagement in asynchronous contexts is measured the same as in 2019, the weekly student engagement in the live interactive lectures was calculated using the Zoom login data in 2021. At that time, we did not require students to join the Zoom class using only their student emails which created some difficulties when matching student performance via their emails, more discussions on the problems are in the next section. Weekly student engagement in learning activities (lectures, tutorials and practicals) during a usual offering of the unit in 2019 (before COVID) and a flipped classroom offering in 2021 (during COVID) show that lectures are poorly attended compared to in-class attendance in other learning activities (Figure 1).
As expected, lecture participation peaked at the beginning of each semester. Interestingly, given the same weekly topics taught in 2019 and 2021, at the beginning of the semester (week 1 to week 3), a higher percentage of students in the flipped classroom (2021) tuned into live Zoom lectures compared to those who attended the on-campus classes in the traditional classroom model (2019).
Figure 1. Weekly student engagement in lectures in 2019 (left panel, a traditional classroom) and in 2021 (right panel, an online flipped classroom). Note that due to technical issue in Week 3 and public holiday in Week 9, Zoom login data were not available, resulting in gaps in these weeks for the live interactive.
Week 13’s revision lecture was better attended synchronously and asynchronously by students in 2019 than by the 2021 cohort. This might be because in 2019, an earlier semester’s paper-based exam was solved with students during the revision lecture, while in 2021, an online practice final exam with 3 attempts, similar to the practice class tests which students experienced during the semester, was made available to the students two weeks before the end of the semester where after each attempt students could see the correct answers. Time-poor students might have decided to use online resources, instead of attending the revision lecture. In 2019, a rise in student engagement during the middle and end of the semester in an asynchronous learning environment compensated for the gradual decline in synchronous lecture participation. In a traditional classroom, there was little difference in learning content and activities in synchronous and asynchronous settings, which may explain the inverse relationship between synchronous and asynchronous student engagement in lectures. In an online flipped classroom in 2021, although a higher percentage of students engaged asynchronously via the pre-recorded lectures, teaching factual and conceptual knowledge, engagement quickly declined towards the end of the semester. Still, some students expanded their learning by attending the live interactive lectures, emphasising the application of knowledge and real-world problems. The proportion of students who watched the pre-recorded lectures outweighed the live lecture attendance between Week 1 and Week 12 while in Week 13 more students watched the recorded live lecture. Thus, it is reasonable to assume that most students who engaged synchronously also watched the pre-recorded lectures and therefore participated in a flipped classroom during these weeks.
The engagement (in-class student engagement) in other learning activities remained high, around 80%, in most weeks; nevertheless, the downward trend was noticeable towards the end of the semester in both years (green and blue lines in Figure 1). It should be noted that two peaks in both years in weeks 7 and 11 practical classes engagement coincide with the class tests 1 and 2, respectively, when students were expected to take their class tests during their practical classes. Given the practice-based hurdle requirement of attending at least 10 out of 12 classes, the drop in in-class participation was offset by an uptake in off-class participation (Figure 2). For example, the low tutorial attendance in week 9 of 2019 was accompanied by a small peak in off-class participation. The off-class participation in 2019 meant that students could submit their solutions to the problems in the missed classes. In contrast, in 2021, students were required to answer off-class participation quizzes based on the problems in the missed learning activities. In either case, students must self-study learning resources within the LMS, such as the lecture notes, videos, and readings, before independently solving the problems themselves. As a result, we expect some learning to occur for those absent from their classes.
Figure 2. Weekly engagement in Practical (left) and Tutorial (right) where solid lines represent in-class participation while the dashed lines represent the off-class participation. The black and blue lines represent the 2019 and 2021 student cohorts, respectively.
3.2. Individual-Level Student Engagement in Learning Activities
To investigate, students’ preferred mode of learning (asynchronous or synchronous), we analysed the distributions of student engagement for a given learning activity (i.e., lecture, practical or tutorial) for both years.
In 2019, more than 81% of students (978 out of 1195 students) engaged more than once in asynchronous lectures, in contrast to only 38% (452 students) participating in synchronous settings at least once (Figure 3). Among those who attended asynchronously, 50% engaged in seven lectures instead of a mere two classes for the synchronous group. Among students who engaged asynchronously, 135 participated in all 13 classes, while only six students who engaged synchronously for all 13 classes.
Figure 3. The distribution of student engagement in lectures in 2019 (left) and in 2021 (right).
In online flipped classrooms in 2021, more than 81% of the students engaged with the lecture materials at a foundation level (957 out of 1032 students) at least once over a 13-week teaching period (though data was available for 11 weeks only), evidenced by the distribution of student engagement in pre-recorded lectures (Figure 3, right panel, the middle boxplot). However, student engagement with higher-order learning either synchronously (live interactive lecture) or asynchronously (recording of the live interactive lecture) was much lower, demonstrated by only 447 and 545 students who attended respective classes at least once. Six students who engaged in all 11 live Zoom lectures, 117 were involved in self-paced learning by watching all 13 weeks’ pre-recorded lectures, and only nine students watched all of the recorded live lectures.
As participation in weekly tutorials and practicals are mandatory, a minimum of 10 of the 12 is required to pass the unit, it is not surprising that student engagement in in-class attendance was high, evidenced by more than 75% of the students attending more than ten and nine classes in 2019 and 2021, respectively. This also explains the low utilisation of off-class learning activities by students, primarily designed for students to meet the participation requirement.
All 1195 students in 2019 engaged in Practicals and Tutorials in person at least once over 12-teaching weeks (Figure 4, left) (there are no such activities in week 1, that is why there are 12 instead of 13 of these). A small number of students in 2021 did not participate in any of the Practicals (n = 2) and/or Tutorials (n = 24). Nevertheless, the distributions in both years are similar while the variability in 2021 is more prominent (Figure 4, right panel).
Figure 4. The distribution of student engagement in in-class other learning activities in 2019 (left) and in 2021 (right).
Students could utilise the off-class participation to make up for the missed Practicals and/or Tutorials. In 2019 (Figure 5, left panel), 156 (13%) and 113 (9%) students engaged in Practicals and Tutorials by completing off-class homework submissions, respectively. In 2021 (Figure 5, right panel), there was more uptake of off-class participation activities both for Practicals, 373 (36%), and for Tutorials, 180 (17%).
Figure 5. The distribution of student engagement in off-class other learning activities in 2019 (left) and in 2021 (right).
3.3. Student Engagement and Their Learning Outcomes
By matching student engagement and their final grade data via their student emails, we can explore associations between student participation in learning activities and their learning outcomes (grades) in different learning modes (asynchronous or synchronous). A student is included in the following analysis if s/he attended in synchronous and/or asynchronous lectures at least once. It is worth noting that a student could appear in both panels in Figure 6, provided that s/he attended at least one synchronous lecture and watched a recorded (asynchronous) class, not necessarily in the same week. Based on the data, among students who attended all 13 asynchronous lectures, 102 did not attend any synchronous lectures. Therefore, it is more informative to present the grade distribution as defined in Figure 6, albeit double counting the students. Three students were excluded from the synchronous group (i.e. total student number is 449, instead of 452, as in Figure 3) and 14 students from the asynchronous group (i.e. total student number is 964, instead of 978, as in Figure 3) due to missing data. Figure 6 demonstrates that in 2019, regardless of synchronous or asynchronous, there is a positive relationship between student engagement and academic performance. Such a positive association is more prominent in the asynchronous group (Figure 6, right panel), given a larger sample size, since it includes each student ever watching a recorded lecture whether they have attended the synchronous lecture or not.
Figure 6. The grade distributions for different modes of lecture participation in 2019.
Figure 7 left and middle panels show the student performance of those who had watched at least one pre-recorded lecture and participated in live interactive lectures at least once, respectively. Notice that the maximum student engagement is 11 due to the missing Zoom data in weeks 3 and 9. The flipped classroom model contributes positively to student performance (median number of students are decreasing from highest grade category to lowest grade category, from left to right). As expected, highly engaged students in pre-recorded lectures were the top-performing students. However, for students with decent numbers of flipped classroom participation (live lectures), some failed to pass the unit. Admittedly, a closer examination of students’ prior academic records may provide more insights.
Figure 7. The grade distributions for different modes of lecture participation in 2021.
In an ideal world, for the optimal learning experience in an online flipped classroom, students should learn the basic knowledge outside the class time via the pre-recorded lectures and come to interactive class to deepen their understanding. In reality, this does not often happen. Out of 1032 students, 19% had a complete online flipped classroom experience at least once by watching pre-recorded lectures and participating interactive lectures in the same given week. Without looking at these students’ other learning activities engagement, we report that 1 in 5 has failed the unit (Figure 8), which is lower than the overall fail rate for the unit (28% Table 2), nevertheless, it is still very high. For these students who had attended flipped classroom at least once and still failed the unit (n = 41) (Figure 8, F and FH), their participation in other learning activities were reasonable as the minimum attendance was required for passing. However, when we look closely to their flipped classroom participation, we noticed that more than half (n = 25) engaged in a complete flipped classroom only once, one-third (n = 13) only twice and three of them only three times, which explains their poor learning outcomes.
Figure 8. The grade distributions of students who had at least once a complete online flipped classroom experience in the same given week in 2021.
4. Discussion and Conclusion
Most of our students in both cohorts were from Generation Z (Gen Z) (born between 1995 and 2010). It is argued that their learning should be practical and facilitated, including individual/independent learning components that enable them to set their own pace (Hope, 2016). In addition, Gen Z is colloquially called zoomers (Slang Dictionary, 2020), maybe, because Gen Z is the generation most exposed to online collaboration, communication and education tools. It seems that the flipped classroom is the best learning environment for them since it addresses their learning needs by giving them opportunity learn independently (pre-recorded lectures) and have practical learning activities during online interactive classes, practicals and tutorials.
The question we aimed to answer is, “Does the flipped classroom with online learning work as well as or better than traditional face-to-face classes for a large first-year statistics unit?” In terms of student engagement, our results showed that most students watched pre-recorded lectures to acquire basic concepts and principles. Only a few of them engaged in live lectures focussing on application and discussion, defeating the flipped classroom’s purpose to cultivate deep and active learning. Poor student engagement in the online flipped classroom (the 2021 cohort) also weakened their learning outcomes compared to their peers who studied in a traditional face-to-face classroom. However, we need to consider the following factors and contexts to unpack the flipped classroom’s effects on student engagement and learning. Firstly, the flipped classroom conventionally utilises a blended learning approach, meaning there is a combination of online (pre-class homework such as watching online lectures) and in-person learning activities. However, the COVID-19 lockdown in the middle of 2021 meant that our university cancelled in-person instruction and transitioned into the special circumstances delivery mode, also known as emergency remote teaching (ERT). Studies (Brooks et al., 2020; Hodges et al., 2020) have shown that ERT in response to COVID-19 should be distinguished from the usual online delivery mode as in the latter, students can self-select to enrol in on-campus or online study modes. In contrast, in the ERT, students and teachers are required to transition into online learning abruptly. A survey by Inside Higher Education (Inside Higher Education, 2020) revealed that 97 university presidents agreed that maintaining student engagement was challenging when classes were shifted online during the pandemic. Secondly, the 2021 student cohort had been on and off learning online for at least three semesters and most felt fatigued by the time the strict lockdown took place in mid-June 2021. Researchers have attributed the lack of peer connection and technological issues to students’ struggles with engagement when learning transitioned online during the COVID-19 pandemic (Krause & Coates, 2008). As a result, the above two reasons may contribute to the worsened engagement and learning outcomes of an online flipped classroom compared to a traditional classroom.
First-year students’ resistance to the flipped classroom caused by transition challenges into university may mediate the reported weak (negative) association between student engagement and the flipped classroom. First-year students need to get used to the foreign learning and teaching environment, focussing on student autonomy, knowledge application and extension, which contradicts students’ expectations of teacher-centred with explicit instructions. On top of that, in a flipped classroom, first-year students are expected to learn most of the knowledge with guidance from academics in the form of recorded lectures and reading materials before attending a live interactive class. Such activities require better time management, self-study and learning strategies, which are among the major concerns of first-year students (Kift et al., 2010). Therefore, students may resist participating in the flipped classroom as they have yet to acquire these necessary skills. The capability to manage one’s time, study and strategy is fundamental to success in the first year and is closely related to the results of a student engagement (Krause & Coates, 2008). Although first-year students should take ownership of their time, instructors and universities could facilitate commencing students to make a realistic sense of the expectations related to time management and the required efforts they should put in to engage in learning activities meaningfully (Van der Meer et al., 2010). Besides assisting students in meeting the challenges of university life, we must communicate clearly and explicitly to our students. Researchers have advocated academics to explain the concept and the rationale of a flipped classroom to the students to get them on board (Findlay-Thompson & Mombourquette, 2014; Moffett, 2014).
In addition, asking questions in live classes, especially in a class size more than ten times bigger than their previous experiences in high school, could be daunting. However, we did observe students asking questions individually before or after class, maybe because they wanted to avoid being identified as tall poppy. The authors also observed that more students in online Zoom classes were asking questions during the interactive classes than in on-campus face-to-face classes, although there was no measurement of how much more questions came from the students. A future qualitative study with focus groups or interviews might shed light on why students choose not to engage in lectures, what they want to make lectures more attractive to them, and how lecturers can help make classrooms more inclusive. Since students were reluctant to ask questions, in the 2019 and 2021 versions of the unit, we used Kahoot for the in-class quizzes to identify students’ knowledge gaps and misunderstandings to tackle them as early as possible. However, given that lecture attendance was low, we were unable to reach out to all students. We are currently revising the delivery of the unit to ensure active learning and more student by-in to engage with the learning activities. The newer version of the unit, which is delivered in 2024, has pre-lecture activities to be completed (such as short quizzes), post-lecture activities to be completed (such as homework exercises) and additional readings to expand learning span of the students who wants to achieve the highest grade (High Distinction).
Despite unsatisfactory student engagement in the flipped classroom, a substantial percentage of students engaged in small group peer and tutor-assisted classes (tutorial and practical) during the pandemic, peaking in weeks where graded assessments were held. Informed by these observations, it may be worthwhile to consider implementing the flipped classroom in these small-class settings. Given we already have a suite of videos on tutorials and practicals currently used as off-class supplementary resources, we could repurpose these pre-recordings as pre-class activities while reserving more in-class time for peer and tutor-assisted activities. Adapting the usual flipped classroom to the small-class setting could address students’ learning anxiety and meet their needs better due to a more intimate learning environment. This proposal echoes the flipped classroom continuum framework (Tomas et al., 2019), where the successful implementation of the flipped classroom depends on students’ learning needs and their readiness for it. Therefore, for first-year students, a mix of traditional lectures and flipped small-class activities may be suitable to scaffold learning activities.
Our analysis had some data-related limitations when measuring student engagement. The way we measured student engagement in lectures in 2019 and 2021 is a crude method. The data quality could be better (i.e. instead of individual data, we opted to use cohort data where we did not have individual data). Nevertheless, efforts have been put into maintaining consistent definitions and data analysis. Some simplicity was needed to undertake the analysis. For example, in 2021, student engagement in live interactive lectures is overly optimistic. So long as students log into the Zoom interactive class, we counted them as engaged. In 2021, although Zoom recorded students’ length of time attending the two-hour live lecture, we chose not to use these data to make it comparable to 2019 when we used a quiz and did not know how long the students remained in the classroom. It would be challenging to justify appropriate cut-off points of time for quantifying different levels of engagement (low, medium and high) where we are confident that student engagement and learning occur, which explains the use of the binary definition. In addition, the Echo360 video views did not record the time stamp when students engaged asynchronously. Further analysis may reveal that those who engage asynchronously (watching pre-recorded lectures and the recorded live lectures) routinely in the same week of the class should achieve more or less the same learning outcomes as those students who engage in the flipped classroom.
The results of this study contribute to discussions of the usefulness (or not) of implementing flipped classrooms in first-year large statistics units. We presented the results of two distinct cohorts of students in a traditional and flipped classroom with similar learning materials and the same teachers. Teachers in higher education should carefully consider the advantages and disadvantages of the flipped classroom experience for first-year students to inform future implementations of the flipped classroom to benefit the students. To achieve better results for student learning, lecturers could aim to develop a teacher-student rapport similar to those established between students and their tutors in a smaller class setting (Hope, 2016). Additionally, graded/non-graded in-class quizzes during the lectures may incentivise students to attend the lectures and engage them with the lecture activities.
We conclude that the flipped classroom could be an alternative to face-to-face traditional lectures. However, we cannot confidently answer “Does flipped classroom with online learning work as well as or better than traditional face-to-face classes for a large first-year statistics unit?” Nevertheless, the flipped classroom could be used to maintain student engagement in their learning, especially in small learning environments like tutorials. Flipped classroom model gives students autonomy to learn whenever they want within the semester by accessing the learning materials wherever they are. However, for ill-prepared, unorganised students, the flipped classroom model can be detrimental to their learning, they might fall behind on their scheduled learning activities relying on available learning materials but never scheduling a time to master their learning. A final remark is that, as crucial as instructors adopting and experimenting with flipped classrooms, a whole-of-institution approach is required to support first-year student transition, engagement and learning in higher education.