Emergency Remote Teaching and Programme Intervention: Towards a Human-Machine Pedagogy Based on Interactive Learning Documents ()

P. Stiefenhofer^{1}, L. Xie^{1}, K. Wildish-Jones^{2}

^{1}Department of Economics, Newcastle University Business School, Newcastle upon Tyne, UK.

^{2}Department of Chemical Engineering, Newcastle University School of Engineering, Newcastle upon Tyne, UK.

**DOI: **10.4236/ce.2022.1311234
PDF HTML XML
25
Downloads
133
Views
Citations

Teaching restrictions, due to the COVID-19 pandemic, have rapidly shifted the global learning environment demanding educators to implement technology-based solutions in their instruction. In this case study, we discuss how Interactive Learning Documents (ILDs), based on R-Markdown and R-Shiny technologies can contribute to building student confidence in their transition from secondary to tertiary education under the conditions of Emergency Remote Teaching. We propose a human-machine based active learning pedagogy to be implemented in a stage one Introductory Statistics course delivered to social science students. Our results show that Interactive Learning Documents contribute to alleviating student’s transition to university education. Three factors contribute to this result: 1) ILDs help students reduce statistics anxiety, 2) ILDs help learning by efficiently organising large quantities of complex learning material, and 3) ILDs effectively engage students by providing them with a variety of dynamic interactions with the subject.

Keywords

Active Learning, R-Shiny Apps, R-Markdown, Statistics, Pedagogy

Share and Cite:

Stiefenhofer, P. , Xie, L. and Wildish-Jones, K. (2022) Emergency Remote Teaching and Programme Intervention: Towards a Human-Machine Pedagogy Based on Interactive Learning Documents. *Creative Education*, **13**, 3690-3714. doi: 10.4236/ce.2022.1311234.

1. Introduction

The first-year student university experience is of upmost importance for students and universities alike as it is inherently linked to student satisfaction, retention, and academic success (Schütze, Bartyn, & Tapsell, 2021). There has long been a concern, however, that A-levels are too narrow a preparation for the demands of higher education (Roberts & Higgins, 1992), and with an increasingly diverse student body (Larsen et al., 2021) universities have a shared responsibility to support students in their transition to university education (Nel, Troskie-de Bruin, & Bitzer, 2009). While most students manage the transition into university life successfully, there is a significant minority (20% - 30%) who consistently experience academic and personal difficulties and for whom the change in academic culture is an unpleasant experience (Lowe & Cook, 2003). Students experiencing heightened psychological distress because of a changing learning environment are likely to be negatively affected in their competence and feeling of self-belief (Christie et al., 2006). To support these students and improve retention rates during the transition year, universities develop and implement local and institutional interventions (Thomas et al., 2017). The structure of an intervention programme, according to Thomas et al. (2017), typically starts pre-university-entry and has an emphasis on engagement (Kearney, 2019) with an explicit, discipline specific, academic purpose with an additional role of assisting students in developing friendships and links with academic and support staff (Meehan & Howells, 2018). A wealth of literature exists on what constitutes best practice in transitioning programmes, and a consensus has emerged favouring a holistic approach, one that links academic preparedness with affective elements such as self-efficacy, resourcefulness, a feeling of belonging, and the development of a sense of identity as a member of a learning community (Larsen et al., 2021). Generally, these programmes are designed for on-site in-person mentoring (McWilliams & Allan, 2014; Curtis et al., 2017; O’Rourke et al., 2021) which, due to the rapid shift of the learning environment to emergency remote teaching caused by the COVID-19 pandemic, lack the usual effectiveness. Consequently, educators are now challenged to find innovative ways to integrate intervention mechanisms into their course curriculum to help alleviate the students’ transition. In this paper, we develop human-machine interactions that can be integrated as intervention mechanisms into a stage one statistics course to support students in their transition to a new learning environment.

In addition to the complex challenges associated with a typical transition to university, students find themselves burdened with a further layer of difficulty caused by the conditions imposed upon them by the COVID-19 pandemic. The widespread lockdowns of societies across the world resulted in universities being forced to rapidly close campuses and shift to emergency remote teaching at incredibly short notice. Neither students nor educators were well-prepared for this sudden shift to online teaching (Kyne & Thompson, 2020). Jayalath et al. (2020) study factors influencing a sudden transition to online teaching during the COVID-19 pandemic and find that science and technology related modules are much more challenging to deliver online. Despite online statistics teaching pedagogies existing as a research domain since the late 90’s as well as many universities already employing a blended learning approach with some degree of online teaching (Mills & Raju, 2011), universities were generally not prepared for a sudden switch to full remote teaching. Fernando, Patrizia, & Tiziana (2020) qualitatively analyse technological, pedagogical, and social challenges of emergency remote teaching. Pedagogical challenges arose due to short transition times to remote teaching preventing students and educators from acquiring the necessary digital skills. Students also reported ineffective organisation of learning material (structured content versus an abundance of online resources), and a lack of opportunity for interactive learning (Johnson et al., 2009). In this paper, we partially address two of the challenges associated with an abrupt shift to online teaching. We design an HTML interface, based on R-Markdown and R-Shiny technologies, enabling educators to efficiently organise large and complex quantities of learning material, and build specific pedagogical tools which will be used to effectively engage students with a variety of dynamic interactive learning options.

Further to the issues related to the transition into university education during a pandemic mentioned above, social science students often find little motivation and passion for the subject of statistics and struggle to successfully transition to university level statistics (Sizemore & Lewandowski, 2009; Perlman & McCann, 1999). Moreover, nearly 75% of students report some degree of statistics anxiety while enrolled in quantitative courses (Kinkead, Miller, & Hammett, 2016; Onwuegbuzie & Wilson, 2003). Recent years have evidenced an increase in the number of articles on statistics anxiety appearing in the literature, as researchers have recognised that statistics anxiety is a multidimensional construct that has debilitative effects on academic performance (Onwuegbuzie & Wilson, 2003). Liu (Liu & Yuan, 2021) shows that attitudes towards statistics are a significant instrument in predicting statistics anxiety in an online or hybrid learning environment. A meta-analysis of the SATS-28 and SATS-36 surveys (Schau et al., 1995; Schau, 2003) shows that there is a statistically significant positive relationship between attitudes towards learning statistics and “Affect”. The Affect dimension of the SATS survey measures student’s feelings concerning statistics (Emmioğlu & Capa-Aydin, 2012). Positive attitudes towards learning statistics are positively correlated with performance, student retention, satisfaction, and motivation to continue to engage with statistics (Ashaari et al., 2011; Vanhoof, 2010; Rahnaward Ghulami, Hamid, & Zakaria, 2015; Law, 2021). These observations motivate us to develop human-machine based intervention strategies at course level to help students reduce statistics anxiety and increase student performance and retention during the transition year under the teaching restrictions imposed by the COVID-19 pandemic for social science students. We hypothesise that a carefully designed human-machine pedagogy based on Interactive Learning Documents, which addresses some of the dimensions of the SATS 36 survey (Schau et al., 1995; Schau, 2003) and the factors of online system anxiety (Liu & Yuan, 2021), will help social science students in reducing statistics anxiety during the transition to university under the conditions of a pandemic. Other interventions have been developed to achieve similar aims including the one-minute strategy (Chiou, Wang, & Lee, 2014), innovative teaching methods (Einbinder, 2014), and, similarly to ours, active learning strategies (Rapp-McCall & Anyikwa, 2016).

Within our research, we consider a constructivist learning approach to learning statistics that maintains learners actively co-constructing their own knowledge in a human-machine interaction, where reality is determined by the student’s own learning experiences (Kratochwill et al., 1999). As a way of learning, constructivism is widely applied to active learning settings and has been effectively espoused as a learning environment in science education (Duit, 1996; Mvududu, 2005). We embark on addressing how constructivism can be used in conjunction with remote emergency teaching and active learning pedagogy to assist anxious students in their transition to university. Active learning has been found to be an effective approach to traditional classroom teaching (Bonwell & Eison, 1991). In this research, we develop interactive learning documents integrating active learning within the constructivist paradigm in a manner accessible by remote learning. The documents aim to effectively organise large and complex learning material and provide students with dynamic interactions with learning content in a way that is less intimidating than in a typical class environment.

Research on R-Markdown as a learning technology in statistics is sparse. Bray, Çetinkaya-Rundel, & Stangl (2014) discuss the advantages of teaching statistics in a reproducible paradigm with R-Markdown. The authors discover that R-Markdown allows instructors to assess the reasoning as well as the final outcome of a statistical analysis. Baumer et al. (2014) use R-Markdown in an introductory statistics module as a new technology that makes creating fully reproducible statistical analysis simple and easy. The authors provide evidence that R-Markdown can be used effectively as an instruction tool within the reproducible paradigm. Finch et al. (2021) report the successful use of R-Markdown documents in teaching an introductory statistics course at a tertiary level. The authors report that from a group of 44 students where the majority (60%) of these students had no experience with R 89% of the students reported that R-Markdown was a useful aid to their learning.

R-Shiny applications have been utilised within education research previously with the uses of these applications ranging from specific types of statistical analysis (Elliott & Elliott, 2020) to enhancing active student participation in group discussions. A set of R-Shiny applications were developed by Doi et al. (2016) which were used as a teaching tool for in-class concept demonstration where the use of interactive, dynamic, visually appealing and user-friendly features were highlighted and the pedagogical use of these was demonstrated by the authors of these applications and published online. R-shiny applications have also been used as an additional resource when teaching challenging statistical concepts. Williams and Williams (2018) created an R-Shiny application to assist students in learning difficult statistical concepts such as confidence intervals. In an aim to improve the future development of R-Shiny applications to be used in the teaching of introductory statistics Gonzales et al. (2018) conducted a qualitative study where the student’s perspective of learning statistics with R-Shiny was assessed across ten R-Shiny applications developed for this purpose. From this study a list of suggestions for R-Shiny application development for teaching introductory statistics was produced and provided for instructors interested in developing their own applications.

In this paper, we take a mixed approach to designing learning material by combining R-Markdown with R-Shiny. We utilise the “learnr” package in R to achieve this. We develop R-Markdown files (RMD) and embed R-Shiny interactive elements inside them. The files are distributed as HTML documents via shinyapps.io cloud to students. A key purpose of our ILDs is to support students in their transition from high school to university under the conditions of Emergency Remote Teaching when intervention programmes fail to operate effectively due to a rapid shift to online teaching. Our human-machine based pedagogy addresses this by utilising ILDs which 1) help students effectively organise complex large quantities of learning material, 2) provide a wide range of opportunities for active learning via dynamic interactive elements, 3) and help students to reduce statistics anxiety. The organisation of the paper is as follows. Section two introduces the methodology of this paper. Section three discusses the main results followed by discussion and conclusion. The survey questions are provided in Appendix I and some snapshots of an ILD are provided in Appendix II.

2. Methodology

We develop a set of Interactive Learning Documents for statistics instruction. A link to a prototype ILD was distributed via Shinyapps.io and provided to a random set of 32 students out of a cohort of 215 students enrolled on the Econ1007 Introductory Statistics course at Newcastle University Business School with the request of providing feedback about technical, pedagogical, and practical aspects of ILDs. Data was collected via an online survey.

The purpose of the prototype ILD was to substitute the existing weekly homework problem set with a new human-machine based interactive learning activity. Through the ILD it was possible to study features which contribute to an effective learning intervention in an emergency remote teaching learning environment helping to reduce statistics anxiety in the transition to higher education.

ILDs are built with the open-source R-studio software. They are R-Markdown files (RMD) produced with the “learnr” package with embedded R-Shiny interactive elements distributed via the Shinyapps.io server (RStudio Cloud) to students. A typical ILD required on average 1500 lines of R-code and 36 hours to develop. Students could access ILDs via any web-browser on any appropriate technological platform including mobile phones for evaluation. Followed by the inspection of the prototype ILD, students responded to a survey consisting of 8 groups of questions containing 32 questions in total. The outcome of this survey is intended to further shape the development of ILDs and our machine-human pedagogy based on ILDs.

The interactive learning activities within the documents range from multiple choice questions, to assess the understanding of the taught content and definitions provided within the document, to interactive applications where students can change the parameters of sets of data and see how this will affect the final result, whether this is a graph or a diagram, the final result will change in real time as the parameters are altered. Interactive Learning Documents may include any or all of the following human-machine interactions:

1) Effective display and organisation of learning content (FigureA1). Learning content is organised via a menu allowing users to move between document sections and learn content swiftly. Moreover, ILDs are capable of retaining user activity information and a user can pick up learning at any point in the future and continue where learning was left. A “start over” button also allows users to delete all information and to redo the ILD exercises as many times as they like (Figure1 left bottom). Learning content can not only be directly embedded within the HTML R-Markdown file but may also be displayed via scrollable embedded pdf files. Figures and illustrations (FigureA1) support organising and displaying learning content. Equations (FigureA2 and FigureA3) can be used in LaTex style. 2) Code boxes (FigureA4) can be used to learn coding, apply existing code to derive solutions to given exercises, or to illustrate real world problems. R-code can be directly executed within the HTML document. Users do not need to download R or R-studio to run code. 3) Multiple choice quizzes and solutions (FigureA2 and FigureA5) are used to enforce learning and understanding. 4) Interactive Videos (FigureA6). R-Markdown supported services

Figure 1. Snapshot of ILD app showing a dynamic interactive activity.

include YouTube and Vimeo. 5) R-Shiny components (Figure1) enable users to dynamically interact with functions and data which in a static book form would not be possible. Each “learnr” R-package based ILD is an R-Shiny interactive document, which means that interactive learning documents can be deployed in all of the same ways that R-Shiny applications can, including locally on an end-user’s machine including mobile phones (FigureA5), or on shinyapps.io.

Maintaining the Integrity of the Specifications

The template is used to format your paper and style the text. All margins, column widths, line spaces, and text fonts are prescribed; please do not alter them. You may note peculiarities. For example, the head margin in this template measures proportionately more than is customary. This measurement and others are deliberate, using specifications that anticipate your paper as one part of the entire journals, and not as an independent document. Please do not revise any of the current designations.

3. Results

3.1. Descriptive Statistics

In total 32 Introductory Statistics students enrolled on the module Eco1007 participated in the evaluation of a prototype ILD 43.4% (14) of which are female and 56.6% (18) male. The majority of students report average (53%) or above average (19%) mathematical pre-requisite skills. A substantial minority of 28% report below average self-assessed mathematical skills. A large proportion of the cohort of students 81.2% (26%) spent a significant amount of time with the ILD in order to effectively evaluate it. 18.8% (6) of students in the group evaluated the features of the ILD in less than 10 minutes.

Figure 2 shows that the ILD is easy to use (bar plot (a)), navigate (bar plot (e)) and effectively engages students in learning statistics (bar plot (b)). While 59.4% of the students find the ILD visually appealing, a significant number of students suggest improving on that aspect of the ILD. Bar plot (c) also suggests that the current prototype ILD lacks a “fun to use” factor.

Figure 2. Set of questions Q4 about overall experience of ILD features.

Figure 3 shows 97% of the students mostly enjoy the compact organization of large and complex learning material via ILDs (bar plot (a)), 3% of the students are indifferent regarding this feature. 90.6% of all students like or strongly like the set of short multiple-choice questions designed to self-check basic understanding of complex statistics concepts (bar plot (b)). While very few students are indifferent between providing or not providing solutions to multiple choice questions, most students express preferences in favour of solutions with 81.43% of all students liking or strongly liking them (bar plot (c)). The short multiple-choice questions, designed to assist students in better understanding of definitions, can be repeated indefinitely until a student decides to progress to the next exercise. An inherent feature of the ILD is that students can pick up work at any point in time where it was left or refresh it with only one click on the “Start Over” button in the bottom of side-bar menu (Figure 1).

Figure 4 bar plot (a) shows that not all students derive utility from using pre-written R code to check solutions to exercises. 21.9% of all students have no preference between having or not having this feature. Scrollable embedded pdf files which provide detailed mathematical derivations of complex problems and proofs of basic theorems or solutions to exercises are highly valued by 71.9% and valued by 21.9% of all survey participants (bar plot (b)). Bar plot (c) shows that 75% of the students express strong preferences for learning activities including

Figure 3. Set of questions Q5 about learning with ILDs.

Figure 4. Set of question Q5 about learning/teaching pedagogy.

interactive dynamic visualisations where they can choose the values of parameters. 15.1% like this feature, leaving less than 1% indifferent towards it. No student reports dislike at any level any of the features listed in Figure 4. An example of interactive dynamic visualisations is shown in Figure 1. The menu on the left side further displays 1) the easy to use and 2) easy to navigate properties of the ILD (Figure 2, bar plots (a) and (e)).

Figure5, bar plot (a) shows that most students (84.4%) like the applications section of the ILD which contains stylised real-world problems with detailed video solutions (FigureA6). In these videos, an instructor provides a step-by-step derivation of solutions to exercises using various pedagogical methods including power point slides; excel spread sheets, white board, etc. A small proportion of students express strong preferences in favour of this feature and another small proportion are indifferent between having or not having it. Bar plot (b) shows that students only marginally derive utility from code boxes in which they can use pre-existing code or write their own code to check answers to exercises (FigureA4). Roughly half of the students do not explicitly derive utility from this learning feature. Nearly 83% of all students like or strongly like this learning feature while 17% are indifferent (bar plot (c)).

Figure 6 presents a summary of students’ feelings concerning statistics in the context of a human-machine pedagogy based on Interactive Learning Documents. The variables q21 - q26 jointly measure one of the six dimensions of the SATS-36 survey called “Affect”. At variance to the SATS-36 survey, the variables report relative differences between students working with ILDs compared to their usual homework type.

Figure 6 bar plot (a) shows a positive relative effect of learning with ILD’s on students “liking statistics” (84.4%). 5.6% show no or negative relative effect of learning with ILDs on “liking statistics”. Similarly, bar plot (d) shows a positive relative impact of learning with ILDs on “liking taking statistics courses” (82%). 3% of the students report that ILDs strongly help them “reduce insecurity” when having to solve statistics problems, supported by 82% expressing some positive effect and 15% report no or negative effect (bar plot (b)). ILDs are reported to moderately “reduce the feeling of being scared by statistics” (85%). 15% of the

Figure 5. Set of question Q5 about learning/teaching pedagogy.

Figure 6. Set of questions Q6: Measuring “Affect”.

students express indifferent or negative effect of learning with ILDs on “reducing statistics anxiety” (bar plot (e)). The next two variables show a weak effect of the proposed learning activity on “reducing statistics anxiety”. Bar plot (f) shows that 53% of students do not feel that ILDs “help them reduce stress during statistics class” and bar plot (c) shows that 47% report that ILDs do not “help reducing frustration” when going over statistics tests in class.

Table 1, q27 shows that a significant percentage of the students prefer learning with Interactive Learning apps compared to their existing homework type (91%). During the academic year 2020-21 homework consisted of solving multiple choice questions, which required calculations. Worked out solutions to these problems were not distributed to students. An online discussion forum was installed, and students were encouraged to interact with each other and to discuss homework. A small proportion of 9% of the students are indifferent or do not wish to substitute the existing homework type with ILDs. A large proportion of survey respondents agree (q32, 25%) and strongly agree (q32, 72%) that ILDs provide significant variation in their interactive learning. The responses to question q33 reveal that most students agree (44%) and strongly agree (44%) that ILDs are also more effective for exam revision compared to static paper and pencil based weekly problem sets. 12%, however, are indifferent or disagree with this statement. Question q28 assesses whether ILDs improve student confidence in learning statistics. It is shown that 38% strongly agree and 50% agree that ILDs build confidence in learning statistics. However, 12% are indifferent or disagree with this statement.

3.2. Cronbach’s Alpha and Satisfaction Rates

We construct a variable representing reduction of statistics anxiety as indicated in Table 1 and calculate Cronbach’s alpha by

Table 1. Question set Q8. Agree or disagree questions about learning with ILDs.

$\alpha =\frac{K}{K-1}\left(1-\frac{\text{\Sigma}Var}{Va{r}_{T}}\right)$,

where *K* is the number of questions in each category, Σ*Var* is the sum of the variances associated with each question, and *Var _{T}* is the variance associated with the observed total score. Cronbach’s alpha is used to measure the reliability and consistency of our data and to determine how closely elements of a group are related. A Cronbach’s alpha of 0.7 or higher is considered sufficient to ensure satisfactory internal consistency. We then calculate for each question a satisfaction rate and provide the associated standard error. The satisfaction rate

${S}_{R}=1-\left(\frac{{A}_{TV}}{{M}_{PV}}\times 100\right)$,

where *A _{TV}* is the average total value, and

Table 2 shows that overall, 76.88% of the students believe that ILDs help them reducing statistics anxiety. A Cronbach’s alpha of 99.17% strongly confirms the consistency of this result. An overall mean value of 2.02 with a variance of 0.79 indicates that the majority of students are strongly convinced about the reliability of this result. Table 2 also reports the mean value and variance for each question, where a low mean indicates higher level of satisfaction. The variables q21 - q26 represent the SATS measure of the “Affect” dimension with a mean 2.05 and variance 1.02 suggesting that ILDs help students reducing statistics anxiety. Both questions q23 and q26 relate to in-class activities showing significantly lower satisfaction rates.

Table 3 shows that overall, 87.5% of the students believe that ILDs effectively organise complex and large quantities of learning material. This result is strongly confirmed by a Cronbach’s alpha of 99.19%. An overall mean of 1.8 with

Table 2. Reduction of statistics anxiety.

*K* = 10, *Var*(*T*) = 73.76, alpha = 0.9917.

Table 3. Organisation of learning content.

*K* = 6, *Var*(*T*) = 15.17, alpha = 0.9919.

variance 0.44 indicates that the majority of students are very strongly or strongly convinced about this result. Among all elements, with a mean of 1.44 and variance of 0.31, students report that recalling definitions and results in compact form most strongly contributes to the effective organisation of learning material of ILDs (q9).

Table 4 shows that overall, 83.04% of the students believe that ILDs provide sufficient opportunities for interactive learning. This result is strongly confirmed by a Cronbach’s alpha of 99.64%. An overall mean of 1.82 with variance 0.32 suggests that the majority of students strongly agree or agree with this result. Among all elements with a mean of 2.53 and variance 0.44, code boxes, where students can use pre-existing code or write own code, seem ineffective in

Table 4. Variation of interactive learning.

*K* = 7, *Var*(*T*) = 15.34, alpha = 0.9964.

explaining the main result (q16). On the other hand, with a mean of 1.31 and variance of 0.28, students report that ILDs help them with effectively preparing for exams (Q32).

3.3. Test for Research Question 1

In the Q-Q plot (Figure 7), the data points are close to the straight line, and the K. S test (Table 5) shows that the *p*-value is slight greater than 0.05. Hence, we conclude that our data is a normally distributed. Now, we can use the one-sample t-test (Table 6) to test our hypothesis.

The *p*-value in Table 6 is significantly less than 0.05, hence we reject the null hypothesis suggesting robustness at less than 3.

3.4. Test for Research Question 2

In the Q-Q plot (Figure 8), the data points are close to the straight line, and the K. S test (Table 7) shows that the *p*-value is slight greater than 0.05. Hence, we conclude that our data is a normally distributed. We can use the one-sample t-test (Table 8) to test our hypothesis.

The *p*-value in Table 8 is significantly less than 0.05, hence we reject the null hypothesis suggesting robustness at less than 3.

3.5. Test for Research Question 3

In the Q-Q plot (Figure 9), the data points are close to the straight line, and the K. S test (Table 9) shows that the *p*-value is slight greater than 0.05. Hence, we conclude that our data is a normally distributed. Now, we can use the one-sample t-test to test our hypothesis.

The *p*-value in Table 10 is significantly less than 0.05, hence rejecting the null hypothesis suggesting robustness at less than 3.

Figure 7. Q-Q plot, research question 1.

Table 5. K. S test for research Question 1.

Table 6. One sample t-test for research Question 1, test value = 3.

Figure 8. Q-Q plot, research question 2.

Table 7. One sample t-test for research Question 2.

Table 8. One sample t-test for research Question 2, test value = 3.

Figure 9. Q-Q plot, research question 3.

Table 9. K. S test for research Question 3.

Table 10. One sample t-test for research Question, test value = 3.

3.6. Ordinary Least Squares

We use a simple ordinary least squares model to further study the usefulness of ILDs as an element of a remote intervention program at module level helping students in their transition to university. Here we show there is a positive relationship between reducing anxiety, organization of learning material, variation in active learning and time spent using the ILDs. We consider

$y={\text{\beta}}_{0}+{\text{\beta}}_{1}x+u$,

where *y* represents the dependent variable of interest, *x* is the amount of time a student spends investigating the ILD,
${\text{\beta}}_{0}$ and
${\text{\beta}}_{1}$ are coefficients and *u *an error term. The dependent variables are constructed as the sum of a weighted average of a set of variables as follows: “Reducing Anxiety” (ReAn): q6, q21 - q26, q28 - q29, q31, q34. “Organisation of Information” (OrIn): q4, q5, q8 - q11, q27, q30, q33. “Interactive Learning” (InLe): q7, q12 - q17, q32.

Figure 10, plot (a) shows the effect of learning statistics with ILDs relative to the usual learning type on reducing statistics anxiety. Students who spend a

(a)(b)(c)

Figure 10. Analysis of reduction of anxiety (a), Organisation of learning content (b), and Variation of interactive learning (c).

Figure 11. Pre-entry mathematics level.

significant amount of time learning with ILDs tend to report stronger evidence of statistics anxiety alleviation relative to students who spend less time engaging with ILDs. The regression slope coefficient of 4.22 is associated with a t-statistic of 3.9 suggesting its high statistical significance. Plot (b) suggests that students who spend more time learning with the ILD are more likely to benefit from the effective organisation of large and complex learning material than students who spend less time with the ILD. The slope coefficient, however, is statistically insignificant at the 5% level of significance. Plot (c) shows that there is a positive relationship between time spent learning with the ILD and reduction of statistics anxiety. A slope coefficient of 0.78 is associated with a t-statistic of 3.138 and hence is statistically significant at the 5% level of significance.

Figure 11 shows that the self-assessed pre-entry mathematics level has no statistically significant effect on students liking learning with ILDs suggesting that ILDs do not discriminate learning between groups of students with diverse pre-requisites.

4. Conclusion

The results of this study suggest that ILDs served as an effective Emergency Remote Teaching tool for educators who wished to engage students in a human-machine interaction-based pedagogy during the teaching restrictions imposed by the global health pandemic. The results also suggest that ILDs may positively contribute to student satisfaction and retention within a wider context of an elementary statistics course. In this paper, we assess the extent to which ILDs served as an effective program intervention element at module level, supporting students in their transition to university. We showed that an ILD based pedagogy can help educators to achieve this aim 1) by utilising R-Markdown with embedded R-Shiny technologies which are capable of effectively organising complex and large quantities of learning material and 2) providing students with ample opportunity to engage in active learning through dynamic interactions. Our ILD based pedagogy directly addresses two key weaknesses of ERT discussed in Jayalath et al. (2020) with positive results. 3) We also show that ILDs help students with reducing statistics anxiety, another desired element of a program intervention (Kinkead, Miller, & Hammett, 2016; Onwuegbuzie & Wilson, 2003). ILDs were not designed with the intention to replace intervention programs, but within the context of an ERT policy, we are expected to support malfunctioning intervention programs, which due to the slow transition to an online setting were only partially successful in alleviating the student transition to university. Another beneficial aspect of ILDs is that they can easily be incorporated into a blended learning environment with several positive impacts on student learning. ILDs allow students to co-create knowledge in a human-machine based interaction including dynamic interactive shiny elements and code boxes. While the latter learning feature reflects an opportunity for students to co-create their own knowledge and understanding within the constructivist paradigm, it seems that running pre-existing code to obtain solutions to exercises does not convince students of its learning effect. This observation suggests that these features might be more effective for modules that require R-coding as part of the syllabus, which in our scenario was not the case. We propose suppressing code-visibility in the code boxes and providing only a “run code” button to improve on this feature. Another advantage of this solution is that ILDs will run much more smoothly on mobile phones as copying and pasting code is not required. Both variables, q23 and q26 address students’ feelings during in-class learning activities. We interpret the results of these variables with some degree of caution as in our experiment; students did not attend class, but remotely assessed the apps. Further research is required to investigate how ILDs can help reduce statistics anxiety during in-class activities.

Funding

This research was funded by the NUBS Digital Innovation in the Curriculum Fund.

Appendix I

Q1: I’m

Male, female, prefer not to say

Q2: My mathematics university entry level is

Below average, average, above average

Q3: How much time did you spend investigating the Interactive Learning App

Less than 10 minutes, between 10 - 20 minutes, more that 20 minutes

Q4: What sentences describe your experience with the Interactive Learning App?

q4: [It’s easy to use]

q5: [I feel engaged with my homework]

q6: [It’s fun to use]

q7: [It’s visually appealing]

q8: [It’s easy to navigate]

Q5: Comment on the features of the Interactive Learning App (Recall that the App is expected to replace weekly homework activities and not substitute lectures/tutorials/seminars).

q9: [Recall of definitions and results in compact form]

q10: [Short questions to check understanding of definitions]

q11: [Solutions to multiple choice questions]

q12: [Using pre-written R code to check solutions]

q13: [Embedded pdf files with detailed worked out examples]

q14: [Interactive Dynamic Visualisations where I can choose parameters]

q15: [Applications with video solutions]

q16: [Code boxes (where I can use existing code or write own code)]

q17: [Pre-written code that generates graphs and solutions to problems]

Q6:

q18: Briefly explain what you like most about the App

q19: Briefly explain what you like least about the App

q20: What further functionalities would you like the App to have?

Q7: Compared to this year’s homework activity what is the expected effect of learning with Interactive Learning Apps on the following: q21 - q26

q21: [I will like statistics]

q22: [I will feel insecure when I have to do statistics problems]

q23: [I will get frustrated going over statistics tests in class]

q24: [I will enjoy taking statistics courses]

q25: [I’m scared by statistics]

q26: [I will be under stress during statistics class]

Q8: Agree or disagree questions: All questions are to be compared to this year’s homework activity (quizzes without solutions)

q27: [I prefer Interactive Learning Apps (ILDs) to the existing homework type]

q28: [Learning with ILDs improves my confidence]

q29: [ILDs help me better prepare for exams]

q30: [I learn more effectively with ILDs]

q31: [ILDs better help understand difficult statistics concepts]

q32: [ILDs provide more variation in learning]

q33: [ILDs are more effective for exam revision]

q34: [ILDs increase my learning motivation]

Appendix II

Figure A1. Organisation of learning material.

Figure A2. Question to check understanding of definition with solution.

Figure A3. Embedded scrollable pdf files.

Figure A4. Code boxes to be used in different ways.

Figure A5. Mobile phone ILD interface.

Figure A6. Interactive videos.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

[1] |
Ashaari, N. S., Judi, H. M., Mohamed, H., Tengku, & Wook, M. T. (2011). Student’s Attitude towards Statistics Course. Procedia-Social and Behavioral Sciences, 18, 287-294.
https://doi.org/10.1016/j.sbspro.2011.05.041 |

[2] |
Baumer, B., Cetinkaya-Rundel, M., Bray, A., Loi, L., & Horton, N. J. (2014). R Markdown: Integrating a Reproducible Analysis Tool into Introductory Statistics. Technology Innovations in Statistics Education, 8. https://doi.org/10.5070/T581020118 |

[3] | Bonwell, C. C., & Eison, J. A. (1991). Active Learning: Creating Excitement in the Classroom. ERIC Publications. |

[4] |
Bray, A., Cetinkaya-Rundel, M., & Stangl, D. (2014). Taking a Chance in the Classroom: Five Concrete Reasons Your Students Should Be Learning to Analyze Data in the Reproducible Paradigm. Chance (New York), 27, 53-56.
https://doi.org/10.1080/09332480.2014.965635 |

[5] |
Chiou, C.-C., Wang, Y.-M., & Lee, L.-T. (2014). Reducing Statistics Anxiety and Enhancing Statistics Learning Achievement: Effectiveness of a One-Minute Strategy. Psychological Reports, 115, 297-310. https://doi.org/10.2466/11.04.PR0.115c12z3 |

[6] |
Christie, H., Cree, V. E., Hounsell, J., McCune, V., & Tett, L. (2006). From College to University: Looking Backwards, Looking Forwards. Research in Post-Compulsory Education, 11, 351-365. https://doi.org/10.1080/13596740600916591 |

[7] |
Curtis, E., Wikaire, E., Jiang, Y., McMillan, L., Loto, R., Fonua, S., Herbert, R. et al. (2017). Open to Critique: Predictive Effects of Academic Outcomes from a Bridging/Foundation Programme on First-Year Degree-Level Study. Assessment and Evaluation in Higher Education, 42, 151-167.
https://doi.org/10.1080/02602938.2015.1087463 |

[8] |
Doi, J. (2016). Web Application Teaching Tools for Statistics Using R and Shiny. Technology Innovations in Statistics Education, 9, 1-32. https://doi.org/10.5070/T591027492 |

[9] | Duit, R. (1996). The Constructivist View in Science Education—What It Has to Offer and What Should Not Be Expected. Investigacoes em Ensino de Ciências, 1, 40-75. |

[10] |
Einbinder, S. D. (2014). Reducing Research Anxiety among MSW Students. Journal of Teaching in Social Work, 34, 2-16. https://doi.org/10.1080/08841233.2013.863263 |

[11] | Elliott, M. S., & Elliott, L. M. (2020). Developing R Shiny Web Applications for Extension Education. Applied Economics Teaching Resources, 2, 9-19. |

[12] |
Emmioglu, E., & Capa-Aydin, Y. (2012). Attitudes and Achievement in Statistics: A Meta-Analysis Study. Statistics Education Research Journal, 11, 95-102.
https://doi.org/10.52041/serj.v11i2.332 |

[13] |
Fernando, F., Patrizia, G., & Tiziana, G. (2020). Online Learning and Emergency Remote Teaching: Opportunities and Challenges in Emergency Situations. Societies (Basel, Switzerland), 10, Article 86. https://doi.org/10.3390/soc10040086 |

[14] |
Finch, S., Gordon, I., & Patrick, C. (2021). Taking the aRghhhh Out of Teaching Statistics with R: Using R Markdown. Teaching Statistics, 43, S143-S147.
https://doi.org/10.1111/test.12251 |

[15] |
González, J. A., López, M., Cobo, E., & Cortés, J. (2018). Assessing Shiny Apps through Student Feedback: Recommendations from a Qualitative Study. Computer Applications in Engineering Education, 26, 1813-1824. https://doi.org/10.1002/cae.21932 |

[16] |
Jayalath, C., Wickramasinghe, U., Kottage, H., & Somaratna, G. (2020). Factors Influencing Orderly Transition to Online Deliveries during COVID19 Pandemic Impact. Asian Journal of Education and Social Studies, 9, 10-24.
https://doi.org/10.9734/ajess/2020/v9i230242 |

[17] |
Johnson, H. D., Dasgupta, N., Zhang, H., & Evans, M. A. (2009). Internet Approach versus Lecture and Lab-Based Approach for Teaching an Introductory Statistical Methods Course: Students’ Opinions. Teaching Statistics, 31, 21-26.
https://doi.org/10.1111/j.1467-9639.2009.00335.x |

[18] |
Kearney, S. (2019). Transforming the First-Year Experience through Self and Peer Assessment. Journal of University Teaching and Learning Practice, 16, Article 3.
https://doi.org/10.53761/1.16.5.3 |

[19] |
Kinkead, K. J., Miller, H., & Hammett, R. (2016). Adult Perceptions of In-Class Collaborative Problem Solving as Mitigation for Statistics Anxiety. The Journal of Continuing Higher Education, 64, 101-111. https://doi.org/10.1080/07377363.2016.1178057 |

[20] | Kratochwill, T. R., Littlefield Cook, J., Travers, J. F., & Elliott, S. N. (1999). Educational Psychology: Effective Teaching, Effective Learning (3rd ed.). McGraw-Hill College. |

[21] |
Kyne, S. H., & Thompson, C. D. (2020). The COVID Cohort: Student Transition to University in the Face of a Global Pandemic. Journal of Chemical Education, 97, 3381-3385.
https://doi.org/10.1021/acs.jchemed.0c00769 |

[22] |
Larsen, A., Cox, S., Bridge, C., Horvath, D., Emmerling, M., & Abrahams, C. (2021). Short, Multi-Modal, Pre-Commencement Transition Programs for a Diverse STEM Cohort. Journal of University Teaching and Learning Practice, 18, Article 5.
https://doi.org/10.53761/1.18.3.5 |

[23] |
Law, M. (2021). Student’s Attitude and Satisfaction towards Transformative Learning: A Research Study on Emergency Remote Learning in Tertiary Education. Creative Education, 12, 494-528. https://doi.org/10.4236/ce.2021.123035 |

[24] |
Liu, M., & Yuan, R. (2021). Changes in and Effects of Foreign Language Classroom Anxiety and Listening Anxiety on Chinese Undergraduate Students’ English Proficiency in the COVID-19 Context. Frontiers in Psychology, 12, Article ID: 670824.
https://doi.org/10.3389/fpsyg.2021.670824 |

[25] |
Lowe, H., & Cook, A. (2003). Mind the Gap: Are Students Prepared for Higher Education? Journal of Further and Higher Education, 27, 53-76.
https://doi.org/10.1080/03098770305629 |

[26] |
McWilliams, R., & Allan, Q. (2014). Embedding Academic Literacy Skills: Towards a Best Practice Model. Journal of University Teaching and Learning Practice, 11, Article 8.
https://doi.org/10.53761/1.11.3.8 |

[27] |
Meehan, C., & Howells, K. (2018). “What Really Matters to Freshers?”: Evaluation of First Year Student Experience of Transition into University. Journal of Further and Higher Education, 42, 893-907. https://doi.org/10.1080/0309877X.2017.1323194 |

[28] |
Mills, J. D., & Raju, D. (2011). Teaching Statistics Online: A Decade’s Review of the Literature about What Works. Journal of Statistics Education, 19, 1-28.
https://doi.org/10.1080/10691898.2011.11889613 |

[29] |
Mvududu, N. (2005). Constructivism in the Statistics Classroom: From Theory to Practice. Teaching Statistics, 27, 49-54. https://doi.org/10.1111/j.1467-9639.2005.00208.x |

[30] |
Nel, C., Troskie-de Bruin, C., & Bitzer, E. (2009). Students’ Transition from School to University: Possibilities for a Pre-University Intervention. South African Journal of Higher Education, 23, 974-91. https://doi.org/10.4314/sajhe.v23i5.48811 |

[31] |
O’Rourke, R. H., Doré, I., Sylvester, B. D., & Sabiston, C. M. (2021). Flourishing or Physical Activity?: Identifying Temporal Precedence in Supporting the Transition to University. Journal of American College Health, 1-6.
https://doi.org/10.1080/07448481.2021.1879815 |

[32] |
Onwuegbuzie, A. J., & Wilson, V. A. (2003). Statistics Anxiety: Nature, Etiology, Antecedents, Effects, & Treatments—A Comprehensive Review of the Literature. Teaching in Higher Education, 8, 195-209. https://doi.org/10.1080/1356251032000052447 |

[33] |
Perlman, B., & McCann, L. I. (1999). The Most Frequently Listed Courses in the Undergraduate Psychology Curriculum. Teaching of Psychology, 26, 177-182.
https://doi.org/10.1207/S15328023TOP260303 |

[34] |
Rahnaward Ghulami, H., Hamid, M., & Zakaria, R. (2015). Students’ Attitudes towards Learning Statistics. AIP Conference Proceedings, 1660, Article ID: 050035.
https://doi.org/10.1063/1.4915668 |

[35] |
Rapp-McCall, L. A., & Anyikwa, V. (2016). Active Learning Strategies and Instructor Presence in an Online Research Methods Course: Can We Decrease Anxiety and Enhance Knowledge? Advances in Social Work, 17, 1-14. https://doi.org/10.18060/20871 |

[36] | Roberts, D., & Higgins, T. (1992). Higher Education: The Student Experience—The Findings of a Research Programme into the Views and Experiences of Students in Higher Education. Heist Research. |

[37] | Schau, C. (2003). Students’ Attitudes: The “Other” Important Outcome in Statistics Education. In 2003 Joint Statistical Meetings—Section on Statistical Education (pp. 3673-3683). |

[38] |
Schau, C., Stevens, J., Dauphinee, T., & Vecchio, A. D. (1995). The Development and Validation of the Survey of Attitudes toward Statistics. Educational and Psychological Measurement, 55, 868-875. https://doi.org/10.1177/0013164495055005022 |

[39] |
Schütze, H., Bartyn, J., & Tapsell, A. (2021). Increasing Self-Efficacy to Improve the Transition to University: An Australian Case Study. Journal of Further and Higher Education, 45, 845-856. https://doi.org/10.1080/0309877X.2020.1826034 |

[40] |
Sizemore, O. J., & Lewandowski, G. W. (2009). Learning Might Not Equal Liking: Research Methods Course Changes Knowledge but Not Attitudes. Teaching of Psychology, 36, 90-95. https://doi.org/10.1080/00986280902739727 |

[41] | Thomas, L., Hill, M., O’Mahony, J., & Yorke, M. (2017). Supporting Student Success: Strategies for Institutional Change: What Works? Student Retention and Success Programme. Higher Education Academy. |

[42] | Vanhoof, S. (2010). Statistics Attitudes in University Students: Structure, Stability and Relationship with Achievement. Katholieke Universiteit Leuven. |

[43] |
Williams, I. J., & Williams, K. K. (2018). Using an R Shiny to Enhance the Learning Experience of Confidence Intervals. Teaching Statistics, 40, 24-28.
https://doi.org/10.1111/test.12145 |

Journals Menu

Contact us

customer@scirp.org | |

+86 18163351462(WhatsApp) | |

1655362766 | |

Paper Publishing WeChat |

Copyright © 2023 by authors and Scientific Research Publishing Inc.

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.