Social Presence as a Mediator of Cognitive Presence in Blended Learning: Roles of Dialogue Facilitation, Direct Instruction, and Teaching Design ()
1. Introduction
In recent years, the rapid advancement of educational information technology has facilitated the transition of blended learning models from theoretical concepts to practical necessities. The Guidelines for College English Teaching (2020 Edition) emphasize the importance of teachers creating diversified teaching environments and implementing blended teaching approaches. As a widely-enrolled general education course, elective English courses serve a broad student population. Therefore, establishing high-quality learning communities that transcend temporal and spatial limitations stands out as a critical challenge that urgently requires attention and resolution.
Garrison and his peers originally presented the Community of Inquiry (CoI) model in 1999 [1]. Today, it is one of the fundamental theories in blended learning scholarship. The model’s central concept is simple but powerful: successful learning results from the dynamic interplay between three crucial elements-teaching presence, social presence, and cognitive presence. Whereas the social and cognitive presences form the foundation of critical thinking, they are all rooted more fundamentally in effective teaching presence [2]. The teaching presence is the structural foundation of the CoI model [3]. It directly affects how the students engage cognitively, form a feeling of community, and ultimately learn [4]. This makes teaching presence critical in the creation of efficient learning environments [5]. Previous research categorically shows that teaching presence influences social and cognitive presence [1] [6], with teaching and social presences both significant predictors of cognitive presence [5] [7] [8]. However, some crucial questions have yet to be resolved. Most studies have examined these relationships in a unidirectional fashion, either focusing on the effect of teaching presence on social presence or the effect of social presence on cognitive presence. There are limited studies examining all three dimensions simultaneously in the CoI model [5] [9]. Moreover, previous studies have confirmed that social presence mediates between teaching presence and cognitive presence in online contexts [10] [11]. But does this mediation transfer when we consider specific aspects of teaching presence? Furthermore, most results stem from totally online environments, and there have been few attempts to study findings in blended environments, particularly English electives. This study bridges such gaps in terms of looking into how elements of teaching presence-instructional design, facilitation of discussion, and direct instruction—affect social and cognitive presence in blended environments. Using the CoI framework, researchers developed a research model (Figure 1) to investigate those relationships. Our findings offer concrete suggestions for improving blended learning communities and their learning achievements.
![]()
Note: TD = Teaching Design; DF = Dialogue Facilitation; DI = Direct Instruction; TP = Teaching Presence; SP = Social Presence.
Figure 1. The proposed framework.
2. Literature Review
2.1. Sub-Dimensions of Teaching Presence and Social Presence
In the CoI framework, teaching presence is crucial. It is about how the design, facilitation, and direction of processes that are cognitive and social happen to reach outcomes that are seen as meaningful and also educationally valuable [12]. This concept reflects the role of the instructor. It breaks down into three main parts: how instruction is designed, how dialogue is facilitated, and the part of direct instruction [13]. These parts play very important roles in how students interact online and engage cognitively, which are both very central to what social presence is all about [14]. On the flip side, social presence tells us about how students can show themselves authentically, both socially and emotionally, in settings where learning is blended. It includes the expression of emotions, the ability to communicate openly, and achieving a sense of being cohesive in a group (as discussed by Rourke and colleagues in 2001) [15]. Studies that have been done previously illustrate that the teaching methods that are applied can have effects on how learners perceive social presence [16]. Moreover, there is a noted strong link between the role that instructors play and the social presence of students [8]. Also, the way that instructors perceive social presence aligns together closely with what students experience, often leading to the predictions made by instructors about their students’ experiences being quite accurate [17] [18]. Yet, it is important to point out that there is not a lot of research that delves into how these three parts of teaching presence-like instructional design, dialogue facilitation, and direct instruction-specifically have an impact on social presence, especially in contexts of blended learning. To help fill this gap that exists in knowledge, hypotheses have been formulated as follows:
Hypothesis 1: It is believed that a positive and significant influence on social presence can be found from instructional design.
Hypothesis 2: It is proposed that social presence can also be positively and significantly influenced by dialogue facilitation.
Hypothesis 3: Additionally, a positive and significant effect on social presence is thought to be brought by direct instruction.
2.2. Sub-Dimensions of Teaching Presence and Cognitive Presence
Cognitive presence is something that is linked to how learners can, in a sense, build up their understanding through discussions and reflections that happen within some form of community inquiry. It works in stages, or phases, that might be described in four parts: triggering events, exploration, integration, and resolution. In this process, first, a problem or task is identified; then, some kind of information is gathered and examined; next, various perspectives are synthesized; and finally, solutions are developed and applied [12]. Research has, of course, suggested that teaching presence has a very positive relationship with cognitive presence, especially when looking at online and blended learning environments [6] [11]. The three key elements of teaching presence-course planning, guiding discussions, and direct teaching methods-play crucial roles in helping learners gain knowledge, build understanding, and develop deep insights [5] [19]. Quick responses from teachers regarding student involvement, that is to say, have been found to improve learning results effectively [14] [20]. While existing research has explored how different aspects of teaching presence might influence learning engagement, these investigations have mostly relied on basic two-variable comparisons, focusing on isolated relationships between factors. They have not fully considered the complex interactions between multiple dimensions, particularly in blended learning environments where understanding how teaching methods shape cognitive processes remains understudied. To address this gap, the following assumptions are proposed:
Hypothesis 4: It is believed that teaching design has a positive and significant impact on cognitive presence.
Hypothesis 5: It is proposed that dialogue facilitation has a positive and significant impact on cognitive presence.
Hypothesis 6: Additionally, direct instruction has a positive and significant impact on cognitive presence.
2.3. The Mediating Role of Social Presence
Empirical studies regarding social presence have been carried out by many researchers. Studies indicate important connections between Social Presence and teaching outcomes, along with learner capabilities [7] [9] [21]. Social presence not only supports open discussions, interactions between people, and group-based exploratory learning in educational communities, but also acts as a bridge connecting instructional methods with how students think and learn. [10] [11] [14]. This bridging role exists because social presence ties closely to educators’ roles in guiding instruction, particularly in building and overseeing learning groups. At the same time, it creates the basis for learners to improve their thinking processes while participating in group-based educational activities. The connecting function of social presence can happen through multiple approaches, such as developing supportive learning spaces, defining common academic objectives, promoting teamwork methods, strengthening group unity, and encouraging reflections after classes [11] [22] [23]. While earlier studies confirm social presence’s role as a connector between teaching methods and learning engagement, questions remain about how exactly different aspects of teaching practices interact with thinking processes. For this reason, the following assumptions are suggested:
Hypothesis 7: It is believed that social presence mediates the relationship between teaching design and cognitive presence.
Hypothesis 8: It is proposed that social presence mediates the relationship between dialogue facilitation and cognitive presence.
Hypothesis 9: Additionally, social presence mediates the relationship between direct instruction and cognitive presence.
3. Methodology
3.1. Subjects
This study looks at students in blended learning programs (both online and in-person) across four universities in Eastern China. It first had 76 students test the survey to make sure it worked properly. Then the investigation carefully selected 312 students taking a mixed English-Chinese news translation course-a special program run by researchers. These participants were aged 19 - 22, all with about 1.5 to 2 years of experience in blended learning. The group included 163 male and 149 female students, with third-year students making up about two-thirds (65.7%) and seniors the rest. Starting fall 2024, this 14-week course includes 30 hours of classroom time plus 14 hours of online learning activities. Put simply, enrolled students need to use online resources to boost their learning—making sure they really grasp the material through different teaching approaches.
3.2. Scales
3.2.1. Instruments
Building on previous research, this study used an 11-point rating scale ranging from 0 (“Strongly Disagree”) to 10 (“Strongly Agree”), which actually works better than the usual 5 or 7-point scales-it’s more precise and shows clearer patterns in the data. Studies by Leung (2011) and Oulo (2017) have already shown this method is great at picking up those small but important differences in how students see things and evaluate themselves, especially in college settings [24] [25]. To really understand different aspects of the learning environment, we used a Chinese version of the “Community of Inquiry” survey that was originally developed by Professor Lan’s research team. The survey contains 27 questions covering three main areas: teaching quality (13 questions), class participation (5 questions), and thinking skills (9 questions). The teaching part looks at three things—lesson planning (4 questions), discussion skills (6 questions), and giving clear instructions (3 questions)-which together show how well teaching activities are organized. The participation section checks how comfortably students express themselves and work with classmates, while the thinking skills part sees how students handle tough problems by analyzing and solving them. Previous uses of this Chinese version have proven it works reliably and accurately, just like Lan’s team originally showed. Basically, this survey has proven to be really useful for understanding how teaching and learning actually work in Chinese universities-all the specific results and details are laid out clearly in Table 1.
Table 1. Community of Inquiry scale.
No. |
Items |
Source |
Teaching design |
1 |
The instructor proficiently communicated the central themes of the course. |
Lan et al. (2018a) [26] |
2 |
The instructor clearly articulated the fundamental objectives of the course. |
3 |
The instructor articulated explicit guidelines on how to partake in the educational activities linked to the course content. |
4 |
The instructor lucidly outlined the key dates and timelines for study engagements. |
Dialogue facilitation |
1 |
The instructor assisted me in recognizing points of consensus and divergence on the course topics, thereby enriching my learning journey. |
Lan et al. (2018a) [26] |
2 |
The instructor guided me in understanding the course material by assisting me in clarifying my thinking. |
3 |
The facilitator supported me in participating in productive discussions. |
4 |
The instructor directed me to stay focused, which promoted my academic advancement. |
5 |
The instructor encouraged me to explore new viewpoints and innovative ideas. |
6 |
The instructor’s interventions cultivated a sense of community in my learning process. |
Direct instruction |
1 |
The instructor facilitated discussions on pertinent course topics in a manner that enhanced my learning. |
Lan et al. (2018a) [26] |
2 |
The instructor’s feedback enabled me to recognize my strengths and areas for improvement. |
3 |
The instructor provided timely feedback. |
Social presence |
1 |
Online communication and digital exchange platforms function as optimal channels for communities to interact with each other. |
Lan et al. (2018a) [26] |
2 |
I find it comfortable to communicate through online platforms. |
3 |
I feel at ease when collaborating in discussions. |
4 |
I am comfortable during interactions with peers. |
5 |
Even when there are discrepancies with other course participants, I maintain a sense of trust with ease. |
Cognitive presence |
1 |
The instructor’s questions have sparked my interest in the course. |
Lan et al. (2018) [26] |
2 |
Engaging in course activities has ignited my passion for exploration. |
3 |
I am motivated to investigate issues related to the course content more deeply. |
4 |
Brainstorming and researching relevant information have assisted me in addressing problems associated with the course content. |
5 |
Engaging in online forums has significantly broadened my comprehension of diverse viewpoints and insights. |
6 |
Effectively incorporating new information enables me to tackle questions that emerge from course tasks. |
7 |
The course assignments facilitate the development of analytical approaches and problem-solving techniques. |
8 |
Essential to my understanding of the course’s core concepts has been the reflection on both the course materials and the discussions. |
9 |
I have discovered that the knowledge gained from this course can be effectively applied to my professional work or other related activities. |
3.2.2. Pilot Test
In the pilot test, exploratory factor analysis was conducted using SPSS 26.0 software to optimize the reliability and validity of the questionnaire and to eliminate redundant or ineffective items. The specific criteria included: cumulative explained variance of no less than 50%, a p-value less than 0.050 for Bartlett’s Test of Sphericity, a Kaiser-Meyer-Olkin (KMO) measure greater than 0.600, factor loadings of no less than 0.300, and eigenvalues of no less than 1.00 [27]-[29]. Additionally, any items with a Cronbach’s alpha coefficient below 0.700 after analysis were removed [30]. Following this process, preliminary results indicated that the questionnaire exhibited strong reliability and validity, with all retained items meeting the aforementioned standards. Specific results for each scale are detailed in Table 2.
Table 2. Results of pilot study
Scale |
Cronbach’s alpha |
KMO |
Sphericity Bartlett test |
Cumulative variance explained |
The smallest items communalities |
Eigenvalue |
Teaching Design |
0.866 |
0.783 |
0.000 |
72% |
0.464 |
≥1.00 |
Dialogue Facilitation |
0.916 |
0.904 |
0.000 |
71% |
0.628 |
≥1.00 |
Direct Instruction |
0.842 |
0.710 |
0.000 |
76% |
0.733 |
≥1.00 |
Social Presence |
0.779 |
0.737 |
0.000 |
54% |
0.392 |
≥1.00 |
Cognitive Presence |
0.866 |
0.899 |
0.000 |
61% |
0.367 |
≥1.00 |
3.3. The Processes of Gathering and Examining Data
In December 2024, a comprehensive survey was distributed to all students via the Wenjuanxing platform, preceded by detailed explanations from instructors. From the total of 333 responses, 312 valid questionnaires were retained after filtering out invalid entries, achieving a high validity rate of 93.7%. The study adopted methodologies from Arbaugh (2007) and Garrison et al. (2010), specifically focusing on and controlling for participants’ age and prior blended learning experience to accurately measure the impact of these factors on cognitive structure variations [10] [31].
To address issues with common method variance, researchers took multiple precautionary steps. Information about how people think was gathered through questionnaires where people report on themselves, designed to reduce overlapping biases. The survey was done without names, with questions mixed up randomly and their purposes somewhat hidden. Table 3 shows findings that demonstrate reasonable measurement accuracy, suggesting shared biases didn’t majorly affect results. That is to say, a statistical method called Harman’s test was used to check how serious these biases were [32]. When looking at all 25 items through factor analysis, the first factor that came out accounted for 45.914% of variation-below the 50% mark many researchers use as a threshold [33] [34]. Therefore, it appears these shared biases didn’t dominate the study’s outcomes. To put it simply, the steps taken seem effective in controlling for this particular type of measurement error, though some minor influence might still exist in the background.
In the context of data analysis, partial least squares-structural equation modeling was selected due to several important factors: (a) how suitable it is for building exploratory theories; (b) appropriateness when conducting predictive analysis; (c) being effective for looking into complex predictive models that have multiple variables; and (d) the capability to manage relationships between variables even in datasets showing non-standard distributions, given that it doesn’t require strict assumptions about data normality or randomness, so to speak. The current study matches these conditions, that is to say. To evaluate the measurement models along with structural ones, the partial least squares approach was used in structural equation modeling through SmartPLS 4.0 software. Testing of hypotheses was performed with the standard partial least squares algorithm to check the statistical importance of parameters estimated, using a bootstrap approach involving 5000 repeats, following the methods suggested by Hair and colleagues in 2011 [34]. To put it simply, this process allows for verifying how well the theoretical framework aligns with observed data patterns while accounting for variability in real-world datasets.
4. Results
4.1. Evaluation of the Measurement Model
This research used a two-stage method, following approaches suggested by Hair et al. (2017) [35]. The first stage mainly looked at checking how well the model converges and its reliability. That is to say, convergence was seen as acceptable when meeting certain conditions: factor loadings for all variables should ideally be 0.700 or higher, as per Hair et al. (2014) [36]. Items with loadings below 0.700 were considered for removal only if their deletion enhanced overall reliability, while those with loadings below 0.400 were consistently eliminated, following Hair et al. (2017) [35]. Additionally, it was essential that the composite reliability surpassed the 0.700 threshold recommended by Gefen et al. (2000) [37]. Fornell and Larcker (1981) further stipulated that the average variance extracted should exceed 0.500 [38]. After selectively removing items with loadings below 0.700, the refined model met all the specified criteria, as shown in Table 3 and Figure 2.
Table 3. Evaluation of the measurement model.
Variable |
Cronbach’s alpha |
rho_A |
Composite reliability |
Average variance extracted (AVE) |
Teaching Design |
0.832 |
0.843 |
0.887 |
0.664 |
Dialogue Facilitation |
0.849 |
0.852 |
0.888 |
0.570 |
Direct Instruction |
0.801 |
0.802 |
0.883 |
0.716 |
Social Presence |
0.812 |
0.815 |
0.869 |
0.571 |
Cognitive Presence |
0.893 |
0.897 |
0.916 |
0.609 |
Figure 2. Partial Least Squares path modeling analyzed relationship coefficients and R2 with a sample size of 312.
4.2. Evaluation of Discriminant Validity
The analysis then checked how well the model distinguishes between different concepts using the HTMT method [39]. Following Kline’s (2011) guidelines, the study confirmed the model works properly when the HTMT scores stayed below 0.900 [40]. The results came in between 0.680 and 0.895-well under that cutoff-proving the model reliably measures what it’s supposed to measure without getting concepts mixed up (as detailed in Table 4). This final check confirmed the model is both accurate and trustworthy for research purposes.
Table 4. Heterotrait-monotrait ratio results.
|
CP |
DI |
FD |
SP |
TDO |
CP |
|
|
|
|
|
DI |
0.790 |
|
|
|
|
DF |
0.843 |
0.838 |
|
|
|
SP |
0.865 |
0.739 |
0.888 |
|
|
TD |
0.680 |
0.895 |
0.747 |
0.707 |
|
4.3. Evaluation of the Structural Model
The study first checked for collinearity issues in the structural model, and all predictor constructs met acceptable standards. The variance inflation factors ranged from 1.486 to 2.710 - above the baseline of 1.000 but safely below the critical cutoff of 5.000, showing acceptable reliability as highlighted by Hair et al. (2017) [35]. This means there were no multicollinearity problems, confirming the model’s formative nature. Plus, all dimension weights exceeded the recommended 0.100 minimum, as clearly shown in Figure 2 [35].
Following this, a bootstrap procedure was applied with a resampling frequency of 5,000, in accordance with the recommendations of Hair et al. (2017), to estimate the Beta coefficients, p values, t values, and the corresponding bootstrap confidence intervals [35]. The analysis adopted the critical values for a one-tailed t-test: 1.645 for a significance level of p < 0.05, 2.327 for p < 0.01, and 3.092 for p < 0.001, as outlined by Hair et al. (2017) [35]. The use of the bootstrapping method led to the discovery of significant effects, as reported in Table 5 and Figure 2. Consequently, the empirical evidence confirmed hypotheses H1 through H9.
4.4. Coefficient of Determination and Predictive Relevance
The R-squared (R2) score shows how accurate a model’s predictions are-basically by measuring how closely the predicted values match the actual results [41]. R2 scores range from 0 to 1, where higher numbers mean better predictions. According to Hair’s benchmarks (2016): 0.250 means weak predictive power, 0.500 means moderate, and 0.750 means strong. In our study, we got R2 scores of 0.584 for Social Presence and 0.662 for Cognitive Presence (see Figure 2)-solid results showing our model makes pretty good predictions.
Table 5. Evaluation of structural model (n = 312).
Path |
Standard path coefficients |
Sample Mean(M) |
Standard deviation |
T-statistics |
p-values |
Findings |
TD -> SP |
0.170*** |
0.170 |
0.052 |
3.305 |
0.000 |
H1 certified |
DF -> SP |
0.588*** |
0.590 |
0.045 |
13.172 |
0.000 |
H2 certified |
DI -> SP |
0.072 |
0.072 |
0.067 |
1.075 |
0.141 |
H3 uncertified |
TD -> CP |
0.030 |
0.030 |
0.049 |
0.608 |
0.272 |
H4 uncertified |
DF -> CP |
0.281*** |
0.281 |
0.045 |
6.180 |
0.000 |
H5 certified |
DI -> CP |
0.225*** |
0.224 |
0.050 |
4.491 |
0.000 |
H6 certified |
TD -> SP -> CP |
0.065** |
0.064 |
0.022 |
3.005 |
0.001 |
H7 certified |
DF -> SP -> CP |
0.225*** |
0.224 |
0.037 |
6.029 |
0.000 |
H8 certified |
DI -> SP -> CP |
0.027 |
0.027 |
0.026 |
1.053 |
0.146 |
H9 uncertified |
*p < 0.05, t > 1.645; **p < 0.01, t > 2.327; ***p < 0.001, t > 3.092 (one tailed).
The Q2 metric-first highlighted by Stone (1974) and later emphasized by Henseler & Fassott (2009)-is a crucial measure of a model’s predictive power [42] [43]. Using PLS analysis, it shows how well the model’s variables can predict outcomes. Positive Q2 values mean the model works, with scores above 0.35 considered strong, above 0.15 moderate, and above 0.02 weak [44]. This study’s Q2 scores (Social Presence = 0.323, Cognitive Presence = 0.395) demonstrate really solid predictive capability-coming close to that strong benchmark.
5. Discussion
In higher education English programs, blended learning has become a hot topic, pushing researchers to figure out how these mixed online/classroom environments actually shape student outcomes. Basically, they’re trying to understand what makes these learning communities tick and how different elements impact academic performance. The Community of Inquiry framework gives a handy way to break this down-looking at three key pieces: how teachers guide the course (teaching presence), how students interact (social presence), and how deep thinking happens (cognitive presence). Together, these elements create the full learning experience by mixing smart course design, peer connections, and real intellectual engagement. While existing research demonstrates teaching significantly impacts both social and cognitive aspects of learning [2] [5] [6] [11], there’s still not enough research breaking down exactly how specific teaching components affect other elements of the learning experience. Previous studies [5] [9] have begun examining how particular teaching methods influence classroom dynamics, and how these social factors then shape students’ thinking processes. But here’s the catch-the majority of this work has employed simple one-factor correlation analyses, missing the bigger picture of how all these elements interact through the Community of Inquiry framework, especially those indirect effects that might be quietly shaping the learning process. To bridge this gap, the current study uses the Community of Inquiry framework to examine how specific teaching approaches influence both social dynamics and critical thinking in blended learning environments- essentially mapping out how different instructional methods shape classroom interactions and ultimately affect learning outcomes. In plain terms, this research aims to show educators exactly which teaching strategies boost collaboration and learning, giving them practical insights to improve their courses.
Kicking off the analysis, this study examined how different teaching approaches impact social interaction in blended learning using statistical modeling. The findings reveal that both course design and discussion facilitation boost students’ social engagement-with discussion support showing a much stronger effect (0.588) compared to instructional planning (0.170, see Figure 2). These results align with prior research [44] [45] while extending their relevance from pure online classes to today’s hybrid classrooms. Put simply, in blended learning environments, actively encouraging student dialogue proves far more impactful for building meaningful peer connections than perfecting syllabus design alone. Teachers should prioritize sparking peer-to-peer discussions-that’s the real game-changer for boosting teamwork and student engagement in blended courses. Now, when it comes to direct instruction (the third teaching component), its link to social interaction stays positive but barely registers statistically (effect size: 0.072). This mirrors what Costley (2015) found-their study on instructor communication styles showed direct teaching methods just don’t move the needle meaningfully for classroom connections [46]. Take instructional videos-Alemdag’s 2022 study found they did little to boost classroom connections [47]. Here’s why: real social engagement thrives on back-and-forth dialogue, collaborative exploration, and mutual interactions in learning communities [11] [12]. While direct teaching delivers content efficiently, it’s ultimately a one-way street-great for information transfer but poor at creating the meaningful two-way exchanges that build lasting student relationships.
The study also looked at how teaching methods impact students’ critical thinking in blended learning. Turns out, three factors really matter: guiding discussions, direct instruction, and course structure—with class discussions making the biggest difference (path coefficient 0.281), followed by teacher-led lessons (0.225), while syllabus design barely moves the needle (0.030). Here’s what’s interesting: while lively debates and clear explanations noticeably boost deeper thinking, how the instructor structures the course materials doesn’t seem to matter as much-findings that partly align with earlier research by Bai & Gu (2021) and others [4] [48] in the field. The observed patterns stem from several practical considerations. At the planning stage, course design primarily occurs before classes commence [5]-where educators develop materials and structure activities in advance, whereas discussion facilitation and instructional delivery evolve organically throughout the course duration. During active learning phases, instructors provide ongoing student support coupled with just-in-time guidance [49]. Meanwhile, while curricular frameworks maintain relative stability, teachers dynamically adjust their discussion techniques and instructional approaches in response to emerging student needs and evolving classroom contexts. This inherent adaptability explains why real-time pedagogical interactions consistently outperform static course materials in fostering higher-order cognitive engagement. The study highlights how fostering classroom discussions serves as the foundation for developing critical thinking skills. Teachers play a crucial role as discussion leaders -their main job being to spark meaningful exchanges between students. Through ongoing dialogue, instructors help learners build knowledge together. Direct instruction taps into educators’ expertise to deliver specialized knowledge and methodological guidance, which proves equally vital for cognitive growth-a finding supported by Zhu et al. (2019) [50]. From the curriculum perspective, students typically focus on absorbing new information and applying it to course-related problem-solving during class sessions. Here's the catch-students rarely progress to applying knowledge in new contexts, because most course designs don’t push for advanced critical thinking or create space for deeper exploration. This limitation in fostering higher-level cognition explains why course structure shows weaker ties to cognitive development [5]. Simply put, when class materials only test immediate application without stretching students’ thinking, their intellectual growth plateaus-making live teaching moments throughout the course far more impactful than pre-planned structural elements.
The study also explored how social interaction bridges teaching methods and critical thinking. The analysis revealed that peer connections serve as a vital link-connecting both course design to deeper learning and class discussions to cognitive growth, yet fail to mediate the relationship between direct instruction and thinking skills. This aligns with existing research showing indirect connections [14], where how a course is structured influences thinking through classroom relationships rather than directly. Essentially, while confirming social interaction's known mediating role, these findings shed new light on the complex interplay between teaching approaches, peer dynamics, and cognitive development-addressing gaps in Garrison’s foundational 2010 framework. The study uncovered three key teaching mechanisms: 1) Course design only boosts thinking skills through improving peer interactions, 2) Classroom discussions enhance learning both directly and by strengthening student connections, while 3) Direct instruction works on its own. This reveals a clear effectiveness ranking: facilitating discussions matters most, followed by explicit teaching, with course structure coming third. These findings advance blended learning theory by showing how different teaching approaches uniquely contribute to deeper learning through collaborative knowledge-building. For teachers, this provides concrete evidence to help optimize their course delivery methods. By understanding which teaching elements work through peer interaction versus direct delivery, educators can smarter design blended learning experiences. Ultimately, this research expands our grasp of classroom dynamics while giving teachers practical strategies—like those discussed by Jia & Li (2020)—to intentionally boost students’ critical thinking through targeted instructional choices [14].
The study shows that in university-level English courses, three key teaching elements-lesson foundations, discussion management, and guided methods-significantly boost both deep learning and classroom community. To achieve this, educators should: 1) Design courses like “architects,” creating materials and discussion topics that spark critical thinking, and 2) Build interactive support systems through activities that foster real collaboration-essentially helping students work together more effectively while engaging deeply with the material. To enhance classroom discussions, instructors should act as “co-editors”-presenting students with real-world scenarios that challenge them to explore and synthesize practical solutions. By exposing learners to authentic problem-solving contexts, teachers help bridge the gap between theory and practice [51]. When giving feedback, educators become “navigators,” guiding self-reflection and peer discussions. Through careful questioning and interpretation, they empower students to connect ideas and expand their thinking. When needed, teachers should provide clear explanations and extra resources, using structured support to help students overcome learning hurdles. This approach helps learners connect classroom concepts to real-world problems—making their education more meaningful. Simply put, teachers become the bridge between textbook knowledge and everyday application [5].
While blended learning has become common in university English courses, there’s still room for improvement [52]. This study used advanced modeling to examine how teaching methods impact both classroom connections and learning outcomes, pinpointing what really works. The findings show that communication-focused teaching makes the biggest difference—boosting both peer relationships and understanding directly. Next comes instructional delivery, which shapes how students process information but doesn’t rely on peer interactions like discussion-based methods do. Building on these insights, course planning only affects learning outcomes through social connections. These findings align with previous research emphasizing teaching strategies’ importance [5] [9]. Current literature shows limited exploration combining teaching methods, social engagement, and learning processes in hybrid education under one framework. That’s why this study creates an integrated model tying these elements together, building on known relationships. Based on our results, researchers suggest expanding this work to other subjects and educational settings to better understand these complex dynamics.
6. Constraints and Ramifications
This study acknowledges limitations in three key domains. Methogologically, reliance on cross-sectional questionaire data constrains casual inference. While structural equation modeling captures complex patterns, future work should adopt cross-lagged desings or intervention experiments (e.g., randomized controlled field trials) to strengthen causal claims. Geographically, the eastern China sampling may reflect regional digitalization policies; comparative studies across central-western institutions or cross-border cases (e.g., differences within the Guangdong-Hong Kong-Macao Greater Bay Area) could test the framework’s cultural adaptability. Methodological expansion should integrate mixed-methods approaches: classroom video analysis could capture micro-interactions of teaching presence, while learning journals might trace cognitive presence trajectories-such triangulation would enhance ecological validity in social presence measurement. Theoretically, decomposing social presence (e.g., emotional support vs. task collaboration) and cognitive presence subdimensions could refine mediation pathways. Discipline-specific mechanisms, such as technical term negotiation in translation courses, may generate unique moderating effects requiring tailored measurement tools.
Acknowledgements
Funding for this research was provided through several grants. Notably, it included support from the 2025 Zhejiang Provincial Educational Science Planning Project (Project No.2025SCG401), the First-class Undergraduate Course initiative in Zhejiang Province for the courses “A Survey of Major English-speaking Countries” (Project No. 689, 2019) and “English Newspaper Reading I” (Project No. 507, 2020). Additionally, the project benefited from the Research Project on the Data-Driven Online-Offline Blended Teaching Model for ESP Courses (Project No. SXSJG202302).
Conflicts of Interest
The author declares no conflicts of interest.