Extending the Community of Inquiry Framework: Exploring the Roles of Agentic Engagement in Blended College English Learning ()
1. Introduction
Recent advances in educational technology have transformed blended learning from an option to a must-have. The 2020 College English Teaching Guidelines now require instructors to actively create diverse learning environments using blended models. But here’s the challenge: as a core required course reaching millions nationwide, college English still struggles to build high-quality learning communities that transcend time and space constraints, a critical issue noted by Jia and Gao (2023).
The Community of Inquiry framework, a cornerstone theory for blended learning in higher Ed., shows that effective teaching requires three key elements working together: teacher guidance, peer connections, and critical thinking (Garrison et al., 1999). But researchers have noticed a gap: while this model explains how classroom environments and social dynamics support advanced learning, it overlooks how individual students’ motivations and behaviors shape their blended learning experience (Shea & Bidjerano, 2010; Lan et al., 2018a). Research shows students’ active participation in English learning remains moderate at best (Gao & He, 2020; Li, 2021). Problem behaviors like cutting corners, getting distracted, shifting goals, losing focus, and even cheating (Gong et al., 2018; Yang & Dai, 2021; Aini & Ciptaningrum, 2024) are key reasons for poor learning outcomes. This raises a crucial question: Does agentic engagement actually drive success in blended English courses?
Agentic engagement focuses on the positive aspects of student learning, referring to learners’ proactive or constructive contributions to their education during the learning process. It involves personalized participation to enhance instructional conditions and improve the learning environment (Reeve & Tseng, 2011; Reeve, 2013). Theoretically, mastering English is a marathon requiring teachers to connect vocabulary, content, and real-world applications making student agency crucial for lasting progress (Guo & Li, 2018) and a key predictor of success (Montenegro, 2017). Research clearly shows that students’ active participation strongly connects to teacher support, peer interaction, and English learning outcomes even helping explain how teacher involvement boosts achievement (Jiang & Zhang, 2021; Kucuk & Richardson, 2019; Ye, 2023; Shi, 2023; Littler, 2024). But these findings come mostly from high school classrooms or online college courses, leaving open questions about blended English learning. That’s why this study aims to: (a) propose treating student initiative as its own key element in the Community of Inquiry framework, and (b) test how it actually interacts with the model’s traditional three components in blended settings.
2. Literature Review
2.1. Community of Inquiry Framework
The Community of Inquiry framework, developed by Canadian researcher Garrison and colleagues, examines how teaching presence, social presence, and cognitive presence work together to develop critical thinking skills (Garrison et al., 1999). Here, “teaching presence” refers to how students perceive instructional activities during their learning process, which includes three key elements: designing and organizing lessons, guiding discussions, and providing direct instruction. Essentially, this highlights the teacher’s role in leading and supporting learning within this framework. Social presence captures how students connect and build trust within their learning community through interactions-it includes three key aspects: emotional expression (sharing feelings), open communication (honest dialogue), and group cohesion (sticking together as a team). Cognitive presence represents students’ meaning-making process through ongoing reflection and discussion, unfolding in four natural stages: triggering curiosity, exploring ideas, connecting concepts, and reaching solutions.
The Community of Inquiry framework emerged from blended learning experiences (Garrison et al., 1999). Early research mainly focused on either developing theoretical models (Shen & Sheng, 2015; Qiao, 2017) or testing how these models worked in practice (Lu et al., 2018; Wang & Liu, 2019). What’s missing are thorough, data-driven studies examining blended learning through multiple lenses-especially for college English courses (Wu, 2017; Lan et al., 2020) where this approach matters most (He & Huang, 2023; Jia & Gao, 2023). Moreover, while teaching presence, social presence, and cognitive presence are recognized as key elements in constructing blended learning environments, the Community of Inquiry framework has not adequately addressed how individual learners’ agency influences the learning process. To address this gap, the study brings student agency into the model, specifically examining blended college English courses. Using real classroom data, the study reveals how learners actively contribute to building effective blended learning communities.
2.2. Agentic Engagement
Student engagement measures how connected learners feel to their school’s people, activities, and values (Skinner et al., 2009). It works through three interconnected channels: emotional engagement (their positive/negative feelings during learning), cognitive engagement (how strategically they approach studying), and behavioral engagement (visible effort and commitment to academic work). Essentially, it captures whether students are psychologically invested and actively participating in their education (Skinner et al., 2009).
Building on the three traditional engagement types, agentic engagement adds a crucial fourth dimension-students taking charge of their learning through initiative and independence (Montenegro, 2017). This means learners proactively shape their education by improving lessons and classroom environments through personal involvement (Reeve & Tseng, 2011; Reeve, 2013; Guo & Li, 2018). At its core, it’s about students becoming co-creators of their learning experience by voicing preferences, asking questions, and communicating needs to build better classrooms together with teachers (Wakefield, 2016). Back in 2013, Reeve’s groundbreaking Korean classroom study revealed a key insight: true student engagement isn’t just about following teachers’ lead, but emerges through dynamic back-and-forth interactions between learners and instructors (Reeve, 2013). This work identified four powerful engagement styles: self-driven initiative (“I’ll tackle this myself”), teacher-supported learning (“Guide me through this”), collaborative co-creation (“Let’s work on this together”), and peer-assisted growth (“We’ll help each other learn”), each playing distinct roles in effective classrooms (Guo & Li, 2018).
2.3. Agentic Engagement and the Community of Inquiry Framework
Recent studies highlight how quality learning hinges on three key elements: effective teaching, dynamic peer interactions, and students’ prior knowledge (Lan et al., 2020). However, current frameworks often overlook other crucial factors like learning strategies and beneficial study habits that don’t involve direct teaching-especially in diverse global classrooms where these non-instructional elements significantly boost outcomes (Lan et al., 2020; Stenbom et al., 2016). What really drives academic success? Research shows self-regulated learning, where students actively manage their study approaches (Hu & Kuh, 2002; Richardson & Long, 2003; Richardson & Newby, 2006), outperforms traditional teaching methods in long-term knowledge retention (Dixon, 2015), while also enhancing other valuable learning behaviors (Ma et al., 2015). Since blended college English courses demand high student involvement, active participation becomes absolutely crucial (Al-Samarraie & Saeed, 2018; Qian, 2022; Tay, 2016). That’s why this study aims to expand the Community of Inquiry framework to include students’ personal initiative (agentic engagement) as a key ingredient in blended learning. By weaving this element into the model, we aim to uncover how individual drive contributes to successful learning communities, ultimately helping create higher-quality blended classrooms.
At the heart of blended learning, teaching presence captures how instructors shape the experience through three key actions: designing courses, guiding discussions, and delivering direct instruction. This core element of the Community of Inquiry framework (Garrison & Akyol, 2013) consistently boosts students’ critical thinking, sense of belonging, and perceived learning outcomes (Garrison & Arbaugh, 2007). Recent English language studies confirm that teachers’ investment, instructional style, and classroom discourse directly fuel student initiative-with research by Ye (2023), Jiang & Zhang (2021), and Min (2022) all showing strong positive connections. Strong course design, skilled discussion leading, and clear instruction all help students take charge of their learning (Garrison & Cleveland-Innes, 2005; Krpanec et al., 2024; Yang, 2023; Liu & Guo, 2021). But here’s the catch-most evidence comes from high school English classes, leaving a gap in understanding how college blended English courses affect student initiative. That’s why our first key hypothesis (see Figure 1) states: Teaching presence positively predicts students’ agentic engagement.
Exploring the collaborative framework suggests that educators develop the ability to create interactive learning environments through three main elements: expressing real emotions, maintaining open discussions, and carefully planning course structures (Arbaugh & Benbunan-Fich, 2006; Garrison & Arbaugh, 2007). Several studies have shown that these factors directly influence teaching outcomes and material handling (Sun & Yang, 2023; Miao & Ma, 2022; Miao et al., 2022). On this basis, social connections are based on three components: building genuine emotional relationships, fostering mutually respectful interactions, and building group unity through learning common goals and collaborative support within the community (Garrison et al., 1999; Garrison & Akyol, 2013). In short, showing true feelings, promoting open communication and creating strong team cohesion depend on establishing personal relationships between teachers and students (Garrison & Abaugh, 2007). Research has shown that students with higher levels of positive behavior tend to establish stronger social relationships and perform better in mixed learning environments. That is, existing academic research has not thoroughly investigated how autonomous student participation shapes peer dynamics in mixed learning environments (Reeve & Jang, 2022; Gao & He, 2020). Based on the discussion above, the central proposal (see Figure 1) suggests that the agentic engagement of students is positively correlated with their social presence (H2).
Rooted in Dewey’s inquiry-based learning model, cognitive presence measures how deeply students engage in meaningful discussions, critical thinking, and knowledge-building (Garrison et al., 1999, 2001). It’s the core of the learning experience, what Vaughan and Garrison (2005) and Kucuk and Richardson (2019) call “the heart of successful education.” But blended learning demands more student investment than traditional classes, requiring greater self-direction to manage time and attention effectively (Wu & Chen, 2017). Essentially, while traditional lectures let students passively receive information, blended environments need learners to actively shape their understanding through consistent participation. Studies confirm that when students take charge of their learning (agentic engagement), it boosts both their performance and self-motivation (Reeve & Tseng, 2011; Reeve, 2013) while creating better classroom dynamics-showing how students and teachers can transform learning environments together (Reeve, 2013). But research hasn’t fully explored how this initiative affects deeper thinking (cognitive presence) in blended courses. That’s why our third hypothesis (see Figure 1) states: Students’ agentic engagement positively predicts their cognitive presence.
The Community of Inquiry model shows that real learning happens when three key elements work together: teacher guidance, peer connections, and critical thinking-all carefully designed in blended courses (Garrison & Arbaugh, 2007; Garrison & Cleveland-Innes, 2005; Joo et al., 2011; Meyer, 2014). Here’s how it works: 1) Social presence builds emotional bonds through open communication and teamwork (Garrison et al., 2000; Garrison & Akyol, 2013; Kucuk & Richardson, 2019), 2) Strong teaching presence creates classroom communities where students actively participate (Joo et al., 2013; Kucuk & Richardson, 2019), and it serves as a reliable indicator of both students’ proactive learning behaviors (agentic engagement) and their emotional investment in coursework (Reeve, 2013), and 3) Cognitive presence drives meaningful knowledge construction through active learning strategies like critical thinking-where students build understanding through experiences, interactions, and discussions (Joo et al., 2013; Kanuka & Garrison, 2004). This thinking process also connects to emotional engagement. Importantly, research confirms social presence significantly bridges teaching methods and critical thinking development, working both directly and through sequential mediation effects (Jia & Gao, 2023). Research shows students’ proactive engagement (agentic engagement) and emotional investment in learning are closely connected (Gao & He, 2020), and can help explain how teaching quality impacts academic achievement (Ye, 2023). These findings suggest proactive student behaviors interact with all three Community of Inquiry elements (teaching, social, and cognitive presence). The study therefore proposes three specific relationships (see Figure 1): Hypothesis 4: Agentic engagement mediates the relationship between teaching presence and social presence. Hypothesis 5: Agentic engagement moderates the relationship between teaching presence and cognitive presence. Hypothesis 6: Agentic engagement mediates the relationship between teaching presence and cognitive presence through social presence.
Building on these insights, this study expands the Community of Inquiry framework by incorporating student agency (agentic engagement) as a new component, aiming to theoretically examine its multiple roles in blended learning communities and test six specific model relationships (shown in Figure 1).
Figure 1. The proposed framework. Note: TP = Teaching Presence; AE = Agentic Engagement; SP = Social Presence; CP = Cognitive Presence.
3. Method
3.1. Participants
This study focuses on non-English major students in blended English programs across four universities in Eastern China. Initially, 76 students helped test the questionnaire’s validity. For the main study, researchers intentionally selected 380 participants from an elective course-“Chinese-English News Translation” in Zhejiang province, all taught by the lead researcher. Participants aged 19 - 22 had 1 - 2 years of blended learning experience, with the sample comprising 179 male and 201 female students (70.8% juniors, 29.2% seniors). The 15-week spring 2024 course combined 34 in-person hours with 12 online self-study hours, requiring consistent use of digital resources.
3.2. Measures
3.2.1. Scales
Building on prior research, this study uses an 11-point rating scale (0 = “strongly disagree” to 10 = “strongly agree”) that outperforms traditional 5 or 7-point scales in clarity and data accuracy, especially for detailed student self-assessments (Leung, 2011; Gulo, 2017). To measure student initiative, the study adopted Guo and Li’s (2018) Foreign Language Learning Agentic Engagement Scale, which tracks four behaviors: self-directed learning, teacher-supported learning, collaborative learning with instructors, and peer-assisted learning, collectively explaining 71.027% of learning differences (see Table 1 for full details).
Table 1. Agentic engagement instrument.
No. |
Items |
Source |
Self-motivated Study |
1 |
I face and solve difficulties in English learning positively. |
Guo & Li (2018) |
2 |
I improve my English proficiency through various methods. |
3 |
I cultivate a positive attitude towards English learning. |
4 |
I do not make efforts to study English. |
Assisting Teacher in Teaching |
1 |
I provide the teacher with some English materials and relevant information. |
Guo & Li (2018) |
2 |
I offer suggestions or feedback to the teacher on how to improve English teaching effectiveness. |
3 |
I inform the English teacher about my learning interests or needs. |
Cooperating with Teacher in Teaching |
1 |
I pay close attention during English classes. |
Guo & Li (2018) |
2 |
I actively complete the English learning tasks assigned by the teacher. |
3 |
I actively participate in classroom activities organized by the English teacher. |
4 |
I preview the content that the English teacher will cover in class. |
Assisting Peers in Learning |
1 |
I encourage my classmates to learn English. |
|
2 |
I share relevant English learning materials or information with my classmates. |
3 |
I help my classmates overcome difficulties in English learning. |
This study employs the Chinese-adapted Community of Inquiry scale (Lan et al., 2018b), a meticulously translated and validated measurement tool tested with Chinese university students. The 27-item instrument covers three key dimensions: 13 items on teaching presence, 5 on social presence, and 9 assessing cognitive presence-demonstrating strong reliability, validity, and structural soundness (Lan et al., 2018b) in Chinese educational contexts (see Table 2 for detailed metrics).
Table 2. Community of inquiry framework instrument.
No. |
Items |
Source |
Teaching presence |
1 |
The facilitator distinctly conveyed pivotal course subjects. |
Lan et al. (2018b) |
2 |
The facilitator explicitly articulated essential course objectives. |
3 |
The facilitator supplied precise guidance on engaging in course learning activities. |
4 |
The facilitator clearly indicated critical deadlines and timeframes for learning activities. |
5 |
The facilitator assisted in pinpointing areas of concurrence and discrepancy on course subjects, aiding my learning. |
6 |
The facilitator skillfully guided the class to understand the course material in a way that clarified my thoughts. |
7 |
The facilitator kept students engaged and encouraged productive discussions among participants. |
8 |
The facilitator made sure everyone stayed on track, which really helped my learning. |
9 |
The facilitator motivated participants to explore novel concepts of this course. |
10 |
The facilitator actions bolstered the cultivation of a communal spirit among participants. |
11 |
The facilitator sharpened discussions on relevant topics, which improved my understanding. |
12 |
The facilitator also gave me helpful feedback that highlighted my strengths and weaknesses related to the course goals. |
13 |
The facilitator always provided the feedback timely. |
Social presence |
1 |
Web-based or online communication serves as an outstanding platform for social engagement. |
Lan et al. (2018b) |
2 |
I found it comfortable to converse via the online platform. |
3 |
I felt at ease participating in course discussions. |
4 |
I was comfortable engaging with other course participants. |
5 |
I was comfortable expressing disagreement with other participants while preserving a sense of trust. |
Cognitive presence |
1 |
Challenges presented heightened my interest in course topics. |
Lan et al. (2018b) |
2 |
Course activities sparked my curiosity. |
3 |
I felt driven to investigate content-related inquiries. |
4 |
Brainstorming and locating pertinent information assisted me in addressing content-related questions. |
5 |
Online dialogues were beneficial in aiding my appreciation of diverse viewpoints. |
6 |
Integrating new information helped me respond to questions posed in course activities. |
7 |
Educational activities facilitated the construction of explanations or solutions. |
8 |
Reflecting on course material and discussions enhanced my comprehension of foundational concepts in this class. |
9 |
I am able to apply the knowledge gained in this course to my
work or other activities unrelated to class. |
3.2.2. Preliminary Study
The preliminary testing phase employed SPSS 26.0 for exploratory factor analysis to refine the survey by eliminating unclear or redundant items. The analysis established strict quality criteria: the model was required to explain at least 50% of variance, demonstrate statistical significance on Bartlett’s test (p < 0.05), achieve a Kaiser-Meyer-Olkin measure above 0.600, maintain factor loadings exceeding 0.300, and preserve eigenvalues of at least 1.00 (Barrett & Morgan, 2005; Hair et al., 2006; Pallant, 2011). Additionally, all items with Cronbach’s alpha coefficients below 0.700 were removed following analysis (Hair et al., 2010). This refinement process yielded a questionnaire demonstrating strong reliability and validity, with all retained items meeting the specified standards, detailed metrics for each scale are presented in Table 3.
Table 3. Preliminary study outcomes.
Scale |
Cronbach’s alpha |
KMO |
Sphericity Bartlett test |
Cumulative variance explained |
The smallest items communalities |
Eigenvalue |
Teaching Presence |
0.914 |
0.889 |
0.000 |
61% |
0.440 |
≥1.00 |
Agentic Engagement |
0.937 |
0.910 |
0.000 |
55% |
0.472 |
≥1.00 |
Social Presence |
0.798 |
0.783 |
0.000 |
63% |
0.587 |
≥1.00 |
Cognitive Presence |
0.880 |
0.868 |
0.000 |
63% |
0.475 |
≥1.00 |
3.3. Data Gathering and Data Examination
In May 2024, following detailed instructor guidance, the study administered surveys via Wenjuanxing platform to all enrolled students. From 406 collected responses, 380 valid questionnaires remained after quality screening, achieving a 93.6% effective response rate. The analysis applied Arbaugh’s (2007) and Garrison et al.’s (2010) theoretical frameworks, while specifically accounting for participants’ age and blended learning experience to properly assess these factors’ potential influence on cognitive architecture variations.
The research team implemented multiple safeguards against common method bias. Using self-report scales for cognitive data collection helped reduce measurement errors, while maintaining respondent anonymity and randomizing question order minimized response patterns. Table 4 data confirms the model’s strong validity, showing no significant bias effects. Harman’s single-factor test (Podsakoff & Organ, 1986) further validated this-exploratory analysis of all 29 items revealed the first factor explained only 46% of variance, well below the 50% threshold (Podsakoff & Organ, 1986; Hair et al., 1998), demonstrating minimal bias impact on findings.
Table 4. Measurement model results.
Variable |
Cronbach’s alpha |
rho_A |
Composite reliability |
Average variance extracted (AVE) |
Teaching Presence |
0.893 |
0.898 |
0.914 |
0.572 |
Agentic Engagement |
0.932 |
0.929 |
0.938 |
0.578 |
Social Presence |
0.799 |
0.811 |
0.869 |
0.623 |
Cognitive Presence |
0.847 |
0.851 |
0.887 |
0.567 |
This study employs Partial Least Squares Structural Equation Modeling (PLS-SEM) for data analysis due to its unique strengths: building new theories, making predictions, testing complex multivariate models, and handling non-normal data distributions, particularly when standard statistical assumptions don’t apply. These capabilities perfectly match our research needs. Using SmartPLS 4.0 software, we validated both measurement and structural models through PLS techniques within the SEM framework (Hair et al., 2017). For hypothesis testing, standard PLS algorithms combined with 5000 bootstrap iterations assessed statistical significance, following Hair et al.’s (2011) recommended practices.
4. Results
4.1. Measurement Model Results
Following Hair et al.’s (2017) guidelines, this study implemented a two-step validation strategy. First, it focused on confirming the model’s reliability and accuracy by checking: factor loadings ideally above 0.700 (Hair et al., 2014), though items between 0.400 - 0.700 were kept if they improved overall consistency, composite reliability scores exceeding 0.700 (Gefen et al., 2000), and average variance extracted (AVE) values above 0.500 (Fornell & Larcker, 1981). Any items below 0.400 were automatically removed per standard practice. After removing items with loadings below 0.700 and making other necessary refinements, the final model successfully met all quality standards mentioned earlier.
4.2. Discriminant Validity Results
To check if our model truly distinguishes between different concepts, the study used Henseler et al.’s (2016) HTMT method, a modern approach for testing measurement quality. Following Kline’s (2011) standards, this research confirmed clear separation between constructs since all HTMT scores (ranging 0.746 - 0.864) stayed safely below the 0.900 cutoff. These results give us confidence that our model reliably measures what it claims to measure, meeting all key validity benchmarks.
4.3. Structural Model Results
In the initial stage of the assumption test, the study confirmed the multiple colinearity of the structural model and all prediction elements meet the basic quality requirements. It shows that the range of distributed inflation indicators is 1.522 to 3.274. These values exceed 1.000 standard, but are far below the limit of more severe 5.000. This shows that the models based on the Hair and his colleagues’ 2017 framework can be trusted. Therefore, the statistical analysis remains independent enough because data analysis does not reveal an important multicore problem. To verify this model, the researchers have determined the Beta, p, t and confidence intervals using the guidance technique recommended by Hair et al. (2017). This method is conventional disposable test criteria, i.e. 1.645 (p < 0.05), 2.327 (p < 0.01), and 3.092 (p < 0.001). As a result of the study, statistically significant relationships were shown using the improved decimal number calculation method as shown in Table 5 and Figure 2, which provides powerful experimental validation for all six proposed hypotheses (h1-h6). This means that patterns observed in data always support theoretical assumptions based on the research model.
Table 5. Structural model results (n = 380).
Path |
Standard path coefficients |
Sample Mean (M) |
Standard deviation |
T-statistics |
p-values |
Findings |
TP -> AE |
0.696*** |
0.698 |
0.027 |
25.550 |
0.000 |
H1 validated |
AE -> SP |
0.547*** |
0.547 |
0.044 |
12.343 |
0.000 |
H2 validated |
AE -> CP |
0.324*** |
0.325 |
0.044 |
7.341 |
0.000 |
H3 validated |
TP -> AE -> SP |
0.381*** |
0.382 |
0.036 |
10.708 |
0.000 |
H4 validated |
TP -> AE -> CP |
0.226*** |
0.226 |
0.029 |
7.748 |
0.000 |
H5 validated |
TP -> AE -> SP -> CP |
0.099*** |
0.099 |
0.018 |
5.515 |
0.000 |
H6 validated |
*p < 0.05, t > 1.645; **p < 0.01, t > 2.327; ***p < 0.001, t > 3.092 (one tailed).
Figure 2. Partial least squares path modeling of relationship coefficients and explained variance (R2) (n = 380).
4.4. R2 and Q2 Results
The R-squared value (R2) is an index for evaluating the validity of the prediction, and its operating principle is the number of correlations between the observed result and the model revenue as described in the previous study (Hair et al., 2016). In short, the measured value is not numerically zero, and the rise of numerical value corresponds to the improvement of the prediction ability. According to the default interpretation guide, the results above 0.750 indicate that there is a remarkable predictive ability, and the result of about 0.500 represents the average performance and the figure close to 0.250 indicates that there is a limit to the interpretation capability. In the present study, the R2 index reached 0.480 in the case of the participating agent of the investigator, and it was not possible to evaluate the social existence at 0.600, and reached 0.678 for the recognition of the cognitive existence, and as shown in Figure 2, the function with the rational ability under various analytical dimensions was explained, and the structural consistency with the conventional evaluation model was maintained.
Stone’s (1974) work established Q2 as the gold standard for measuring predictive relevance, later confirmed by Henseler and Fassott’s (2009) research showing its critical role in evaluating models. This blindfolding technique in PLS analysis tests whether variables actually predict outcomes, with scores above 0.35 indicating strong predictive power, 0.15 moderate, and 0.02 weak (Hair et al., 2011). The results hit 0.273 for student initiative (AE), 0.366 for peer connections (SP), and 0.380 for critical thinking (CP), proving the model’s strong real-world forecasting ability.
5. Discussion
In college English education, blended learning has become a key focus, prompting researchers to examine its theoretical foundations and student impacts. While the Community of Inquiry framework effectively addresses teaching, social, and cognitive aspects of blended environments, studies have largely overlooked individual learner differences within this model. Early attempts to bridge this gap focused narrowly on online settings, theoretical modeling, or specific disciplines (Lan et al., 2018a, 2020; Wu & Chen, 2017; Xie et al., 2024), leaving blended language learning contexts underexplored. While external environmental factors clearly help build quality learning communities (Chen & Sang, 2018), mastering English requires sustained personal effort-making individual factors like student agency crucial for success (He & Huang, 2023; Guo & Liu, 2016). This study therefore expands the Community of Inquiry framework by adding learner agency, using empirical data to reveal how students actively contribute to blended English learning communities. Specifically, the study examines how proactive student behaviors interact with the framework’s three core dimensions (teaching, social, and cognitive presence).
The findings from this study emphasize the significant role played by student initiative within inquiry-based learning communities. Recognizing learners’ active involvement not only enhances the Community of Inquiry framework but also strengthens its explanatory capabilities, that is to say, it improves our understanding of collaborative learning processes. Results demonstrate that educational design fundamentally shapes inquiry environments (Garrison & Akyol, 2013) while reinforcing community connections, as seen in foundational studies by Garrison and colleagues (Garrison & Archer, 2007). For instance, research by Jiang and Zhang (2021) reveals that certain learning approaches, particularly two distinct English-learning methods, significantly boost proactive participation. Subsequent investigations by Ye (2023) and Min (2022) further illustrate how both patterns of student involvement and variations in instructor communication styles directly influence proactive engagement in blended classroom settings. This research confirms instructors’ central role in blended learning contexts, showing how thoughtfully designed courses and teaching strategies empower students to actively participate in knowledge construction. These insights not only expand existing theoretical frameworks but also transform our comprehension of collaborative learning dynamics. The study particularly highlights how students’ proactive behaviors strengthen peer relationships in blended environments, mirroring findings from online learning contexts where active participation enhances social connectedness (Miao & Ma, 2022). The analysis reveals a strong positive correlation between students’ self-driven participation and peer interactions in blended learning, broadening our understanding of how learner initiative shapes social dynamics in collaborative settings. Importantly, the findings also indicate that proactive behaviors directly improve critical thinking abilities, aligning with previous studies showing individual initiative enhances advanced learning outcomes (Lan et al., 2018a, 2020). These results suggest that student-led contributions equally benefit academic achievements across different blended learning designs, to put it simply, confirming the universal value of fostering agency in modern educational environments.
Existing research suggests that active student participation serves as a bridge between teaching methods, peer collaboration, and critical analysis skills. That is to say, this proactive behavior is an important connector that transforms educational impact into advanced learning outcomes through classroom interaction (Kucuk & Richardson, 2019; Reeve, 2013). The support of teachers promotes this cyclic process, facilitates communication among peers (Garrison & Akyol, 2013; Joo et al., 2013), and enhances analytical skills through learners’ participation and contribution in blended learning environments (Gao & He, 2020). This study addresses an important research gap by exploring the intrinsic mechanisms between student driven behavior in blended learning and the three basic components of the learning community framework. The research findings emphasize the positive role of learners in influencing open dialogue, knowledge formation processes, and cross learning group collaboration. By improving the curriculum structure method, cultivating independent English learning behavior, and perfecting the joint inquiry method, this study proposes practical strategies to optimize the results of blended learning, while providing educators with specific tools to enhance participation and learning outcomes in blended classroom environments.
It is still important to improve the quality if mixed learning is the basis for higher education English Curriculum (Ellis et al., 2016; Han & Ellis, 2019). In this study, the researchers clarified the main subject of the student’s participation in the framework of the community. According to the results of the study, teachers should be flexible, design the space to guide students, combine structured teaching, community interaction, and independent learning opportunities to create a so-called “autonomous but supportive” learning experience. Effective mixed learning means creating a community (Jia & Gao, 2023; Gao & he, 2020) that supports students to think independently, develop independent learning methods, maintain positivity, and support higher critical thinking. Teachers can strengthen peer connections through interactive activities like online discussions, in-person topic sharing, peer reviews, and post-class reflections (Shea & Bidjerano, 2010)-approaches that simultaneously build critical thinking skills. Students should actively engage in classroom interactions and collaborate with instructors to create more supportive learning environments. This two-way participation makes blended learning both more effective and sustainable, ultimately leading to better educational outcomes.
6. Limitation and Implication
While this study advances research by incorporating academic resilience into the inquiry framework and examining its relationships with core presences, its limitations should be acknowledged. The current conceptualization of agentic engagement primarily focuses on students’ positive individual states. Future research should develop a more comprehensive learning presence model that accounts for both positive and negative learning states. Incorporating assessments of negative states (Sharma & Sarkar, 2020; Sumarsono et al., 2021) would enable researchers to better understand the nuanced relationships between learning presence and other Community of Inquiry elements.
Secondly, the study’s participant pool was limited to undergraduate students currently enrolled in colleges, excluding other important groups engaged in blended English learning. Research shows significant differences in cognitive presence between undergraduate and graduate students (Garrison et al., 2010). In addition, the current study conducted simple statistics on the gender, age, and academic background of the participants, but did not incorporate these factors into the model. Therefore, future studies should examine diverse learner populations and other elements to draw more comprehensive conclusions about blended learning effectiveness.
A third limitation stems from relying solely on a single survey administered to students. The absence of multiple longitudinal assessments restricts causal inferences. While advanced structural modeling techniques provide valuable insights, future research should implement causal evaluations within the Community of Inquiry framework to better understand the dynamic interactions among its components.
Acknowledgements
This research was supported by several grants, including funding for the 2025 Zhejiang Provincial Educational Science Planning Project (Research on the Innovative Application of Generative Artificial Intelligence Technology in News Translation and Compilation Teaching, No. 2025SCG401), the First-class Undergraduate Course in Zhejiang Province, specifically for A Survey of Major English-speaking Countries (Project No. 689 in 2019), English Newspaper Reading I (Project No. 507 in 2020), and support from the Research Project focused on the Data-Driven Online-Offline Blended Teaching Model for ESP Courses (Project No. SXSJG202302).