Extending the Community of Inquiry Framework: Exploring the Roles of Agentic Engagement in Blended College English Learning

Abstract

In today’s tech-driven education landscape, blended learning has become essential for college English courses. This study examines what really boosts student outcomes using the Community of Inquiry framework (which focuses on teaching presence, social connections, and critical thinking). Through data analysis with SPSS 26.0 and SmartPLS4, we found three key things: 1) How teachers’ structure lessons directly improve classroom interactions and deep learning, 2) Quality discussions amplify teaching’s impact on both peer relationships and understanding, and 3) Social connections serve as the vital bridge between teaching methods and cognitive growth. This study enhances the Community of Inquiry framework by revealing how student agency drives blended learning success. The findings give educators concrete evidence that empowering learners’ personal initiative optimizes classroom environments and boosts outcomes in college English courses.

Share and Cite:

Jin, F. (2025) Extending the Community of Inquiry Framework: Exploring the Roles of Agentic Engagement in Blended College English Learning. Open Journal of Social Sciences, 13, 313-333. doi: 10.4236/jss.2025.134019.

1. Introduction

Recent advances in educational technology have transformed blended learning from an option to a must-have. The 2020 College English Teaching Guidelines now require instructors to actively create diverse learning environments using blended models. But here’s the challenge: as a core required course reaching millions nationwide, college English still struggles to build high-quality learning communities that transcend time and space constraints, a critical issue noted by Jia and Gao (2023).

The Community of Inquiry framework, a cornerstone theory for blended learning in higher Ed., shows that effective teaching requires three key elements working together: teacher guidance, peer connections, and critical thinking (Garrison et al., 1999). But researchers have noticed a gap: while this model explains how classroom environments and social dynamics support advanced learning, it overlooks how individual students’ motivations and behaviors shape their blended learning experience (Shea & Bidjerano, 2010; Lan et al., 2018a). Research shows students’ active participation in English learning remains moderate at best (Gao & He, 2020; Li, 2021). Problem behaviors like cutting corners, getting distracted, shifting goals, losing focus, and even cheating (Gong et al., 2018; Yang & Dai, 2021; Aini & Ciptaningrum, 2024) are key reasons for poor learning outcomes. This raises a crucial question: Does agentic engagement actually drive success in blended English courses?

Agentic engagement focuses on the positive aspects of student learning, referring to learners’ proactive or constructive contributions to their education during the learning process. It involves personalized participation to enhance instructional conditions and improve the learning environment (Reeve & Tseng, 2011; Reeve, 2013). Theoretically, mastering English is a marathon requiring teachers to connect vocabulary, content, and real-world applications making student agency crucial for lasting progress (Guo & Li, 2018) and a key predictor of success (Montenegro, 2017). Research clearly shows that students’ active participation strongly connects to teacher support, peer interaction, and English learning outcomes even helping explain how teacher involvement boosts achievement (Jiang & Zhang, 2021; Kucuk & Richardson, 2019; Ye, 2023; Shi, 2023; Littler, 2024). But these findings come mostly from high school classrooms or online college courses, leaving open questions about blended English learning. That’s why this study aims to: (a) propose treating student initiative as its own key element in the Community of Inquiry framework, and (b) test how it actually interacts with the model’s traditional three components in blended settings.

2. Literature Review

2.1. Community of Inquiry Framework

The Community of Inquiry framework, developed by Canadian researcher Garrison and colleagues, examines how teaching presence, social presence, and cognitive presence work together to develop critical thinking skills (Garrison et al., 1999). Here, “teaching presence” refers to how students perceive instructional activities during their learning process, which includes three key elements: designing and organizing lessons, guiding discussions, and providing direct instruction. Essentially, this highlights the teacher’s role in leading and supporting learning within this framework. Social presence captures how students connect and build trust within their learning community through interactions-it includes three key aspects: emotional expression (sharing feelings), open communication (honest dialogue), and group cohesion (sticking together as a team). Cognitive presence represents students’ meaning-making process through ongoing reflection and discussion, unfolding in four natural stages: triggering curiosity, exploring ideas, connecting concepts, and reaching solutions.

The Community of Inquiry framework emerged from blended learning experiences (Garrison et al., 1999). Early research mainly focused on either developing theoretical models (Shen & Sheng, 2015; Qiao, 2017) or testing how these models worked in practice (Lu et al., 2018; Wang & Liu, 2019). What’s missing are thorough, data-driven studies examining blended learning through multiple lenses-especially for college English courses (Wu, 2017; Lan et al., 2020) where this approach matters most (He & Huang, 2023; Jia & Gao, 2023). Moreover, while teaching presence, social presence, and cognitive presence are recognized as key elements in constructing blended learning environments, the Community of Inquiry framework has not adequately addressed how individual learners’ agency influences the learning process. To address this gap, the study brings student agency into the model, specifically examining blended college English courses. Using real classroom data, the study reveals how learners actively contribute to building effective blended learning communities.

2.2. Agentic Engagement

Student engagement measures how connected learners feel to their school’s people, activities, and values (Skinner et al., 2009). It works through three interconnected channels: emotional engagement (their positive/negative feelings during learning), cognitive engagement (how strategically they approach studying), and behavioral engagement (visible effort and commitment to academic work). Essentially, it captures whether students are psychologically invested and actively participating in their education (Skinner et al., 2009).

Building on the three traditional engagement types, agentic engagement adds a crucial fourth dimension-students taking charge of their learning through initiative and independence (Montenegro, 2017). This means learners proactively shape their education by improving lessons and classroom environments through personal involvement (Reeve & Tseng, 2011; Reeve, 2013; Guo & Li, 2018). At its core, it’s about students becoming co-creators of their learning experience by voicing preferences, asking questions, and communicating needs to build better classrooms together with teachers (Wakefield, 2016). Back in 2013, Reeve’s groundbreaking Korean classroom study revealed a key insight: true student engagement isn’t just about following teachers’ lead, but emerges through dynamic back-and-forth interactions between learners and instructors (Reeve, 2013). This work identified four powerful engagement styles: self-driven initiative (“I’ll tackle this myself”), teacher-supported learning (“Guide me through this”), collaborative co-creation (“Let’s work on this together”), and peer-assisted growth (“We’ll help each other learn”), each playing distinct roles in effective classrooms (Guo & Li, 2018).

2.3. Agentic Engagement and the Community of Inquiry Framework

Recent studies highlight how quality learning hinges on three key elements: effective teaching, dynamic peer interactions, and students’ prior knowledge (Lan et al., 2020). However, current frameworks often overlook other crucial factors like learning strategies and beneficial study habits that don’t involve direct teaching-especially in diverse global classrooms where these non-instructional elements significantly boost outcomes (Lan et al., 2020; Stenbom et al., 2016). What really drives academic success? Research shows self-regulated learning, where students actively manage their study approaches (Hu & Kuh, 2002; Richardson & Long, 2003; Richardson & Newby, 2006), outperforms traditional teaching methods in long-term knowledge retention (Dixon, 2015), while also enhancing other valuable learning behaviors (Ma et al., 2015). Since blended college English courses demand high student involvement, active participation becomes absolutely crucial (Al-Samarraie & Saeed, 2018; Qian, 2022; Tay, 2016). That’s why this study aims to expand the Community of Inquiry framework to include students’ personal initiative (agentic engagement) as a key ingredient in blended learning. By weaving this element into the model, we aim to uncover how individual drive contributes to successful learning communities, ultimately helping create higher-quality blended classrooms.

At the heart of blended learning, teaching presence captures how instructors shape the experience through three key actions: designing courses, guiding discussions, and delivering direct instruction. This core element of the Community of Inquiry framework (Garrison & Akyol, 2013) consistently boosts students’ critical thinking, sense of belonging, and perceived learning outcomes (Garrison & Arbaugh, 2007). Recent English language studies confirm that teachers’ investment, instructional style, and classroom discourse directly fuel student initiative-with research by Ye (2023), Jiang & Zhang (2021), and Min (2022) all showing strong positive connections. Strong course design, skilled discussion leading, and clear instruction all help students take charge of their learning (Garrison & Cleveland-Innes, 2005; Krpanec et al., 2024; Yang, 2023; Liu & Guo, 2021). But here’s the catch-most evidence comes from high school English classes, leaving a gap in understanding how college blended English courses affect student initiative. That’s why our first key hypothesis (see Figure 1) states: Teaching presence positively predicts students’ agentic engagement.

Exploring the collaborative framework suggests that educators develop the ability to create interactive learning environments through three main elements: expressing real emotions, maintaining open discussions, and carefully planning course structures (Arbaugh & Benbunan-Fich, 2006; Garrison & Arbaugh, 2007). Several studies have shown that these factors directly influence teaching outcomes and material handling (Sun & Yang, 2023; Miao & Ma, 2022; Miao et al., 2022). On this basis, social connections are based on three components: building genuine emotional relationships, fostering mutually respectful interactions, and building group unity through learning common goals and collaborative support within the community (Garrison et al., 1999; Garrison & Akyol, 2013). In short, showing true feelings, promoting open communication and creating strong team cohesion depend on establishing personal relationships between teachers and students (Garrison & Abaugh, 2007). Research has shown that students with higher levels of positive behavior tend to establish stronger social relationships and perform better in mixed learning environments. That is, existing academic research has not thoroughly investigated how autonomous student participation shapes peer dynamics in mixed learning environments (Reeve & Jang, 2022; Gao & He, 2020). Based on the discussion above, the central proposal (see Figure 1) suggests that the agentic engagement of students is positively correlated with their social presence (H2).

Rooted in Dewey’s inquiry-based learning model, cognitive presence measures how deeply students engage in meaningful discussions, critical thinking, and knowledge-building (Garrison et al., 1999, 2001). It’s the core of the learning experience, what Vaughan and Garrison (2005) and Kucuk and Richardson (2019) call “the heart of successful education.” But blended learning demands more student investment than traditional classes, requiring greater self-direction to manage time and attention effectively (Wu & Chen, 2017). Essentially, while traditional lectures let students passively receive information, blended environments need learners to actively shape their understanding through consistent participation. Studies confirm that when students take charge of their learning (agentic engagement), it boosts both their performance and self-motivation (Reeve & Tseng, 2011; Reeve, 2013) while creating better classroom dynamics-showing how students and teachers can transform learning environments together (Reeve, 2013). But research hasn’t fully explored how this initiative affects deeper thinking (cognitive presence) in blended courses. That’s why our third hypothesis (see Figure 1) states: Students’ agentic engagement positively predicts their cognitive presence.

The Community of Inquiry model shows that real learning happens when three key elements work together: teacher guidance, peer connections, and critical thinking-all carefully designed in blended courses (Garrison & Arbaugh, 2007; Garrison & Cleveland-Innes, 2005; Joo et al., 2011; Meyer, 2014). Here’s how it works: 1) Social presence builds emotional bonds through open communication and teamwork (Garrison et al., 2000; Garrison & Akyol, 2013; Kucuk & Richardson, 2019), 2) Strong teaching presence creates classroom communities where students actively participate (Joo et al., 2013; Kucuk & Richardson, 2019), and it serves as a reliable indicator of both students’ proactive learning behaviors (agentic engagement) and their emotional investment in coursework (Reeve, 2013), and 3) Cognitive presence drives meaningful knowledge construction through active learning strategies like critical thinking-where students build understanding through experiences, interactions, and discussions (Joo et al., 2013; Kanuka & Garrison, 2004). This thinking process also connects to emotional engagement. Importantly, research confirms social presence significantly bridges teaching methods and critical thinking development, working both directly and through sequential mediation effects (Jia & Gao, 2023). Research shows students’ proactive engagement (agentic engagement) and emotional investment in learning are closely connected (Gao & He, 2020), and can help explain how teaching quality impacts academic achievement (Ye, 2023). These findings suggest proactive student behaviors interact with all three Community of Inquiry elements (teaching, social, and cognitive presence). The study therefore proposes three specific relationships (see Figure 1): Hypothesis 4: Agentic engagement mediates the relationship between teaching presence and social presence. Hypothesis 5: Agentic engagement moderates the relationship between teaching presence and cognitive presence. Hypothesis 6: Agentic engagement mediates the relationship between teaching presence and cognitive presence through social presence.

Building on these insights, this study expands the Community of Inquiry framework by incorporating student agency (agentic engagement) as a new component, aiming to theoretically examine its multiple roles in blended learning communities and test six specific model relationships (shown in Figure 1).

Figure 1. The proposed framework. Note: TP = Teaching Presence; AE = Agentic Engagement; SP = Social Presence; CP = Cognitive Presence.

3. Method

3.1. Participants

This study focuses on non-English major students in blended English programs across four universities in Eastern China. Initially, 76 students helped test the questionnaire’s validity. For the main study, researchers intentionally selected 380 participants from an elective course-“Chinese-English News Translation” in Zhejiang province, all taught by the lead researcher. Participants aged 19 - 22 had 1 - 2 years of blended learning experience, with the sample comprising 179 male and 201 female students (70.8% juniors, 29.2% seniors). The 15-week spring 2024 course combined 34 in-person hours with 12 online self-study hours, requiring consistent use of digital resources.

3.2. Measures

3.2.1. Scales

Building on prior research, this study uses an 11-point rating scale (0 = “strongly disagree” to 10 = “strongly agree”) that outperforms traditional 5 or 7-point scales in clarity and data accuracy, especially for detailed student self-assessments (Leung, 2011; Gulo, 2017). To measure student initiative, the study adopted Guo and Li’s (2018) Foreign Language Learning Agentic Engagement Scale, which tracks four behaviors: self-directed learning, teacher-supported learning, collaborative learning with instructors, and peer-assisted learning, collectively explaining 71.027% of learning differences (see Table 1 for full details).

Table 1. Agentic engagement instrument.

No.

Items

Source

Self-motivated Study

1

I face and solve difficulties in English learning positively.

Guo & Li (2018)

2

I improve my English proficiency through various methods.

3

I cultivate a positive attitude towards English learning.

4

I do not make efforts to study English.

Assisting Teacher in Teaching

1

I provide the teacher with some English materials and relevant information.

Guo & Li (2018)

2

I offer suggestions or feedback to the teacher on how to improve English teaching effectiveness.

3

I inform the English teacher about my learning interests or needs.

Cooperating with Teacher in Teaching

1

I pay close attention during English classes.

Guo & Li (2018)

2

I actively complete the English learning tasks assigned by the teacher.

3

I actively participate in classroom activities organized by the English teacher.

4

I preview the content that the English teacher will cover in class.

Assisting Peers in Learning

1

I encourage my classmates to learn English.

2

I share relevant English learning materials or information with my classmates.

3

I help my classmates overcome difficulties in English learning.

This study employs the Chinese-adapted Community of Inquiry scale (Lan et al., 2018b), a meticulously translated and validated measurement tool tested with Chinese university students. The 27-item instrument covers three key dimensions: 13 items on teaching presence, 5 on social presence, and 9 assessing cognitive presence-demonstrating strong reliability, validity, and structural soundness (Lan et al., 2018b) in Chinese educational contexts (see Table 2 for detailed metrics).

Table 2. Community of inquiry framework instrument.

No.

Items

Source

Teaching presence

1

The facilitator distinctly conveyed pivotal course subjects.

Lan et al. (2018b)

2

The facilitator explicitly articulated essential course objectives.

3

The facilitator supplied precise guidance on engaging in course learning activities.

4

The facilitator clearly indicated critical deadlines and timeframes for learning activities.

5

The facilitator assisted in pinpointing areas of concurrence and discrepancy on course subjects, aiding my learning.

6

The facilitator skillfully guided the class to understand the course material in a way that clarified my thoughts.

7

The facilitator kept students engaged and encouraged productive discussions among participants.

8

The facilitator made sure everyone stayed on track, which really helped my learning.

9

The facilitator motivated participants to explore novel concepts of this course.

10

The facilitator actions bolstered the cultivation of a communal spirit among participants.

11

The facilitator sharpened discussions on relevant topics, which improved my understanding.

12

The facilitator also gave me helpful feedback that highlighted my strengths and weaknesses related to the course goals.

13

The facilitator always provided the feedback timely.

Social presence

1

Web-based or online communication serves as an outstanding platform for social engagement.

Lan et al. (2018b)

2

I found it comfortable to converse via the online platform.

3

I felt at ease participating in course discussions.

4

I was comfortable engaging with other course participants.

5

I was comfortable expressing disagreement with other participants while preserving a sense of trust.

Cognitive presence

1

Challenges presented heightened my interest in course topics.

Lan et al. (2018b)

2

Course activities sparked my curiosity.

3

I felt driven to investigate content-related inquiries.

4

Brainstorming and locating pertinent information assisted me in addressing content-related questions.

5

Online dialogues were beneficial in aiding my appreciation of diverse viewpoints.

6

Integrating new information helped me respond to questions posed in course activities.

7

Educational activities facilitated the construction of explanations or solutions.

8

Reflecting on course material and discussions enhanced my comprehension of foundational concepts in this class.

9

I am able to apply the knowledge gained in this course to my work or other activities unrelated to class.

3.2.2. Preliminary Study

The preliminary testing phase employed SPSS 26.0 for exploratory factor analysis to refine the survey by eliminating unclear or redundant items. The analysis established strict quality criteria: the model was required to explain at least 50% of variance, demonstrate statistical significance on Bartlett’s test (p < 0.05), achieve a Kaiser-Meyer-Olkin measure above 0.600, maintain factor loadings exceeding 0.300, and preserve eigenvalues of at least 1.00 (Barrett & Morgan, 2005; Hair et al., 2006; Pallant, 2011). Additionally, all items with Cronbach’s alpha coefficients below 0.700 were removed following analysis (Hair et al., 2010). This refinement process yielded a questionnaire demonstrating strong reliability and validity, with all retained items meeting the specified standards, detailed metrics for each scale are presented in Table 3.

Table 3. Preliminary study outcomes.

Scale

Cronbach’s alpha

KMO

Sphericity

Bartlett test

Cumulative variance explained

The smallest items communalities

Eigenvalue

Teaching Presence

0.914

0.889

0.000

61%

0.440

≥1.00

Agentic Engagement

0.937

0.910

0.000

55%

0.472

≥1.00

Social Presence

0.798

0.783

0.000

63%

0.587

≥1.00

Cognitive Presence

0.880

0.868

0.000

63%

0.475

≥1.00

3.3. Data Gathering and Data Examination

In May 2024, following detailed instructor guidance, the study administered surveys via Wenjuanxing platform to all enrolled students. From 406 collected responses, 380 valid questionnaires remained after quality screening, achieving a 93.6% effective response rate. The analysis applied Arbaugh’s (2007) and Garrison et al.’s (2010) theoretical frameworks, while specifically accounting for participants’ age and blended learning experience to properly assess these factors’ potential influence on cognitive architecture variations.

The research team implemented multiple safeguards against common method bias. Using self-report scales for cognitive data collection helped reduce measurement errors, while maintaining respondent anonymity and randomizing question order minimized response patterns. Table 4 data confirms the model’s strong validity, showing no significant bias effects. Harman’s single-factor test (Podsakoff & Organ, 1986) further validated this-exploratory analysis of all 29 items revealed the first factor explained only 46% of variance, well below the 50% threshold (Podsakoff & Organ, 1986; Hair et al., 1998), demonstrating minimal bias impact on findings.

Table 4. Measurement model results.

Variable

Cronbach’s alpha

rho_A

Composite reliability

Average variance

extracted (AVE)

Teaching Presence

0.893

0.898

0.914

0.572

Agentic Engagement

0.932

0.929

0.938

0.578

Social Presence

0.799

0.811

0.869

0.623

Cognitive Presence

0.847

0.851

0.887

0.567

This study employs Partial Least Squares Structural Equation Modeling (PLS-SEM) for data analysis due to its unique strengths: building new theories, making predictions, testing complex multivariate models, and handling non-normal data distributions, particularly when standard statistical assumptions don’t apply. These capabilities perfectly match our research needs. Using SmartPLS 4.0 software, we validated both measurement and structural models through PLS techniques within the SEM framework (Hair et al., 2017). For hypothesis testing, standard PLS algorithms combined with 5000 bootstrap iterations assessed statistical significance, following Hair et al.’s (2011) recommended practices.

4. Results

4.1. Measurement Model Results

Following Hair et al.’s (2017) guidelines, this study implemented a two-step validation strategy. First, it focused on confirming the model’s reliability and accuracy by checking: factor loadings ideally above 0.700 (Hair et al., 2014), though items between 0.400 - 0.700 were kept if they improved overall consistency, composite reliability scores exceeding 0.700 (Gefen et al., 2000), and average variance extracted (AVE) values above 0.500 (Fornell & Larcker, 1981). Any items below 0.400 were automatically removed per standard practice. After removing items with loadings below 0.700 and making other necessary refinements, the final model successfully met all quality standards mentioned earlier.

4.2. Discriminant Validity Results

To check if our model truly distinguishes between different concepts, the study used Henseler et al.’s (2016) HTMT method, a modern approach for testing measurement quality. Following Kline’s (2011) standards, this research confirmed clear separation between constructs since all HTMT scores (ranging 0.746 - 0.864) stayed safely below the 0.900 cutoff. These results give us confidence that our model reliably measures what it claims to measure, meeting all key validity benchmarks.

4.3. Structural Model Results

In the initial stage of the assumption test, the study confirmed the multiple colinearity of the structural model and all prediction elements meet the basic quality requirements. It shows that the range of distributed inflation indicators is 1.522 to 3.274. These values exceed 1.000 standard, but are far below the limit of more severe 5.000. This shows that the models based on the Hair and his colleagues’ 2017 framework can be trusted. Therefore, the statistical analysis remains independent enough because data analysis does not reveal an important multicore problem. To verify this model, the researchers have determined the Beta, p, t and confidence intervals using the guidance technique recommended by Hair et al. (2017). This method is conventional disposable test criteria, i.e. 1.645 (p < 0.05), 2.327 (p < 0.01), and 3.092 (p < 0.001). As a result of the study, statistically significant relationships were shown using the improved decimal number calculation method as shown in Table 5 and Figure 2, which provides powerful experimental validation for all six proposed hypotheses (h1-h6). This means that patterns observed in data always support theoretical assumptions based on the research model.

Table 5. Structural model results (n = 380).

Path

Standard path coefficients

Sample

Mean (M)

Standard deviation

T-statistics

p-values

Findings

TP -> AE

0.696***

0.698

0.027

25.550

0.000

H1 validated

AE -> SP

0.547***

0.547

0.044

12.343

0.000

H2 validated

AE -> CP

0.324***

0.325

0.044

7.341

0.000

H3 validated

TP -> AE -> SP

0.381***

0.382

0.036

10.708

0.000

H4 validated

TP -> AE -> CP

0.226***

0.226

0.029

7.748

0.000

H5 validated

TP -> AE -> SP -> CP

0.099***

0.099

0.018

5.515

0.000

H6 validated

*p < 0.05, t > 1.645; **p < 0.01, t > 2.327; ***p < 0.001, t > 3.092 (one tailed).

Figure 2. Partial least squares path modeling of relationship coefficients and explained variance (R2) (n = 380).

4.4. R2 and Q2 Results

The R-squared value (R2) is an index for evaluating the validity of the prediction, and its operating principle is the number of correlations between the observed result and the model revenue as described in the previous study (Hair et al., 2016). In short, the measured value is not numerically zero, and the rise of numerical value corresponds to the improvement of the prediction ability. According to the default interpretation guide, the results above 0.750 indicate that there is a remarkable predictive ability, and the result of about 0.500 represents the average performance and the figure close to 0.250 indicates that there is a limit to the interpretation capability. In the present study, the R2 index reached 0.480 in the case of the participating agent of the investigator, and it was not possible to evaluate the social existence at 0.600, and reached 0.678 for the recognition of the cognitive existence, and as shown in Figure 2, the function with the rational ability under various analytical dimensions was explained, and the structural consistency with the conventional evaluation model was maintained.

Stone’s (1974) work established Q2 as the gold standard for measuring predictive relevance, later confirmed by Henseler and Fassott’s (2009) research showing its critical role in evaluating models. This blindfolding technique in PLS analysis tests whether variables actually predict outcomes, with scores above 0.35 indicating strong predictive power, 0.15 moderate, and 0.02 weak (Hair et al., 2011). The results hit 0.273 for student initiative (AE), 0.366 for peer connections (SP), and 0.380 for critical thinking (CP), proving the model’s strong real-world forecasting ability.

5. Discussion

In college English education, blended learning has become a key focus, prompting researchers to examine its theoretical foundations and student impacts. While the Community of Inquiry framework effectively addresses teaching, social, and cognitive aspects of blended environments, studies have largely overlooked individual learner differences within this model. Early attempts to bridge this gap focused narrowly on online settings, theoretical modeling, or specific disciplines (Lan et al., 2018a, 2020; Wu & Chen, 2017; Xie et al., 2024), leaving blended language learning contexts underexplored. While external environmental factors clearly help build quality learning communities (Chen & Sang, 2018), mastering English requires sustained personal effort-making individual factors like student agency crucial for success (He & Huang, 2023; Guo & Liu, 2016). This study therefore expands the Community of Inquiry framework by adding learner agency, using empirical data to reveal how students actively contribute to blended English learning communities. Specifically, the study examines how proactive student behaviors interact with the framework’s three core dimensions (teaching, social, and cognitive presence).

The findings from this study emphasize the significant role played by student initiative within inquiry-based learning communities. Recognizing learners’ active involvement not only enhances the Community of Inquiry framework but also strengthens its explanatory capabilities, that is to say, it improves our understanding of collaborative learning processes. Results demonstrate that educational design fundamentally shapes inquiry environments (Garrison & Akyol, 2013) while reinforcing community connections, as seen in foundational studies by Garrison and colleagues (Garrison & Archer, 2007). For instance, research by Jiang and Zhang (2021) reveals that certain learning approaches, particularly two distinct English-learning methods, significantly boost proactive participation. Subsequent investigations by Ye (2023) and Min (2022) further illustrate how both patterns of student involvement and variations in instructor communication styles directly influence proactive engagement in blended classroom settings. This research confirms instructors’ central role in blended learning contexts, showing how thoughtfully designed courses and teaching strategies empower students to actively participate in knowledge construction. These insights not only expand existing theoretical frameworks but also transform our comprehension of collaborative learning dynamics. The study particularly highlights how students’ proactive behaviors strengthen peer relationships in blended environments, mirroring findings from online learning contexts where active participation enhances social connectedness (Miao & Ma, 2022). The analysis reveals a strong positive correlation between students’ self-driven participation and peer interactions in blended learning, broadening our understanding of how learner initiative shapes social dynamics in collaborative settings. Importantly, the findings also indicate that proactive behaviors directly improve critical thinking abilities, aligning with previous studies showing individual initiative enhances advanced learning outcomes (Lan et al., 2018a, 2020). These results suggest that student-led contributions equally benefit academic achievements across different blended learning designs, to put it simply, confirming the universal value of fostering agency in modern educational environments.

Existing research suggests that active student participation serves as a bridge between teaching methods, peer collaboration, and critical analysis skills. That is to say, this proactive behavior is an important connector that transforms educational impact into advanced learning outcomes through classroom interaction (Kucuk & Richardson, 2019; Reeve, 2013). The support of teachers promotes this cyclic process, facilitates communication among peers (Garrison & Akyol, 2013; Joo et al., 2013), and enhances analytical skills through learners’ participation and contribution in blended learning environments (Gao & He, 2020). This study addresses an important research gap by exploring the intrinsic mechanisms between student driven behavior in blended learning and the three basic components of the learning community framework. The research findings emphasize the positive role of learners in influencing open dialogue, knowledge formation processes, and cross learning group collaboration. By improving the curriculum structure method, cultivating independent English learning behavior, and perfecting the joint inquiry method, this study proposes practical strategies to optimize the results of blended learning, while providing educators with specific tools to enhance participation and learning outcomes in blended classroom environments.

It is still important to improve the quality if mixed learning is the basis for higher education English Curriculum (Ellis et al., 2016; Han & Ellis, 2019). In this study, the researchers clarified the main subject of the student’s participation in the framework of the community. According to the results of the study, teachers should be flexible, design the space to guide students, combine structured teaching, community interaction, and independent learning opportunities to create a so-called “autonomous but supportive” learning experience. Effective mixed learning means creating a community (Jia & Gao, 2023; Gao & he, 2020) that supports students to think independently, develop independent learning methods, maintain positivity, and support higher critical thinking. Teachers can strengthen peer connections through interactive activities like online discussions, in-person topic sharing, peer reviews, and post-class reflections (Shea & Bidjerano, 2010)-approaches that simultaneously build critical thinking skills. Students should actively engage in classroom interactions and collaborate with instructors to create more supportive learning environments. This two-way participation makes blended learning both more effective and sustainable, ultimately leading to better educational outcomes.

6. Limitation and Implication

While this study advances research by incorporating academic resilience into the inquiry framework and examining its relationships with core presences, its limitations should be acknowledged. The current conceptualization of agentic engagement primarily focuses on students’ positive individual states. Future research should develop a more comprehensive learning presence model that accounts for both positive and negative learning states. Incorporating assessments of negative states (Sharma & Sarkar, 2020; Sumarsono et al., 2021) would enable researchers to better understand the nuanced relationships between learning presence and other Community of Inquiry elements.

Secondly, the study’s participant pool was limited to undergraduate students currently enrolled in colleges, excluding other important groups engaged in blended English learning. Research shows significant differences in cognitive presence between undergraduate and graduate students (Garrison et al., 2010). In addition, the current study conducted simple statistics on the gender, age, and academic background of the participants, but did not incorporate these factors into the model. Therefore, future studies should examine diverse learner populations and other elements to draw more comprehensive conclusions about blended learning effectiveness.

A third limitation stems from relying solely on a single survey administered to students. The absence of multiple longitudinal assessments restricts causal inferences. While advanced structural modeling techniques provide valuable insights, future research should implement causal evaluations within the Community of Inquiry framework to better understand the dynamic interactions among its components.

Acknowledgements

This research was supported by several grants, including funding for the 2025 Zhejiang Provincial Educational Science Planning Project (Research on the Innovative Application of Generative Artificial Intelligence Technology in News Translation and Compilation Teaching, No. 2025SCG401), the First-class Undergraduate Course in Zhejiang Province, specifically for A Survey of Major English-speaking Countries (Project No. 689 in 2019), English Newspaper Reading I (Project No. 507 in 2020), and support from the Research Project focused on the Data-Driven Online-Offline Blended Teaching Model for ESP Courses (Project No. SXSJG202302).

Conflicts of Interest

The author declares no conflicts of interest regarding the publication of this paper.

References

[1] Aini, S. A., & Ciptaningrum, D. S. (2024). Exploring Students’ Engagement in Blended Learning. Britain International of Linguistics Arts and Education (BIoLAE) Journal, 6, 47-61.
[2] Al-Samarraie, H., & Saeed, N. (2018). A Systematic Review of Cloud Computing Tools for Collaborative Learning: Opportunities and Challenges to the Blended-Learning Environment. Computers & Education, 124, 77-91.
https://doi.org/10.1016/j.compedu.2018.05.016
[3] Arbaugh, B. (2007). An Empirical Verification of the Community of Inquiry Framework. Journal of Asynchronous Learning Networks, 11, 73-85.
https://doi.org/10.24059/olj.v11i1.1738
[4] Arbaugh, J. B., & Benbunan-Finch, R. (2006). An Investigation of Epistemological and Social Dimensions of Teaching in Online Learning Environments. Academy of Management Learning & Education, 5, 435-447.
https://doi.org/10.5465/amle.2006.23473204
[5] Barrett, K. C., & Morgan Jr., G. A. (2005). SPSS for Intermediate Statistics: Use and Interpretation. Psychology Press.
[6] Chen, M., & Sang, X. H. (2018). An Empirical Study on College Students’ Perception and Satisfaction with Blended Teaching Mode Reform Courses. Modern Distance Education, No. 5, 57-64.
[7] Dixson, M. D. (2015). Measuring Student Engagement in the Online Course: The Online Student Engagement Scale (OSE). Online Learning, 19, 1-15.
https://doi.org/10.24059/olj.v19i4.561
[8] Ellis, R. A., Pardo, A., & Han, F. (2016). Quality in Blended Learning Environments—Significant Differences in How Students Approach Learning Collaborations. Computers & Education, 102, 90-102.
https://doi.org/10.1016/j.compedu.2016.07.006
[9] Fornell, C., & Larcker, D. F. (1981). Structural Equation Models with Unobservable Variables and Measurement Error: Algebra and Statistics. Journal of Marketing Research, 18, 382-388.
https://doi.org/10.1177/002224378101800313
[10] Gao, F., & He, Q. (2020). A Study on the Relationship between Agency Engagement, Affective Engagement, and Academic Achievement among Non-English Major Students at Teacher Education Institutions: A Case Study of Shaanxi Normal University. Journal of Chengdu Normal University, No. 10, 14-20.
[11] Garrison, D. R., & Akyol, Z. (2013). The Community of Inquiry Theoretical Framework. In M. G. Moore (Ed.), Handbook of Distance Education (pp. 104-119). Routledge.
[12] Garrison, D. R., & Arbaugh, J. B. (2007). Researching the Community of Inquiry Framework: Review, Issues, and Future Directions. The Internet and Higher Education, 10, 157-172.
https://doi.org/10.1016/j.iheduc.2007.04.001
[13] Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating Cognitive Presence in Online Learning: Interaction Is Not Enough. American Journal of Distance Education, 19, 133-148.
https://doi.org/10.1207/s15389286ajde1903_2
[14] Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. The Internet and Higher Education, 2, 87-105.
https://doi.org/10.1016/s1096-7516(00)00016-6
[15] Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. The Internet and Higher Education, 2, 87-105.
https://doi.org/10.1016/s1096-7516(00)00016-6
[16] Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical Thinking, Cognitive Presence, and Computer Conferencing in Distance Education. American Journal of Distance Education, 15, 7-23.
https://doi.org/10.1080/08923640109527071
[17] Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring Causal Relationships among Teaching, Cognitive and Social Presence: Student Perceptions of the Community of Inquiry Framework. The Internet and Higher Education, 13, 31-36.
https://doi.org/10.1016/j.iheduc.2009.10.002
[18] Gefen, D., Straub, D., & Boudreau, M. (2000). Structural Equation Modeling and Regression: Guidelines for Research Practice. Communications of the Association for Information Systems, 4, 1-79.
https://doi.org/10.17705/1cais.00407
[19] Gong, Ch. H., Li, Q., & Gong, Y. (2018). Issues on Learning Engagement in Smart Learning Environment. e-Education Research, No. 6, 83-89.
[20] Guo, J. D., & Li, Y. (2018). Development and Validation of an Instrument for Measuring Learner Agency in Foreign Language Learning. Foreign Language Education, No. 5, 66-69.
[21] Guo, J. D., & Liu, L. (2016). Devotion to FL Learning: Connotation, Structure and Research Perspective. Journal of Jiangxi Normal University (Philosophy and Social Sciences Edition), No. 6, 181-185.
[22] Hair Jr, J., Sarstedt, M., Hopkins, L., & Kuppelwieser, G. V. (2014). Partial Least Squares Structural Equation Modeling (PLS-SEM): An Emerging Tool in Business Research. European Business Review, 26, 106-121.
https://doi.org/10.1108/ebr-10-2013-0128
[23] Hair, J. F., Anderson, R. E., & Babin, B. J. (2010). Multivariate Data Analysis: A Global Perspective (7th ed.). Pearson Prentice Hall.
[24] Hair, J. F., Anderson, R. E., Tatham, R. L., & Black, W. C. (1998). Multivariate Data Analysis (5th ed.). Prentice Hall.
[25] Hair, J. F., Hult, G. T. M., & Ringle, C. M. (2016). A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM). Sage Publications.
[26] Hair, J. F., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2017). A Primer on Partial Least Squares Structural Equation Modelling (PLS-SEM) (2nd ed.). SAGE Publication.
[27] Hair, J. F., Ringle, C. M., & Sarstedt, M. (2011). PLS-SEM: Indeed a Silver Bullet. Journal of Marketing Theory and Practice, 19, 139-152.
https://doi.org/10.2753/mtp1069-6679190202
[28] Hair, J. S., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate Data Analysis. Prentice-Hall.
[29] Han, F., & Ellis, R. A. (2019). Identifying Consistent Patterns of Quality Learning Discussions in Blended Learning. The Internet and Higher Education, 40, 12-19.
https://doi.org/10.1016/j.iheduc.2018.09.002
[30] He, Y. Y., & Huang, Y. Y. (2023). The Relationship between Metacognition and the Community of Inquiry Model: An Exploration Based on Blended English Teaching Practices at the University Level. Journal of Kunming Metallurgy College, No. 1, 78-86.
[31] Henseler, J., & Fassott, G. (2009). Testing Moderating Effects in PLS Path Models: An Illustration of Available Procedures. In V. Esposito Vinzi, W. W. Chin, J. Henseler, & H. Wang (Eds.), Handbook of Partial Least Squares: Concept Methods and Applications (pp. 713-735). Springer.
https://doi.org/10.1007/978-3-540-32827-8_31
[32] Henseler, J., Ringle, C. M., & Sarstedt, M. (2016). Testing Measurement Invariance of Composites Using Partial Least Squares. International Marketing Review, 33, 405-431.
https://doi.org/10.1108/imr-09-2014-0304
[33] Hu, S., & Kuh, G. D. (2002). Being (Dis)engaged in Educationally Purposeful Activities: The Influences of Student and Institutional Characteristics. Research in Higher Education, 43, 555-575.
https://doi.org/10.1023/a:1020114231387
[34] Jia, W., & Gao, X. Y. (2023). Research on the University English Blended Learning Com-munity Model: Analysis of the Mediating Effects of Affective Presence and Social Presence. Foreign Language World, No. 4, 82-90.
[35] Jiang, A. L., & Zhang, L. J. (2021). University Teachers’ Teaching Style and Their Students’ Agentic Engagement in EFL Learning in China: A Self-Determination Theory and Achievement Goal Theory Integrated Perspective. Frontiers in Psychology, 12, Article ID: 704269.
https://doi.org/10.3389/fpsyg.2021.704269
[36] Joo, Y. J., Joung, S., & Kim, E. K. (2013). Structural Relationships among e-Learners’ Sense of Presence, Usage, Flow, Satisfaction, and Persistence. Educational Technology & Society, 16, 310-324.
[37] Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online University Students’ Satisfaction and Persistence: Examining Perceived Level of Presence, Usefulness and Ease of Use as Predictors in a Structural Model. Computers & Education, 57, 1654-1664.
https://doi.org/10.1016/j.compedu.2011.02.008
[38] Kanuka, H., & Garrison, D. R. (2004). Cognitive Presence in Online Learning. Journal of Computing in Higher Education, 15, 21-39.
https://doi.org/10.1007/bf02940928
[39] Kline, R. B. (2011). Principles and Practice of Structural Equation Modelling. Guilford Press.
[40] Krpanec, E., Popović, D., & Babarović, T. (2024). How Can Teachers Encourage Students’ Agentic Engagement? The Role of Autonomy-Supportive Teaching and Students’ Autonomous Motivation.
https://www.researchgate.net/publication/374949745
[41] Kucuk, S., & Richardson, J. C. (2019). A Structural Equation Model of Predictors of Online Learners’ Engagement and Satisfaction. Online Learning, 23, 196-216.
https://doi.org/10.24059/olj.v23i2.1455
[42] Lan, G. H., Zhong, Q. J., Lv, C. J., Song, Y. T., & Wei, J. C. (2018b). Construction of a Chinese Version of the Community of Inquiry Measurement Instrument. Open Edu-cation Research, No. 3, 68-76.
[43] Lan, G. Sh., Zhong, Q. J., Guo, Q., & Kong, X. K. (2020). Research on the Relationship between Self-Efficacy, Self-Regulated Learning and Community of Inquiry Model-Based on Blended Learning in Online Learning Space. China Educational Technology, No. 12, 44-54.
[44] Lan, G. Sh., Zhong, Q. J., Lv, C. J., & Song, Y. T. (2018a). Exploring Relationships between Learning Presence and Community of Inquiry Model. Open Education Research, No. 5, 92-107.
[45] Leung, S. (2011). A Comparison of Psychometric Properties and Normality in 4-, 5-, 6-, and 11-Point Likert Scales. Journal of Social Service Research, 37, 412-421.
https://doi.org/10.1080/01488376.2011.580697
[46] Li, H. Y. (2021). A Study on English Learning Engagement of Non-English Major College Students in a Blended Learning Environment. Chongqing Normal University.
[47] Littler, M. (2024). Social, Cognitive, and Teaching Presence as Predictors of Online Stu-dent Engagement among MSN Students.
[48] Liu, X. H., & Guo, J. D. (2021). Teacher Support, Interaction Engagement and Learning Enjoyment in On-Line EFL Teaching. Foreign Languages bimonthly, No. 5, 34-42+160.
[49] Lu, D., Jie, Y. G., Tang, Y. W., & Yang, X. (2018). Research on a New Blended Learning Mode Oriented by Critical Thinking-Taking English Writing Teaching as an Example. China Educational Technology, No. 6, 135-140.
[50] Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the Necessary Condition for Engagement in an Online Learning Environment Based on Learning Analytics Approach: The Role of the Instructor. The Internet and Higher Education, 24, 26-34.
https://doi.org/10.1016/j.iheduc.2014.09.005
[51] Meyer, K. A. (2014). Student Engagement in Online Learning: What Works and Why. ASHE Higher Education Report, 40, 1-114.
https://doi.org/10.1002/aehe.20018
[52] Miao, J., & Ma, L. (2022). Students’ Online Interaction, Self-Regulation, and Learning Engagement in Higher Education: The Importance of Social Presence to Online Learning. Frontiers in Psychology, 13, Article ID: 815220.
https://doi.org/10.3389/fpsyg.2022.815220
[53] Miao, J., Chang, J., & Ma, L. (2022). Teacher-Student Interaction, Student-Student Interaction and Social Presence: Their Impacts on Learning Engagement in Online Learning Environments. The Journal of Genetic Psychology, 183, 514-526.
https://doi.org/10.1080/00221325.2022.2094211
[54] Min, Y. (2022). A Study on the Impact of High School English Teachers’ Discourse on Students’ Learner Agency Engagement in English Learning. Hunan University.
[55] Montenegro, A. (2017). Understanding the Concept of Agentic Engagement. Colombian Applied Linguistics Journal, 19, 117-128.
https://doi.org/10.14483/calj.v19n1.10472
[56] Pallant, J. (2011). SPSS Survival Manual: A Step by Step Guide to Data Analysis Using SPSS (4th ed.). Allen and Unwin.
[57] Podsakoff, P. M., & Organ, D. W. (1986). Self-Reports in Organizational Research: Problems and Prospects. Journal of Management, 12, 531-544.
https://doi.org/10.1177/014920638601200408
[58] Qian, J. (2022). Blended Learning Attitudes, Engagement, and Effectiveness among Chinese EFL College Students. Education, 6, 138-145.
[59] Qiao, H. J. (2017). Research on Blended English Learning Model Based on Inquiry Society System. Technology Enhanced Foreign Language, No. 4, 43-48.
[60] Reeve, J. (2013). How Students Create Motivationally Supportive Learning Environments for Themselves: The Concept of Agentic Engagement. Journal of Educational Psychology, 105, 579-595.
https://doi.org/10.1037/a0032690
[61] Reeve, J., & Jang, H. (2022). Agentic Engagement. In Handbook of Research on Student Engagement (pp. 95-107). Springer International Publishing.
https://doi.org/10.1007/978-3-031-07853-8_5
[62] Reeve, J., & Tseng, C. (2011). Agency as a Fourth Aspect of Students’ Engagement during Learning Activities. Contemporary Educational Psychology, 36, 257-267.
https://doi.org/10.1016/j.cedpsych.2011.05.002
[63] Richardson, J. C., & Newby, T. (2006). The Role of Students’ Cognitive Engagement in Online Learning. American Journal of Distance Education, 20, 23-37.
https://doi.org/10.1207/s15389286ajde2001_3
[64] Richardson, J. T. E., & Long, G. L. (2003). Academic Engagement and Perceptions of Quality in Distance Education. Open Learning: The Journal of Open, Distance and e-Learning, 18, 223-244.
https://doi.org/10.1080/0268051032000131008
[65] Sharma, S., & Sarkar, P. (2020). Efficiency of Blended Learning in Reduction of Anxiety: With Special Reference to High School Students. International Journal of Grid and Distributed Computing, 13, 277-285.
[66] Shea, P., & Bidjerano, T. (2010). Learning Presence: Towards a Theory of Self-Efficacy, Self-Regulation, and the Development of a Communities of Inquiry in Online and Blended Learning Environments. Computers & Education, 55, 1721-1731.
https://doi.org/10.1016/j.compedu.2010.07.017
[67] Shen, Y., & Sheng, Y. D. (2015). College English Flipped Classroom Teaching Based on Inquiry Community System. Foreign Language World, No. 4, 81-89.
[68] Shi, X. W. (2023). An Investigation into Learner Agency Engagement in English Learning among High School Students. Yunnan Normal University.
[69] Skinner, E. A., Kindermann, T. A., & Furrer, C. J. (2009). A Motivational Perspective on Engagement and Disaffection: Conceptualization and Assessment of Children’s Behavioral and Emotional Participation in Academic Activities in the Classroom. Educational and Psychological Measurement, 69, 493-525.
https://doi.org/10.1177/0013164408323233
[70] Stenbom, S., Jansson, M., & Hulkko, A. (2016). Revising the Community of Inquiry Framework for the Analysis of One-to-One Online Learning Relationships. The International Review of Research in Open and Distributed Learning, 17, 36-53.
https://doi.org/10.19173/irrodl.v17i3.2068
[71] Stone, M. (1974). Cross-Validatory Choice and Assessment of Statistical Predictions. Journal of the Royal Statistical Society Series B: Statistical Methodology, 36, 111-133.
https://doi.org/10.1111/j.2517-6161.1974.tb00994.x
[72] Sumarsono, D., Haryadi, H., & Bagis, A. K. (2021). When Blended Learning Is Forced in the amid of Covid-19: What Happen on EFL Learners’ Speaking Anxiety? Journal of Languages and Language Teaching, 9, 305-315.
https://doi.org/10.33394/jollt.v9i3.3906
[73] Sun, Z., & Yang, Y. (2023). The Mediating Role of Learner Empowerment in the Relationship between the Community of Inquiry and Online Learning Outcomes. The Internet and Higher Education, 58, Article ID: 100911.
https://doi.org/10.1016/j.iheduc.2023.100911
[74] Tay, H. Y. (2016). Investigating Engagement in a Blended Learning Course. Cogent Education, 3, Article ID: 1135772.
https://doi.org/10.1080/2331186x.2015.1135772
[75] Vaughan, N., & Garrison, D. R. (2005). Creating Cognitive Presence in a Blended Faculty Development Community. The Internet and Higher Education, 8, 1-12.
https://doi.org/10.1016/j.iheduc.2004.11.001
[76] Vaughn Gulo, E. (2017). University Students’ Attitudes as Measured by the Semantic Differential. The Journal of Educational Research, 60, 152-158.
https://doi.org/10.1080/00220671.1966.10883462
[77] Wakefield, C. R. (2016). Agentic Engagement, Teacher Support, and Classmate Related-ness—A Reciprocal Path to Student Achievement.
[78] Wang, Y. K., & Liu, B. (2019). A Practical Study of Online-Offline Mixed Teaching of Academic English for Postgraduates. Foreign Languages and Their Teaching, No. 5, 10-19.
[79] Wu, X. E., & Chen, X. H. (2017). On the Effects of Presence on Online Learning Performance. Modern Distance Education, No. 8, 66-73.
[80] Wu, Y. J. (2017). Research on Influential Factors and Its Measurement of Learners’ Online Deep Learning. e-Education Research, 38, 57-63.
[81] Xie, Y. T., Xiao, Y. H., & Zhong, Y. Y. (2024). The Relationship between Preservice Early Childhood Educators’ Professional Identity and Online Self-Regulated Learning: A Study Based on the Theory of Communities of Inquiry. Journal of Shaanxi Xueqian Normal University, No. 9, 68-75.
[82] Yang, G., & Dai, Ch. H. (2021). The Analysis of the Dimensions and Influencing Paths of University Students’ Online English Learning Engagement. Foreign Languages and Their Teaching, No. 4, 113-123+150-151.
[83] Yang, Y. T. (2023). A Retrospective Study on the Changes in Agency Engagement of Chinese Second Language Learners from the Perspective of Complex Dynamic Systems Theory. Nanchang: Nanchang University.
[84] Ye, H. Q. (2023). A Study on the Relationship between High School Students’ Perceived Teacher Engagement, Learner Agency Engagement, and English Achievement. Guangdong Polytechnic Normal University.

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.