Exploring the Link between ChatGPT Utilization, Academic Resilience, and Academic Achievement of STEM Students ()
1. Introduction
The rapid evolution of artificial intelligence (AI) has sparked critical discourse on its implications across various sectors, particularly in education. As AI tools become increasingly accessible, their role in reshaping traditional pedagogical practices demands closer examination. In recent years, AI has taken center stage in educational reform, gradually integrating into teaching and learning processes. Among the most advanced AI applications, ChatGPT has emerged as a leading tool capable of generating high-quality prose within seconds [1]. Its unique capabilities have prompted predictions about its impact on student assessment, academic performance, and broader educational structures. While ChatGPT offers the potential to enhance learning experiences, educators are challenged to adapt their instructional approaches and evaluation strategies in light of the changing demands of living, working, and learning in an AI-driven era [2]. However, the benefits of ChatGPT are counterbalanced by concerns about its misuse. One study warns that the unethical use of ChatGPT may result in diminished learning and cognitive development among students, advising educators to avoid assigning theory-based take-home tasks that are vulnerable to AI-assisted plagiarism [3].
ChatGPT, launched in November 2022, is an open-access AI model renowned for its capacity to comprehend and generate natural language at a human-like level [4] [5]. Its proficiency in offering concise explanations and structured responses makes it a powerful tool for academic communication [6] [7]. The rapid adoption of ChatGPT across educational settings reflects a broader trend in which students integrate AI tools into their academic routines [8]. This uptake has sparked a dual response from educators: curiosity about its pedagogical potential and concern about academic integrity and ethical implications [9]. Concerns over plagiarism and the authenticity of student work have led to ongoing debates about the responsible use of generative AI in academic environments [1]. Despite these concerns, ChatGPT has shown promise in enhancing student productivity and comprehension. Studies report that users benefit from their support in concept clarification, text translation, and question formulation, which enhances their understanding of subject matter [10]. Students generally perceive ChatGPT as a valuable tool with applications across educational, social, and personal contexts [11] [12]. However, the widespread use of ChatGPT also necessitates clear institutional guidelines to ensure the responsible and effective integration of this technology into the learning process [13].
Academic resilience, defined as a psychological construct enabling individuals to adapt positively to academic challenges [14], is often shaped by socioeconomic, emotional, and institutional factors [15]. In the educational context, resilience refers to a student’s ability to manage academic pressure, setbacks, and adversity while maintaining motivation and academic performance. Studies suggest that interventions aimed at fostering self-esteem, cultivating a sense of school belonging, and prioritizing academic support can enhance students’ resilience [16]-[18]. Resilience value has been statistically associated with performance by previous studies [19]. Hence, student teachers who performed well academically had a higher appreciation and importance of the subject matter. More recently, academic resilience has been studied as a mediating variable between psychological constructs and academic outcomes. Evidence has shown its mediating role between perfectionism and performance [20], as well as between various forms of social support and academic success [21]. As a protective mechanism, resilience can buffer students from the adverse effects of stress, enabling them to achieve favorable academic outcomes even in unfavorable conditions [22]-[24].
In the field of biology, large language models like ChatGPT have been evaluated for their potential to support specialized academic tasks. Research shows that such tools are effective in contributing to scholarly discussions, identifying research trends, and enhancing academic writing [25]. Moreover, a range of educational technologies, including computer-assisted instruction, simulation programs, AI-enhanced textbooks, and game-based curricula, have been employed to improve biology instruction, generally yielding positive results, with the notable exception of computer-assisted instruction [26] [27]. Although integrated technologies have shown promise for lower-achieving biology students, significant disparities persist in academic outcomes across different student groups [27]. These inconsistencies suggest a need to investigate not only the technological tools used in education but also the contextual and psychological factors, such as academic resilience, that mediate their effects. As ChatGPT continues to emerge as a widely adopted AI tool, its influence on learning outcomes in biology warrants empirical scrutiny, particularly when considering diverse student profiles. Recent research has begun to investigate the impact of ChatGPT on high school students. A survey of 70 students revealed that ChatGPT is widely regarded as a valuable tool for academic, social, and personal purposes, although students also expressed a need for clearer usage guidelines [28]. Some studies have reported positive effects of ChatGPT on student performance and engagement in specific subjects, such as electromagnetism [29]. Others have found improved academic results in control groups that did not use ChatGPT, raising questions about the conditions under which AI enhances learning [30]. Moreover, ethical concerns remain, particularly regarding ChatGPT’s potential to contribute to cognitive stagnation when used inappropriately [3].
The impact of ChatGPT on student learning appears to vary based on academic resilience (AR), which refers to the learner’s ability to achieve goals despite adverse conditions. A previous study highlighted that students with higher AR use ChatGPT to develop ideas and supplement their studies with other resources. At the same time, those with lower AR tend to rely on it as their primary source of information [31]. This divergence suggests that academic resilience moderates how students engage with generative AI tools and may influence the resulting academic outcomes. Given its significant role in educational outcomes, academic achievement remains a focal point of research. It encompasses both objective metrics, such as grades and test scores, and subjective dimensions, including student satisfaction and engagement [32]-[35]. As schools and policymakers strive to improve educational systems, identifying reliable predictors of academic achievement, such as resilience and the responsible use of AI, becomes essential [36]-[39].
Although existing literature indicates that ChatGPT can support teaching and learning, students’ perspectives on its use are often mixed. While many students are aware of ChatGPT, they seldom use it regularly for academic purposes, and skepticism remains about its long-term educational value without proper institutional guidance [40]. Interestingly, the academic resilience of students appears to influence how they approach and evaluate AI-generated content. Those with higher resilience verify information more rigorously and remain mindful of the ethical implications of relying on ChatGPT [31]. Although limited, emerging literature also suggests that AI has the potential to enhance emotional and community resilience [41] [42].
Furthermore, ChatGPT’s personalized and immediate responses may enhance students’ motivation by providing accessible academic support. This flexibility enables learners to pursue clarification and guidance beyond conventional classroom hours, potentially fostering academic confidence and persistence [43]. Other studies confirm that academic resilience significantly correlates with improved academic achievement, engagement, and well-being among students in diverse educational contexts [44]-[46]. Nonetheless, contradictory findings also exist, with some studies reporting no significant link between resilience and academic outcomes such as GPA [47].
Although the potential of generative AI in education is evident, empirical evidence on ChatGPT’s pedagogical value remains inconclusive, especially when considering variations in academic resilience. While several AI-driven tools have shown positive effects on biology education, inconsistencies in academic outcomes suggest that additional mediating factors need to be examined. Notably, the mediating influence of academic resilience on the relationship between ChatGPT use and academic performance has yet to be comprehensively studied. By exploring the utilization of ChatGPT and its relationship with academic achievement in general biology, as mediated by academic resilience, this study aims to investigate whether the utilization of ChatGPT is related to academic achievement and how this relationship is mediated by academic resilience.
1.1. Conceptual Framework
Within the context of this study, which examines the relationship between ChatGPT utilization and academic achievement, with academic resilience as a mediating variable, academic resilience is conceptualized as a student’s capacity to maintain or regain high levels of academic functioning in the face of adversity. It is within these moments of academic difficulty that the role of ChatGPT may manifest differently, depending on how students engage with the tool. In this framework, ChatGPT utilization serves as the independent variable and is operationalized across three dimensions: knowledge (students’ awareness and understanding of ChatGPT’s educational applications), attitudes (their perceptions and dispositions toward its academic use), and practices (their actual behavioral engagement with the tool in educational contexts). These dimensions were examined concerning the dependent variable—students’ academic achievement in general biology. Academic resilience, serving as the mediating variable, was included to explore whether it influences or explains the nature and strength of the relationship between ChatGPT utilization and academic performance. This framework guided the study’s investigation into whether ChatGPT functions effectively as an academic support tool in biology education and whether students’ levels of academic resilience shape its impact on student achievement (See Figure 1).
![]()
Figure 1. Research paradigm.
1.2. Research Objectives
This study aimed to examine the multifaceted dimensions of ChatGPT utilization among students, specifically focusing on their level of knowledge regarding its educational applications, their attitudes toward the tool, and their actual practices in using it for academic purposes. In parallel, the research sought to assess students’ academic resilience by exploring three core dimensions: perseverance, reflective and adaptive help-seeking behaviors, and their negative affect and emotional responses. The study also described the students’ academic achievement in general biology, serving as the primary academic outcome. Beyond descriptive analysis, the research investigated the relationships among key variables: the association between ChatGPT utilization and academic achievement in general biology; the relationship between academic resilience and academic achievement; and the link between ChatGPT utilization and academic resilience. Furthermore, the study aimed to determine whether the utilization of ChatGPT and academic resilience significantly predicts students’ academic performance in general biology. Lastly, it examined the mediating role of academic resilience in the relationship between ChatGPT utilization and academic achievement, thereby providing insights into how psychological factors may influence the educational value of AI-assisted learning.
2. Methodology
2.1. Research Design
This study employed a descriptive-correlational research design to investigate the relationship between ChatGPT utilization and academic achievement in general biology, with academic resilience serving as a potential mediating variable among Senior High School (SHS) students. The descriptive-correlational design aims to describe the nature and strength of relationships among variables without inferring causality [48]. This approach enabled the researchers to measure students’ knowledge, attitudes, and practices related to ChatGPT, assess their academic resilience, and determine whether resilience mediates the link between ChatGPT use and academic performance, thereby addressing key gaps in AI-supported educational research.
2.2. Respondents of the Study
The respondents of the study were selected using purposive sampling. Purposive sampling is a type of non-probability sampling technique that involves selecting respondents who fit the researchers’ criteria of being most suitable for the study. The participants of this study were 230 Science, Technology, Engineering, and Mathematics (STEM) students, aged 16 to 19, from three selected schools in Cabanatuan City, Nueva Ecija, Philippines. In this study, respondents were required to meet the following inclusion criteria: they must be enrolled in Senior High School, belong to the Science, Technology, Engineering, and Mathematics (STEM) academic strand, and be currently taking the General Biology 2 subject. This selection technique enabled the researchers to obtain respondents who could provide the necessary data for analysis, thereby allowing the research questions to be answered. The selected schools offer the STEM strand with strong technological integration and are recognized for their excellent science performance, supported by enhanced resources, particularly in General Biology.
2.3. Research Instrument
This study employed two validated instruments adapted from established sources: 1) the Knowledge, Attitudes, and Practices Regarding the Educational Use of ChatGPT Questionnaire (KAP-CQ), and 2) the Academic Resilience Scale (ARS-30).
The KAP-CQ was designed to assess students’ utilization of ChatGPT as an academic support tool. It comprised three dimensions: knowledge of ChatGPT’s educational use, attitudes toward the tool, and actual academic practices involving it. The instrument was revised from its original 39 items to 30 items based on expert recommendations. Validation was conducted by two PhD holders in education and one Master Teacher II. A pilot test yielded an overall Cronbach’s alpha of 0.77, indicating acceptable reliability. Further reliability testing revealed the following Cronbach’s alpha values for each subscale: knowledge (α = 0.72), attitudes (α = 0.77), and practices (α = 0.79), all of which fall within acceptable thresholds.
The ARS-30, adapted to assess academic resilience, is a 30-item, four-point Likert scale instrument subdivided into three domains: perseverance, reflective and adaptive help-seeking, and negative affect and emotional response. Originally developed to measure psychological resilience in the face of academic adversity [14], the ARS-30 was also reviewed by two PhD holders and one Master Teacher II. The pilot study established a overall Cronbach’s alpha of 0.85, confirming its reliability. Reliability testing for the ARS-30 subscales in this study yielded the following Cronbach’s alpha values: perseverance (α = 0.74), reflecting and adaptive help-seeking (α = 0.76), and negative affect and emotional response (α = 0.79), supporting the internal consistency of each dimension. These instruments were administered to assess students’ academic resilience, their perceptions of ChatGPT, and their usage behavior, serving as key variables in the analysis.
2.4. Data Collection Procedure
Prior to data collection, the researchers secured permission from the Dean of the College of Education and obtained approval from the school principals of the selected institutions. Upon approval, informed consent forms were distributed to participants, detailing the study’s objectives, procedures, potential risks and benefits, and assurances of confidentiality. A pilot test was conducted with a STEM class from a different school to assess the reliability and validity of the research instruments, under the supervision of teachers and with oversight from the researchers. The main data collection was conducted over one week, during which the research team, divided into two groups, administered the instruments across the assigned schools. Participants were first briefed on the purpose and procedures of the study to ensure informed participation. Following data collection, responses were encoded and subjected to appropriate statistical analyses. The resulting data were then presented, interpreted, and evaluated concerning the study’s research questions and objectives.
2.5. Data Analysis
To obtain accurate and reliable findings aligned with the research objectives, several statistical treatments were employed. Descriptive statistics, specifically the arithmetic mean, were used to measure students’ levels of ChatGPT utilization and academic resilience. Similarly, the arithmetic mean and percentage distributions were calculated to describe students’ academic achievement in general biology. Pearson’s r correlation was applied to determine the presence and strength of significant relationships among ChatGPT utilization, academic resilience, and academic achievement. To assess whether the utilization of ChatGPT and academic resilience significantly predicts academic achievement, a multiple linear regression analysis was conducted. Furthermore, to explore the mediating role of academic resilience in the relationship between ChatGPT utilization and academic performance, the Modern Approaches to Inference about Intervening Variable Effects, as proposed by Hayes and Preacher (2008) [49], was utilized. These statistical tools allowed for a comprehensive analysis of the direct and indirect relationships among variables, supporting the study’s goal of identifying predictive and mediating factors in AI-assisted learning.
2.6. Ethical Consideration
The researchers conducted the study with consideration for ethical principles to ensure the authenticity and reliability of the data established in the study. The researchers protected the privacy, safety, and health of every respondent. The researchers assured that neither undue pressure nor excessive benefits were offered to the respondents for participating in the study during the data collection process. The purpose and scope of the study were explained to the respondents. Every participant gave their informed consent. All answers were kept anonymous and displayed as group data to preserve their privacy. For the sake of institutional confidentiality, the names of the selected schools were kept anonymous. The collected data was used only for scholarly research.
3. Results and Discussions
3.1. ChatGPT Utilization
1) Knowledge regarding educational use of ChatGPT
Table 1 indicates that students demonstrated a moderate level of knowledge regarding the utilization of ChatGPT, with an overall mean score of 3.05. Among the items, the highest-rated statement was item no. 5, “ChatGPT is available for free” (M = 3.58), while the lowest was item no. 6, “ChatGPT can only provide text-based responses” (M = 2.16).
At the time of data collection, ChatGPT’s image-generation features were exclusive to the paid version (ChatGPT 4.0), indicating that many students were unaware of the distinctions between the free and premium versions. Economic considerations and perceived utility influence willingness to pay for AI tools like ChatGPT [50]. This moderate level of knowledge aligns with previous research, which shows that students possess varying degrees of familiarity with ChatGPT’s capabilities [51]. Similarly, other studies reported low to moderate awareness among both students and faculty, often attributed to limited training or misconceptions about AI functionality [52]-[54]. Contrastingly, knowledge levels were found to be low in developing contexts [55] and higher among students with frequent exposure to the tool [56]. Overall, the findings suggest that while students are generally aware of ChatGPT, there are significant gaps in their understanding of its advanced features and limitations.
2) Attitude regarding educational use of ChatGPT
Table 2 reveals that students generally hold a positive attitude toward the utilization of ChatGPT, as reflected by an overall mean score of 2.84. While several
Table 1. Knowledge regarding the educational use of ChatGPT.
Statements |
Mean |
Verbal Description |
1. ChatGPT uses artificial intelligence to generate human-like responses. |
3.17 |
Moderate Knowledge |
2. ChatGPT responses are 100% accurate. |
3.31 |
High Knowledge |
3. ChatGPT is designed to provide human-like conversations. |
2.85 |
Moderate Knowledge |
4. ChatGPT is trained on a diverse range of topics. |
3.23 |
Moderate Knowledge |
5. ChatGPT is a commercial product and not available for free. |
3.58 |
High Knowledge |
6. ChatGPT can only provide text-based responses. |
2.16 |
Low Knowledge |
7. ChatGPT can help teachers with lesson planning. |
2.92 |
Moderate Knowledge |
8. ChatGPT can be used to assist students with their coursework. |
3.22 |
Moderate Knowledge |
9. ChatGPT can be integrated with virtual learning environments. |
3.00 |
Moderate Knowledge |
10. ChatGPT can provide additional teaching resources and learning materials for students. |
3.02 |
Moderate Knowledge |
Overall Mean |
3.05 |
Moderate Knowledge |
Table 2. Attitudes regarding educational use of ChatGPT.
Statements |
Mean |
Verbal Description |
1. ChatGPT uses artificial intelligence to generate human-like responses. |
3.17 |
Moderate Knowledge |
2. ChatGPT responses are 100% accurate. |
3.31 |
High Knowledge |
3. ChatGPT is designed to provide human-like conversations. |
2.85 |
Moderate Knowledge |
4. ChatGPT is trained on a diverse range of topics. |
3.23 |
Moderate Knowledge |
5. ChatGPT is a commercial product and not available for free. |
3.58 |
High Knowledge |
6. ChatGPT can only provide text-based responses. |
2.16 |
Low Knowledge |
7. ChatGPT can help teachers with lesson planning. |
2.92 |
Moderate Knowledge |
8. ChatGPT can be used to assist students with their coursework. |
3.22 |
Moderate Knowledge |
9. ChatGPT can be integrated with virtual learning environments. |
3.00 |
Moderate Knowledge |
10. ChatGPT can provide additional teaching resources and learning materials for students. |
3.02 |
Moderate Knowledge |
Overall Mean |
3.05 |
Moderate Knowledge |
items indicated relatively positive to very positive attitudes, responses to specific items suggest a more nuanced perspective. Items no. 1 (“I trust the responses provided by ChatGPT,” M = 2.42), no. 2 (“I find ChatGPT responses to be accurate”, M = 2.43), no. 5 (“I believe that ChatGPT should be banned in all schools and academic institutions”, M = 2.07), and no. 8 (“I think people who use ChatGPT for academic purposes are cheating”, M = 2.39) received neutral ratings.
These findings suggest limited trust in ChatGPT’s output, coupled with a rejection of its complete prohibition in academic settings. The disapproval of banning ChatGPT supports findings that students often oppose restrictions and may attempt to circumvent institutional policies [57]. However, this contrasts with evidence from other studies where students expressed concerns over fairness and advocated for banning AI tools in classrooms [58]. Notably, the same study concluded that students favored the development of strict regulations and ethical guidance, aligning with the present findings, which show strong support for responsible use and education regarding ChatGPT. Prior studies also report moderately positive attitudes toward ChatGPT, influenced by trust in its accuracy, ethical considerations, and institutional stances [52]-[54]. Conversely, negative attitudes were more common among students with lower levels of knowledge [55]. In contrast, consistently positive attitudes were found among those with higher AI familiarity, especially when accompanied by ethical instruction [56]. These results suggest that while students generally view ChatGPT favorably, their trust and acceptance are shaped by ethical, institutional, and informational factors.
3) Educational practices regarding the educational use of ChatGPT
Table 3 indicates that students exhibit moderately positive educational practices in utilizing ChatGPT, with an overall mean score of 2.78. Notably, item no. 10 (“I use ChatGPT to learn a language”) received the lowest mean score of 2.08, suggesting limited use of ChatGPT for language acquisition.
Table 3. Educational practices regarding the educational use of ChatGPT.
Statements |
Mean |
Verbal Description |
1. ChatGPT uses artificial intelligence to generate human-like responses. |
3.17 |
Moderate Knowledge |
2. ChatGPT responses are 100% accurate. |
3.31 |
High Knowledge |
3. ChatGPT is designed to provide human-like conversations. |
2.85 |
Moderate Knowledge |
4. ChatGPT is trained on a diverse range of topics. |
3.23 |
Moderate Knowledge |
5. ChatGPT is a commercial product and not available for free. |
3.58 |
High Knowledge |
6. ChatGPT can only provide text-based responses. |
2.16 |
Low Knowledge |
7. ChatGPT can help teachers with lesson planning. |
2.92 |
Moderate Knowledge |
8. ChatGPT can be used to assist students with their coursework. |
3.22 |
Moderate Knowledge |
9. ChatGPT can be integrated with virtual learning environments. |
3.00 |
Moderate Knowledge |
10. ChatGPT can provide additional teaching resources and learning materials for students. |
3.02 |
Moderate Knowledge |
Overall Mean |
3.05 |
Moderate Knowledge |
This finding highlights a missed opportunity, as existing research demonstrates ChatGPT’s effectiveness in supporting language-related tasks [59]. Despite this specific underuse, the general pattern reflects moderate engagement with ChatGPT for academic purposes, aligning with prior findings that students hold favorable views of its instructional value [60]. Several studies similarly report moderate to low levels of actual usage, often attributed to students’ limited familiarity with the tool’s capabilities [52]-[54]. Suboptimal usage has also been linked to insufficient knowledge. In contrast, consistent and productive practices were more common among students who regularly engaged in writing activities and had greater exposure to the tool [55] [56]. Overall, while students appear open to incorporating ChatGPT into their learning, its integration across a broader range of academic tasks remains moderate.
3.2. Academic Resilience
1) Perseverance
Table 4 indicates that students exhibit high levels of academic resilience in terms of perseverance, with an overall mean score of 3.54. Item-level scores showed minimal variation, suggesting consistency across responses. Notably, item no. 6 (“I would still pursue my career plans”) recorded the highest mean score of 3.60, reflecting strong commitment to long-term goals despite challenges.
Table 4. Perseverance.
Statements |
Mean |
Verbal Description |
1. I would work harder. |
3.43 |
High Resilience |
2. I would keep trying. |
3.52 |
High Resilience |
3. I would use the feedback to improve my work. |
3.57 |
High Resilience |
4. I would not give up. |
3.55 |
High Resilience |
5. I would try to think of new solutions. |
3.52 |
High Resilience |
6. I would still pursue my career plans. |
3.60 |
High Resilience |
7. I would use the situation to motivate myself. |
3.54 |
High Resilience |
8. I would maintain my long-term goals and ambitions. |
3.59 |
High Resilience |
9. I would see the situation as a challenge. |
3.50 |
High Resilience |
10. I would look forward to showing that I can improve my grades. |
3.58 |
High Resilience |
Overall Mean |
3.54 |
High Resilience |
This finding aligns with prior research indicating that individuals with greater grit—conceptualized as sustained perseverance toward long-term objectives—tend to demonstrate fewer career shifts [61]. Similar patterns have been observed among nursing students, where high perseverance scores were linked to sustained academic motivation and goal orientation [62]. Other studies also report elevated perseverance levels among students, supporting the notion of enduring effort and resilience in academic contexts. Overall, these findings suggest that the respondents maintain a strong sense of purpose and determination, enabling them to persist in their academic and professional aspirations despite potential obstacles.
2) Reflecting and adaptive help-seeking
Table 5 shows that students demonstrate high resilience in terms of reflecting and adaptive help-seeking, with an overall mean score of 3.40. This suggests that learners generally engage in self-reflection and seek appropriate support when facing academic difficulties. However, item no. 6 (“I would seek help from my teachers”) yielded a slightly lower mean of 3.10, interpreted as moderate resilience.
Table 5. Reflecting and adaptive help-seeking.
Statements |
Mean |
Verbal Description |
1. I would try to think more about my strengths and weaknesses to help me work better. |
3.59 |
High Resilience |
2. I would give myself encouragement. |
3.51 |
High Resilience |
3. I would seek encouragement from my family and friends |
3.29 |
High Resilience |
4. I would try different ways to study. |
3.40 |
High Resilience |
5. I would set my own goals for achievement. |
3.56 |
High Resilience |
6. I would seek help from my teachers. |
3.10 |
Moderate Resilience |
7. I would start to monitor and evaluate my achievements and effort. |
3.39 |
High Resilience |
8. I would start to self-impose rewards and punishments depending on my performance. |
3.28 |
High Resilience |
9. I would use my past successes to help motivate myself. |
3.52 |
High Resilience |
10. I would do my best to stop thinking negative thoughts. |
3.40 |
High Resilience |
Overall Mean |
3.40 |
High Resilience |
This finding contrasts with studies indicating that motivational factors significantly mediate students’ willingness to seek instrumental help, particularly from instructors [63]. A possible explanation is the increasing reliance of students on informal help-seeking channels, such as peers or online platforms, as instructor interaction may be perceived as intimidating or formal [64]. Additionally, perceived threat has been shown to inhibit help-seeking behavior, especially among students who exhibit avoidant tendencies [65]. Despite this outlier, the overall results suggest that the students in the study exhibit higher levels of adaptive help-seeking than typically reported in the literature.
3) Negative affect and emotional response
Table 6 indicates that students exhibit moderate levels of resilience in terms of negative affect and emotional response, with an overall mean score of 2.85. Notably, item no. 10 (“I would accept the teacher’s feedback”) was the only item classified under high resilience, with a mean score of 3.35. This suggests that students are more emotionally resilient when processing feedback from educators, highlighting the significance of constructive teacher-student interactions. Prior research supports the idea that positive and supportive feedback from educators enhances students’ motivation and perseverance [66]. Emotional regulation, particularly in the context of interpersonal relationships such as those with teachers, is associated with improved responses to both negative and positive experiences [67]. Moreover, students’ emotional reactions to academic tasks—ranging from classwork to homework—play a critical role in shaping their academic outcomes and life satisfaction, especially when positive emotions are involved [68]. Generally, the findings indicate a moderate capacity among students to manage negative emotions, consistent with existing literature on emotional resilience in academic settings.
Table 6. Negative affect and emotional response.
Statements |
Mean |
Verbal Description |
1. I would not feel like everything was ruined and was going wrong. |
2.95 |
Moderate Resilience |
2. I would not begin to think my chances of success at university were poor. |
2.92 |
Moderate Resilience |
3. I would probably not get depressed. |
2.73 |
Moderate Resilience |
4. I would not be very disappointed. |
2.57 |
Moderate Resilience |
5. I would not begin to think my chances of getting the job I want were poor. |
2.84 |
Moderate Resilience |
6. I would probably not get annoyed. |
2.51 |
Moderate Resilience |
7. I would not stop myself from panicking. |
2.53 |
Moderate Resilience |
8. I would see the situation as temporary. |
3.04 |
Moderate Resilience |
9. I would not blame the teacher. |
3.06 |
Moderate Resilience |
10. I would accept the teacher’s feedback. |
3.35 |
High Resilience |
Overall Mean |
2.85 |
Moderate Resilience |
3.3. Academic Achievement in General Biology
Table 7 presents the academic achievement levels of the respondents in General Biology. Out of 230 students, 154 (66.96%) achieved outstanding grades, ranging from 90 to 100, followed by 69 students (30%) who obtained very satisfactory marks between 85 and 89. A small number, six students (2.61%), received satisfactory grades of 80 - 84. No student scored within the fairly satisfactory range of 75 - 79, while only one student (0.43%) received a grade below 75, classified as
Table 7. Academic achievement in general biology.
Grades |
f |
Percentage |
Verbal Description |
Mean |
90 - 100 |
154 |
66.96% |
Outstanding |
90.43 |
85 - 89 |
69 |
30.00% |
Very Satisfactory |
80 - 84 |
6 |
2.61% |
Satisfactory |
75 - 79 |
0 |
0.00% |
Fairly Satisfactory |
Below 75 |
1 |
0.43% |
Did not meet expectations |
Total |
230 |
100.00% |
|
|
“did not meet expectations”. The overall mean grade was 90.43, which was described as outstanding. A comparable study reported an average grade of 89.56, classified as very satisfactory but closely aligned with the present findings [69]. Another investigation involving Grade 9 students reported a mean biology score of 73.57 with a standard deviation of 13.83, indicating only satisfactory performance [70]. The contrast between these studies underscores the exceptional academic performance of the current respondents. These results suggest that the participants consistently demonstrated high achievement in General Biology, surpassing the performance levels reported in related literature.
3.4. Relationship between Students’ ChatGPT Utilization and Academic Achievement in General Biology
Table 8 presents the statistical analysis examining the relationship between students’ use of ChatGPT and their academic achievement in General Biology. The study initially hypothesized that no significant relationship exists between these variables. ChatGPT utilization was measured across three dimensions: knowledge regarding its educational use, attitudes toward ChatGPT, and educational practices involving ChatGPT. Pearson’s r correlation was used at a 5% significance level to assess the association between each dimension and students’ academic performance in General Biology.
Table 8. Relationship between students’ ChatGPT utilization and academic achievement in general biology.
Grades |
f |
Percentage |
Verbal Description |
Mean |
90 - 100 |
154 |
66.96% |
Outstanding |
90.43 |
85 - 89 |
69 |
30.00% |
Very Satisfactory |
80 - 84 |
6 |
2.61% |
Satisfactory |
75 - 79 |
0 |
0.00% |
Fairly Satisfactory |
Below 75 |
1 |
0.43% |
Did not meet expectations |
Total |
230 |
100.00% |
|
|
Results reveal that only one dimension—knowledge regarding the educational use of ChatGPT—demonstrated a statistically significant positive relationship with students’ biology grades (r =0.131, p = 0.046). This suggests that students with a greater understanding of ChatGPT’s educational applications tend to perform slightly better in General Biology. Conversely, the dimensions of attitudes (r = −0.001, p = 0.990) and educational practices (r = 0.016, p = 0.807) showed no significant association with academic performance. This finding aligns with prior literature indicating that various educational technologies, such as computer-mediated simulations, AI-integrated texts, and game-based curricula, can have a positive influence on student achievement [26]. Furthermore, existing research supports that awareness and competence in using information and communication technology are significantly associated with improved academic outcomes [71]. Thus, the results suggest that academic benefits from ChatGPT are likely contingent upon students’ informed and competent use of the tool.
3.5. Relationship between Students’ Academic Resilience and Their Academic Achievement in General Biology
Table 9 presents the results of the statistical analysis examining the relationship between students’ academic resilience and their academic achievement in General Biology. The study tested the null hypothesis that there is no significant relationship between these two variables. Academic resilience was assessed across three dimensions: perseverance, reflecting and adaptive help-seeking, and negative affect and emotional response. Using Pearson’s r correlation at the 5% significance level, the relationship of each dimension with students’ General Biology 2 grades was evaluated.
Table 9. Relationship between students’ academic resilience and their academic achievement in general biology.
Academic Resilience |
Academic Achievement in General Biology |
r |
p |
VD |
Decision |
Perseverance |
0.067 |
0.311 |
Not significant |
Accept H0 |
Reflecting and adaptive help-seeking |
0.027 |
0.688 |
Not significant |
Accept H0 |
Negative affect and emotional response |
−0.001 |
0.985 |
Not significant |
Accept H0 |
Note: *p < 0.05, **p < 0.01.
The findings indicate that none of the three dimensions exhibited a statistically significant correlation with academic achievement: perseverance (r = 0.067, p = 0.311), reflecting and adaptive help-seeking (r = 0.027, p = 0.688), and negative affect and emotional response (r = −0.001, p = 0.985). These results suggest that academic resilience, as operationalized in this study, does not significantly influence academic performance in General Biology. This outcome contrasts with prior research conducted among Moroccan students, which reported a significant positive relationship between academic resilience and academic achievement [72]. Similarly, another study involving Minawao refugee students found that academic resilience had a substantial impact on academic success [73]. The divergence in findings may be attributed to contextual, cultural, or curricular differences across educational settings, suggesting the need for further investigation into the role of academic resilience in different learning environments.
3.6. Relationship between Students’ ChatGPT Utilization and Academic Resilience
Table 10 presents the results of the statistical analysis exploring the potential significant relationships between students’ ChatGPT utilization and academic resilience. Both constructs were assessed through three dimensions each: ChatGPT utilization included knowledge regarding its educational use, attitudes toward ChatGPT, and educational practices involving ChatGPT; academic resilience comprised perseverance, reflecting and adaptive help-seeking, and negative affect and emotional response. A total of nine pairwise relationships were analyzed using Pearson’s correlation to identify significant associations.
Table 10. Relationship between students’ ChatGPT utilization and academic resilience.
ChatGPT Utilization |
Academic Resilience |
Perseverance |
Reflecting and adaptive help-seeking |
Negative affect and emotional response |
r |
p |
r |
p |
r |
p |
Knowledge regarding educational use of ChatGPT |
0.131* |
0.046 |
0.041 |
0.536 |
0.113 |
0.088 |
Attitudes regarding ChatGPT |
0.141* |
0.032 |
0.151* |
0.022 |
0.057 |
0.390 |
Educational practices regarding ChatGPT |
0.0620 |
0.349 |
0.028 |
0.669 |
−0.012 |
0.857 |
Note: *p < 0.05, **p < 0.01.
Among these, only three relationships yielded statistically significant results. First, students’ knowledge regarding the educational use of ChatGPT was found to have a positive and significant correlation with perseverance (r = 0.131, p = 0.046). Second, students’ attitudes toward ChatGPT were significantly and positively associated with both perseverance (r = 0.141, p = 0.032) and reflective and adaptive help-seeking (r = 0.151, p = 0.022). The remaining six relationships did not reach statistical significance (p > 0.05), indicating no substantial association among those specific dimension pairs.
These findings align with previous research demonstrating that technology-based education can positively influence academic perseverance and learner motivation [74]. Students who are knowledgeable about and hold positive attitudes toward educational technologies are more likely to engage meaningfully with them. As such, the development of proper knowledge and constructive attitudes is essential for the successful integration of educational tools like ChatGPT. Furthermore, the observed relationship between positive attitudes toward ChatGPT and adaptive help-seeking resonates with findings that, among preservice teachers, attitudes toward technology significantly shape reflective practices and professional growth [75]. This suggests that fostering favorable dispositions toward educational technologies not only enhances perseverance but also supports students’ capacity for adaptive academic strategies.
3.7. ChatGPT Utilization and Academic Resilience as Predictors of Academic Achievement in General Biology
Table 11 displays the results of the multiple regression analysis conducted to determine the extent to which the components of ChatGPT utilization (knowledge, attitudes, and educational practices) and academic resilience (perseverance, reflecting and adaptive help-seeking, and negative affect and emotional response) predict students’ academic achievement in General Biology. The model yielded an adjusted R2 value of 0.012, indicating that only 1.2% of the variance in academic achievement can be explained by the combined predictors, representing a small and negligible effect size. The overall model was not statistically significant, F = 1.454, p = 0.195, suggesting that, collectively, ChatGPT utilization and academic resilience do not significantly predict students’ academic performance in General Biology.
Table 11. ChatGPT utilization and academic resilience as predictors of academic achievement in general biology.
Variable |
B |
SE |
β |
t |
P’ |
R2 Adjusted |
F |
p |
Knowledge regarding educational use of ChatGPT |
2.081 |
0.209 |
0.762 |
2.730 |
0.007 |
0.012 |
1.454 |
0.195 |
Attitudes regarding ChatGPT |
−0.601 |
−0.055 |
0.769 |
0.782 |
0.435 |
Educational practices regarding ChatGPT |
−0.420 |
−0.067 |
0.465 |
0.905 |
0.367 |
Perseverance |
0.465 |
0.061 |
0.672 |
0.692 |
0.490 |
Reflecting and adaptive help-seeking |
−0.020 |
−0.003 |
0.639 |
0.031 |
0.975 |
Negative affect and emotional response |
−0.246 |
−0.035 |
0.481 |
0.511 |
0.610 |
Note: *p < 0.05, **p < 0.01.
However, individual predictor analysis revealed that only knowledge regarding the educational use of ChatGPT significantly contributed to the model, B = 2.081, SE = 0.209, β = 0.762, t = 2.730, p = 0.007. The β-value indicates a large effect size, suggesting that students with a higher understanding of the educational applications of ChatGPT tend to achieve higher academic outcomes in General Biology. The remaining variables—attitudes toward ChatGPT (p = 0.435), educational practices involving ChatGPT (p = 0.367), perseverance (p = 0.490), reflecting and adaptive help-seeking (p = 0.975), and negative affect and emotional response (p = 0.610)—were not statistically significant predictors.
These findings align with prior research indicating that awareness and informed use of technological tools such as AI and computer-mediated platforms contribute positively to academic outcomes [26] [72]. As noted in earlier studies, technological efficacy is maximized when users possess sufficient knowledge and competence in its application [71]. Hence, while general use or positive attitude alone may not suffice, targeted knowledge of ChatGPT’s educational features appears to play a crucial role in enhancing academic achievement in science learning contexts.
3.8. Academic Resilience is not a Mediator of the Relationship
between ChatGPT Utilization and Academic Achievement
in General Biology
This study hypothesized that academic resilience does not mediate the relationship between ChatGPT utilization and academic achievement in General Biology. Following Hayes’ (2008) procedure for testing mediation, a series of preliminary analyses was conducted to evaluate the relationships among the independent variable (ChatGPT utilization), the mediating variable (academic resilience), and the dependent variable (academic achievement in General Biology) [49]. Step one requires establishing a significant relationship between the independent variable and the dependent variable. The findings indicated that only one component of ChatGPT utilization, knowledge regarding its educational use was significantly and positively associated with academic achievement (β = 2.081, p = 0.007).
Step two of Hayes’ method necessitates a significant relationship between the independent variable and the proposed mediator. In this study, only partial significance was observed: knowledge and attitudes regarding ChatGPT were significantly correlated with perseverance and reflective help-seeking, respectively. However, these associations did not extend to academic resilience as a whole construct. Step three requires the mediator to significantly predict the dependent variable, controlling for the independent variable. This condition was not satisfied, as none of the academic resilience dimensions significantly predicted academic achievement (p > 0.05). Lastly, in step four, the indirect effect through the mediator must be statistically significant. Given that the mediator (academic resilience) was not significantly associated with academic achievement, the indirect effect does not exist; thus, no mediation can be claimed.
In summary, the conditions necessary for establishing a mediating effect were not fulfilled [49]. While knowledge of ChatGPT’s educational use is a significant predictor of academic achievement, academic resilience fails to act as a mediating variable in this relationship. This contrasts with findings from prior studies, where academic resilience has served as a mediator in various academic contexts. For instance, resilience has been found to mediate the relationship between perfectionism and academic performance among Generation Z students [20] and between social support and academic outcomes in university populations [21].
The lack of mediation in the present study may suggest that while academic resilience is a valuable psychological trait, its influence on academic performance may be more context-dependent and may require stronger or different antecedents to manifest a mediating role. These findings underscore the importance of domain-specific knowledge, such as familiarity with educational technology, as a more immediate and actionable factor in promoting academic success in science education.
4. Conclusions
This study examined the relationship between students’ use of ChatGPT and their academic achievement in General Biology, with academic resilience considered as a potential mediating variable. Findings revealed that students demonstrated moderate knowledge of ChatGPT’s educational use, moderately positive attitudes, and moderate levels of integration into academic practices. However, usage remained limited in specific domains, such as language learning. Despite these moderate engagement levels, students generally achieved outstanding performance in General Biology, indicating high academic competence within the sample.
Regarding academic resilience, students demonstrated high levels of perseverance and adaptive help-seeking behaviors, while responses related to emotional regulation reflected only moderate resilience. Notably, among the three components of ChatGPT utilization (knowledge, attitudes, and practices), only knowledge of ChatGPT’s educational applications had a significant positive relationship with academic achievement. Attitudes and practices, while moderately favorable, did not correlate with performance outcomes. Similarly, none of the three dimensions of academic resilience showed a significant predictive relationship with academic achievement.
Furthermore, regression analysis revealed that knowledge of ChatGPT use was the only factor that significantly predicted academic achievement. The remaining predictors, including all resilience dimensions, failed to demonstrate a statistically significant effect. Mediation analysis confirmed that academic resilience did not mediate the relationship between ChatGPT utilization and academic achievement, as the conditions necessary for mediation were not satisfied.
These findings highlight the crucial role of digital literacy and informed use of AI tools over mere exposure or generalized resilience traits. While resilience may support broader learning outcomes, academic achievement in content-specific areas, such as biology, appears to be more strongly influenced by a student’s understanding and strategic use of educational technologies. Therefore, educational institutions should prioritize structured training on AI tools, such as ChatGPT, to maximize their pedagogical value.
However, this study is subject to several limitations that should be considered. The use of purposive sampling restricts the generalizability of the findings. Additionally, the reliance on self-report measures introduces the possibility of response bias due to participants’ subjective perceptions and varying levels of self-awareness. Furthermore, the cross-sectional design captures data at a single point in time, limiting the ability to draw causal inferences or observe developmental trends related to ChatGPT utilization, academic resilience, and academic achievement. These limitations underscore the need for more robust, longitudinal, and mixed-method studies to build a deeper and more nuanced understanding of AI’s role in educational contexts. Future research should further investigate the contextual and pedagogical variables that influence the integration of AI in education, particularly in STEM disciplines.
5. Recommendations
Drawing on the study’s findings, several practical recommendations are offered to guide students, educators, policymakers, and future researchers in maximizing the educational potential of ChatGPT and other AI tools.
For students, it is recommended to develop a deeper understanding of ChatGPT’s educational applications actively. Since only knowledge about the tool significantly predicted academic achievement, students should be encouraged not just to use ChatGPT passively, but to engage critically with its functions, strengths, and limitations. Integrating AI tools into study routines should be complemented by reflection, verification of outputs, and ethical usage.
For teachers, professional development programs should be implemented to equip educators with strategies for integrating ChatGPT effectively into instruction. Teachers should also guide students on the responsible use of AI, emphasizing digital literacy, critical thinking, and academic integrity. Classroom activities can include AI-enhanced exercises that teach students how to prompt, interpret, and evaluate AI-generated content.
For educational policymakers, there is a clear need to institutionalize digital literacy initiatives that specifically address the use of generative AI in educational settings. Policies should promote equitable access to AI tools, establish guidelines for ethical use, and incorporate AI literacy into the curriculum. Emphasis should be placed on developing students’ knowledge and evaluative skills rather than restricting AI usage outright.
For future researchers, studies should be extended to include diverse student populations across different grade levels, academic strands, and disciplines to validate and expand upon the current findings. Further investigation is also recommended into specific competencies related to ChatGPT knowledge that most directly impact academic performance. Moreover, exploring other potential mediators or moderators, such as motivation, self-efficacy, or digital access, could provide a more comprehensive understanding of how AI tools influence learning outcomes.
Overall, these recommendations aim to ensure that ChatGPT and similar technologies are used not just as tools for convenience but as catalysts for meaningful, responsible, and effective learning.
Conflicts of Interest
The authors declare no conflicts of interest.