Value-Added Assessment in English Language Teaching: A Study and Implementation under the OBE Framework ()
1. Introduction
In October 2020, the Central Committee of the Communist Party of China and the State Council promulgated the General Plan for Deepening the Reform of Education Evaluation in the New Era. This policy document represents a seminal and strategic initiative, providing a comprehensive and evidence-based framework for the systematic reform of education evaluation. It introduces the concept of the “Four Evaluations,” which includes enhancing result evaluation, strengthening process evaluation, exploring value-added evaluation, and perfecting comprehensive evaluation [1]. These dimensions collectively signal a paradigm shift from a predominantly outcome-oriented approach to a more holistic and process-oriented evaluation framework. By integrating these multifaceted strategies, the plan offers new opportunities for advancing the reform of curriculum and teaching evaluation, thereby contributing to a more equitable, effective, and contextually responsive educational landscape.
In May 2023, the Office of the Ministry of Education promulgated the Action Plan for Deepening the Reform of Curriculum and Teaching in Basic Education. This policy document represents a significant advancement in educational reform, advocating for the integration of process-oriented and value-added evaluations within the framework of teaching assessment. It underscores the critical role of evaluation in guiding instructional practices, diagnosing learning needs, facilitating continuous improvement, and providing motivational support for both educators and students [2]. Contemporary educational discourse has increasingly emphasized the importance of student-centered evaluation approaches that prioritize holistic development and individual growth trajectories. In this context, the traditional reliance on result-oriented evaluation—characterized by its focus on summative assessments and standardized metrics—has been widely acknowledged as inadequate for capturing the complexities of learning processes and the diverse needs of students. As a result, there is a pressing need to innovate and diversify evaluation methods by prioritizing value-added evaluation. This shift is intended to leverage evaluation as a catalyst for instructional improvement and student learning, thereby facilitating the comprehensive development of students.
By integrating process-oriented and value-added evaluations, the Action Plan aims to address the limitations of traditional evaluation models and promote a more dynamic, responsive, and contextually sensitive approach to educational assessment. This approach not only provides a more nuanced understanding of educational effectiveness but also encourages educators to focus on the continuous improvement of teaching practices and the individual development of each student.
Value-added evaluation (VAE) represents a paradigmatic shift in educational assessment, centering on the measurement of student progress and change rather than static outcomes. By examining the “net effect” of educational interventions on student learning, VAE provides a nuanced evaluation of the magnitude of improvement and the effort expended in both student learning and teacher instruction. This approach is characterized by two interrelated dimensions. First, VAE prioritizes the measurement of student growth and progress. It integrates students’ initial proficiency levels into the assessment framework, using learning gains as the evaluative criterion. This focus on “value-added” redirects attention from absolute performance outcomes to relative improvement trajectories, thereby operationalizing the principle of “not comparing results, but comparing progress” [3]. Second, VAE assesses the impact of educational processes on student progress. It accounts for and isolates factors that influence student growth independently of educational interventions, thereby evaluating the effort and effectiveness of educational practices, teaching, and learning. This dimension emphasizes the “net effect” of educational inputs, realizing the principle of “not comparing background, but comparing effort” [3]. By foregrounding these dual emphases, VAE offers a more dynamic and contextually responsive approach to educational assessment, challenging traditional models that prioritize static outcomes over individual growth and educational impact.
2. Literature Review
2.1. International Research
The massification of higher education in the United States during the 1960s engendered pervasive apprehensions about the quality of university instruction and the efficacy of student learning outcomes. This context catalyzed the development of value-added assessment, a methodological framework designed to scientifically evaluate the value generated by educational institutions and the learning gains achieved by students. In 1971, economist Eric Hanushek introduced the concept of “value-added” into the educational domain, pioneering the application of input-output analyses to teacher effectiveness models. This seminal work marked the inception of value-added thinking within the realm of teacher evaluation, laying the groundwork for subsequent advancements in this area.
Over the ensuing decades, value-added assessment garnered significant attention, prompting numerous regions to embark on practical explorations of its application. A landmark development occurred in 1992 when Tennessee introduced the Tennessee Value-Added Assessment System (TVAAS), becoming the first state to implement a value-added evaluation system for schools [4]. Building on the foundational principles of TVAAS, the Education Value-Added Assessment System (EVAAS) was subsequently developed, further refining the methodology and expanding its applicability [5]. A critical advancement in the field was made by Douglas Betebenner, who proposed the Student Growth Percentile (SGP) model. This model revolutionized value-added assessment by enabling comparisons of student growth across diverse initial proficiency levels, thereby addressing a key challenge in the implementation of value-added approaches [6] [7]. By facilitating the measurement of individual student progress relative to their peers, the SGP model significantly enhanced the feasibility and utility of value-added evaluation, establishing itself as a foundational tool in the field.
As the demand for robust and contextually sensitive evaluation methods grew, researchers developed a range of value-added models tailored to different data scenarios and research assumptions. These include covariate-adjusted models, which account for extraneous variables that may influence student outcomes; differential value-added models, which examine variations in educational impact across different student subgroups; and cross-classified models, which accommodate complex educational structures and nested data [8]-[10]. Collectively, these advancements have enriched the value-added assessment toolkit, enhancing its capacity to provide nuanced insights into educational effectiveness.
The evolution of value-added assessment reflects a paradigmatic shift in educational research and policy, moving beyond static measures of student achievement to focus on the dynamic processes of learning and the incremental impact of educational interventions. As a result, value-added approaches have garnered increasing attention from scholars, practitioners, and policymakers, who recognize their potential to provide actionable evidence for improving educational quality and equity.
2.2. Domestic Research
The exploration of value-added evaluation (VAE) in China can be traced back to the late 1970s in Taiwan region and the mid-to-late 1980s in Hong Kong SAR. By the 1990s, mainland scholars had initiated research on VAE, progressively elucidating its connotations, principles, and significance. VAE is fundamentally concerned with assessing the relative progress made by a student cohort in a given school compared to cohorts in other schools over a specific period, while accounting for students’ initial performance levels [11]. This approach is encapsulated in the definition of value-added as “the relative progress achieved by individual students in various or specific areas, compared across schools with similar starting points, within a defined period of education” [12].
In the realm of basic education, empirical studies have predominantly focused on the college and high school entrance examination results of schools in specific regions. Additionally, research on VAE in vocational education has gradually emerged. In higher education, VAE began to gain traction in 2012, leveraging various direct and indirect measurement tools. Scholars have employed multiple regressions and hierarchical linear modeling to investigate the value-added in students’ ability levels. For instance, Xue Xinhua utilized hierarchical linear modeling to conduct research on the educational effectiveness of higher education institutions [13].
Since the promulgation of the General Plan for Deepening the Reform of Education Evaluation in the New Era in 2020, research on value-added evaluation (VAE) in China has expanded significantly in scope and depth. This policy initiative has catalyzed a diversification of VAE research, with intensified explorations across multiple educational domains. In the realm of basic education, VAE has extended to encompass areas such as aesthetic and moral education, reflecting a broader recognition of the multifaceted nature of student development [14]. Additionally, research has focused on value-added assessments of both individual subjects and comprehensive academic performance, highlighting the need for nuanced evaluations that capture the complexity of educational outcomes [14]. In higher education, there has been a deepened discourse on transcending the “five only,” a critique of the overreliance on narrow metrics such as degrees, titles, honors, papers, and projects. Scholars have engaged in critical reflections on the current state of VAE and have proposed targeted strategies and pathways to address prevailing challenges and limitations [14].
Parallel to these developments, there is a growing emphasis on leveraging digital technologies to enhance the efficacy and applicability of value-added evaluation. For instance, Zhang Yuchen et al. provided an in-depth examination of the Student Growth Percentile (SGP) and Student Growth Objectives (SGO) models developed in New Jersey, USA [15]. This work underscores the potential of advanced analytical frameworks to inform VAE practices. Zhu Ke et al. conducted a longitudinal analysis of the developmental stages of value-added models within an international context and offered a comparative study of ten seminal value-added models, elucidating their respective strengths and limitations [16]. Building on these foundational contributions, subsequent research has introduced various innovative VAE models, including the multi-level linear quantile regression estimation method (SGPs) [17], the simplified percentile growth model [18], the teacher network collaboration effectiveness model [19], and the student learning value-added evaluation model [20].
Despite these advancements, the localization of empirical research using these value-added models remains an underexplored area. The adaptation and application of these models within the Chinese educational context present unique challenges and opportunities. Future research must therefore prioritize the development of contextually responsive methodologies that integrate global best practices with local educational realities. This endeavor will require rigorous empirical investigations to validate the applicability and effectiveness of these models in diverse educational settings. Ultimately, such efforts will contribute to the refinement of VAE as a robust tool for enhancing educational quality and equity.
3. Research Design
3.1. Outcome-Based Education (OBE) Philosophy
Outcome-Based Education (OBE) is an educational paradigm that centers on student learning outcomes as the primary drivers of curriculum design and instructional practice. By prioritizing student-centered approaches and outcome-oriented goals, OBE emphasizes the alignment of educational activities with clearly defined learning objectives. This alignment is achieved through backward design, which begins with the identification of desired learning outcomes and subsequently informs the development of instructional strategies, assessment methods, and resource allocation. Forward implementation ensures that these design principles are systematically integrated into the teaching and learning process. OBE advocates for the organization of learning activities and the evaluation of learning outcomes around predetermined educational objectives, thereby fostering a cohesive and goal-oriented educational experience. Additionally, it establishes a robust framework for continuous improvement, creating a dynamic evaluation system that integrates assessment, feedback, and enhancement. This system is designed to promote teaching through evaluation, learning through evaluation, and improvement through evaluation, thereby enhancing the overall quality of education.
Within this framework, educators are encouraged to transition from traditional teaching-centered models to learning-centered approaches, focusing on the developmental needs and growth trajectories of students. Educational resources and teaching activities are strategically allocated and arranged with an emphasis on learning outcomes and personal development. By adopting a student-outcome orientation, clear learning outcome standards are established. The curriculum and teaching components are designed backward from these outcomes, ensuring that faculty and resources are aligned to support student success.
Continuous improvement is a central tenet of OBE, with comprehensive and process-oriented tracking and evaluation of teaching activities. Evaluation results are systematically utilized to inform teaching enhancements, forming a closed loop of “assessment—feedback—improvement.” This cyclical process drives the continuous enhancement of professional talent cultivation capabilities and quality, ensuring that educational programs remain responsive to evolving student needs and academic standards.
3.2 Design Approach
The design approach is anchored in the educational principles of student-centeredness, outcome orientation, and continuous improvement. It employs a backward design methodology, beginning with the formulation of concrete and measurable learning objectives that are explicitly aligned with desired student outcomes. These objectives are designed to be teachable, learnable, and assessable, ensuring clarity and feasibility in the instructional process.
The teaching process is structured around an activity-based learning framework, with students positioned as the central agents of their educational experience. Learning activities are carefully sequenced to facilitate a progression from lower-order to higher-order thinking skills. This includes Comprehension and Understanding Activities, Application and Practice Activities, Transfer and Innovation Activities. To enhance the teaching and learning experience, the design approach emphasizes the deep integration of information technology into classroom instruction. Digital platforms and tools are leveraged to empower teaching practices, providing students with dynamic, interactive, and personalized learning opportunities. This fusion of technology and pedagogy aims to create a more engaging and effective learning environment.
A diversified evaluation system is implemented, incorporating both formative and summative assessments to provide a comprehensive understanding of student progress. Building on these traditional assessment methods, the approach also explores value-added evaluation, which focuses on measuring the incremental progress and growth of students over time. By emphasizing student growth rather than solely relying on static outcomes, this holistic evaluation framework aims to provide actionable insights into the effectiveness of instructional practices and the development of student capabilities.
3.3 Research Subjects and Methods
The present study focuses on students majoring in English from the 2022 cohort as the primary research subjects. To investigate the value-added dimension of their academic performance, the study employs a rigorous methodological framework. Specifically, paired-samples analysis is conducted using SPSS software to compare the initial and final academic outcomes of students who started the semester with identical scores. This statistical approach allows for a detailed examination of individual progress over time. In addition to the paired-samples analysis, individual student performance trajectories are visualized through line graphs. These graphical representations provide a clear depiction of each student’s developmental path throughout the semester, highlighting the nuances of their academic growth.
By combining these analytical techniques, the study aims to offer a comprehensive account of student progress, moving beyond mere summative assessments to capture the incremental changes and developmental milestones achieved over the course of the semester. This methodological design is intended to provide robust evidence of the value-added dimension of the educational experience, thereby contributing to a more nuanced understanding of student development.
4. Research Implementation
4.1. Formulation of Teaching Objectives
In accordance with the principles of Outcome-Based Education (OBE) and the methodology of backward design, the development of teaching objectives was guided by student needs, graduation requirements, and specific performance indicators. These objectives were explicitly defined to delineate the precise learning outcomes that students should achieve in the domains of knowledge, skills, and values. For instance, in the course A Course in English Language Teaching, the course-specific objectives were meticulously aligned with the broader graduation requirements. Specifically, Course Sub-objective 3 was directly mapped to Graduation Requirement Indicator 4.2 [Basic Skills], thereby establishing a clear and actionable correlation between the course content and overarching educational goals.
As shown in Table 1, Graduation Requirement Indicator 4.2 [Basic Skills] emphasizes the development of essential teaching skills, including effective communication in Chinese and English, the ability to develop teaching resources using modern educational technologies, and the capacity to design and evaluate teaching activities in a student-centered manner. Course Objective 3 is specifically designed to meet the broader graduation requirement by focusing on the practical
Table 1. Correspondence between course objectives and graduation requirement indicators.
Graduation Requirement Indicators |
Course Objective 3 |
4.2 [Basic Skills] Graduates are expected to demonstrate proficient oral and written communication skills in both Chinese and English. They should be capable of utilizing subject-specific pedagogical knowledge and modern educational technologies to develop teaching resources. Graduates should design, implement, and evaluate teaching activities in a student-centered manner, thereby gaining practical teaching
experience and developing foundational teaching competencies. |
3. In accordance with the developmental characteristics of middle school students, and in alignment with the English curriculum standards and textbook systems,
students will develop preliminary English course
resources. They will create appropriate language
contexts and conduct effective teaching design,
implementation, and evaluation to acquire
foundational teaching experience. |
application of teaching skills tailored to the developmental needs of middle school students. It integrates the development of teaching resources, the creation of effective language contexts, and the implementation of teaching activities to provide students with hands-on experience in educational practice.
4.2. Design and Implementation of Teaching Activities
In the design phase of teaching activities, this study is guided by Course Objective 3 and grounded in the principles of Outcome-Based Education (OBE). A systematic and deliberate approach is employed to develop a diverse array of teaching activities, including lesson plans and simulated teaching exercises. By elucidating the intrinsic connections between course objectives and teaching activities, each activity is meticulously aligned with the overarching course goals, thereby providing students with a clear and coherent learning trajectory and robust instructional support.
During the implementation phase, the study employs a multi-dimensional evaluation framework that integrates value-added assessment, performance assessment, and summative assessment. This comprehensive and systematic evaluation approach enables a holistic understanding of students’ learning progress and outcomes. Moreover, it facilitates real-time feedback mechanisms that allow for timely adjustments to teaching strategies. Such an adaptive and responsive approach not only optimizes the implementation of teaching activities but also ensures their effectiveness and alignment with course objectives, thereby enhancing the overall quality and impact of the educational experience.
Table 2. Alignment of course objectives with teaching activities and assessment methods.
Course Objective 3 |
Pre-Test Activity |
Post-Test Activity |
Assessment Method |
3. In accordance with the developmental
characteristics of middle school students,
and in alignment with the English curriculum standards and textbook systems, students will develop preliminary English course resources. They will create appropriate language contexts and conduct effective teaching design,
implementation, and evaluation to acquire
foundational teaching experience. |
Initial Lesson Plan (at the
beginning of the semester) |
Final Lesson Plan (at the end of the semester) |
Value-Added Assessment Performance Assessment |
|
Simulated Teaching (at the end of the semester) |
Performance Assessment |
|
Final Exam (at the end of the semester) |
Summative Assessment |
Table 2 displays alignment of course objectives with teaching activities and assessment methods. Initial Lesson Plan in Pre-assessment Activity assesses students’ initial capabilities in lesson planning and their understanding of the course requirements at the beginning of the semester. Final Lesson Plan in Post-assessment Activity assesses the development and refinement of students’ lesson planning abilities by the end of the semester, reflecting their progress and learning. Simulated Teaching and Final Exam provide a comprehensive evaluation of students’ knowledge and skills acquired throughout the semester, offering a holistic view of their learning outcomes.
Value-Added Assessment measures the incremental progress and growth of students’ abilities over the semester, focusing on the development of teaching skills and lesson planning. Performance Assessment evaluates students’ practical teaching skills and lesson planning through simulated teaching and lesson plans, providing insights into their practical competencies. Summative Assessment provides a comprehensive evaluation of students’ learning outcomes at the end of the semester through the final exam, offering a definitive measure of their overall achievement.
4.3. Evaluation Indicators for Teaching Activities
In pursuit of a more precise assessment of students’ evolving teaching capabilities, this study endeavors to surmount the inherent limitations of conventional evaluation methodologies. Such limitations often manifest in the form of insufficient sensitivity to nuanced improvements and potential biases that may compromise the objectivity of the assessment outcomes. To address these concerns, this research introduces a refined evaluation framework that prioritizes scientific validity and impartiality. Aligned with the overarching instructional objectives, this study has meticulously formulated a set of observation points and evaluation criteria specifically tailored for teaching activities. These criteria are designed to provide a comprehensive and granular record of students’ performance during teaching practice, capturing both quantitative and qualitative dimensions of their teaching competencies. The evaluation indicators for lesson planning and simulated teaching are outlined in Table 3 and Table 4.
Table 3. Evaluation indicators for lesson planning.
Evaluation Content |
Evaluation Points |
Evaluation Criteria |
Teaching
Design
Competence |
Textbook
Analysis |
Demonstrates the ability to analyze the content from the perspectives of What, Why, and How, and to construct an informative diagram that visually represents the key elements of the material (10 points). |
Student Analysis |
Exhibits proficiency in analyzing students’ learning characteristics by identifying their current knowledge, gaps in understanding, and areas of curiosity. Proposes actionable strategies to address these needs effectively (10 points). |
Teaching
Objectives |
Establishes clear and achievable teaching objectives that are aligned with curriculum standards, informed by student analysis, and reflective of the core content of the textbook (20 points). |
Teaching Design Competence |
Teaching Process |
Designs a coherent and engaging teaching process that integrates the core competencies of the English discipline and adheres to the principles of English learning activities. Demonstrates a logical flow and effective use of instructional time (30 points). |
Subject-Based Education |
Incorporates educational activities that promote moral development and curriculum integration. Demonstrates how the English discipline can be leveraged to foster holistic student development, reflecting the educational values embedded within the subject matter (20 points). |
Integration of Information Technology and Teaching |
Shows initial competence in utilizing modern educational technologies and digital platforms to develop and enhance course resources. Effectively integrates these tools into the teaching process to support student learning and engagement (10 points). |
Table 4. Evaluation indicators for simulated teaching.
Evaluation Content |
Evaluation Points |
Evaluation Criteria |
Teaching Competence |
Pedagogical
Presence |
Exhibits high levels of enthusiasm and engagement, effectively capturing students’ attention through dynamic and expressive language. Demonstrates clear, fluent, and logically structured communication, complemented by natural and appropriate body language to convey
instructional content with precision (15 points). |
Objective
Orientation |
Shows a clear understanding of students’ prior knowledge and experiences, integrating these elements to highlight teaching objectives. Effectively identifies and emphasizes key content
areas and challenging concepts, ensuring that instructional focus is maintained throughout the lesson (25 points). |
Curriculum
Integration and
Educational Value |
Seamlessly integrates course-based ideological and political education (Course Ideology) into the lesson, emphasizing the development of students’ critical thinking, problem-solving
abilities, and ethical reasoning. Demonstrates a commitment to fostering holistic student
development through value-oriented teaching practices (30 points). |
Application of Technology and
Instructional
Techniques |
Skillfully employs information technology to enhance the teaching process, ensuring that
digital tools and resources are used effectively to support learning objectives. Demonstrates versatility in the application of teaching methods, adapting strategies to meet diverse student needs and promote active engagement (20 points). |
Clarity and
Organization |
Reflects the instructional design intent through a well-organized and purposeful layout.
Features neat, legible handwriting with no spelling or grammatical errors, ensuring that key information is clearly presented and easily accessible to students (10 points). |
5. Data Collection and Analysis
5.1. Assessment of Achievement for Course Objective 3
The degree to which course objectives are achieved serves as a critical indicator for evaluating the fulfillment of graduation requirements. This assessment is foundational in determining whether students have attained the competencies outlined in the curriculum. Given its significance, the application of rigorous and methodologically sound approaches to measure the achievement of course objectives is essential.
For Course Objective 3, the level of achievement is determined through a composite evaluation framework that integrates both performance-based and summative assessments conducted at the conclusion of the course. Performance assessments capture the practical application of skills and knowledge, while summative assessments provide a comprehensive overview of students’ learning outcomes. Collectively, these assessments offer a robust and multi-dimensional perspective on the extent to which Course Objective 3 has been achieved.
Table 5. Achievement of course objective 3 for a course in English language teaching.
Graduation Requirement
Indicators |
Course Objective 3 |
Assessment Method |
Average Score |
Total Score |
Weight |
Achievement Level of Course Objective 3 |
4.2 [Basic Skills] Graduates are
expected to demonstrate proficient oral and written communication skills in both Chinese and English. They should be capable of utilizing subject-specific pedagogical knowledge and modern educational technologies to develop teaching
resources. Graduates should design, implement, and evaluate teaching
activities in a student-centered
manner, thereby gaining practical teaching experience and developing foundational teaching competencies. |
3. In accordance with the
developmental characteristics of middle school students, and in alignment with the English
curriculum standards and
textbook systems, students will develop preliminary English course resources. They will
create appropriate language
contexts and conduct effective teaching design, implementation, and evaluation to acquire
foundational teaching
experience. |
Performance Assessment: Lesson Plan |
54.34 |
60 |
0.3 |
0.85 |
Performance Assessment: Simulated Teaching |
52.99 |
60 |
0.3 |
Summative Assessment: Final Exam |
41.25 |
52 |
0.4 |
As depicted in Table 5, the correspondence between Course Sub-objective 3 and Graduation Requirement Indicator 4.2 [Basic Skills] is meticulously delineated, alongside the evaluative metrics and criteria employed to assess Course Objective 3. The achievement level of Course Sub-objective 3 is determined through the following computation: Achievement Level = 54.34/60*0.3 + 52.99/60*0.3 + 41.25/52*0.4 = 0.85.
This composite score signifies that Course Objective 3 has been attained, thereby affirming the alignment between the instructional aims and the broader curricular requirements.
Nonetheless, the development of targeted, personalized, and customized learning interventions for individual students remains a critical area for further inquiry. Future research should focus on the granular analysis of student performance data to identify specific learning needs and inform the design of bespoke instructional strategies. Such an approach is essential for maximizing educational outcomes and ensuring that each student receives the support necessary to achieve their full potential.
5.2. Holistic Value-Added Analysis of Lesson Plan
To precisely assess the development of students’ teaching abilities, this study utilized SPSS software to conduct independent-sample t-tests on both experimental and control groups at the beginning and end of the semester. This analytical approach was employed to quantify and compare the differences in teaching capabilities between the two groups. The results obtained from these analyses are presented in Table 6.
Table 6. Comparisons of pedagogical competency development: independent samples t-test.
Time |
Group |
Mean ± Std |
t-Statistic |
p-Value |
Pre-test |
Experimental Group |
83.97 ± 3.89 |
1.583 |
0.1117 |
Control Group |
82.81 ± 3.82 |
Post-test |
Experimental Group |
87.81 ± 1.82 |
2.956 |
0.009 |
Control Group |
85.42 ± 3.02 |
During the pre-test phase, the experimental and control groups achieved average scores of 83.97 ± 3.89 and 82.81 ± 3.82 respectively. T-test results (T = 1.583, P = 0.1117) showed no statistically significant difference between the groups, indicating similar pre-intervention learning levels and a fair experimental baseline. In the post-test phase, the average scores of the experimental and control groups increased to 87.81 ± 1.82 and 85.42 ± 3.02 respectively. Re-T-testing (T = 2.956, P = 0.004) revealed a statistically significant difference in scores between the two groups. Specifically, the experimental group’s scores were significantly higher than those of the control group, strongly suggesting that the experimental intervention had a substantial positive impact on the experimental group’s performance.
To quantify the value-added effect in students’ lesson plan design capabilities, this study utilized SPSS software to perform a paired-samples t-test analysis on the scores of lesson plan designs assessed at the beginning and end of the semester. The results of this analysis, presented in Table 7, offer a detailed statistical characterization of the changes in lesson plan design scores, including their associated significance levels. These findings elucidate the extent to which the instructional interventions and learning experiences throughout the semester contributed to the development of students’ lesson planning competencies.
Table 7. Longitudinal changes in instructional design proficiency: paired samples t-test.
Time |
Mean ± Std |
t-Statistic |
p-Value |
Pre-Test |
83.97 ± 3.89 |
4.549 |
0.001 |
Post-Test |
87.81 ± 1.82 |
Over the course of the semester, the mean score for lesson plan design exhibited a significant increase from 83.97 to 87.81, corresponding to a value-added gain of 3.84 points. This enhancement underscores the marked improvement in students’ lesson plan design competencies following a semester of teaching practice. Additionally, the standard deviation decreased from 3.89 at the beginning of the semester to 1.82 at the end, indicating a reduction in the dispersion of lesson plan quality around the mean. This trend suggests a convergence in quality, reflecting greater consistency in students’ lesson plan design abilities. Such consistency is indicative of enhanced standardization and normalization in students’ approach to lesson plan design. The paired-sample t-test results further substantiate these findings, revealing a highly significant difference between the initial and final lesson plan design scores (T = 4.549, P = 0.001). This statistical significance underscores the substantial improvement in lesson plan design capabilities achieved over the semester, thereby confirming the effectiveness of the instructional interventions and learning experiences in fostering students’ pedagogical skills.
This study employed paired-sample t-test analysis to evaluate the progression of students’ lesson plan design capabilities over the course of the semester. The results indicate a statistically significant improvement in students’ overall lesson plan design proficiency at the semester’s conclusion compared to its commencement, thereby substantiating a meaningful value-added effect. This finding underscores the efficacy of instructional interventions in fostering students’ pedagogical design skills and holds significant implications for enhancing teaching quality through optimized lesson plan design. However, acknowledging the heterogeneity of student learning trajectories, the specific value-added outcomes for individual students warrant a more granular analysis. Future research should therefore incorporate individual value-added trajectory analyses to provide a comprehensive understanding of each student’s developmental progress. Such an approach would facilitate the identification of personalized learning needs and inform targeted instructional strategies, thereby maximizing educational outcomes for all students.
5.3. Analysis of Individual Value-Added Variations in Lesson Plan
To elucidate the individual value-added profiles of students, this study utilizes individualized line graphs to longitudinally track and compare the value-added changes in lesson plan design scores. As depicted in Figure 1, the analysis provides a granular examination of the fluctuations and trajectories in individual student performance. This approach reveals the nuanced variations in learning progress, highlighting the heterogeneity of educational gains among students.
Figure 1. Individual value-added trajectories of students.
At the beginning of the semester, scores (depicted by the blue line) were distributed between 75 and 92, exhibiting notable variability. This fluctuation reflects the initial heterogeneity among students in their lesson plan design competencies. In contrast, scores at the end of the semester (depicted by the red line) were distributed between 85 and 95, indicating a substantial overall improvement in academic performance. This result suggests that after one semester of simulated teaching practice, students’ performance in simulated instruction has significantly advanced. This improvement is not only evident in the higher scores but also reflects a deeper understanding and mastery of the simulated teaching content.
Despite the overall improvement, individual differences persist. Notably, students such as numbers 12, 26, and 27 demonstrated substantial progresses in their lesson plan design, with value-added gains exceeding the group mean. This indicates that the instructional activities of the semester had a positive and significant impact on their development. Interviews with these students revealed that the interactive, practical, and contextual nature of the lesson plan design activities enhanced their motivation and engagement, thereby contributing to their academic gains.
Conversely, a group of students, including numbers 9 and 23, experienced a decline in their scores. This regression may be attributed to various factors, such as individual attitudes toward learning, adaptation to lesson plan design activities, and personal learning strategies. These factors may influence students’ understanding and mastery of the simulated teaching content, ultimately affecting their academic performance.
Additionally, a stable development group, including students such as numbers 10 and 20, exhibited value-added gains that did not reach statistical significance. However, compared to students with similar initial scores, the extent of their value-added progress requires further investigation. In future teaching practices, it is essential to employ more detailed data models to analyze these differences. This approach will allow for a deeper focus on individual student differences and the implementation of differentiated instructional strategies to enhance the effectiveness of simulated teaching activities. Such efforts are crucial for maximizing educational outcomes and ensuring that all students receive the support necessary to achieve their full potential.
6. Conclusions
This study has been intricately anchored in the principles of Outcome-Based Education (OBE), conducting a rigorous exploration across two pivotal dimensions: the achievement of course objectives and value-added evaluation. Adhering to a student-centered and outcome-oriented approach, the study meticulously delineated course objectives and employed backward design to systematically orchestrate teaching activities. By leveraging a multifaceted assessment framework, the study dynamically monitored the entire teaching process. At the end of the semester, the study integrated performance-based and summative assessment results to conduct a precise analysis of the degree of course objective achievement. This comprehensive evaluation strategy effectively promoted teaching through assessment and fostered continuous instructional improvement by using assessment to drive change.
The integration of value-added evaluation has placed a strong emphasis on individual student growth and progress during the learning process, focusing on the potential for student value-added. By comparing pre- and post-test data and employing paired-sample t-tests using SPSS software, the study identified a significant overall improvement in students’ lesson plan design capabilities. However, further individual trajectory analyses revealed heterogeneous outcomes: while some students demonstrated substantial progress, even reaching a “ceiling effect,” others stagnated or regressed. Through interviews with individual students and multi-dimensional analyses of value-added data, examination responses, and process-oriented learning materials, the study explored the impact of teaching methods and student engagement on learning outcomes.
Nevertheless, the present study is constrained by limitations in the selection of research participants, characterized by a modest sample size and a limited disciplinary scope. These limitations may restrict the broader applicability of the findings. In future research, it is essential to expand the sample size and broaden the disciplinary range to enhance the generalizability and representativeness of the results. Future research will continue to explore how to leverage information technology to build a dynamic student growth tracking system and integrate value-added models to deeply analyze individual student development trajectories. This approach will provide more targeted support for instructional improvement and student personalization, ultimately enhancing the alignment between teaching practices and student needs.
Funding
Research on the Reform and Innovation of English Teacher Education Curriculum under the ‘Comprehensive Ideological and Political Education’ Framework: A 2022 General Project of Shandong Province Undergraduate Teaching Reform (Project No.: M2022004).
Research on Reform and Innovation of Curriculum-based Ideological and Political Education in Teacher Training Courses for English Majors in Higher Education: A 2022 General Undergraduate Teaching Reform Project of Qilu Normal University (Project No.: JG202221).
Conflict of Interest
The author declares no conflicts of interest.