An Examination of Field Experiences as They Relate to InTASC Standards: A Retrospective Pilot Study for an Educator Preparation Provider

This paper outlines the efforts made by a teacher preparation program to examine the way field experiences are implemented and structured. A retrospective approach is taken to examine the educational preparation provider’s (EPP) current practices and structure field experiences with greater intentio-nality. A pilot study is designed for the teacher preparation program to align experiences to applicable InTASC standards and better define requirements. Throughout the paper, a strong emphasis is placed on field and clinical practice as an integral part of the preparation of preservice teachers.


Introduction
Educator Preparation Providers (EPPs) cope with a variety of pressures when it comes to higher education quality assurance: the tension of meeting rigorous accreditation standards, the difficulty of keeping up with federal and state demands, as well as the struggle to create new and improved approaches to assessment (Ewell, 2009). As Darling-Hammond (2014) states, "The question how to strengthen teacher education is increasingly at the forefront of U.S. education policy-making, as the demands on teachers to teach ever more challenging curriculum to ever more diverse learners continue to increase exponentially" (p. 547). A large part of these demands is centered on field and clinical practice. For example, the Council for Accreditation of Educator Preparation (CAEP) dedicates an entire standard to "Clinical Partnerships and Practice" (CAEP, 2015).
The standard states that high-quality clinical practice is essential for aspiring teachers to acquire the appropriate knowledge, skills, and dispositions to have a positive impact on P-12 students (CAEP, 2015). Based on a need to promote continuous improvement efforts in the area of field experiences, an examination of current practices in field experience and clinical practice was warranted due to increasing demands and national calls for reform (Capraro, Capraro, & Helfeldt, 2010). Darling-Hammond (2014) states, "Efforts to improve teacher education have recently focused in on the importance of well-supervised clinical practice as a critical element of effective preparation" (p. 547). Outside entities such as accreditors and policy-makers have placed a strong emphasis on field experiences being an integral part of EPPs (Capraro, Capraro, & Helfeldt, 2010;Darling-Hammond, 2014). Teacher preparation programs must respond to this call and rethink the way learning experiences are structured for pre-service teachers to have opportunities to better integrate theory and practice in real-life classroom settings. Specifically regarding this study, a retrospective approach was taken to examine the EPP's current field experiences to improve their standard alignment, content, and sequencing. Alignment was the focus of the first step in analyzing current practices.

Meeting Accreditation Standards
Educational Preparation Provider's continuous improvement efforts are motivated by several factors including positive student outcomes and seeking and maintaining accreditation. Accreditation, the process of external quality review, is carried out by a variety of councils and associations for specific purposes.
CAEP, for example, reviews EPPs as a part of both institutional and professional accreditation cycles. Efforts related to field experience and clinical practice are directly related to CAEP Standard 2 (Clinical Partnerships and Practice) and the Interstate Teacher Assessment and Support Consortium (InTASC) categories (Salazar, 2015). InTASC "offers a new vision for preparing, supporting, evaluating, and rewarding teachers along their careers" (Salazar, 2015, Slide 10, para. 1) and identifies four categories of core teaching standards: the Learner and Learning, Content, Instructional Practice, and Professional Responsibility. For the purpose of this study, improvement efforts with regard to field experiences were focused on integrating InTASC categories as recommended by CAEP in order to promote continuous improvement of the EPP.

Current Practices
The EPP is a moderately sized, regional university located in southern United States. Current student enrollment is 6488 with 5896 undergraduate students and 592 graduate students. Ninety-three percent of the student population originates from in-state, 5% from out-of-state, and 2% as international students. The EPP Each program within the EPP requires a certain number of field experience hours per level. Hour requirements for each level are specified by each course within a program to satisfy state guidelines. Implementation of field experiences is scaffolded; therefore, upper-level courses include a greater number of level 3 experiences. Charts were developed for initial programs within the EPP, which specifies the hours and levels of experiences for courses. Before the EPP examined current practices, faculty were well versed in field experiences within their own courses but ill-informed on the implementation of field experiences across coursework and programs. This lack of awareness of field experiences across programs hindered their abilities to scaffold candidates' experiences properly. Pre-service teachers receive placement for their experiences by the EPP's Field Experience Coordinator, and data for each experience are entered manually into the EPP's assessment system, LiveText, by the candidates. Pre-service teachers enter self-reported data of their experiences using an online survey format within LiveText. These data include, but are not limited to, the level of the field experience, location and date of the field experience, subject area(s) and grade level(s) in which the experience took place, ethnicity and gender of the supervising school personnel and the duration of the experience (See Appendix A for Field Experience Demographics Form). Field experience data are exported annually as a part of the EPP's unit data collection processes. Data are also disaggregated by program and shared with faculty and program coordinators.
Despite these efforts, the current processes for field experiences within the EPP leave some areas of concern. While specific courses are assigned a minimum number of hours to be earned within programs, no other specifications were aligned to these field experiences. The only information available was self-reported data from LiveText. This warranted an investigation into current practices. Additionally, the EPP was aware that all institutions must make a more conscious effort to establish better action-based measures. The question arose: How can the EPP provide more actionable data to improve field experiences?

The Pilot
Within the EPP, faculty members are assigned to specific CAEP standards according to their areas of expertise. The CAEP Standard 2 committee is co-chaired by two faculty members: the Field Experience Coordinator and the Director of Student Teaching. Based on prompting from state evaluators, the question of how to provide more actionable data to improve field experiences was posed to the Standard 2 Committee. In Spring 2016, this committee met to discuss concerns related to field experience and clinical practice.
The EPP was prompted to examine the scope and sequence of pre-service field experiences following feedback from the EPP's on-site state review process.
Feedback from the outside review team indicated that there were a sufficient amount of experiences within programs, but there were other areas of concern that needed to be addressed. As Capraro, Capraro and Helfedt (2010) state, "Bridging the gap between theory and practice does not automatically occur simply as a result of participating in field experiences" (p. 132). In this regard, the Standard 2 Committee found that there was some disconnect between coursework and experiences. Additionally, the committee found there could be greater collaboration between faculty with a focus on the sequencing and alignment of experiences. This applied to the progression of experiences within courses, between courses, and throughout programs. The Standard 2 Committee agreed to pilot a study that examined alignment of field experiences to the sound underpinnings of InTASC categories and provide in-depth descriptions of field experiences required in select teacher education courses.

Methodology
The CAEP standard two committees met in spring 2016 to determine the procedure for the pilot study. This committee was comprised of the Director of Student Teaching who taught secondary ELA, the field experience coordinator, the assessment coordinator and three faculty members who represented early childhood education, elementary education, middle school, and secondary education. The committee determined that the pilot study would begin with faculty participants who would complete a "field experience matrix" (see Appendix B for the Field Experience Matrix). This matrix required each participating member to align courses to InTASC categories and identify certain elements of field experiences within their courses. The committee determined that in order to allow for field experience data to be collected across programs, all members of the field experience committee (six total) would be asked to be participants in the study. The six faculty members represented a strong cross-section of courses offered by the EPP. In order to thoroughly analyze the required experiences in each course, the field experience committee developed a data collection matrix that summarized important elements of field experiences. The elements were identified as the Order in which the field experience was offered, Level (1, 2 or 3), Type (video, observation, small groups, tutoring, interview, case study, or E. Block et al. whole class-instruction), Quantity (required hours in field), Relationship to the InTASC Category (InTASC category that is most closely aligned to experience as determined by the professor), and a Description (summary of tasks required of candidates in the field). Participating faculty members were given two weeks to complete the matrices for their courses.
At the completion of phase one of the pilot study (end of spring 2016), the Standard 2 Committee collected matrices from three of the six committee members for six different courses (Appendix C). The following courses/faculty members agreed to participate in the pilot study: 1) EDUC 312: Planning for Teaching in Multicultural Classrooms which all undergraduate candidates must successfully complete with a grade of C or higher in order to progress to methodology courses; 2) EDUC 421: Current Practices and Strategies in Teaching which all certification-only candidates must pass with a C or higher in order to progress to methodology coursework; 3) FCED 239: Preschool Practicum which undergraduate candidates in early childhood education must successfully complete prior to student teaching; 4) EDCI 573: Curriculum and Methods for Early Childhood Special Education which is completed by candidates in the Master's Degree in Early Childhood Education; EDCI 579: Practicum in Early Childhood Education which is completed by candidates in the Master's Degree in Early Childhood Education; and 5) EDCI 580: Interdisciplinary and Interagency Teaming in Early Childhood Education. These courses represented a cross section of candidate classification, major, and coursework (undergraduate, graduate, and certification-only).
The committee met to review matrices and determine next steps for phase two of the pilot: implementation in summer courses. Even though faculty members aligned field experiences to InTASC standards by course on the matrices, data for these pilot courses had to be collected and aggregated through candidate self-reporting in LiveText. Along with the collected matrices, corresponding changes were made to the field experience form in LiveText that pre-service teachers complete after conducting their experiences. To collect data on alignment to InTASC categories in phase two of the pilot, the Assessment Coordinator added a dropdown menu where pre-service teachers chose one of the four InTASC categories as designated by their instructor and the nature of the field experience. As a part of the pilot, any members of the committee teaching courses which required field experiences were asked to complete a field experience matrix and have students document the appropriate InTASC category in their field experience forms.
Before collecting alignment to InTASC data via LiveText, the committee was able to analyze and compare the matrices and made the following observations: faculty descriptions and number of InTASC categories aligned to each experience varied by instructor. This presented two issues: 1. The dropdown menu in LiveText would only allow for one InTASC category to be assigned to each experience, and 2. some descriptions might not include adequate information of each experience and purpose. To remedy the first issue the committee decided that experiences should be aligned to the most applicable InTASC category that applied best to the experience. To address the second issue; in the event that a more detailed description was needed, the committee chair would meet with faculty members to get additional information.
As candidate reporting of field experience alignment to InTASC standards was critical to the analysis of data, the assessment coordinator worked with the three faculty members involved in the pilot study on how to implement information from the matrices in their courses. The faculty members were asked to review the rationale for pre-service field experiences to their candidates, specific alignment to InTASC standards, and why this alignment is critical in producing teachers who create success for each K-12 student in his/her future classrooms.
Pilot faculty taught their six courses in summer 2016 and integrated InTASC content and procedural tasks on reporting this content in LiveText throughout their respective courses. Candidate data collection was completed in July, 2016.

Results
At the completion of Summer 2016, the Assessment Coordinator pulled data from LiveText to examine the implementation of the pilot using the six matrices from the three participating faculty members. Table 1 and Table 2 provide a summary of that data as it relates to InTASC alignment. Raw data is provided in Appendix D.  indicative of instructional practices in the classroom (46%) and least indicative of content (5%). Candidates who completed tutoring experiences in the classroom (FE2) indicated that these experiences were most reflective of InTASC category "Instructional Practice" (41%) and least reflective of InTASC category "Content" (13%). Candidates who completed whole class instruction (FE3) indicated that these field experiences were most closely aligned to "Instructional Practice" (47%) with the lowest alignment to "The Learner and Learning" (6%) followed closely by "Content" (14%). "Professional Responsibility" was consistently represented in alignment across field experience levels with 37% in FE1, 30% in FE2 and 33% in FE3. InTASC category "Instructional Practice" was also relatively stable across field experience levels at 46% (FE1), 41% (FE2), and 47% (FE3).

Discussion
Even with a small sample size of six Teacher Education courses participating in this pilot study, an adequate response rate was received allowing the authors to draw tentative conclusions leading to future research opportunities. In reviewing the categorization of field experiences to InTASC, the authors observe that "Instructional Practice" was consistently ranked as the highest competency noted in observations (FE1), tutoring (FE2) and whole group teaching (FE3). "Content" is consistently ranked as the lowest competency noted in FE1 and FE2 with the exception of "The Learner and Learning" as the lowest ranked category in FE3.
This disparity in representation of InTASC categories gives the authors pause as all four categories should ideally be scaffolded and sequenced with a different emphasis throughout the programs. As candidates progress through their teacher preparation programs, field experiences should be sequenced with equal priority given to "The Learner and Learning," "Content," "Instructional Practices," and Professional Responsibility" at different points throughout the coursework. If consistent focus is given to one category (Instructional Practices) in all three types of field experiences, candidates may not have the foundational content and pedagogy needed to be successful in the classroom. In sequencing field experiences, it is important for faculty to review all courses, the order in which they are completed, and how "Content" and "The Learner and Learning" can be more predominantly represented in FE1 and FE2 experiences with "Instructional Practices" and "Professional Responsibility" represented across FE2 and FE3 experiences. This intentional design of field experiences will provide candidates with growth and scaffolding across their programs. While these data were limited to the faculty participating in the study, the committee found enough evidence to expand the pilot to all faculty within the EPP. Additionally, with data from an entire academic year, the committee would be able to evaluate data that is more representative of the entire pre-service teacher population and of field experiences and clinical practice in general.

Limitations
There were several limitations within this pilot study. One limitation was the implementation of the study within the summer semester. The availability of field experience placements is limited due to most PK-12 schools being out for the summer. This affects what courses are offered and how field experiences are assigned. Additionally, the summer months are restricted to a much smaller sample that may not be representative of the EPP's entire pre-service teacher population. Secondly, the findings for this pilot, already limited, presented stronger internal validity rather than external validity. Since the study was restricted to one EPP, there is limited evidence to support that this pilot and future recommendations could be successfully implemented for other education providers. The third limitation was the use of only three faculty members to begin the pilot study. Stronger conclusions cannot be drawn from the evidence until all faculty members within the EPP are documenting their current practices and implementing the changes in their courses. This, in conjunction with an entire academic year of data, will provide more valid findings and actionable figures. Additionally, it will give the EPP a holistic view of the alignment, content, and sequencing of all field experiences for each program as well as the unit.

Recommendations
Staying true to the nature of a pilot study, there are multiple recommendations for the EPP that are vital to future research efforts and continuation of this initiative. The first recommendation is to move from the pilot study to full implementation within the EPP. Significant findings cannot be made until all faculty are involved in the new processes and more data are collected. The second rec-E. Block et al.
ommendation is to collect data on an annual basis. A full year of data including the summer, fall, and spring semesters of an academic year would provide a larger sample size and more accurate representation of pre-service teachers and their experience in the field. Furthermore, this will integrate well into the EPP's already established assessment cycle with field experience data collection in the summer, analysis in August, collaboration with stakeholders in October, and recommendations and proposed changes in November. Most significantly, this pilot study did not explore best practices on sequencing field experiences for pre-service teachers. This restructuring of field experiences is an area ripe for research by this team in future studies and should be based on thoughtful scaffolding of InTASC standards across experiences and programs. With a state-mandated shift to a one-year residency program rather than one semester of student teaching, it is essential for the authors to move beyond the examination of field experiences as they relate to InTASC standards and determine how to restructure field experiences for each program based on the InTASC classification. Using data from the pilot study will guide faculty members in a comprehensive examination of the scope and sequence of field experiences so that candidates enter their culminating semesters of residency with the knowledge, skills, and dispositions to be successful.

Summary
This pilot study served as the EPP's foundation for examining field experiences and their relationships to InTASC categories as reported through matrices and candidate self-report. The CAEP Standard 2 Committee found the information provided by the participating faculty to be beneficial to the unit's self-study. The committee also found that if matrices were provided by all faculty, the EPP could continue its self-evaluation in a more comprehensive manner. In addition, these efforts have the ability to increase collaboration between faculty and diminish disconnect between instructors, coursework, and experiences. Despite the ambiguity in the past surrounding the various field experiences being implemented within the EPP, a retrospective approach was taken to better define, align, and sequence these practices. The EPP plans to follow through with the committee's recommendations to expand the pilot to all faculty members, commence efforts to evaluate and restructure field experiences, and to document and report on its findings.

1) Enter a name for the field experience(s).
2) Today's Date.
3) Your First Name.

Appendix B-Field Experience Matrix
Instructor: Course: Order Level Type Quantity Relationship to InTASC Category Description Order-The order in which the field experiences are completed. Most likely, this will be in numerical order going down as 1, 2, 3, etc.
Type-Options are observation, video, interview, small groups, tutoring, whole class instruction, lesson implementation, etc.
Quantity-How many of these field experiences are completed in the course.
Relationship to InTASC Category-Options are "The Learner and Learning," "Content Knowledge," "Instructional Practice," or "Professional Responsibility." Candidates must choose one of these four options, designated by you the instructor, in their field experience forms in LiveText. For more information on The candidate will apply FE 2 (Small Group Instruction), which aligns with the InTASC category (The Learner and Learning-all 3 standards) because it relates to teacher comprehension of individual development, understanding individual differences and the inclusion of diverse populations as well as the importance of collaborative learning. The candidate will also apply FE2 (Small Group Instruction), which aligns with the InTASC category (Instructional Practice-specifically Standard #8) because it relates to implementation of a variety of instructional strategies to build academic skills in meaningful ways. The candidate will also apply FE2 (Small Group Instruction), which aligns with the InTASC category (Professional Responsibility-both standards) because the candidate is required to interview the supervising teacher regarding the state evaluation tool-this aligns with engaging in professional learning and collaboration. Creative Education The candidate will apply FE3 (Whole Group Instruction), which aligns with the InTASC category (Content-Both standards) because it relates to teaching the academic content using differing perspectives to engage learners. The candidate will apply FE3 (Whole Group Instruction), which aligns with the InTASC category (Instructional Practice-Both standards) because it relates to assessment, planning, and instructional strategies. The candidate will also apply FE2 (Small Group Instruction), which aligns with the InTASC category (Professional Responsibility-both standards) because the graduate candidate is required to interview the supervising teacher regarding the state evaluation tool-this aligns with engaging in professional learning and collaboration.