· Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students...

34
Template for Initial Teacher – Section III. May 18, 2016 EDUCATION PROFESSIONAL STANDARDS BOARD III. Program Profile: This profile describes a program category, which includes potential variations of program offerings. Each instance or variation must be distinguished among the others in order to ensure regulatory compliance. Please see the “Program Review Technical Guide” for additional details. Program Identification Name of the Program Category: Middle School - English Grade Levels: (check all that apply) B-P P-5 X 5-9 5-12 8-12 P-12 Program Classification: (check all that apply) X Undergraduate Undergraduate – Cert Only Graduate Graduate – Cert Only Program Route: (check all that apply) XTraditional Option 6 Option 7 Program Sites: (check all that apply) X Main/Residential Campus Off-Site Campus (list each location) Campus Name City Lindsey Wilson College Columbia Delivery Modes: (check all that apply) X Face-to-Face Only Online Only Hybrid EPP Submission Coordinator: If Option 6 - provide Program Coordinator: Name Dr. David W. Moffett Name ___________________________ Name of Institution Program Template Part III Page 1 /home/website/convert/temp/convert_html/5e28ce57957faa1e237ae353/document.docx

Transcript of  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students...

Page 1:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

EDUCATION PROFESSIONAL STANDARDS BOARD

III. Program Profile: This profile describes a program category, which includes potential variations of program offerings. Each instance or variation must be distinguished among the others in order to ensure regulatory compliance. Please see the “Program Review Technical Guide” for additional details.

Program Identification

Name of the Program Category: Middle School - English

Grade Levels: (check all that apply) ☐ B-P P-5 X 5-9 ☐ 5-12 8-12 P-12

Program Classification: (check all that apply)X Undergraduate ☐ Undergraduate – Cert Only☐ Graduate ☐ Graduate – Cert Only

Program Route: (check all that apply)XTraditional ☐ Option 6 ☐ Option 7

Program Sites: (check all that apply)X Main/Residential Campus ☐ Off-Site Campus (list each location)

Campus Name City

Lindsey Wilson College Columbia

Delivery Modes: (check all that apply)X Face-to-Face Only ☐ Online Only ☐ Hybrid

EPP Submission Coordinator: If Option 6 - provide Program Coordinator:

Name Dr. David W. Moffett Name ___________________________Phone 270-384-8135 Phone ___________________________Email [email protected] Email ____________________________

Name of Institution Program Template Part III Page 1/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 2:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Program Experiences

Program Innovations: (Optional)Program-Initiated Innovations. These innovations may span over the most recent three years and should include all variations within this program category.

Professional and content courses in the program have remained the same since the 2013 NCATE visit. The core level general education courses cause a total program hours range between 120-124 hours, with 120 hours being the college required minimum.

Program Curriculum:

Each EPP must inform a potential candidate about the program’s content, performance expectations and assessment processes.

How does the EPP communicate/Identify below the following program requirements: required coursework and electives, certification and/or degree result, admission requirements, exit requirements, Praxis II test disclaimer. If the EPP offers multiple program routes for this category and certification, include each variation.

The EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in the advising. Freshmen have first year advisors beyond the EPP. The EPP works closely with the external first year advisors to ensure Education students are advised in which courses they should take. In the two required courses Education students take, prior to being considered for admission into the program, instructors provide program requirement information. The 15 criteria for being considered for admission are also available in the annual college catalog. This same information is provided within the EPPs Student Handbook, which is available online. Students’ advisors regularly meet with Education students to advise them regarding Praxis I examinations, criteria for successfully being admitted into the EPP (Stage 1), criteria for successfully being admitted into Stage 2 (upon completion of all required program course work and candidacy for practicum and student teaching), the relevant Praxis II examinations, and criteria for successful program and degree completion (Stage 3). Additionally, the EPP provides field experience and student teacher handbooks to students.

Admission criteria for each program code in this category: This must include admission criteria such as GPA, admission assessments, evidence of Code of Ethics and Character and Fitness Review. Reference the applicable program code(s) and regulations (i.e., 16 KAR 5:020, 16 KAR 9:080, 16 KAR 9:090) and the “Program Review Technical Guide” for additional details. Information provided below should correlate to the QAS documentation.

Admission criteria are available to students, via the college catalog, both in hard copy form and online. Additionally, students sign a statement prior to being admitted to the program, stating they have read the EPP student handbook, including the admission criteria. This signed form is placed in each student’s hard copy file, maintained by the EPP. These are the criteria:

Pre-Entry to Education Program(Criteria for Admission)1 Application for admission to teacher education program2 Completion of 30 hours of semester work*3 Complete 1 semester @ LWC4 * or, CUM GPA 2.75 of last 36 hrs of coursework5 Completion of Code of ethics

Name of Institution Program Template Part III Page 2/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 3:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

6 EDUC 3403/3413 fundamentals7 Complete 3 Core Praxis I tests8 Complete required courses

Engl 1013/1023 writing studies class Comm 2103 public speaking class Math class

9 Candidate handbook acknowledgement10 3 Pre-entry Dispositions

Fundamentals class Teaching profession class Content class

11 Must have acceptable scores on disposition assessment 12 Stage 1 Portfolio - critial thinking & creativity13 Stage 1 Portfolio - performance & interview14 On-demand writing w/satisfactory performance15 30 hours field experience w/satisfactory scores

Pre-Student Teaching Experiences: (Option 6 will skip this section)How does the program ensure candidate’s pre-student teaching experiences meet the requirements as outlined in 16 KAR 5:040 Section 3(3)?

(a) Engagement with diverse populations of students which include:1. Students from a minimum of two (2) different ethnic or cultural groups of which the candidate would not be considered a member;2. Students from different socioeconomic groups;3. English language learners;4. Students with disabilities; and5. Students from across elementary, middle school, and secondary grade levels;

(b) Observation in schools and related agencies, including:1. Family Resource Centers; or2. Youth Service Centers;

(c) Student tutoring;(d) Interaction with families of students;

(e) Attendance at school board and school-based council meetings:(f) Participation in a school-based professional learning community; and(g) Opportunities to assist teachers or other school professionals.

Clinical Requirements by Course State Regulation Field Requirements by Course

EPSB regulation for 200 field hours must be completed prior to student teaching. Requirements listed in Table I. Distribution for each course’s field requirements can be found on Table II. Due dates throughout semester for completion of portion of field hours will be set by Ed Division. Instructors must require students to complete assignments according to these due dates.

Table I – Middle Grades Ethni

cSoc

ELLESL

Disab

E/M/S

Family Resour

Camp Safari@A

Tutor

Family

SB/

PLC

Observe

Assist

Video

Name of Institution Program Template Part III Page 3/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 4:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Course2

groups

Ec ** ceCenter

CES /Casey

@ACPC

SBC

Master

Teacher

Teach

EDUC 2123Teach Prof E/M/S 10 Hrs

EDUC 3123 Principles of

Lifelong Learning

EDUC 3143Except Learner

10 Hrs E/M/S

EDUC 3403Fundamenta

lsX

Tutor 3 Hrs E/M/S 2 3 12 Hrs

EDUC 3523Reading/WritingContent

Area

X X 3 Hrs X

E/M/SMid Sec

Music

5 hr MS2 Hrs

20 Hrs placement grade level

readingX

EDUC 4103 Measureme

nt & Assessment

E/M/S

15/ (25 Art)Work with school

assessment/ CIITS/ PGES/ data

analysis

EDUC 4463Class Mgmt

E/M/S 1 hr2.5Hrs

Teach/Tutor

2 Hrs Read Adair

1.5 hr

23 Hrs(12 Hrs P-5)

placement in major

X

EDUC 4433Curriculum & Methodology

*X S

3 HrsReadAdair X 27 Hrs X X 30

Total hours per program 1604603

Practicum X X X X 40

Total Field Hours Required Prior to Student Teaching 200

*Participating/attending Professional Learning Community must be included (Russell County incorporates as common planning).

** Audit of placement across field experiences occurs at start of EDUC4263/4463 Classroom Management, to ensure all pre-service candidates have experiences across all levels.Placement Note: When qualified teachers are limited or the instructor wants several teacher candidates to work with a particular teacher, arrangements will be made with the public school teacher to take several teacher candidates on a rotating basis. The instructor will create a schedule in conjunction with the public school teacher assigning candidates specific class periods and/or dates to complete the assignment. Candidates must complete the field assignment as scheduled.

Describe the culminating Clinical/Professional Experiences for each instance in this program category: Reference the regulation 16 KAR 5:040 Section 6 about professional experiences. The Option 6 instance of this program category can ignore this section since the program must use KTIP as the culminating experience.

The P-5, Physical Education/Health, Physical Education, Middle Grades Education, Music Education, and Art Education programs have placements across two levels (P-5, MGE, 8-12) during the Student Teaching semester The 8-12, Secondary Education program has one placement across the semester. Student teachers have a total of 70 days, 6.5-7 hours per day. Some placements are seven weeks and some are eight weeks. Duration depends on the calendar, and when Student Teachers reach

Name of Institution Program Template Part III Page 4/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 5:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

the 70th day. Student Teachers initially observe and assist teacherrs. Then, they begin their teaching experience by teaching one class, and eventually they solo teach (This sequence is available in the EPPs Student Teacher Handbook). Candidates design and teach a ten-day unit plan of instruction. They also develop a Leadership and Professional Growth Plan. Additionally, they write and evaluate an ethical case study. They also evaluate a video lesson of their teaching, and complete their credential file. All student teachers co-teach. They solo teach for a total of four weeks. For those teaching in two placements, they solo teach for two weeks in each placement. 8-12, Secondary Education students solo for four weeks in their single placement.

Exit requirements for each instance in this program category: This must include exit assessments. ( i.e.,: KTIP assessment, portfolio/work sample, GPA, and if the program requires passing or taking the Praxis II for program completion, list it here.) Reference CAEP 3.5 and 3.6

Stage 3 (Completed Education Program) - Graduated from LWC

2.75 Minimum Cumulative GPA2.75 Minimum Professional GPA2.75 Minimum Content GPASuccessful completion of Stage 3 Exit PortfolioMust have completed:Student TeachingAll Professional coursesAll Content coursesAll General Education coursesAll Elective hoursPre-service Candidate completion of CA-1 application formTranscript audited and approved by RegistrarGraduation from college

Note: Pre-service candidates are not required to pass Praxis II prior to graduating from the college.Pre-service candidates are required to register for Praxis II tests in the semester prior to student teaching.Upon successful passing of all required Praxis II tests:EPP Certification Officer Sends CA-1 application to EPSBExit student from EPSB website

Kentucky P-12 Curriculum Requirements

The following information is gathered in accordance with Kentucky Senate Bill 1 - http://www.lrc.ky.gov/record/09RS/SB1.htm and the associated legislation tied to this bill.

How does the EPP ensure each candidate’s knowledge/proficiency of the Kentucky Academic Standards (KAS)? How does the EPP measure the depth of knowledge of each candidate?

The Kentucky Academic Standards (KAS) are introduced early in all programs. Candidates study KAS across subject areas in the beginning education courses. When candidates declare their majors, then their focus of study becomes the Kentucky Academic Standards for their content area(s). Kentucky Academic Standards are utilized in all lesson plans, units, learning centers, web quests, and all other items designed by candidates for instruction of P-12 students. The instructor for each class assesses all candidate work to make sure the standards are used in an acceptable manner. The EPP evaluates the

Name of Institution Program Template Part III Page 5/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 6:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

candidates use of the standards at each of the three stage assessments. The stage assessments measure the KAS depth of knowledge of each candidate through portfolio evaluations (Stages 1, 2, and 3), an interview (Stage 1), and a pre-service candidate presentation (Stage 2). 

Briefly describe how candidates use the Kentucky P-12 curriculum framework and the Kentucky P-12 assessment system to guide instruction.

Across each pre-service candidate’s program, students create lesson and unit plans that are Kentucky P-12 curriculum framework driven. Students select standards that drive the components of the lesson plans. Naturally, the P-12 currriculum is aligned with the P-12 assessment system. Students’ lesson and unit plans reflect this alignment. Field experiences, practicum, and student teaching supplement pre-service candidates’ knowledge of the Kentucky P-12 curriculum framework and assessment system as well. Additionally, the EPP features a standalone assessment course which all pre-service candidates take. The course includes instruction in P-12 K-Prep, ACT, College and Career Readiness.

Provide evidence (KTIP assessments/portfolio/other data) of candidates’ use of the KAS framework in lesson plans (include lesson plan format if not using the current KTIP format).

The EPP uses the KTIP instrument for assessment of teaching episodes of all its pre-service candidates. The EPPs lesson plan is based on the KTIP plan, but has more content that explicitly features the detailed use of the KAS Framework. The lesson plan is found in the EPP Candidate Handbook and Student Teaching Handbook.

Provide evidence of candidate’s abilities to create and use formative and summative assessments to guide instruction toward mastery of the Kentucky P-12 curriculum framework.

The EPPs pre-service candidates receive instruction in lesson plan design in the early Fundamental courses. Their lesson plans become part of the Stage 1 Portfolio judged by committees for students being admitted into the programs. In the early courses, pre- service candidates are instructed on the use of the Kentucky P-12 Curricula Framework. Candidates learn to design instruction using formative and summative assessment strategies.

As candidates progress through their programs, they design units of instruction that range from five to nine days of formative assessment, with a summative assessment at the end of the unit. Ultimately all pre-service candidates create 10 day unit plans of instruction containing pretests, formative assessments, and summative posttests. These unit plans become part of the students’ Stage 2 portfolios that are judged by committees, for student entry into Stage 2. During the student teaching semester, the student teachers design and teach a ten-day unit. The student teacher also assesses and evaluates the results of the unit taught.

Name of Institution Program Template Part III Page 6/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 7:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Courses

Use the “COURSES” tab on the Program Review Spreadsheet

Provide a list of the program courses (include all courses in the curriculum guide; General Education courses are not required). Ensure that the courses are identified and linked to each program category and program code on the “Program Review Spreadsheet”. When completing the “COURSES” tab, the EPP can enter all courses for all programs in one spreadsheet.

Clinical Educators

Use the “Clinical Educators” tab on the Program Review Spreadsheet

Provide a list of all Clinical Educators who prepare candidates in this program category. Include full-time and part-time faculty; identify the adjunct teachers; do not include cooperating teachers. These should be members who are directly involved with program delivery. Ensure that each educator is identified and linked to one or more program categories. When completing the “Clinical Educators” tab, the EPP can enter all educators for all programs in one spreadsheet.

Key Assessment Areas

Use the “Assessments” tab on the Program Review Spreadsheet

In this section, identify the assessments used to generate program data to demonstrate mastery of the Kentucky Teacher Standards. For each assessment area, indicate the type or form of the assessment and when it is administered in the program. EPPs must identify the assessments for each assessment area to demonstrate meeting the Kentucky Teacher Standards. Reference the “Program Review Technical Guide” for additional details. When completing the “Assessments Initial” tab, the EPP can either enter all assessments for all initial programs in one spreadsheet (this approach requires that each assessment is tagged to specific program codes), or enter the assessments for each program code in a separate spreadsheet.

Align to Standards

Use the SPA tabs on the Program Review Spreadsheet

The purpose of the alignment section is to indicate where the program courses address the applicable Specialty Professional Standards. Some programs will be expected to demonstrate alignment with multiple SPAs (i.e., ACEI, NCTM, ILA, ISTE, etc.). The Program Review Spreadsheet provides each of the major standard areas, including the SPAs to be used to show this alignment. This alignment provides direction and guidance for the evaluation of addressing all the standards through the program review process. Many EPPs have their own alignment tables and combine standards through various crosswalks – these may be attached as an addendum and may replace the alignment tables in the Program Review Spreadsheet. (Assessments are aligned with the KTS and the course alignments are for the SPA.)

Name of Institution Program Template Part III Page 7/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 8:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Evidence and analysis

Repeat this section for each assessment

Evidence for meeting standards - For each instance in this program category, provide a narrative about the eight (8) assessment areas, discuss the instrument, scoring guide/criteria, and alignment to the Kentucky Teacher Standards. The narrative provides a rationale for how the assessment area demonstrates candidate mastery of the standards. Many EPPs study their assessments on a periodic basis and develop comprehensive reports and graphs; this report may be attached as an addendum and may be used to replace the table questions below only if all equivalent information is provided. When completing this section, the EPP will copy this table eight (8) times for each instance in this program category. If the assessments are the same for each instance, then declare in your narrative that they are the same, or only show those assessments which are different. Reference the “Program Review Technical Guide” for additional details.

Assessment Title: The new CAEP 8 annual reporting measures (CAEP Handbook, March, 2016, p. 63), within CAEP Component 5.4 are the EPPs assessment areas, as of the 2016-2017 academic year. The assessments are the same across all program categories. Impact measures are embedded within the EPPs Stages 1, 2, 3, and In-Service Candidate data. The outcome measures result from state and national data. The CAEP-driven measures are as follows:

Impact measures:

1. P-12 student learning/development (portion of EPP Stage 3 “Prescribed Action Research”)2. Observations of teaching effectiveness (portion of EPP Stages 2 & 3 IPR teaching evaluations)3. Employer satisfaction and completer persistence (In-Service Candidates)4. Completer satisfaction (Student Teacher survey portion of Stage 3)

Outcome measures:

5. Completer rate6. Licensure rate7. Employment rate8. Consumer information

Descriptions of the assessments:

Impact measures:

1. P-12 student learning/development: During the student teaching semester, the student teachers design and teach a 10 day unit plan of instruction. The unit includes a day one pretest and a day 10 posttest. Student teachers analyze these data to determine disaggregated (by student) and aggregated (class wide) gains and losses in student learning. These prescribed action research projects are being formalized beginning in the 2016-2017 academic year, for both CAEP Standard 2 clinical practice scholarship requirements, and college wide sharing of student research.

2. Observations of teaching effectiveness: The EPP observes student teachers four times during their experience(s). The four part IPR instrument is being used starting in the 2016-2017

Name of Institution Program Template Part III Page 8/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 9:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

academic year by both college supervisors (CSs) and cooperating teachers (CTs). Also, beginning in the 2016-2017 academic year pre-service candidates shall be observed, prior to student teaching, by a CS and CT using part 1 of the IPR in their first mutually observed teaching episode, and parts 1, 2, and 3 of the IPR in their second observed teaching episode. Video teaching episodes of students are also evaluated in courses prior to student teaching.

3. Employer satisfaction and completer persistence: EPSB and KACTE are collaborating to create the database system, “heat maps”, and in-service candidate data reporting that will be shared with the EPP.

4. Completer satisfaction: Upon completion of their programs, student teachers complete a survey provided for the EPP. Beginning in the spring semester of academic year 2016-2017, as part of the enhanced P-12/EPP CAEP Standard 2 requirement, the EPP shall conduct annual focus groups of teachers in area school districts to receive additional satisfaction feedback, for continuous improvement purposes.

Outcome measures:

5. Completer rate: Data are shared between the EPP, EPSB, and ETS to determine annual completer rates. These rates are analyzed by the EPP, for continuous improvement purposes, and shared in annual reports.

6. Licensure rate: This rate is calculated by both EPSB and the EPP. The rate is analyzed by the EPP for continuous improvement purposes.

7. Employment rate: Data are shared between the EPP and EPSB resulting in employment rate statistics. CAEP Standard 3 requirements are considered when analyzing these data, in that the EPP’s strategic plan regarding program majors is directly affected by annual results. These data also inform college-wide recruiting efforts, related to CAEP Standard 3.

8. Consumer information: While state created consumer information is not used by CAEP for the purposes of accreditation approval, the EPP anticipates receiving and analyzing these data to inform its continuous improvement, and to share with interested audiences.

Assessment description: The EPP utilizes the same four key assessment instruments across all programs. Three of the four assessment instruments are KTS-driven, resulting from preparation for, and successful completion of, Stages 1, 2, and 3 across pre-service candidates’ programs. The fourth key assessment is the IPR teaching evaluation instrument. Below, the 3 stages, and the clinical portion, are titled by each of the four key assessments. The list of documents supporting, and culminating, in each of the 3 Stage areas, and clinical portion, are provided. The KTS-driven key assessments are utilized across each of undergraduate, initial certification programs, across Stages 1, 2, and 3. The fourth common key assessment is the IPR PGES teaching evaluation instrument which is used in a “stairstep” fashion in pre student teaching episodes and student teaching. Each of the four key EPP assessment instruments, common across all undergraduate, initial certification programs offered by the EPP are attached, in support of the program reports.

Assessments that inform and culminate within the four EPP Key Assessments. Note: These same assessments are used across all initial certification undergraduate programs:

Pre-Stage 1 instruments:Stage 1 Portfolio - critical thinking & creativity

Stage 1 - Entry Portfolio Scoring Rubric Stage 1 (see attachment)

Name of Institution Program Template Part III Page 9/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 10:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Stage 1 Portfolio - performance & interviewStage 1 Entrance Interview Assessment Teacher Ed Program (see attachment)

Evaluation of non-teaching student field performance- Teach Profession course (see attachment)Evaluation of student field performance – Fundamentals course (see attachment)Self-Assessment of Dispositions (see attachment)Stage 1 Record of Field Experience Hours (see attachment)Stage 1 Field Experience log - cover sheet (see attachment)The assessments above culminate in Key Assessment #1, Stage 1 Summative Instrument

Pre-Stage 2 Instruments:Summary Interim Disposition Assessment (see attachment)Stage 1&2: Field Experience log - cover sheet (see attachment)Stage 2 Evaluation Teaching in field experience (see attachment)Stage 2 Evaluation Non-teaching in field experience (see attachment)Stage 1&2 Record of Field Experience Hours (see attachment)Intern Performance Record (IPR) Observation Evidence and Rating for Domain 1 (see attachment)Intern Performance Record (IPR) Observation Evidence and Rating for Domains 1, 2, & 3 (see attachment)Portfolio Scoring Rubric Stage 2 (see attachment)Professional Growth Plan (PGP)-LWC Teacher Performance Assessment (see attachment)Leadership/Service Plan-LWC Teacher Performance Assessment (see attachment)Stage 2 Presentation Oral Communication Rubric (see attachment)Stage 2 Written Communication Value Rubric (see attachment)Ethical Reasoning Value Rubric (see attachment)Stage 2 Presentation/Portfolio Committee Summary Score Sheet (see attachment)Disposition Summary Stage 2 (see attachment)Stage 2 Disposition Assessment (see attachment)The assessments above culminate in Key Assessment #2, Stage 2 Summative Instrument

Pre-Stage 3 Instruments:LWC Student Teacher: Teacher program review & evaluation (see attachment)Student Teacher/Completer Survey (see attachment)Principal Exit Review and Evaluation (see attachment)Intern Performance Record (IPR) Observation Evidence and Rating for domains 1-4 (see attachment)Formative Student Teaching Evaluation form (see attachment)Summative student teaching Evaluation Instrument (see attachment)EDUC 5133 Essay Scoring Rubric (see attachment)EDUC5133 Selected Response (see attachment)EDUC 5133 Performance Assessment Rubric (see attachment)The assessments above culminate in Key Assessment #3, Stage 3 Student Teaching Summative Scoring Rubric

Key Assessment #4, Intern Performance Record (IPR) Instrument

Pre-student teaching first observation IPR Domain 1Pre-student teaching second observation IPR Domains 1, 2, and 3Student Teaching observations IPR Domains 1, 2, 3, and 4

Discuss the data analysis for this assessment: Explain how the assessment data supports/validates a candidate’s ability through the progressions of this program:

Name of Institution Program Template Part III Page 10/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 11:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Key Assessment 1: For Pre-Service Candidates to be admitted to the program, the rater & interrater averaged scores for the interview and portfolio must be at least at the Acceptable level. Dispositions must be at the Acceptable level.

Key Assessment 2: For Pre-Service Candidates to be admitted to Student Teaching, the rater & two interrater score averages must be at the Target level for the student presentation. The rater & interrater score averages for the student portfolio must be at the Target level. Dispositions must be at the Target level.

Key Assessment 3: For Pre-Service Candidates to complete the program, the rater & two interrater score averages must be at the Target level for the portfolio. The rater and interrater score averages must be at the Target level for Student Teaching. The Principal interview and survey must be received. The Student Teacher survey must be received.

Key Assessment 4: IPR. Pre-Service Candidate, pre student teaching practice teaching episode 1 rater & interrater score averages must be at the IPR’s “Low Developing” level, pre student teaching practice episode 2 rater & interrater score averages must be at the IPR’s “Low Developing” level, and student teaching rater & interrater summative score averages must be at the IPR’s “Developing” level.

Provide a link to the assessment scoring guide or rubric. (Not required for Praxis II)Please see the four key assessments attachedDiscuss how the reliability and validity of this assessment has been established and supported. The EPPs four key assessments are pre-validated instruments, in that items 1, 2, and 3 are Kentucky Teacher Standards-driven and item 4 is the state IPR instrument. Reliability was not evident in earlier iterations of EPP database design. Now, with the new Quality Assurance System database design, there are always at least a rater and interrater score evident for all assessment items, resulting in the ability to run statistical analyses to ensure scores are at the .80 or greater required agreement levels.

Name of Institution Program Template Part III Page 11/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 12:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Summary Analysis for Program

Provide a holistic summary and rationale for how all key assessment areas demonstrate the program’s overall quality, and how each candidate has demonstrated appropriate performance of the Kentucky Teacher Standards. Many EPPs study their assessments on a periodic basis and develop comprehensive reports and graphs; this report may be attached as an addendum and replaces the analysis summary and improvement sections below. If the EPP chooses to append EPP-designed reports, a narrative description/interpretation of the report(s) must be included.

As previously mentioned, three of four EPP key assessments are KTS-driven. This enables pre-service candidates to demonstrate appropriate performances at the Stages 1, 2, and 3 levels. If, for any reason, pre-service candidates do not demonstrate appropriate performances for the various Stages remediation opportunities are provided. The fourth key assessment is the IPR instrument. It is used in a stair-step fashion across pre student teaching episodes and student teaching. While the EPP has three cycles of data to analyze, those data do not have capabilities for measures of reliability. Therefore, the data results are questionable. In the newly designed Quality Assurance System reliability measures are in place and measurable.

Continuous Improvement Plan for this program category: Provide an explanation of how assessment data are/were used to improve this program.

The shift from EPSB/NCATE accreditation to EPSB/CAEP accreditation is challenging. Being one of the first two EPPs in Kentucky to experience the new accreditation process compounds the challenges. It provides opportunities for aforementioned shifts in the eight key assessments, driving the new quality assurance system, and urgency to make needed changes. While data representing the three most recent cycles of the prior quality assurance system are not perfectly aligned with the new accreditation requirements, it provides much for which to discuss regarding continuous improvement. In some cases, perhaps, what did not exist in the “old” system informs potential improvement, more than what did exist. Further, as exhibited in the following narrative, much of the current continuous improvement plan, and resulting action plans, resulted from Stages committee members’, students’, faculty, and staff feedback, discussions, and insights. A mixed-methods data approach, then, is used and valued in the EPPs continuous improvement strategies.

Assessment Validity and ReliabilityThe EPP had a successful National Council for Accreditation of Teacher Education (NCATE) visit

four years ago. Three areas for improvement (AFIs) were identified by EPSB in 2010. One of the AFI’s was the need to ensure assessment instruments possess validity. Beyond possible validity issues, the new division chair discovered reliability deficits across assessments and in the continuous assessment system. Validity types include internal, external, test, content, construct, criterion, and face (Springer, 2010). EPPs and their accreditors are concerned with content, construct, and criterion validity. Content validity refers to how well an assessment measures all facets of what it intends to measure. Construct validity’s focus is on the extent to how well an assessment measures what it intends to measure. In other words, content validity is concerned with the merit of each of the components of an assessment, while construct validity is concerned with how well an assessment measures what it is supposed to be measuring. Criterion validity measures the accuracy of tagging the assessments used to the relevant state

Name of Institution Program Template Part III Page 12/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 13:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

and national standards.Reliability measures the quality of procedures used in scoring assessments, and indicates the

degree of accuracy of that scoring (Springer, 2010). Raters and interraters score assessments blind to each others’ scoring. Training of scorers prior to scoring assessments is required, in the spirit of ensuring consistent knowledge of assessment scoring levels. The minimum acceptable level of scoring agreement is .80. This means particular assessment scores must be the same at least 80 percent of the time, in order for the score results to be reliable. If the reliability rate is less than .80 then the scores are not reliable.

As mentioned, one of the EPPs AFIs was that the assessments being used had not been tested for validity. The validity issue persisted through, and including, the 2015-2016 academic year, in that the EPSB shared the AFI to the new division chair. A seminal measure used to test content validity is Lawshe’s Content Value Ratio (1975). The ratio is CVR= (Ne-N/2) / (N/2). The formula yields values from +1.0 to -1.0. Experts score the components of an assessment using a 3 point scale. The scale items include “Essential”, “Useful, But Not Essential”, and “Not Necessary.” Simply put, if half or more of the experts rate the assessment component as “Essential” then the item possesses content validity.

So, who are the experts that evaluate the assessments? In this EPPs case, the experts are the members of the council that is appointed by the EPP to receive recommendations for program changes, and who approve students being admitted into the program and student teaching, etc. This council meets twice a year. This group, known as the Teacher Education Committee (TEC), are the EPPs experts. To best align with the new CAEP Standard component 5.5, the TEC should be comprised of the EPP full time faculty, an adjunct representative if applicable, P-12 school district partner leaders, P-12 school district teachers, recent graduates of the program who are teaching, and students in the program. This new TEC member configuration will be evident in fall semester, 2016. The chair, with the data manager’s assistance, amassed all assessments used across pre-service student program experiences. Interestingly, the audit resulted in discovering that no assessments used by the EPP are self created. In other words, the key assessment instruments across EPP Stages 1, 2, and 3 are Kentucky Education Standards-driven.

It should be noted that state and national assessments used by EPPs do not have to be tested for content validity, in that they have already been externally tested for content validity. Also, surveys do not need to be measured for validity. Surveys measure perceptions. Assessments created by the EPP are the focus of the content validity tests. Therefore, none of the key assessment instruments used across Stages 1, 2, and 3 used by the EPP need to be tested for content validity.

Construct validity, as previously mentioned, measures the extent to which an assessment measures what is intended to be measured. Upon careful examination it has become evident that the previous accreditation results determined construct validity to be the culprit, resulting in the AFI.

It was discovered in academic year 2015-2016 there were construct validity issues with an assessment instrument in one of the EPPs assessment stages. Although the EPP utilizes the pre-validated Kentucky Teaching Standards for the interview process for pre-service candidates seeking to enter the student teaching stage, where in fact a content validity test was not needed being that it was a state pre-validated instrument, it was the interview itself that possessed a construct validity issue. An interview did not measure best what a pre-service candidate had accomplished since being admitted into their program. This resulted in a Stage 2 assessment shift from student interviews to student presentations of work resulting across their programs, still driven by the Kentucky standards.

The review of tagging course syllabi to standards revealed possible issues with criterion validity. The EPP requests directives from EPSB regarding standards deficits evident in the crosswalk provided.

CAEP Standard 5.2 is particularly relevant to the EPPs assessment validity and reliability. The component states, “The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent” (CAEP Handbook, 2016). In other words, the assessment system is only as good as its data. If the data are valid and reliable then the needed foundation for legitimate analyses are in place. Then, these analyses lead to decisions driven by the spirit, and intention, of continuous improvement.

Name of Institution Program Template Part III Page 13/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 14:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Based on what was discovered, there were issues with both construct validity and scoring reliability. There may also be critierion validity issues. These issues provide the EPP with continuous improvement opportunities.

In the next yearly report, the EPP will address the validity AFI with the above information, if the validity AFI persists beyond what the EPP has shared in this report.

Reconfiguring Program AssessmentsConstruct validity measures the extent to which an assessment measures what is intended to be

measured. In the fall semester of 2015-2016, the new chair discovered construct validity issues. Students in their last semester of completing required course work, prior to their practicum and student teaching experiences, prepare to qualify for being admitted into Stage 2 of the program. Stage 2 is where pre-service candidates complete their practicum and student teaching. Since the last accreditation visit, students prepared their electronic portfolios, and prepared for an interview driven by the state standards.

Stage 2 committee members consist of a combination of full time Education faculty, local P-12 district faculty or administrators, and Professors from other divisions across the college. On the way to the study’s first semester Stage 2 experience, the Investigator was told by faculty serving on a Stage 2 committee from beyond the Education division that they were sacrificing half of their week long fall break to review the electronic portfolios. They went on to say this would be the last time they would do so.

Once in the interviews, a P-12 teacher who served on committees told the Investigator they had tried to avoid the emails sent by the Education division to serve on the committees. They had served on Stage 2 Committees previously and wished not to do so again. Since emails were being ignored, one of the Education division faculty members made a personal visit to the teacher’s school, located the teacher, and secured their consent to serve on the committee.

Among the concerns possessed by the P-12 teacher was the Stage 2 interview was entirely scripted, and they had been told they could not deviate from the state standards-driven interview script. This same concern regarding the tight scripting of the Stage 2 interviews was subsequently expressed by an “across the college Professor” who had previously served on a Stage 2 committee. The metaphor they used for the Stage 2 interview process was like being in a “straightjacket.” An additional across the college Professor stated they spent half of their fall break reviewing Stage 2 portfolios and they would not do it again. These committee members beyond the EPP faculty and adjuncts did not want to participate in future Stage 2 student interviews. Obviously, there were many concerns held by Stage 2 committee members beyond the EPP. Some of the concerns expressed were directly related to the construct validity problem.

Additionally, during the first semester of the study, some of the students preparing for the Stage 2 entry process seemed to not take it seriously. The chair reviewed four of the electronic portfolios. The students were assigned to share their education philosophies from their early introduction to education course and a second, very recent education philosophy. In a pretest/posttest kind of way, these philosophy of education papers were supposed to reveal the knowledge, skills, and dispositional growth of the students across their programs. In all four cases, the first and second philosophy of education papers were exactly the same. Students shared a letter of introduction in their electronic portfolios. In one of the four cases a student described how they had always wanted to be a mathematics teacher, even though their program major was social science. Such carelessness abounded across the artifacts in these pre-service candidates’ portfolios.

Students’ videos were supposed to represent their best teaching of a lesson, used to judge if they were ready to student teach. In all four cases, the videos were the students’ first ones from a teaching attempt earlier in their programs, and it was obvious they were first attempts at teaching. Additionally, the students’ unit plans of instructions were no more than mundane and ordinary. All of the artifacts suggested the students were merely going through the motions. The realities of the poor quality of these students’ Stage 2 electronic portfolios surely intensified the complaints by committee members about the many hours of time it took to review the product.

To compound the problems with the electronic portfolios, and substantial time investments in

Name of Institution Program Template Part III Page 14/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 15:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

reviewing them by committee members, they were not the focus of the opportunity to interview the students for Stage 2 entry. Instead, the tightly scripted, state standards-driven interview was general in nature. The questions did not focus on the particular student work. Instead, the naturally general nature of the standards resulted in general interview answers. There was a construct validity disconnect between the students’ portfolio work and the state standards interview.

Based on the considerable feedback from Stage 2 committee members, in fall semester 2015-2016, along with the chair’s conclusions regarding the questionable quality of some of the students’ electronic portfolio product, changes in the Stage 2 process were incorporated in the second semester of the study.

In spring semester 2015-2016 the Stage 2 continuous improvement plan included: Assurance of quality of a second philosophy of education paper in the course where the paper occurred, the providing of the electronic portfolio earlier in students’ programs, rather than two weeks prior to the planned interviews, a shift from committee members interviewing Stage 2 candidates driven by the state standards to Stage 2 candidates presenting their prescribed course product within the same state standards framework, ending of the requirement for P-12 and across the college Professors to invest hours of time in reviewing the electronic portfolios, while at the same time providing them with access to them, and of course needed changes in the EPP created assessment instrument used to evaluate the student presentations rather than interviews, from interview to presentation.

Change is difficult. Some education faculty and students resisted the change from Stage 2 student interviews to student presentations. In one case a student broke down crying and was escorted to the education division chair by a division faculty member. The student expressed how they had made many life sacrifices to pursue a teaching degree and career. They felt as though this change from interview to presentation doomed them, and they would not succeed in their goal of becoming a teacher. The meeting participants were the chair, the instructor of the course, and the student. The chair expressed to the faculty member that their support was needed in this change from interview to student presentation. The student’s concerns were heard by the chair. The students objections to the change were overcome by the chair, and subsequently the student had a successful presentation experience. Buy-in by faculty is critical to the success of change initiatives and EPP continuous improvement. Students look to faculty regarding how to respond to substantive program changes.

In meetings leading to the Stage 2 changes, the chair requested that the Stage 2 student presentations be made available to admitted students in the EPP. The chair provided the simile of a music recital for what the Stage 2 student presentations could become. In other words, students seeking admittance to Stage 2 would share their best work and presentation (a performance) to the committee and student audience. Students observing the presentations would gain ideas for their future presentations. This student legacy approach has the potential for continuous improvement of the Stage 2 student presentations.

Most of the faculty resisted the chair’s recommendation to immediately cause the Stage 2 student presentations to be attended by fellow students in the programs. They felt the change from student interviews to presentations would be enough of a challenge for the students in the first semester. The chair consented and agreed to wait until fall semester, 2016 to make the Stage 2 student presentations available to other education students. Beginning in fall semester, 2016 the Stage 2 Presentations can be attended by EPP admitted students, as well as the Stage 2 committees.

The chair visited the class where the students were preparing for the Stage 2 presentations and discussed the new presentation approach. All students who presented in the new Stage 2 presentation model were successful. Feedback from spring semester 2015-2016 Stage 2 committee members, especially those beyond the EPP faculty, was positive. It is likely the change from student interviews with scripted questions to student presentations, with open-ended end of presentation question and answer opportunities, will cause external committee members to be more likely to participate in future Stage 2 episodes.

The change in Stage 2 from state standards-driven student interviews to state standards-driven student presentation addressed and remedied the construct validity that was obvious. The feedback from

Name of Institution Program Template Part III Page 15/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 16:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

committee members, as well as the Investigator’s review of several electronic student portfolios, was the data analyzed leading to the decision for the needed changes.

CAEP encourages EPPs to collect and analyze data, leading to changes with the intention of improvement. This example where the EPP addressed the Stage 2 construct validity problem should epitomize what is desired by the accreditor, in regards to analyzing data and taking steps to continuously improve.

Data Collection and Analysis

Beyond the issue with the EPPs instrument construct validity, reliability problems with the assessment data became obvious to the new chair, in the fall semester 2015-2016. As previously described, there was a problem with construct validity that was identified and remedied. Further, reliability issues compounded the validity problem.

The assessment system, now known within CAEP Standards guidelines as the quality assurance system, relied on hard copy evaluation instruments being submitted to the EPPs data manager by the raters and interraters. For a variety of reasons, not all hard copy evaluations were collected, tallied, and placed in pre-service candidates’ hard copy files. Without all of the needed completed evaluation instruments, the data manager could not compute assessment score averages. Additionally, the database did not have data cells in its design for both rater and interrater(s) scores. Since the database did not have the needed data cells there was no possibility of running statistical tests for reliability. Therefore, there was no reliability of scores for the EPPs assessments.

CAEP requires submission of three cycles of data, either semesters or academic years, from the quality assurance system to be analyzed by the EPP, resulting in decisions regarding how best to continuously improve. Of course, if the data do not possess reliability then subsequent analyses and decisions may well be flawed. Beyond the data reliability issues, there were changes in the data manager position that impacted the assessment system. In spring semester, 2015 the EPP data manager suddenly died. Data entry for spring 2014-2015 was subsequently non-existent until the midpoint of the 2015-2016 academic year. Then, the data manager who had been hired in June, 2015 departed in April, 2016. A third data manager began at the end of academic year 2015-2016. Such turnover in the high stakes position of EPP data manager position obviously affects the ability to collect and analyze data, which of course impedes continuous improvement decisions.

Beyond the hard copy data collection problem, the missing database data cells problem, and the data manager turnover problem, EPP faculty, overwhelmed by workloads, often had to score assessment instruments in haste. The chair personally came to know this hasteful scoring reality early on due to workload demands. The scoring of assessments simply had to be subordinated, in order for faculty to be successful in their many other tasks. As mentioned earlier, faculty college breaks had been used for evaluating electronic portfolios. This calendar design, where needed college breaks intended for rejuvenation were being used for education faculty assessment of student product, further compromised the portfolio scoring. Faculty being time-pinched, due to their extraordinary workloads, added to the data issues. Also, as much as faculty work during the academic calendar they do in fact need scheduled breaks. Scheduling review of student portfolios during needed academic calendar breaks is illogical and detrimental to both faculty and scoring reliability.

CAEP requires three cycles of assessment data. The EPP will have three cycles of data, but due to the issues discussed above they will be suspect. However, analyses of the issues have led to continuous improvement remedies. For EPPs having CAEP accreditation visits in Fall, 2017, it is expected the new quality assurance system be in place and operational. However, CAEP does not necessarily expect the three cycles of data used in the accreditation visit to be reflective of the new standards. Instead, verbiage regarding the shortcomings of the former system are to be shared, along with descriptions, and operationalization, of the new system. The quality assurance system that aligns with the new CAEP standards must be operational at the time of the self-study, one year prior to the accreditation visit. The self study shall take

Name of Institution Program Template Part III Page 16/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 17:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

place in fall, 2016-2017.

Electronic PortfoliosIn the first semester of 2015-2016, students received their electronic portfolios two weeks prior

to their Stage 2 interviews. Many artifacts were required to be placed in the electronic portfolio in a short time. Items from across the students’ three years of courses were expected to be placed in them, in this short time window. In some cases, the items had been misplaced by the students across their program journeys. Due to construct validity issues previously discussed, much of the work expended by students on the portfolios seemed futile. The interviews often did not target the product placed in the portfolios.

Several of the programs have academic majors outside of the EPP. Obviously, students in those majors have far fewer courses within the EPP. The higher level courses they do have within the EPP are “back-ended”, meaning their high stakes courses where many of their artifacts are produced are taken in a relatively short period of time. Compounding the intensive creation of the portfolio artifacts was access to the Stage 2 portfolios two weeks prior to the student interviews. The short time in which students had to create their electronic portfolios exacerbated the problems with the intensity of producing required artifacts. Further, students in these several programs with many courses outside of the EPP often did not have all artifacts they were supposed to have when coming into the class where the portfolios were built, for a number of reasons.

Additionally, students from all programs are placed in one class, reflecting two course sections, where the portfolios are finalized. Students with more classes within the EPP, and with Education Instructors familiar with the program requirements, are at a natural advantage when it comes to having the needed product for the portfolios. They have more courses across their academic programs, and greater guidance, where the required artifacts can be created. Receiving the electronic portfolio two weeks prior to their Stage 2 interviews caused considerable anxiety for many of the students, especially those whose majors were outside of the EPP. These problems contributed to the former lack of quality evident in some of the students’ portfolio product.

In response to these data, and its analysis, among the changes implemented during the second semester of the academic year study was the providing of the electronic portfolios to students earlier in their programs. While the earlier providing of the electronic portfolios to students did not address the immediate problem the current Stage 2 candidates faced, its intent is to reduce the students’ problems and anxieties with Stage 2 electronic portfolios, and their product, in future semesters and years.

Education students are advised to take the Technology in Education course early in their programs. The course was a logical place to provide students with their needed electronic portfolios. Students receiving their electronic portfolios early in their programs have the potential to cause them to be more likely to place the required artifacts from courses along the way. Placing their artifacts in the electronic portfolio across the courses in their programs may also reduce the problem of lost documents. It may also reduce the problems students with majors outside of the EPP have experienced with the electronic portfolios. The instructor of the technology in education course welcomed the opportunity to guide students through their accessing and creating the electronic portfolios.

One of the items required in the electronic portfolio is a video of the student’s best teaching episode. In fall semester 2015-2016, an online software application for video viewing was sent to Stage 2 committee members. They were instructed to download the video software so they could then view students’ video teaching episodes. There were many problems with the video software. Certain browsers were not compatible with it. The problems with downloading the needed software, for which to view the student videos, reduced viewing of them by committee members. It also added to the aforementioned frustrations of the time needed to review the electronic portfolios by all committee members.

The students with majors outside of the EPP typically had only one video from which to choose, for their best video performance. The course in which the final touches are made on the electronic portfolio required a video teaching episode. However, the deadline for turning in the teaching episode video in the course was one week after the electronic portfolios were due. This caused many students, especially those with academic majors beyond the EPP, to not have at least a second video from which to

Name of Institution Program Template Part III Page 17/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 18:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

choose for their portfolios. The already daunting task of loading all required components into electronic portfolios during the short two week time window subordinated decisions by students to videotape a recent, and better, teaching episode. So, in some cases the video performances were students’ first time attempts at teaching. In most cases, first time teaching episodes are not good evidence for pre-service candidates to be considered ready for student teaching. Within the previously mentioned Stage 2 construct validity issue were videos of teaching episodes exhibiting content validity issues.

CAEP expects student product to be assessed in valid and reliable ways. In order for this to occur, the students, EPP faculty, and external review committee members must have the needed time and opportunities for producing and carefully evaluating the product. Additionally, the product should reflect the students’ best attempts to satisfy the requirements and standards. The changes made in when electronic portfolios are provided to the students was the result of collecting data, analyzing it, and making the decision that is intended to continuously improve the process. Data shall continue to be collected, and analyzed, to determine if the changes result in the intended outcomes.

In future semesters there will be an increase of clinical observations of students. These formalized clinical observation opportunities may provide students with opportunities to video more of their practice teaching episodes, resulting in better product of best teaching episodes for their electronic portfolios.

Clinical Experiences

While it is common for nationally accredited EPPs to have clinically observed teaching episodes prior to student teaching, there were none evident during the 2015-2016 academic year. Also, there was not a common assessment used by college supervisors and cooperating teachers during student teaching, not to mention prior to student teaching. Therefore, there was no possibility of reliability in scoring of students teaching.

In the continuous assessment design that had been approved four years prior to this study, the EPP planned to have two teaching episodes clinically supervised prior to student teaching. The new chair of the division, during the academic year 2015-2016, discovered the prior plan, with the help of the data manager, and inquired to EPP faculty as to why the pre-student teaching episodes were not occurring. The faculty told the new chair the p-12 school teachers did not like it so the approved plan was abandoned.

The chair asked the faculty the reason why there was not a common observation instrument used during student teaching. The faculty reported the co-teaching model does not allow for the cooperating teacher to observe the student teacher since they are always teaching together. This results in the cooperating teacher needing to use a different assessment instrument than the college supervisor who visits the class to formally observe the student teacher four times. The new chair suspected the faculty did not understand the co-teaching model. For clarity regarding the co-teaching model, the chair followed up with the state professional standards board. The board representative stated the faculty’s interpretation of the co-teaching model was a misinterpretation of the regulation. Student teachers do in fact have standalone teaching experiences, including their teaching of a 10 day unit plan of instruction.

Since cooperating teachers can observe their student teachers, they can in fact use the same observation instrument used by the college supervisors. Naturally, this provides the opportunity to measure the reliability of the scoring. Formerly, the opportunity to measure for scoring reliability did not exist, since the observers were using different assessment instruments. But, now, the same clinical assessment instrument will be used in student teaching, and scoring reliability can be measured. The assessment being used is the state’s IPR assessment instrument for P-12 teachers. This causes it to be a pre-validated instrument. Still, within the guidelines of new CAEP Standard 2, there should be consensus between the P-12 partner districts and the EPP regarding the instrument’s use.

Having no mutually observed teaching episodes by cooperating teachers and college supervisors, using the same assessment instruments, prior to student teaching is a problem for the EPP. The other EPP in the state that is preparing for a fall, 2017 accreditation currently has three teaching episodes in their programs prior to student teaching, observed by both cooperating teachers and college supervisors, using

Name of Institution Program Template Part III Page 18/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 19:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

a common observation instrument. This coming academic year, which is the year prior to the CAEP visit, the other EPP is increasing the number of supervised teaching episodes prior to student teaching from three to six. It is critical that the EPP at least have the previously planned, and approved, two observed teaching episodes prior to student teaching. Plans for including the two observed teaching episodes are to start in fall semester 2016-2017.

Abbreviated versions of the pre-validated state IPR observation instruments are planned to be used in the two episodes.

New CAEP standard 2, focused on P-12/EPP partnerships, including clinical practice, and an abridged version of it created by the EPP chair, is being shared with partner school district leaders beyond this study. Much must be done to reconfigure the EPP/P-12 partnerships in a short time, so they are enhanced and meet the new CAEP 2 standard components.

P-12 School Partnerships

CAEP Standard 2 is focused on P-12/EPP partnerships (CAEP Handbook, 2016). Beyond students’ clinical experiences, CAEP standard 2, comprised of three components, includes such things as the P-12/EPP shared responsibility model, P-12/EPP co-construction of observational instruments and clinical experiences, EPP-provided required online clinical training availability, and analyses of P-12/EPP partnership clinical data.

Standard 2.1 requires P-12 school/LWC Education shared responsibility for continuous improvement of candidate preparation. Evidence for 2.1 include a description of partnerships (MOU), along with documentation that the partnership is being implemented as described.Artifacts include: schedules of joint meetings (in our case TEC meetings, and other partner meetings as needed, documentation of stakeholder involvement, a shared responsibility model, technology-based collaborations, evidence of co-constructed observational instruments (content validity exercise), and evidence of co-constructed candidate expectations during clinical experiences. Standard 2.2 measures: Evidence of co-selection of College Supervisors (CSs) and Cooperating Teachers (CTs), evidence of criteria for selection of clinical educators, including recent field experience and currency in relevant research. Resources available online: clinical educator orientation available both in person and online, performance evaluations of CSs, CTs, and candidates, collected data used to modify clinical experiences, records of remediation and/or counseling out of candidates available.

Standard 2.3 measures: Analyses of data for what works and what doesn’t in an ongoing manner, cross referencing findings and conclusions, continuous answering to the research question: What is it about the clinical experiences (that is, depth, breadth, diversity, coherence, and duration) that can be associated with the observed outcomes?

New CAEP Standard component 5.5 demands a reconfiguration of the committee, or council, that reviews EPP recommendations, etc. on a regularly scheduled basis. The committee should be comprised of P-12 District/School/Clinical Education Leaders, P-12 Educators, Recent Graduates (Alumni) of Education Division Teaching in P-12 Schools, Current Education Students, Full Time Education Division Faculty, EPP Program Coordinators, an Education Division Adjunct Representative, and, perhaps, an Ex-Officio College Administration Member which is optional.

During the 2015-2016 academic year, naturally, the teacher education committee did not reflect the new CAEP prescribed membership. Attendance by members beyond the EPP was low, or non-existent. The committee met on the Thursday of the first week of the two semesters. By then students who were to be considered for Stage 1 admission into the programs were already in the restricted professional courses. Student teachers who were to be considered for recommendation were already placed in P-12 school settings. Thus, the recommendations on the part of the committee were little more than ceremonial.

Beyond the time of this report, the EPP needs to reconsider the timing of the meetings so the recommendations have substance, in addition to reconstituting its membership. The most difficult part of this continuous improvement attempt will be the timing of student teachers being placed in settings and

Name of Institution Program Template Part III Page 19/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 20:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

their being approved for student teaching. But with the new enhanced P-12/EPP partnerships it may be possible to cause the approval of student teachers by the EPP to have more substance.

Quality Assurance System Revisions

CAEP standard 5, comprised of five components, describes the new quality assurance system. Previously, these data collection designs were known as continuous assessment systems but CAEP has labeled them with this new moniker.

CAEP standards one through four outcomes are continuously measured by the quality assurance system. These data are monitored by the CAEP eight annual reporting measures (CAEP Handbook, March, 2016, p. 63). EPPs with accreditation visits in fall, 2017 are not necessarily expected to have new CAEP aligned data from previous data cycles. However, EPPs with these first accreditation visits are expected to have their quality assurance systems in place, and operational, at the time of the creation of their self studies, one year prior to their visits (Fall semester, 2016-2017 in this EPPs case).

CAEP expects EPPs to have three cycles of data that have been collected, analyzed, and used for decisions for continuous improvement. The cycles can be either semesters or academic years, which is something new in teacher education accreditation. Previously, cycles could only be entire academic years.

The EPP in this study does in fact have three cycles of data. However, there are issues with these data, which have been shared herein, and will also be shared in the CAEP self study where applicable. An issue with construct validity became evident, within the Stage 2 process. It was also discovered there was no reliability in scores of assessment and teaching observation instruments. Further, there were no opportunities for testing reliability, statistically. While these items should, and do, elicit substantial concern, they provide opportunities for EPP writing in the fall semester, 2016-2017 self study regarding its baseline, and plans for continuous improvement.

Criterion validity issues became apparent as a result of the new chair receiving information regarding SPA crosswalk requirements for some courses at a KACTE Conference in Louisville, June 28, 2016, via EPSB guest speakers. The new chair discerned that Kentucky is a “program review state”, with SPAs embedded within the state requirements. This prompted the chair to guide the new EPP office associate to use the EPP crosswalk grid to audit the inclusion of SPA standards within, and across, courses. Being that CAEP is INTASC Standards centered, and the state program report review includes the SPAs, these became critical audit items in the EPPs crosswalk.

The accreditation timeline for the EPP is such that the program reports are due Summer, 2016 (this summer). EPP faculty, as well as content faculty, are not on campus during the summer. The EPP chair seeks guidance and direction from the reviewers of the program syllabi in this matter. If there are standards linked deficits in any of the syllabi, the EPP chair requests prompts and instructions from the syllabi reviewers in email, written form so they may be shared with relevant faculty, within the EPP and across the college. Reviewers should find the EPP provided crosswalk helpful in assessing the needed standards. Upon receiving any needed revisions from the reviewers, the chair will seek guidance from the college’s VPAA, and others, for the needed remedies.

References

CAEP Accreditation Handbook (Version 3-March, 2016). CAEP Council for the

Accreditation of Educator Preparation, Washington, D.C..

Lawshe, C.H. (1975) A quantitative approach to content validity. Personnel Psychology, 28,

563-575.

Name of Institution Program Template Part III Page 20/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 21:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Springer, K. Educational Research. (2010). Wiley & Sons, Hoboken, N.J..

Name of Institution Program Template Part III Page 21/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 22:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Option 6

If this program category has an Option 6 alternative route, then the following data is also required:

Include a narrative to describe how the alternative route program differs from the traditional route program:

(Provide a narrative here)

Option 6 Mentoring Experiences: (limit of 2000 characters)(Per KAR 9:080 Section 3)

Your response text can be all in one section; however you must address each item.

1. Provide evidence of selection criteria and evaluation of University and District mentors.

2. Explain the process through which at least 15 annual observation hours (minimum 5 for university faculty, minimum 5 by district-based mentor, minimum 5 additional by university faculty or district-based mentor) are assigned to the mentors. If the program uses a template for the mentoring plan that is submitted to the EPSB for certification, please attach a copy of that template.

3. Explain how the hours are monitored and reported.

4. Describe how support will be offered to the candidate during in-class and out-of-class time to assist the candidate in meeting the teacher’s instructional responsibilities.

5. Describe the process established to maintain regular communication with the employing school to assist the candidate and address identified areas of improvement. .

Option 6 ONLY - How does the EPP (Provider) monitor and support candidate completion through KTIP? (Per 16 KAR 9:080. University-based alternative certification program - Sections 3 and 7)

(limit to 1000 characters)

Name of Institution Program Template Part III Page 22/tt/file_convert/5e28ce57957faa1e237ae353/document.docx

Page 23:  · Web viewThe EPP utilizes “degree contracts” (program course plans) to advise students across their program experiences. The EPP also uses course sequence plans in …

Template for Initial Teacher – Section III. May 18, 2016

Name of Institution Program Template Part III Page 23/tt/file_convert/5e28ce57957faa1e237ae353/document.docx