2019 EPP Annual Report - bhsu.edu

7
2019 EPP Annual Report CAEP ID: 12127 AACTE SID: 400 Institution: Black Hills State University Unit: School of Education Section 1. AIMS Profile After reviewing and/or updating the Educator Preparation Provider's (EPP's) profile in AIMS, check the box to indicate that the information available is accurate. 1.1 In AIMS, the following information is current and accurate... Agree Disagree 1.1.1 Contact person 1.1.2 EPP characteristics 1.1.3 Program listings Section 2. Program Completers 2.1 How many candidates completed programs that prepared them to work in preschool through grade 12 settings during Academic Year 2017-2018 ? Enter a numeric value for each textbox. 2.1.1 Number of completers in programs leading to initial teacher certification or licensure 1 132 2.1.2 Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.) 2 48 Total number of program completers 180 1 For a description of the scope for Initial-Licensure Programs, see Policy 3.01 in the Accreditation Policy Manual 2 For a description of the scope for Advanced-Level Programs, see Policy 3.02 in the Accreditation Policy Manual Section 3. Substantive Changes Have any of the following substantive changes occurred at your educator preparation provider or institution/organization during the 2017-2018 academic year? 3.1 Changes in the established mission or objectives of the institution/organization or the EPP 3.2 Any change in the legal status, form of control, or ownership of the EPP. 3.3 The addition of programs of study at a degree or credential level different from those that were offered when most recently accredited 3.4 The addition of courses or programs that represent a significant departure, in terms of either content or delivery, from those that were offered when most recently accredited 3.5 A contract with other providers for direct instructional services, including any teach-out agreements Any change that means the EPP no longer satisfies accreditation standards or requirements: 3.6 Change in regional accreditation status 3.7 Change in state program approval

Transcript of 2019 EPP Annual Report - bhsu.edu

2019 EPP Annual Report CAEP ID: 12127 AACTE SID: 400
Institution: Black Hills State University
Unit: School of Education
Section 1. AIMS Profile After reviewing and/or updating the Educator Preparation Provider's (EPP's) profile in AIMS, check the box to indicate that the information available is accurate.
1.1 In AIMS, the following information is current and accurate... Agree Disagree
1.1.1 Contact person 1.1.2 EPP characteristics 1.1.3 Program listings
Section 2. Program Completers 2.1 How many candidates completed programs that prepared them to work in preschool through grade 12 settings during Academic Year 2017-2018 ?
Enter a numeric value for each textbox. 2.1.1 Number of completers in programs leading to initial teacher certification or licensure1 132
2.1.2 Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.)2
48
Total number of program completers 180
1 For a description of the scope for Initial-Licensure Programs, see Policy 3.01 in the Accreditation Policy Manual 2 For a description of the scope for Advanced-Level Programs, see Policy 3.02 in the Accreditation Policy Manual
Section 3. Substantive Changes Have any of the following substantive changes occurred at your educator preparation provider or institution/organization during the 2017-2018 academic year?
3.1 Changes in the established mission or objectives of the institution/organization or the EPP
3.2 Any change in the legal status, form of control, or ownership of the EPP.
3.3 The addition of programs of study at a degree or credential level different from those that were offered when most recently accredited
3.4 The addition of courses or programs that represent a significant departure, in terms of either content or delivery, from those that were offered when most recently accredited
3.5 A contract with other providers for direct instructional services, including any teach-out agreements
Any change that means the EPP no longer satisfies accreditation standards or requirements: 3.6 Change in regional accreditation status
3.7 Change in state program approval
Course Details:
Preparation applicable to all content areas to effectively integrate technology into instruction and active student learning.
New Tech class.docx
Description:
This non-methods course presents major concepts and theories in the various social studies disciplines including geography, anthropology, sociology, world and US History, government, citizenship, democracy, and economics. It builds an understanding of research and inquiry in social studies specific to working with K-8 students.
New SS for elem class.docx
Section 4: 1st Table Insert
Section 4: 2nd Table Insert
Section 4: 3rd Table Insert
Section 4: 4th Table Insert
Section 4: 5th Table Insert
Section 4: 6th Table Insert
(continued next page)
Tables for section 4.docx
Substitute Teaching
When asked to substitute-teach the intern may do so for their clinical educator only. For long-term subbing, a three-way agreement between the principal, the clinical educator, and the intern is required. Some districts do not approve the interns to substitute teach during their internship. Any payment that is received for substituting is based upon school district policy. Exceptional situations will be approved through the Office of Field Experiences.
School of Education Policies and Procedures
3.4.2   Clinical educators in P-1`2 schools retain full responsibility over their classrooms and delegate instructional responsibility to the student teaching intern on a temporary basis.  However, the student teaching interns are not to be used as substitute teachers in the building unless all the following conditions are met:
1. Student teaching interns may substitute only for their clinical educators unless school district personnel, the university supervisor, and the Director of Field Experiences have granted permission.
1. A four-way agreement exists among the school district administration, clinical educator, university supervisor, and student teaching intern.
1. Substitute payment is based upon district policy.
These are the two found. We have to adjust the number of parties in the agreement, 3 or 4.
Thank you,
Thursday November 1, 2018
President’s Conference Room in the Student Union
Attendance: (19 Total) Julie Mathiesen, Brian Harting, Pete Wilson, Jay Beagle, Kristen Strissel, Kirk Easton, Joe Hauge, Tonya, Mullaney, Julie Hatling, Sharman Adams, Micheline Nelson, Amy Fuqua,
Betsy Silva, Brian Eberhard, Jarrett Moore, Louise Yoho, Janet Briggs, Denice Turner and Rod Custer
1) Welcome by Dean Adams and Chair Custer
2) Introductions of Council
3) CAEP update from Betsy Silva-final decision will be made October 22nd 2019 then it will take 30 days to finalize and send out.
4) Faculty update: 4 new education hires for Fall 2018: Elizabeth Christian, Melissa Schwiesow, Sydney Haugland, Janet Briggs and Dave Zanton, the Little Jackets Child Care Coordinator. Rod also introduced the 3 new education faculty that were hired in the fall of 2017: Louise Yoho, Carin Neitzel and Jarrett Moore.
5) Louise Yoho spoke a little about the new master’s degree program. This program is designed for people without an undergraduate degree. This is to roll out in the fall of 2019.
6) Denice Turner talked about the 3 strands of value: How we care about ourselves, others and the world. Input was given from the council.
a) We need to teach our student teachers to have more
· Confidence
· Resilience
· Leadership
· Build capacity that they are successful
· Learn from failure
· Differentiate instruction
· Data Collecting
· What can we give our teachers to make them excel in the classroom
· Ask teachers what they need for their classroom
· Bring schools on campus at BH to have mini conferences
· Help teachers understand how to write grants
8) Input from the council about technology in schools
· What can teachers in the field offer our students
· bring teachers from the field to the Methods Block
· teachers are in experienced with online reports
· collaborate with Method blocks students by bringing teachers from the field to a methods block class
· Coding and the number of jobs filled v/s the number of jobs being created in computer science
· create workshops for pre service teachers at no cost
9) Micheline gave update on the EDC/SDACTE and DOE
· EDC
· Which programs are sustainable?
· Open Resources – Preview open resources for general education courses (Chemistry, Biology, Physics, History, Political Science) OER
· SDACTE
· Discussion on who and when the SD Code of Ethics for Teachers is taught
· Discussion if student teachers may take long-term positions
· DOE: Kathy Riedy
· SDAR Article 24:53 Preparation Program Approval
· Programs now overlap K-8 and 5-12: institutions need to make sure the middle school years are addressed in the programs
· 24:53:07:03 Certification programs – should we keep industrial technology, agriculture, as teaching program or just CTE
· Need to add Mass Communication/Journalism
· Suicide Awareness before license or renew license
· Title II reporting: need to include the collaborative online certification candidates. – not a degree
· Looked at the new standards for Elementary preparation: no standards to prepare elementary PE, Music, Health, Art, etc.
· The English as New Language Endorsement codes:
· 01508: Elementary English Language Learner
· 01608: Grades 5-8 English Language Learning
· 01608: English Language Learning (Secondary)
Meeting adjourned at 1:30PM
Meeting Notes from Advisory Council Meeting 11-05-18 - cert now 5-12 mini conference.docx
Meeting Notes
Friday March 23, 2018
President’s Conf. Room/Student Union
Attendance: (12 total) Julie Mathiesen, Ryan Young, Brian Harting, Pete Wilson, Jay Beagle, Kristen Strissel, Kirk Easton, Sharman Adams, Betsy Silva, Micheline Nelson, Amy Fuqua, and Rod Custer
1) Welcome by Dean Adams and Chair Custer
2) We received CAEP report back and responding to AFI’s. The CAEP visit is April 15-17. Feedback from the Advisory Council is welcome.
3) Faculty Update: 3 new Education hired for Fall of 2017: Louise Yoho SPED, Jarrett Moore Secondary ED, and Carin Neitzel ED PSYC. All three were unfamiliar with snow. Our retiring faculty in June are Mary Cooper, Sue McGrath, Cynthia Chandler, and Kathleen Matthew. New faculty for fall 2018 hired so far: S. Haugland and M. Schwiesow. We are interviewing for Early Childhood, Science Education, and the Child Care Care Coordinator.
4) Rod Custer will email the “Call for Comment” link to the Advisory Council. We need the schools to go to the link and enter their responses.
5) Betsy Silva gave a CAEP update. All 5 CAEP Standards were met. We received formative feedback on Feb 9, 2018. Talked about using the P-PAT method which Betsy said is more authentic and cutting edge. We are addressing the 504 disconnect. Gen Eds went from 45 to 31. The council discussed how you figure out the content of new technology course. The council talked about the importance of finding the mindset to go out and find new technology and then effectively implement the technology so it has an impact on what students learn.
6) Brian Eberhard talked about Graduate Level Programming. Brian is the MSCI coordinator. He discussed portfolio professional growth as the most popular for the Master’s Program. The least popular are Publication and Thesis. He has 7 programs and 71 current students. The Specialist Generalist is a challenge and doesn’t meet the needs of students in this area. SD Counts is running out of money. Brian did a research project on what’s better, Traditional Teaching or Progressive Teaching. The result showed Traditional was better. He did this research to improve how effectively we place student teachers out in the field.
7) Ryan Amys, Project Second, works with students who are looking to change their career in the teaching field. He works with Vets and recent grads. He is working with students in 11 states and sees growth with virtually no marketing. The growth is due to simple word of mouth. He is piloting new software called “Go React”. Students shoot video of them teaching and Ryan pops right in their video right where he needs to and offers feedback. He feels “Go React” could be used practicum.
8) Micheline Nelson talked about PPC “Professional Progress Committee”. She meets with this group on BIG issues or appeals. From 2014-2017 we had 500 candidates, 13% (42 students) were dropped or counselled out. The key is to document everything. Right now Micheline is monitoring 4 students. Situations are transparent. Micheline and the council discussed background checks and the requirements of local schools.
9) Rod Custer provided data on Distance Education. We have 175 courses and 83 are online, almost 50%. There are no dual credit courses in education. Field and Block courses are all face to face. We have to offer online courses to keep up with other Universities. The online courses are 1 to 3 hour courses, but most are 3 hours. Enrollment minimum to run an online course is set at 10 but may increase to 12. Grad level min enrollment is 7. We are keeping an eye on the min. enrollment requirement as that will affect the number of courses being offered. We are allowed 4% under-enrolled courses campus wide. We use IDEA surveys to measure the quality of our courses.
Notes respectfully submitted by Jean Osborn 3/26/18
AC Meeting Notes Spring 2018 - Tech class content.docx
Section 4: 1st Table Insert
Section 4: 2nd Table Insert
Section 4: 3rd Table Insert
Section 4: 4th Table Insert
Section 4: 5th Table Insert
Section 4: 6th Table Insert
(continued next page)
Tables for section 4.docx
Section 4. Display of Annual Reporting Measures. Annual Reporting Measures (CAEP Component 5.4 | A.5.4)
Impact Measures (CAEP Standard 4) Outcome Measures 1. Impact on P-12 learning and development (Component 4.1) 5. Graduation Rates (initial & advanced levels)
2. Indicators of teaching effectiveness (Component 4.2)
6. Ability of completers to meet licensing (certification) and any additional state requirements; Title II (initial & advanced levels)
3. Satisfaction of employers and employment milestones (Component 4.3 | A.4.1)
7. Ability of completers to be hired in education positions for which they have prepared (initial & advanced levels)
4. Satisfaction of completers (Component 4.4 | A.4.2)
8. Student loan default rates and other consumer information (initial & advanced levels)
4.1 Provide a link or links that demonstrate data relevant to each of the Annual Reporting Measures are public-friendly and prominently displayed on the educator preparation provider's website.
1
Link: http://www.bhsu.edu/Portals/0/Academicshttp://www.bhsu.edu/Academics/The-Colleges/College-of- Education-and-Behavioral-Sciences/School-of-Education/Accreditation
Description of data accessible via link: updating to meet new expectations for CAEP; expected posting June, 2019
Tag the Annual Reporting Measure(s) represented in the link above to the appropriate preparation level(s) (initial and/or advanced, as offered by the EPP) and corresponding measure number.
Level \ Annual Reporting Measure 1. 2. 3. 4. 5. 6. 7. 8. Initial-Licensure Programs Advanced-Level Programs
4.2 Summarize data and trends from the data linked above, reflecting on the prompts below.
What has the provider learned from reviewing its Annual Reporting Measures over the past three years?
Discuss any emerging, long-term, expected, or unexpected trends? Discuss any programmatic/provider-wide changes being planned as a result of these data? Are benchmarks available for comparison? Are measures widely shared? How? With whom?
The EPP at Black Hills State University has developed strategies to gather multiple measures of Completer Impact on P-12 Learning and Development and Teacher Effectiveness. Due to South Dakota Codified Law 13-42-70, a law ensuring privacy protection for P-12 students and teachers in SD, the State DOE limits data release to the EPP. The law prohibits sharing school and district evaluation data of P-12 students or teachers that is disaggregated further than grade level. Therefore, it was necessary for the EPP to create a unique process for direct and indirect data collection to assess program and completer impact on P-12 learning and development. Direct measures provide data from teacher performance and value-added assessments. These measures include Student Learning Outcome (SLO) achievement, employer evaluations, Teacher Effectiveness-Ratings, action- research, and South Dakota Department of Education Student Teacher Accountability and Reporting System (STARS, aggregate data) of proficiency and growth for P-12 learning and development. Indirect measures include employer, alumni, and graduate surveys and advisory committee recommendations as well as partner input. The SLO and TE-Rating assessments are research-based evaluations developed by The South Dakota Commission on Teaching and Learning (CTL). The CTL also led to the South Dakota DOE policy of aligning educational expectations and evaluations with the Charlotte Danielson Framework (SD DOE 24:57:0201), providing another layer of construct validity. Training for CTL created SLO and TE processes are provided through the SD DOE. Privacy law parameters for collection of SLO and TE-Rating data were discussed with superintendents and SD Board of Regents legal counsel. The resultant process required personal contact between P-12 school faculty and EPP faculty for a direct transfer of data. The EPP purposefully selected two local districts with a history of strong partnership and a prevalence of EPP-prepared in- service teachers. A unique characteristic of the chosen schools is that several grades and subjects have at least 50% EPP- prepared teachers, thus these teachers strongly influence SLO and TE-Rating data as an actual measure of the completer impact on P-12 student-learning growth. The EPP recognizes the limitations and possible biases associated with volunteer data.
To maintain consistency in measuring program impact and teacher-effectiveness of EPP-prepared in-service teachers on P-12 learning and development, the EPP uses STARS data for the same two districts as the SLO and TE-Rating. STARS provides a variety of academic and demographic data through a centralized system that disaggregates student data to grade; only aggregate proficiency data are reported due to privacy laws. STARS proficiency and growth data are reported for Math, English Language Arts (ELA), and (as of 2017-18) Science. Combining proficiency with growth data annually and across time provides a more complete picture of impact when integrated with benchmarks of State averages and ranges. Together, measures of performance and growth provide information for both impact on learners and teacher effectiveness. In addition to measurement data above, measures of student growth were collected and analyzed using action research. This research was conducted by alumni enrolled in the Master of Science in Curriculum and Instruction degree program, in their respective classrooms. All research required Institutional Review Board (IRB) approved to ensure rigor. Only data from EPP- prepared teachers’ research are included in this analysis. Action research outcomes demonstrated improvement or neutral outcomes in P-12 learning across subject areas and grade levels. Data below show the performance and growth rates (first year of data for science so no growth rate) for the 2017-2018 academic year. It is evident that performance data is at or above the State average with the exception of District 1 science. Growth rates for each district and subject are also at or above State averages. The strong math growth rate for district 2 is certainly notable. Coupling the growth data with performance is important for the interpretation. Doc attached, Section 4: 1st Table Insert
Measures of students’ college and career-readiness in each district indicate a dichotomy of performance, yet positive impact on student learning by EPP-prepared teachers. State DOE benchmarks for Math and English are noted in the data table. District 1 students met ELA State average benchmark and exceeded State average Math benchmark. District 2 students performed below State average benchmarks for ELA and Math. District 2 exceeded the State average benchmark for the Career Readiness while District 1 performed below.
Doc attached, Section 4: 2nd- 5th Table Inserts
Integration of STARS, SLO, and action-research data strengthen conclusions that EPP-prepared teachers contributed to P-12 student-learning growth and were effective teachers. Findings of SLO data for the two selected partner districts supported both aggregate STARS data and action-research data determining completers' positive impact on P-12 student learning. The SLO data demonstrated that EPP-prepared teachers met and exceeded expectations for student growth within their classrooms. STARS data demonstrated both proficiency and growth for students in schools with a preponderance of EPP-prepared in-service teacher. Additionally, action-based research data collected by EPP-prepared in- service teachers demonstrated positive student growth. The three data sources with varying methodologies demonstrate consistency of data and support the validity of the outcome that EPP-prepared in-service teachers also demonstrated positive impact of P-12 student growth.
The supervisor TE-Rating provides another measure of impact on student learning. Although district TE- Rating reporting varies within parameters allowed by the SD DOE, minimally, districts must evaluate teacher effectiveness relative to the state-mandated Danielson Framework. However, additional research-based teacher effectiveness evaluation options may be added as approved by the SD DOE. Although the TE-rating instrument is consistent, districts are provided flexibility for scoring. District 1 used a 4 point scale, 4 being the highest; District 2 used a descriptive interpretation of unsatisfactory, basic, proficient, and distinguished. All teachers included demonstrated proficient or stronger TE-ratings in the classroom. This finding is consistent with the positive evidence of growth relative to performance and high percentages of students meeting college and career ready expectations. Taken together with the growth, proficiency, and college and career ready data, EPP-prepared teachers demonstrate a positive impact on student learning.
Another critical component to measuring EPP impact is the perception by candidates of their preparedness for the teaching profession. Indirect measures of perceptions of program preparation include graduate, alumni, and employer surveys and advisory council input. The EPP currently administers surveys directly aligned with the InTASC Standards which is one of the EPP foundations, therefore surveys directly relate to the EPP curriculum. The same survey question content, with audience variations, is administered to each respondent group, completers, alumni, and principals, to enable direct comparison of responses. Principals and alumni receive the survey on off-set three year cycles and completers at the SOE graduation exit seminar.
Doc attached, Section 4: 6th Table Insert
Employer data: 2019 Alumni data: 2020
All education program completers meet or exceed licensing requirements due to EPP integration of SD DOE certification requirements within curriculum, policy, and graduation requirements. Policy and graduation requirements are inclusive of the SD Board of Regents mandates. This ensures that completers meet and exceed certification requirements and are fully qualified for employment in the content area and level for which they were prepared. Doc attached, Section 4: 5th Table Insert
The EPP and State collaborated to develop composite majors to serve the needs of small (10-80 senior class) rural, and tribal schools. The general need for teachers in the small and rural school districts in South Dakota led to a SD DOE rules change that will be implemented in 2019; secondary will change from 7-12 to 5-12. The EPP is preparing for the curricular changes; however, this could impact in-service teachers if assigned students in grades 5 and 6, but who do not have the middle level preparation. This could certainly impact both the graduate and the employer satisfaction (4.3, 4.4) as well as Impact on P-12 Learning and
Indicators of Teaching Effectiveness. Also impacting graduation rates, the EPP monitors developmental progression of content, pedagogy, and dispositions to help candidates with decisions related to admission, continuation, and graduation in the education profession. Candidates unable to meet EPP requirements are counseled out, removed, or required to remediate prior to graduation as needed. This contributes positively to the strong graduation rate. Preparation in each candidate’s field of interest aligning with certification and licensure requirements helps with career placement and subsequently, the ability of the graduates to repay loans. BHSU 2012-2015 (most recent Federal data released) aggregate default rate was 9.3% (aggregate means all BHSU students inclusive of all years and all programs university-wide; schools or degree are not reported). The national rate was 10.8, SD 12.9, public 2-3 year 16.7 and 4 year 7.1.
https://www2.ed.gov/offices/OSFAP/defaultmanagement/cdr.html
Section 5. Areas for Improvement, Weaknesses, and/or Stipulations Summarize EPP activities and the outcomes of those activities as they relate to correcting the areas cited in the last Accreditation Action/Decision Report.
CAEP: Areas for Improvement (ITP) 5 Provider Quality Assurance and Continuous Improvement
The EPP did not provide sufficient information regarding validity and reliability. (component 5.2)
The EPP received accreditation with no stipulations, yet one area for improvement was noted. The EPP-created Professional Disposition Assessment and satisfaction survey instrument lacked sufficient validity. The foundation of each tool’s creation was the Charlotte Danielson Framework for Teaching and InTASC Standards, yet the newly reconstructed tools were not yet determined to have research-based content validity and reliability. The final decision was received November 15, 2018, however, the EPP began working toward remediation of the AFI based on the FFR.
Two EPP faculty developed a research proposal to determine validity and reliability for the PDA has been submitted to the human subjects committee for approval. The faculty completed the required CITI training pertinent to this type of study and are awaiting approval. Determining PDA validity and reliability will occur across the summer and 2019-2020 academic year. The Survey instrument will progress through the same process beginning Fall of 2020 as per the EPP Selected Improvement Plan.
Section 6. Continuous Improvement CAEP Standard 5
The provider maintains a quality assurance system comprised of valid data from multiple measures, including evidence of candidates' and completers' positive impact on P-12 student learning and development. The provider supports continuous improvement that is sustained and evidence-based, and that evaluates the effectiveness of its completers. The provider uses the results of inquiry and data collection to establish priorities, enhance program elements and capacity, and test innovations to improve completers' impact on P-12 student learning and development.
CAEP Standard 5, Component 5.3 The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes.
6.1 Summarize any data-driven EPP-wide or programmatic modifications, innovations, or changes planned, worked on, or completed in the last academic year. This is an opportunity to share targeted continuous improvement efforts your EPP is proud of. Focus on one to three major efforts the EPP made and the relationship among data examined, changes, and studying the results of those changes.
Describe how the EPP regularly and systematically assessed its performance against its goals or the CAEP standards. What innovations or changes did the EPP implement as a result of that review? How are progress and results tracked? How will the EPP know the degree to which changes are improvements?
The following questions were created from the March 2016 handbook for initial-level programs sufficiency criteria for standard 5, component 5.3 and may be helpful in cataloguing continuous improvement.
What quality assurance system data did the provider review? What patterns across preparation programs (both strengths and weaknesses) did the provider identify? How did the provider use data/evidence for continuous improvement? How did the provider test innovations? What specific examples show that changes and program modifications can be linked back to evidence/data? How did the provider document explicit investigation of selection criteria used for Standard 3 in relation to candidate progress and completion? How did the provider document that data-driven changes are ongoing and based on systematic assessment of performance, and/or that innovations result in overall positive trends of improvement for EPPs, their candidates, and P-12 students?
The following thoughts are derived from the September 2017 handbook for advanced-level programs How was stakeholders' feedback and input sought and incorporated into the evaluation, research, and decision-making activities?
QUALITY ASSURANCE: ASSESSMENT INSTRUMENTS, STANDARDS, AND FRAMEWORK This EPP’s approach to quality assurance is continually evolving to reflect research-based best educational practice as well as EPP goals and accreditation expectations. Proprietary and EPP-created measures combine to create verifiable, relevant, and actionable analyses of candidate, program (licensure discipline), and EPP outcomes. Proprietary assessments used by the EPP are the 1) ETS Praxis discipline content, 2) Principles of Learning and Teaching (PLT) exams and 3) Praxis Performance Assessment of Teachers (PPAT). ETS provides national benchmarks for each assessment which the EPP uses for candidate performance comparison (discussed later in the report under annual reviews). EPP-created measures include the Professional Dispositions Assessment (PDA), Formative and Final Clinical Evaluations (FA), and surveys of employers, alumni, and completers. Minimum benchmarks and expectations determined by the EPP with partners add accountability. Non-academic performance information are included within the PPAT and PLT and supplemented by PDA and surveys. The FA report summarizes the formative assessment rubric content as a synopsis of candidate growth through the clinical experience.
PROGRAM REVIEW PROCESS AND OUTCOMES Parallel to the EPP report, program, and candidate data reviews occur annually. At the program level, the School of Education Assessment Committee reviews, analyzes, and interprets data then shares the reports with content and EPP faculty for discussion, recommendations, and action. All proprietary and EPP-created assessment data for transition point (TP) 3 and exit are included in the annual disaggregated program assessment reports prepared by the SOE Assessment Committee. Candidates complete a PDA, clinical educator and university supervisor performance survey, and program survey at TP 3, program exit. All reports complement the proprietary assessments and stipulated course and GPA requirements. Clinical educators and university supervisors complete a PDA as well as formative and final appraisals during TP 3. The number of formative and final appraisals is dependent upon the type of placement and intern performance. For the purposes of research and triangulation, the PDA items are identical for each member of the internship team and the survey completed by graduates is the same as that submitted to alumni and employers.
Data and benchmark comparisons for proprietary and EPP-created measures are the foundation of annual program reports. The reporting process includes an annual analysis of each program and a grand analysis: 2 measures of content: 1) certification proprietary exam, 2) per program need Pedagogy-based proprietary exam Ability to plan instruction Effect on student learning Professional Dispositions Clinical Internship
Each program report provides evidence and analysis to demonstrate CAEP SPA or State expectations. The SOE Assessment committee reviews trends across time and programs, interpretation of benchmark comparisons, and implementation of changes deemed necessary from data analysis. The assessment committee members initially collaborate with program coordinators across colleges for discussion and interpretation of the data. Next, discussion expands to incorporating data from alumni, employer, and graduate surveys, and input gleaned from partners (e.g. superintendents, principals, diverse population representatives, community content specialists, teacher fair representatives). The EPP discusses the recommendations in detail then determines actions taken by EPP vote.
QUALITY ASSURANCE: TESTING INNOVATIONS AND IMPLEMENTING EVIDENCE DRIVEN CHANGES Actions noted across time that led to curricular changes over the past 2 academic years include: an additional capstone technology course, an assessment course, and a behavior management course. Approvals at the campus level and the Board of Regents system level were achieved and the courses are being added according to BOR policy. That means that teacher candidates who entered the university under the 2018 Academic Catalog will be required to complete the new courses. It will be 3-5 years before strong data will be obtained regarding the implementation of these courses. Additionally, faculty recognized a need for elementary education majors to have more content in social studies, which resulted in the addition of ELED 361, Social Science for Elementary Teachers. Outcomes of this course will begin to be seen within the next academic year of Praxis content results.
Results for social studies content specific to ELED will be notable from the subcategory of the Praxis content exam. This will require specifically tracking candidates who did/did not take the new course until all candidates are required to take the course. The impact of the technology, behavior management, and assessment courses will be monitored via specific items on the PPAT, Clinical Internship Final Appraisal, PDA and surveys of employers and completers. Direct alignment of scores on proprietary tests and ratings from the surveys and PDA will determine the impact, positive, neutral, or negative of each course.
The EPP is transitioning to reporting and monitoring the 8 outcomes and impact measures; specifically of interest here are Sections 4.1 and 4.2 of this report. These sections detail for data collection and analysis of EPP completers' impact on P-12 learning and development and teaching effectiveness.
CANDIDATE REVIEW PROCESS AND OUTCOMES As with the EPP and program, candidate-level analysis occurs annually. Assessment driven actions at the candidate level occur at each of three transition points (TPs) or as a result of documented concerns.
Evidence for TP decisions is based on formative and summative feedback data provided by clinical educators, EPP and university faculty, supervisors, and proprietary assessment outcomes. The EPP has an established formal process to review data, inform candidates of decisions, and adjudicate disagreements.
Successful candidates must meet benchmarks for performance, knowledge, and dispositional assessments at each of three transition points (TPs). Candidates passing through TP1 achieved EPP set benchmarks while TPs 2 and 3 require meeting national proprietary assessment benchmarks as well as EPP. The nationally determined benchmarks at TP2 is the SDDOE mandated Praxis content exam, which is required prior to a clinical internship as an affirmation of content competence for placement sites. TP3 national benchmarks required for exit are the SDDOE mandated Praxis PLT and PPAT. The EPP recently collaborated with the SDDOE and BOR universities to set a state cut-score for are PPAT. Our EPP piloted the PPAT in 2015 and it is now fully implemented.
QUALITY ASSURANCE PARTNER COLLABORATIONS AND CO-CONSTRUCTION Recommendation/EPP action:
a) establishing policies allowing clinical interns to substitute teach during their internship; EPP action: Discussion and refinement for the School of Education Policies and Procedures 3.4.2 a-c; and the Student Teaching Handbook page 15: Substitute Teaching
b) completers have experience with the SLO and TE-Rating process EPP action: methods faculty are discussing and determining appropriateness of depth for placement in curriculum; the SDDOE is reviewing the viability of these two practices which will impact the EPP inclusion of training
c) More experience with locally/regionally-used data platforms EPP action: A newly developed course, EDFN 375, Methods of Technology Integration, provides a capstone technology experience for candidates that will include numerous local and regional data platforms
f) Possibility of conference-type sharing on BHSU campus EPP action: needs and feasibility investigation
Tag the standard(s) or component(s) to which the data or changes apply.
1.3 Application of content and pedagogical knowledge 1.5 Model and apply technology standards 5.2 Quality assurance system relies on measures yielding reliable, valid, and actionable data. 5.4 Measures of completer impact are analyzed, shared and used in decision-making
Upload data results or documentation of data-driven changes.
New_Tech_class.docx
New_SS_for_elem_class.docx
Tables_for_section_4.docx
sub_rules.docx
Meeting_Notes_from_Advisory_Council_Meeting_110518__cert_now_512_mini_conference.docx
AC_Meeting_Notes_Spring_2018__Tech_class_content.docx
Tables_for_section_4(1).docx
6.2 Would the provider be willing to share highlights, new initiatives, assessments, research, scholarship, or service activities during a CAEP Conference or in other CAEP Communications?
Yes No
6.3 Optional Comments
Web Design intent is to create a table of the 8 reporting measures with links to detailed data.
Section 8: Preparer's Authorization Preparer's authorization. By checking the box below, I indicate that I am authorized by the EPP to complete the 2019 EPP Annual Report.
I am authorized to complete this report.
Report Preparer's Information
Name: Betsy Silva
Position: Faculty/Assessment/CAEP Coordinator
Phone: 6056456673
E-mail: [email protected]
I understand that all the information that is provided to CAEP from EPPs seeking initial accreditation, continuing accreditation or having completed the accreditation process is considered the property of CAEP and may be used for training, research and data review. CAEP reserves the right to compile and issue data derived from accreditation documents.
CAEP Accreditation Policy
Policy 6.01 Annual Report
An EPP must submit an Annual Report to maintain accreditation or accreditation-eligibility. The report is opened for data entry each year in January. EPPs are given 90 days from the date of system availability to complete the report.
CAEP is required to collect and apply the data from the Annual Report to:
1. Monitor whether the EPP continues to meet the CAEP Standards between site visits. 2. Review and analyze stipulations and any AFIs submitted with evidence that they were addressed. 3. Monitor reports of substantive changes. 4. Collect headcount completer data, including for distance learning programs. 5. Monitor how the EPP publicly reports candidate performance data and other consumer information on its website.
CAEP accreditation staff conduct annual analysis of AFIs and/or stipulations and the decisions of the Accreditation Council to assess consistency.
Failure to submit an Annual Report will result in referral to the Accreditation Council for review. Adverse action may result.
Policy 8.05 Misleading or Incorrect Statements
The EPP is responsible for the adequacy and accuracy of all information submitted by the EPP for accreditation purposes, including program reviews, self-study reports, formative feedback reports and addendums and site visit report responses, and information made available to prospective candidates and the public. In particular, information displayed by the EPP pertaining to its accreditation and Title II decision, term, consumer information, or candidate performance (e.g., standardized test results, job placement rates, and licensing examination rates) must be accurate and current.
When CAEP becomes aware that an accredited EPP has misrepresented any action taken by CAEP with respect to the EPP and/or its accreditation, or uses accreditation reports or materials in a false or misleading manner, the EPP will be contacted and directed to issue a corrective communication. Failure to correct misleading or inaccurate statements can lead to adverse action.
Acknowledge