School of Education - Accreditation Self-Study Report

Standard 2: Data Collection, Analysis, and Evaluation

The unit collects candidate data at the following transition points.

  • Pre Admission - A strong feature of our programs is the collection of data on candidates upon their first contact with an education program, via the Entry Survey (2.4.a.4), which all students complete in their introductory course. The Entry Survey is the first of the four common metrics tools administered to candidates, and helps us better understand them. In addition to the Entry Survey, candidates participate in an early field experience in their introductory course. Data are collected related to candidates’ dispositions and early skills in this experience (Exhibits 2.4.b.4; 2.4.b.5; 2.4.b.6). Candidates are encouraged to attempt their MTLE Basic Skills tests early so they can re-take the exams and participate in remediation opportunities if necessary (Exhibits 2.4.a.5; 2.4.b.7; 2.4.b.8).
  • Admission – Admission to teacher education is a two-part process. First, eligible candidates are admitted to individual programs. The admission standards for each program are published in undergraduate or graduate catalogs (Exhibit 2.4.b.1). As part of the unit’s new criteria, each program assesses written and oral communication and student dispositions. Many programs have received mini-grants to support collaboration with P-12 to determine how best to assess critical skills and dispositions (Exhibit 2.4.b.9). Once candidates are admitted to their program, they are eligible for admission to teacher education (Exhibit 2.4.b.2), which requires candidates to have a cumulative grade point average of 2.75 and a minimum score of 220 (240 is passing) on the MTLE Basic Skills test. If the minimum score is not attained, candidates must receive permission to proceed by the Student Relations Coordinator, after jointly completing a comprehensive success plan.
  • Progression in Program – As candidates pass through a program, they complete key assessments. Each program identifies a minimum of three key assessments that are used to monitor candidate progress in their program of study and measure necessary knowledge, skills and dispositions. Key assessments are included in program review documents submitted to the Minnesota Board of Teaching (Exhibit 2.4.a.3).
  • Eligibility to Student Teach – Eligibility for student teaching requires candidates to meet all program requirements and pass the MTLE Basic Skills test. The Office of Clinical Experiences and Student Relations Office jointly monitor and support candidates as they apply to student teach. An appeal process is in place for students who petition to move forward without passing Basic Skills (Exhibit 2.4.b.10).
  • Exit – At exit from an initial licensure program, the following data are collected.
    • Exit survey (Exhibit 2.4.a.6) – This is the second of the four common metrics administered. It is completed at the end of student teaching and measures candidate perception of program quality.
    • Performance Based Assessment (Exhibit 2.4.a.7). This instrument, completed by both cooperating teacher and university supervisor, is aligned with state standards. At the advanced level, practicum evaluations are collected from field supervisors.
    • edTPA (Exhibit 2.4.a.8). Minnesota adopted edTPA as an authentic assessment of candidate performance during student teaching. The edTPA is a nationally normed and standardized performance-based instrument measuring teaching performance in five domains: Planning, Instructing, Assessing, Analyzing Teaching and Academic Language. The edTPA rubrics have been cross-walked with Minnesota standards (Exhibit 2.4.a.9).
    • Minnesota Teacher Licensure Examinations. Candidates complete the MTLE Pedagogy and Content examinations at the culmination of their coursework. These are required for Minnesota licensure (Exhibit 2.4.b.13).
    • Induction – Program completers are contacted approximately one year post-completion. The contact information gathered at program exit is shared through a partnership with our Career Services Office, who makes the initial contact with our completers. With these additional resources we have been successful in locating 90% of our program completers in 2012-13 and 87% in 2013-14. Each year a report is completed providing a snapshot of employment for our graduates (Exhibits 2.4.b.11; 2.4.b.12).
      • At the time of the initial contact, Career Services gathers general information about the completer’s employment status. Completers are told to expect a survey in the near future, and are encouraged to complete it.
      • A few weeks after the initial contact by Career Services, a link to the Transition to Teaching Survey (TTS) (Exhibit 2.4.a.14) is sent. The TTS is the third of the common metrics instruments, and is almost identical to the Exit survey, providing insight on how perceptions of preparation change after one year of professional practice. Summary data exist for both the institution and the NExT aggregate (Exhibit 2.4.b.14). Aggregate data has been helpful in gauging program and unit performance and identifying areas for improvement.
      • Per our Institutional Review Board, candidates completing the Transition to Teaching Survey are asked to consent to their supervisor being contacted regarding the quality of their preparation. The Supervisor Survey (Exhibit 2.4.a.15) is the last of the common metric instruments. This survey provides data regarding supervisor views of the preparation provided by SCSU (Exhibit 2.4.b.15). Many discussions have transpired regarding the extremely low response rate on this survey, and means by which to improve our results.
      • Other School Professionals – Candidates in school counseling and educational administration programs are evaluated by university faculty and practicum supervisors based on standards set by other accrediting bodies and professional associations (CACREP and BOSA). School counseling candidates are assessed using the School Counseling Internship Student Rating Form (Exhibit 2.4.a.16) and candidates in educational administration programs are evaluated using the Situational Panel Assessment (Exhibit 2.4.a.17).

Reliability and Validity

Several studies have been conducted over the years related to the quality of the instruments used in our assessment system.

  • Internal consistency reliabilities for INTASC-based domain scores all above or very near .80 collected and reported for (a) the “old” Completer instrument (Exhibit 2.4.c.1), (b) the Performance-Based Summative Assessment (Exhibit 2.4.c.2), (c) and the Cooperating teacher Instrument (Exhibit 2.4.c.3).
  • Scale validity and internal consistency data available for versions of the Transition to Teacher Survey. The instrument proved to have defensible scales with internal consistency reliabilities in the .9 range (Exhibit 2.4.c.4); the same data are available for the Entry Survey (Exhibit 2.4.c.5), the Exit Survey (Exhibit 2.4.c.6) and the Supervisory Survey (Exhibit 2.4.c.7).
  • A study of the correlations between the edTPA and the MTLE Basic Skills – Writing was completed in preparation for administering the edTPA. This was done to determine whether the edTPA “operated” independently of basic writing skills (Exhibit 2.4.c.9). In addition, we looked at the relationship between edTPA scores and our internal summative instrument. While we feel that the Performance Based Summative Assessment is somewhat reliable and valid, the low correlations suggest that the tools measure different aspects of performance (Exhibit 2.4.c.10).
  • An internal study looking at the performance of the edTPA both locally and nationally (Exhibit 2.4.c.11).
  • A study of correlations between MTLE Basic Skills passing scores and various entry criteria, including candidate comprehensive ACT scores (Exhibits 2.4.c.8; 2.4.c.12).

Connect with SCSU