Data Collection, Analysis, and Evaluation
The unit has worked diligently to establish a system of data collection that provides information that is valuable for evaluating candidate, program, and unit performance. However, over the past seven years, the unit discovered that it was collecting too much data and was not being as selective about the types of data collected and the impact of the dissemination schedule. Therefore, adjustments have been made in the data collection and dissemination schedule within the Unit Assessment System to better serve the internal needs of the unit.
The data collected within the Assessment System are illustrated in a document titled Key Assessments Inventory. Candidates are assessed through formative and summative assessments. Data are collected from a variety of sources including candidates, cooperating teachers, principals, superintendents, university supervisors, faculty, counselors, and national testing services. Data collected on a semester-by-semester basis are typically reported at the unit and program levels on at least a bi-yearly basis. These reports include data sorted into tables and figures and include a written interpretive narrative to guide unit members. Some data, such as Praxis I and Praxis II information are disseminated annually. The dissemination system within the Unit Assessment System was established to ensure that members of the unit receive data on candidate and unit performance annually.
Data on our off-campus programs are also collected and disseminated through a formal analysis. A research study was recently conducted on the North Branch Partnership with Anoka-Ramsey Community College and North Branch School District. Findings suggested that candidate performance in the program was equal to candidates’ on-campus accomplishments in most cases and superior in others. The Portal Partnership between Special Education (SCSU) and Anoka-Ramsey Community College will be assessed similarly when candidates come close to the end of the project.
To illustrate how the Unit Assessment System has been evaluated and modified on an ongoing basis, unit-wide surveys have traditionally been distributed at the end of each semester by mail. Data are then entered into statistical programs, analyzed, reported, and disseminated on the schedule specified in the Key Assessments Inventory. However, consumer feedback and low return rates have dictated changes in procedures over the past two academic years. For example, due to declining return rates with mail surveys, the self-report instrument is now collected at a required (for student teachers) professional development day event. This change has resulted in a return rate of close to 100 percent. Another compelling example is the decision to collect the Unit Operations Survey within randomly selected capstone courses during the last half of each semester rather than using a traditional mail survey. Cooperating Teacher Surveys continue to be mailed, as do follow-up studies with candidates and employers. Praxis I and Praxis II data are delivered to the unit by Educational Testing Service (ETS); the data are assembled and delivered to unit representatives during November each academic year. Finally, other data are compiled in the units where they are collected (e.g., Office of Clinical Experiences) and disseminated through the defined process.
The responsibility for data collection, analysis, and dissemination is shared between the Director of Assessment and Accreditation and the Associate Dean. Reassigned time has also been provided to a faculty member to serve as a research analyst. In addition, two to four graduate assistants are employed to manage data.
The unit incorporates a variety of information technologies to manage the data within the Assessment System including the ISRS System, a data management system within the MnSCU System. Data from follow-up studies, unit operations, and performance-based assessments are stored on an extensive and comprehensive database stored on a server within the College of Education. Praxis data are entered into the ISRS System and becomes part of each candidate’s permanent record. The Office of Clinical Experiences uses a data system specifically designed to track the diversity of candidate placements as well as monitor candidate performance. The unit experimented with an external vendor for collection of both formative and summative data on student teaching performance; however, several challenges emerged that could not be resolved. As a result, the unit continues to develop alternative internal routes to the system described above.
Finally, the Associate Dean is responsible for maintaining a system for tracking, analyzing, and reporting student complaints. This system was initiated in 2006 and reported for the first time in 2007.