Electronic Network Update

Project 2: Assessment of Student Learning in Programs

1. Describe the accomplishments you made in the past six months and the current status of each of your Student Learning Projects. (100-200 words)  

We have three student learning projects. One of these, General Education Assessment, essentially is dormant right now. The assessment plan for the General Education program has been approved, but the program’s structure has not. When that happens, perhaps later this year, we will devote more attention to this project. Our primary student learning project is Assessment of Student Learning in Programs. Most of our effort has been devoted toward this project, but we also have devoted substantial attention to our project on the University Assessment System. These two projects are closely related. We see the University Assessment System project as less important in its own right than as a contributor to the Assessment of Student Learning in Programs project. To improve program-level assessment, we need to make certain institution-level changes. Here is a list of major accomplishments since our last update in November 2007.

  • Barbara Walvoord visited our campus for two days in January 2008, presenting workshops on program assessment, general education, and grading. The combined attendance at these workshops was over 200 faculty and staff. The workshops received high evaluation scores and, anecdotally, changed opinions of some about assessment. 
  • We have made substantial progress in posting the assessment plans of academic programs on the Assessment website. Many programs have revised their assessment plans or completed missing components.
  • A large percentage of academic programs used the recommended template for their annual assessment report this year. Although a few programs still use other formats, we have succeeded in getting the template established as the “default” format.
  • We developed materials to train assessment peer consultants, and we trained 20 peer consultants in January and 16 more in October. So far the Assessment Peer Consulting program has served five programs. Peer consultants meet monthly during the academic year, and they are developing case materials to be used for training purposes at these meetings.
  • Funding for assessment grants was continued for this year. The reports from last year’s grants document that each of the 14 grants had a positive impact on program assessment.
  • We reorganized the Assessment website and developed an online handbook that explains assessment expectations on campus and provides tips on how to fulfill these.
  • SCSU now subscribes to Quality Matters. In August several faculty members received training in how to use the rubric to evaluate online courses.

2. What are your next steps towards realizing the results for each of these Student Learning Projects? For example, who will do what, when? (100-200 words)

  • College of Science and Engineering (COSE) Assessment Committee will pilot assessment of assessment this year. The committee developed a rubric based upon our annual report template that it will use to provide feedback and suggestions to programs in the college. Other college committees will begin providing feedback to programs within the next two years.
  • The Assessment Steering Committee will complete a position paper on rewards for assessment work this year. The purpose will be to examine the reward structure for assessment work done by faculty and staff and to clarify how and under what conditions assessment work at the program level contributes to evaluation criteria specified in their collective bargaining agreements.
  • Program review guidelines will be revised to coordinate more closely with assessment and academic planning. Although Academic Affairs is primarily responsible for moving this forward, it will require coordination among several committees.
  • SCSU is participating in Foundations of Excellence this year. In connection with this project, we will inventory existing first-year assessment resources and develop new ones. Several members of the Assessment Steering Committee will be serving on Foundations of Excellence committees.
  • An academic action planning group on institutional outcomes recommended that SCSU establish institutional learning outcomes, and this recommendation has been incorporated into the institutional work plan for this fiscal year.
  • We will begin publication of an assessment newsletter this year. The Assessment Office will be responsible for implementation. Members of the Assessment Steering Committee will play a major role in identifying useful content, and most of the writing will be done by faculty and staff members.
  • The Assessment Steering Committee established a new program this year, Advancing Program Assessment through Discussion (APAD), which is designed to encourage discussions about assessment. This program subsidizes the purchase of books to be used for program-level discussions of assessment.
  • We expect to host CLA in the Classroom training on our campus this spring. The Office of Institutional Effectiveness is coordinating this effort. CLA requires that we have 20 trainees. Once we have identified the source of financial support for these trainees, we will schedule the training. After the training, those trained will be able to train others to adapt CLA performance tasks and rubrics for classroom use.

3a. Describe any "effective practice(s)" that resulted from your work on assessing student learning.
3b. What makes this an “effective practice” at your institution? Specifically, we are asking you to contextualize the “effective practice” so others will understand why it was successful. This will help others decide if and how these practices could be customized to meet their own specific needs.

  • Our annual report template is in its second year. Based upon faculty input, we made a few minor changes to the template this year, and we changed the reporting deadline from late spring to mid-September. These changes undoubtedly contributed to greater use of the template this year, as well as the stronger reports we received. The template was used by more than 80% of the reporting academic programs in every college except the College of Business, which developed its own form based upon the template. The template includes a question about program-level discussions among faculty. We used an open-ended question last year but replaced it with a checklist this year. Perhaps in part due to this change, more programs reported that they had discussions about assessment during the past year. This year, almost 90% of reporting programs had discussions of program-level assessment. 
  • The committee structure for assessment of academic programs has become increasingly effective. This is especially evident in handling of annual assessment reports from programs. Last year program reports went to college assessment directors, who submitted a summary report as well as the program reports to the University Assessment Director. The University Assessment Director included the summary report in the institutional assessment report, but the program reports were not made public. This year the college assessment directors did not forward the program reports; these are kept at the program and college level. This have alleviated fears among some faculty members that the information in the program reports could be used against them, and it probably contributed to the stronger reports we received this year. College committees also are beginning to take on additional responsibility for assessing assessment. The College of Science and Engineering will be providing feedback to programs on this year’s reports, and other colleges will do this in the near future.
  • On one of the professional development days at the beginning of spring semester, we host an Assessment Luncheon to which we invite members of assessment committees, assessment peer consultants, and administrators. This event brings together people from across the campus to talk about assessment. We assign seating at round tables and provide discussion questions pertaining to issues about which the Assessment Steering Committee would like feedback. A committee member at each table records the responses. Committee members receive a listing of responses. These have been helpful to us as we discuss issues related to the table discussions.
  • Members of the Assessment Steering Committee offer several workshops on assessment-related topics each semester. These provide a resource to departments and units that want to improve their assessment practices. Equally important, they provide an opportunity to discuss assessment across department and unit lines.
  • We trained 20 assessment peer consultants in January 2008 and trained 16 more in October, just as this update is being submitted. The 11 hours of training focus on assessment principles, institutional policies and resources, analysis of case materials, and role playing of peer consulting scenarios. Those trained in January found all parts of the training to be useful, but they cited the case materials (SWOT analysis of assessment practices in their own programs and role playing of three cases) as especially useful. We are developing additional case materials for use at monthly meetings of the peer consultants. 
  • The Assessment Peer Consulting program provides an opportunity for faculty and staff members who are interested in assessment to interact with each other during the training, in consultations, and in meetings of the peer consultants. The opportunity for interaction across academic/non-academic lines has been especially valuable. Along with other mechanisms for communicating about assessment across department and unit lines, the Assessment Peer Consulting program is helping to create a synergy that is gradually changing the culture of the institution.
  • An assessment grant competition was reinstituted in 2007-08 after a one-year hiatus. Fourteen grants were awarded in 2007-08. The proposal deadline for this year’s competition is November 10. Although the grants are small in size ($2000 cap), they are enough to encourage faculty members to make room in their busy schedules for assessment projects. The reports submitted by last year’s grant recipients document that each grant resulted in improvements in program assessment and/or student learning.  Most of the grant recipients are presenting on their grant projects on a faculty development day. In August, a panel including recipients of five grants was well attended and well received. Another session with grant recipients is planned for January 2009.

4. How has this project engaged stakeholders (e.g., faculty, administrators, staff, students) in assessing and improving student learning?

  • The Walvoord workshops attracted a large number of faculty members, staff, and administrators. Interest in Walvoord’s approach has stimulated discussions in some programs. This led us to initiate the Advancing Program Assessment Through Discussion (APAD) program this year, which provides copies of Walvoord’s assessment books or another book specified by the department to support discussions about program assessment.
  • SCSU went through an extensive academic planning process last year. One consequence is that institutional planning, reporting, and budgeting processes are becoming increasingly integrated. For example, future assessment reports will be incorporated into a larger departmental report that encompasses all departmental activities.
  • The Assessment Luncheon, the Assessment Peer Consulting program, and assessment workshops bring together faculty, staff, and administrators to discuss and address assessment issues.
  • We plan to host CLA in the Classroom training in the spring. Several deans have expressed interest in having faculty members in their colleges participate, and it is likely that these deans will be providing financial support to support this participation.
  • The assessment grant program has administrative support. The Provost made $25,000 available for assessment grants last year and $20,000 this year. Last year the Dean of Continuing Studies also made available $5,000 to support proposals focusing on online learning.

5. What has been the impact of your Academy work on the institution, faculty and staff, students, learning and teaching, the culture...etc.?

  • We are currently processing information from annual assessment reports. We do not yet have statistical descriptions of the improvements from last year, but the quality of the reports clearly has improved. Fewer programs did not submit an annual report, more programs used the recommended template for their report, and more programs reported that they are using assessment findings for improvement.
  • Many programs revised their assessment plans or added missing components in the past year. The resource on the Assessment website that links to program assessment plans is becoming more complete.
  • The workshops offered by Barbara Walvoord last January were well received and generated discussions among faculty and staff members that are continuing into this year.
  • The assessment peer consulting program is generating interest among the faculty and staff. Last January we trained 20 assessment peer consultants, 10 of whom were on the Assessment Steering Committee. This October we trained 16 more, only two of whom were on the Assessment Steering Committee. After being trained, peer consultants met twice last spring, and they have met once so far this academic year. Several are developing case materials that will be used for role playing or discussion at future meetings. These meetings, as well as the training and the consultations with programs, provide a mechanism for communicating across department and unit lines about assessment, and this seems to be having a synergistic effect.
  • Fourteen assessment grants were funded in 2007-08. The reports filed by the grant recipients document improvements in assessment practices and/or student learning in each case.
  • Recipients of 2007-08 assessment grants were given an opportunity to present their work on a faculty development day. Most accepted this invitation. Recipients of five grants presented in a panel session during convocation week. The session was well attended and well received. A roundtable session is planned in January for the remaining grant recipients

6. What challenges are you still facing in regards to assessing and improving student learning?Departments and programs are realizing that not only is assessment “politically” important but that they and their students can benefit from the process. 

Improvements need to be made as follows:

  • Promoting discussions of program-level assessment among facultyand staff
  • Increasing the percentage of programs that use assessment findings to improve assessment practices and student learning
  • Providing constructive feedback to programs about their assessment efforts
  • Generating increased commitment to assessment among the faculty
  • Clarifying the contributions of assessment work to teaching effectiveness and scholarship, and ensuring that these contributions are properly rewarded in the faculty evaluation process
  • Ensuring consistentresources to support assessment—such as ongoing funding for assessment grants and compensation for teams of general education faculty engaged in time-consuming summer assessment projects

7. If you need specific help to stimulate progress on this action project, explain your need(s) and who to contact?

If you can provide any help with these problems, please contact Jim Sherohman.

  • More faculty buy-in is needed to approach our targets for the project, but we are limited by what we can do to speed up this process. Among the underlying issues are time/workload, faculty ownership of data, the lack of an institutionalized mechanism for program level discussions about assessment, and a sporadic history of support for assessment. We feel that we have made substantial progress on this, and we expect to make additional progress this year through a position paper on rewards for assessment work. However, we still see a need for improvement.
  • Coordinating institutional, HLC-based assessment policies with assessment policies for specialized or professional accrediting bodies (NCATE, AACSB, ABET, etc.). We have made some progress but are extremely interested in how other institutions are handling this issue. This affects two of our five colleges most strongly (Business and Education). Faculty members in those collegeshave an increased workload related to accreditation-related assessment.
Untitled Document