The Assessment System designed in 2003 was predicated on the concept that the system for assessing student learning would be best served by an "organic" approach that assumed regular change would occur. As a result, the process has continued to be iterative and participatory. As described in the Assessment Plan, each academic year involves several levels of review of the system itself:
At the area/program level, each area faculty group discusses prior year data in the fall of the following year, moderated by an Area Facilitator. At that event, the participating faculty have the opportunity to make appropriate changes to program or area outcomes, rubrics, or assessment tools.
At the institutional level, the faculty discusses data and conducts projects on two general education competencies a year. In a series of discussion events, the faculty has the opportunity to recommend changes to the competencies themselves. In fall 2008, for example, a recommendation came from area-level discussions to eliminate ethical reasoning as a competency. The issue was brought to a meeting of the faculty; both sides of the argument were presented and discussed. The resulting vote confirmed the competency. The same process resulted in the faculty revising the research competency objectives in a way that clarified its goals for student outcomes.
At the system level, the Assessment Plan and the charge of the Assessment Committee call for the Faculty Core Team to conduct an annual evaluation each spring of Sauk's assessment system (Appendix). The Core Team, based on evaluation of system data, creates an annual report, including recommendations for change (if any) and a plan of action for the next academic year. The report is subsequently considered at the spring meeting of the full Assessment Committee, which consists of the Core Team, the Academic Vice President, the Dean of Institutional Research and Planning, and all of the Academic Deans. The annual evaluation ensures systematic oversight of the assessment system and has resulted in several major alterations in the system:
The 2005 and 2006 reports were directed to the Organizational Planning and Improvement Committee (OPIC). By 2006, a shift in the OPIC charge and revisions to the new Operational Plan Template led to a recommendation that this submission to OPIC was no longer necessary; and subsequent reports have been acted upon by the Assessment Committee.
In 2008, the Core Team, believing that the major work of design was complete, subsumed a separate General Education subcommittee and began more direct oversight of the competencies. It indicated that it would “refocus its creative energies from design to the improvement of teaching and learning through application of the assessment data,” primarily by recommending and facilitating professional development related to the Gen Ed competencies.
In spring 2009, the Core Team called for a Gap Analysis of the system. This report was considered at a special meeting of the Assessment Committee in the fall and resulted in a major revision of the 2005 system, which is being implemented as the self-study occurs.
At the administrative level, all of the academic administrators are members of the Assessment Committee and have an important role in discussing, revising, and approving the annual reports of the Core Team. In addition, as part of the 2009 Gap Analysis, key administrators were invited to assess the system against the HLC Matrix of Implementation, the results of which are reported in Figure 3ii below. A periodic repetition of this evaluation has been added to the Core Team’s annual review checklist to strengthen the administrative role in reviewing the assessment process, particularly those aspects, like Board support, that are beyond the purview of the faculty.