According to the Sauk Valley Community College Assessment Plan (2010), the college’s assessment system “exists to measure the degree which our instructional practices work in support of the organizing principles of the college, including the Mission, Vision, and Shared Values.” To carry out this directive, the assessment system articulates four institution-level assessment goals. These over-arching goals were developed by the faculty to guide the development of the 2003 Assessment Plan and have remained the organizing principle of the system:
The 2003 assessment plan divided the transfer goal into Disciplines (any A.S. degree offered in the catalog) and Areas (the General Education Core Curriculum areas described in the catalog). In this design, faculty in each discipline established a set of outcomes and collected, analyzed, and acted upon the data collected through classroom assessment. Areas would then, according to the plan, aggregate the data from across the appropriate disciplines to analyze and act upon in relation to specific outcomes established at the area level. A 2009 Gap Analysis showed that the system was too complex to be maintained: a small, multi-disciplinary faculty was overburdened and the single-person-discipline instructors found little value in assessment in isolation. As a result, the 100% participation of full-time faculty in collecting discipline-level data that marked the first two years of the system had eroded. As can be seen in Figure 3i, significant holes had developed in the discipline data: Of the 35 disciplines, 83% had no documentation of new data in FY09.
The same analysis found, however, that where a group of faculty was working together on a project, whether a multi-faculty discipline or an area, the data was more rigorously collected, discussed, and acted upon. In fact, in FY10, area projects had been conducted in 100% of the transfer areas. As a result, in 2010, the system was revised so that instead of assessing each discipline, the focus of assessment efforts shifted to area-level projects. This adjustment is appropriate to a community college, where transfer degrees are focused on providing the GECC and a sampling of discipline-level work.
The General Education Competencies (Ethics, Mathematics and Quantitative Reasoning, Problem Solving, Communications, Technology, and Research) were developed by the faculty for the 2003 Assessment Plan. These outcomes speak directly to the responsibility of the college, as well as to its obligation to assist students in developing the habits of mind that society values in the formation of citizens in a democracy ( 4B.1). The competencies are referred to in assessment documentation as a “golden thread” woven through the strands of coursework. As a result of this conception, the data for assessing the competencies is drawn from college-level classrooms across the curriculum and generally not from a specific course where direct instruction is provided. So, for example, assessment events for research are carried out in classes which have research project requirements, such as literature, nursing, or biology, rather than from Composition II (ENG 103) where research writing is taught. This approach arises from the faculty’s desire to assess how students are acquiring and carrying out the competencies across education experiences in order to confirm retention of the competency and to inform direct instruction.
Instructors are asked to choose two competencies they value and annually to collect and report data for these from appropriate college-level courses. The data is aggregated over a three-year period and discussed during the year that the competency comes up on the cycle. The faculty uses the data as a catalyst for discussions, both cross-curricular and within their areas, conforming to the timeline that coordinates these discussions into the planning and budget cycles. By design, the results of this analysis and discussion make their way either into Operational Plans or back to the full faculty for action. Several examples demonstrate that over time, this process is consistently providing opportunities for improvement:
Cyclical assessment of the competencies also enabled the development of institution-wide projects. Selected and administered by the Core Team, the institution-wide assessments show promise to become a valuable instrument of institutional improvement. Three such projects have been undertaken:
As with transfer degrees, the A.A.S. degree faculty created specific outcomes for each individual program in the design of the 2003 assessment system. They did not create area-level assessments because no compelling need appeared at that time for aggregating data across programs. Nursing, as a multi-faculty program with a highly developed assessment process that predates the development of the college-wide system, was able to benefit from the program-level discussions; however, the technology programs and some of the business programs, each with a single-person faculty, struggled with program assessment in the same way that single-person transfer disciplines did. To provide a remedy, the career degrees have added appropriate cross-curricular aggregation in order to benefit from the discussions and institution-level influence that area-level assessment has demonstrated are valuable. The program faculty met at the fall 2010 in-service to create a combined objective sheet that establishes outcomes based on the needs of employers. As the self-study is in process, the system has not yet collected data, but the end product will provide common outcomes will be collected and aggregated. For example, all of the career programs share the outcome that students will exhibit professional habits and behaviors in the workplace. Based on internship experiences, clinical settings, or capstone courses, instructors or supervisors will assess dependability, social appropriateness, cooperation and initiative. These shared concerns for employment readiness can be discussed profitably by the diverse programs and acted upon to the benefit of student learning and institutional improvement. The career programs will continue to assess various external data for their individual programs and have the option to retain program-specific skill assessments as the redesign matures.
As an open-enrollment college, preparing the underprepared student to succeed is vitally important to carrying out Sauk's Mission. The assessment system outcomes for developmental students, defined as those enrolled in any developmental-level reading, composition, or math course, were established for the 2003 assessment plan by developmental faculty. The outcomes address both academic skills—specifically passage of exit testing that demonstrates college-level readiness—and other success skills.
Mathematics has a highly developed assessment tool for its testing, and the full-time faculty regularly discuss and act on the data collected, as attested to by documents on file in the assessment folder. However, at the time of the self-study, no reading or English exit testing data has been filed in the folder, even though evidence exists that the data has been regularly collected. Developmental Taskforce minutes, area operational planning, and a recent program review all reveal that the assessment data is being effectively applied, especially to oversight and revision of placement cutoff scores and to changes in the exit testing that have taken place in the two previous years. In 2009, the Core Team acted to repair this gap by inviting the Director of Academic Development to become an active member of the Core Team.