SVCC HLC Self-Study Document

Sauk Valley Community College
HLC Self-Study Document

September 19-21, 2011

2C: Ongoing Evaluation and Assessment

Sauk Valley Community College’s ongoing evaluation and assessment processes provide reliable evidence of institutional effectiveness that clearly informs strategies for continuous improvement.

2C.1: Effective Systems Support Continuous Improvement

The 2002 Visit Team found Sauk engaged in collecting data, but without a system that assured that analysis of that data contributed to continuous improvement. As a result of that finding, the college recognized that much of its planning was done in “silos,” creating isolated documents in isolated settings. It set about transforming itself into a learning organization where continuous improvement could be achieved and documented through interlocked planning systems. In 2006, the Focused Visit Team recognized that transformation and made recommendations for continued growth of the systems for strategic planning and assessment. In addition, a new president added his perspectives, which effected continued changes in the strategic planning system, some of which are being set into motion for the first time during the self-study process. The primary initiative undertaken has been focused on re-engineering and linking multiple internal processes into a single efficient system and on integrating daily activities with institutional planning.

Sauk is clearly maintaining effective systems for collecting, analyzing, and using organizational information. Most importantly, the components of the planning system are linked in such a way that the data from each informs annual budget decisions that in turn support the Strategic Directions. Each of the four primary planning and evaluation cycle components includes a specific point in the timeline where prior performance is revisited in setting future plans (link to an appendixAppendix):

  • Strategic planning: OPIC is charged with reviewing “institutional progress toward achieving Strategic Goals,” which includes reviewing each year’s Operational Plan, Annual Report, Program Review Report, the status of past budget requests coming from program review and operational planning, and the status of Key Performance Indicators (link to an appendixAppendix). During its fall 2010 review, OPIC discovered some issues related to budget request feedback that required improved communication.
  • Operational planning: Planning is a two-part endeavor where units evaluate the results of their plans for the year that is ending and plan next year’s activities. All reports and plans are submitted to the unit’s supervising administrator, who provides feedback as appropriate and approves them. The reports and plans are also submitted to the Dean of Institutional Research and Planning, who also reviews them and requests revisions to assure that activities are aligned with objectives, outcomes are measurable, and that the reported results are pertinent and appropriate. The timeline requires that academic areas submit drafts in April and final plans in September. Support departments and offices submit their reports and plans in July.
  • Program review: An evaluation loop is built into the program review process. Throughout the template, the review teams are asked to report on the results of their previous efforts and identify future efforts. Supervising administrators and the Program Review Committee evaluate the reviews and request modifications or additional information as appropriate; the review is not approved until it is complete. The Dean of Institutional Research and Planning provides the final feedback loop by verifying that planned actions are transferred into Operational Plans.
  • Assessment of academic achievement: Assessment of student learning is faculty driven and based on the learning outcomes set at institutional, area, program, or course level (link to an another section of the report3A.3). Data collected from the various streams is discussed and acted upon by the faculty. Data collected through individual coursework often does not have a budgetary need. However, discussion of various area-level and institutional-level data may result in curricular proposals or budget requests, which are initiated in Operational Plans whenever analysis shows that action is appropriate. The faculty annually collects data which is analyzed and discussed at the beginning of the following academic year. Based on that discussion, the area will determine whether to repeat the assessment or to assess another outcome. By the program review, each area is to have covered all of its objectives at least once. The Sauk Valley Community College Assessment Plan establishes a schedule for assessment tasks that ensures that data flows into the operational planning system in such a way that it reaches the budget (Link to another location in the reportAppendix).

Appropriate internal and external data is made available for strategic, operational, program review and budgetary planning. Information Services creates variations of data sets as requested if a need is determined. The Board of Trustees, the President's Cabinet, and the academic areas and support departments clearly look at data. Reviews of data contribute to organizational improvement when they are used for goal creation in assessment, Operational Plans, or program review. Decisions or steps toward implementation for improvement are more difficult to trace from the President's Cabinet and Board of Trustees meetings as the decisions are housed in minutes and do not necessarily go into a planning document with Strategic Goals attached.

2C.2: Program Review Supports Continuous Improvement

All of Sauk's offices and academic areas undergo evaluation annually in the process of examining data for their Operational Plans and in preparing each year’s budget requests. A more in-depth analysis in conducted every five years as a part of the ICCB-required program review for all instructional and non-instructional units. Sauk uses this requirement to provide a comprehensive analysis and evaluation of each unit.

At the 2002 HLC comprehensive evaluation, the program review process was disconnected from other efforts of the college and suffered from some of the same weaknesses as other systems: Data was unclear and inconsistent. Action plans were often weak and few were implemented. After being approved, many of the reviews were misplaced or lost.

Three nearly simultaneous circumstances resulted in an overhaul of the system:

  • The new college president declared that the completed reviews contained useless narrative based on little or questionable data and ordered changes.
  • Learning organization principles that were transforming the committee structure and the planning system revealed the weakness of the existing system.
  • The ICCB released new Program Review Guidelines in 2007.

As a result, in FY09, program review was redefined as a tool for continuous improvement. A unique approach was utilized to revise this process: instead of forming a single cross-institutional committee, the Dean of Institutional Research and Planning facilitated a process that used specialized small groups. For example, the business faculty and college accountants discussed and designed the structuring of the financial data that review teams analyze during a program review. Also, the templates, revised again slightly for FY11, were aligned with the new Strategic Plan (link to an appendixAppendix). The resulting program review process guides each review team through a data-based examination of its current status in order to make plans for future improvement. Other features enhance the evaluation’s value to the evaluated area and to college planning efforts:

  • Quality and viability issues are separated, which allows the review to address with each separately.
  • A copy of the report that is submitted to the ICCB is also provided to OPIC as information for use in strategic planning.
  • Improvement plans and initiatives become a part of the appropriate Operational Plan and, as a result, are directly linked with the Strategic Plan.
  • When equipment, personnel, or facility needs are identified, the Program Review Guidelines direct teams to complete the appropriate budget request form to submit with the completed review, so it can be considered as a part of the budget allocation process.

While most program reviews have not resulted in major improvements, many minor improvements have been documented, including the following examples:

  • The FY10 Financial Assistance Office review reported that the office was using TELNET for faster student loan processing, verification, and audit reports.
  • The FY09 Accounting review reported the development of two online courses: ACC 101, and ACC 102.
  • The FY08 Mathematics review reported extensive end-of-course assessment data collection and analysis procedures, as well as the development of an Associate of Arts in Teaching for secondary mathematics.
  • The FY08 Property Maintenance Specialist review led to the program being discontinued due to low enrollment.

The program review process has also been consulted for unanticipated purposes, such as identifying facility needs that were shared with site planners.

As a part of the process since FY09, the chair of each program review team has been asked to complete an evaluation of the process. Responses indicate that the objectives for revising the program review format have been achieved. During the first three years of the new format, over 70% of team chairs reported that the program review was linked to other internal processes, 90% ranked the new format as an improvement over the old format, and 75% also rated the new program review as being valuable to the program.

The revised program review process is a positive approach to reviewing a program’s current status, making plans to achieve future improvements, and assuring that requests with budgetary impact get considered as part of the budget allocation process. It is expected that the process and templates will continue to be reviewed and revised slightly every year to keep the process an effective tool for continuous improvement.

2C.3: Evaluation of Institutional Effectiveness

Within Sauk's various planning and evaluation processes, benchmarks are set and results reported in ways appropriate to the system:

  • An important part of operational planning training is to orient employees to identify concrete, measurable results and avoid generalized, abstract results on the Operational Plans. The Dean of Institutional Research and Planning reviews submitted plans for their compliance with the creation of appropriate benchmarks. Results are recorded on the Operational Plan Form at the end of the plan year, in preparation for setting the next year’s objectives.
  • Program reviews use program-specific data to analyze need, quality, and cost effectiveness and then to make plans. The types of data used are consistent among all instructional programs to allow for cross-program comparisons. Results of the program review are posted on the Sauk webpage as well as reported to ICCB.
  • Faculty involved in assessment of student learning set the expected performance for the objectives and review these when discussing their findings. The results of CAAP allow Sauk graduates to be evaluated against a set of peer colleges and against state and national norms. Findings are stored in a digital system for access by the faculty.

The college measures its overall institutional effectiveness against ICCB benchmarks and the goals set by the Illinois Board of Higher Education (IBHE) through two required reports, submitted annually to ICCB: the Underrepresented Groups Report and the Performance Report. Although these reports have gone through several transitions during the past few years as the IBHE has revised its goals, Sauk intends to dovetail the IBHE and ICCB benchmarking process with the Strategic Directions.

In the meantime, the FY11 revision to the college’s strategic planning system incorporated a new awareness that the college should be accountable to its own Strategic Goals in addition to those imposed by the state. To that end, meaningful institutional key performance indicators (KPI) were identified by OPIC and are articulated in the Strategic Directions. Sauk’s KPIs measure the success of the college’s overall efforts:

  • Transfer rate - The number of students who began their baccalaureate degree at Sauk and successfully transfer to universities.
  • Employment rate - The number of students who enroll to develop occupational skills and successfully obtain employment.
  • Credit hours generated - The number of credits generated at the end of the year provides a measure of student success and determines the state’s apportionment funding.
  • Number of certificate and degree program completions - The number of completions is a measure of student success and Sauk’s efforts to make students successful.
  • Proportion of departments that operate within budget - The number of departments that do not exceed their budget is a measure of financial management.

The KPIs represent an institution-level, measureable benchmark appropriate to the broad academic and fiscal charge of the college. They focus staff on Sauk's priorities and should make it easier to determine the extent to which Strategic Goals have been achieved. Although the first cycle of the FY11 planning system will not reach the point of measuring against these benchmarks until summer of 2011, the plan design calls for administration and OPIC to examine the current year rates against the prior year and a multi-year average. From that evaluation, Sauk will generate an annual report that will become the foundation for future planning, which will be published on the website (link to an appendixAppendix).

2C.4: Support for Planning and Assessment

Between the 2002 Reaffirmation of Accreditation Visit and the 2006 Focused Visit, Sauk made a significant investment in staff time to develop new systems for both assessment and strategic planning. In the Focused Visit report, the college expressed concern that the level of investment was not sustainable over the long term, but recognized that both systems would have to be supported adequately to maintain them. As the systems have matured, assessment and planning processes are clearly supported through multiple avenues:

  • Personnel: The budget includes compensation for several leadership positions in assessment and planning:
    • The Dean of Institutional Research and Planning and an administrative assistant oversee and maintain the various planning processes, including the committee structure, as job duties.
    • Area Facilitators are faculty leaders in operational planning, program review, and assessment who receive compensation for that leadership.
    • The Assessment Core Team, the faculty committee that oversees the faculty assessment of academic achievement, requests compensation for major tasks the group deems necessary, including data management and plan revision. Once approved, these requests are directed into the budget process and have received appropriate levels of funding.
  • Time: Sauk employees are provided the time to contribute to the planning and assessment processes. Recent examples include strategic planning focus groups in spring 2010, departmental and academic area meetings, participation on OPIC and other cross-institutional committees, and in-service activities related to planning. In addition, the college has negotiated a bi-monthly faculty meeting hour that allows time for essential full-time faculty discussions related to the assessment and planning processes.
  • Data collection: While some data results have been consistent enough to warrant a move away from the burdensome expense of annual testing to sustainable levels of periodic testing, Sauk budgets regularly for the cost of several data collection tools:
    • CAAP: After providing the ACT CAAP test annually, the faculty determined that a sample every three years was adequate to provide external confirmation of several competencies. The college funds the cost of the testing and incentives for participants. For example, in spring 2010, each student participating in the test received a $5 meal card at the cafeteria, and students who scored above national norms had their names placed in a drawing for two prizes of $100.
    • Noel Levitz Student Satisfaction Inventory (SSI): The SSI, which surveys students’ opinions on topics such as class scheduling, the registration process, resources, staff, and safety, is scheduled to be conducted every four years. However, due to budget issues and staff changes, Sauk had a 6-year gap between the two most recent surveys. The results of the 2010 survey were analyzed by the Dean of Student Services. Activities planned in response have been noted in the FY11 Operational Plan.
    • Assessment data storage: Sauk maintains in-house storage of its assessment data as a better fitting alternative than a purchased system. The system is developed and maintained by compensated faculty on the Assessment Core Team, supported by Information Systems staff.
  • Professional development: There is no emphasis for employees to participate in external professional development related to planning, although in-service agendas provide a record of in-house sessions on strategic planning. For example, the spring 2010 in-service included a presentation by OPIC members to the college community of the new Strategic Directions and the alignment of the planning systems. The annual reports from the Assessment Committee show that regular opportunities for assessment-related professional development are being provided, both through conference attendance (for example, the Illinois Community College Assessment Fair) and through in-house events, especially those related to the general education competencies.

Despite budgetary challenges, Sauk continues to provide adequate monetary support for strategic planning and for assessment of student learning. The mechanisms currently in place will remain vital to the college's ability to maintain its structured planning and assessment processes and continue the progress that has occurred over the last five years.