Outcomes/measures tracked and tools utilized
At the institutional level, KPIs are used to measure students’ success and act as a basis for accountability. Performance on these measures is compared to national peer institutions, and based on the outcomes, different areas of the College develop a set of improvement strategies to help achieve identified targets.
Additionally, Outcomes Assessment at the College is a catalyst for improving teaching and learning. The faculty-driven assessment process has a central focus on improving learning and success. Program Review allows program faculty and staff to lead a purposeful and continuous cycle of improvement. The summative evaluation of vitality for each academic program is made by the appropriate division dean and acts as the basis for prioritization of academic initiatives and resource allocations.
Summary results of assessments
Key Performance Indicators: Figures 1.8, 1.9, and 1.10 measure first-, full- and part-time, graduation and transfer success and persistence compared to peer institutions as indicators of student success.
Student Learning Outcomes: In the first two years of data collection in the general education curriculum, more than 19,000 students were assessed in multiple courses and disciplines. This assessment process is reported on more fully in 1P1, including aggregated data for the first two years of the general education assessment as well as data for non-general education courses and career and technical programs.
Program Vitality: At JCCC, Academic Program Review, Planning and Development allows program faculty and staff to lead a purposeful and continuous cycle of improvement through two related processes: Comprehensive Academic Program Review and Annual Planning and Development. During Fall 2015, the focus of program review was extended to include purposeful, annual program action planning and development, and incorporated a summary recommendation on program vitality from the division deans. Figure 1.12 below shows a summary of the deans' vitality recommendations for the academic year 2015–2016.
Comparison of results with internal targets and external benchmarks
Key Performance Indicators: The College's performance on KPIs is compared against peer institutions. Based on comparative performance, different areas of the College community develop improvement strategies to help achieve targets. The institutional benchmark level has been established at the 75th percentile of community college performance nationwide. As noted in the figures above, the College has met the benchmark for Fall to Spring persistence, but needs additional improvement on the Fall to Fall persistence. In the other categories, the College has been close to or exceeded the external benchmarks related to graduation rates.
Student Learning Outcomes: The College has established institutional benchmarks for direct and indirect assessments of student learning outcomes. The initial targets for the College are based on performance data from like institutions. As shown in Figure 1.14, the percentage of students performing at the mastery level (40%) is well above the initial benchmark for mastery (at least 10–15% of students), while those progressing on SLOs (36%) are well below the benchmark for progressing (at least 65–70% students). Student performance assessed as low/no mastery (24%) is slightly higher than the benchmark (less than 20% of students).
Program Vitality: Internal benchmarks have not been established for the program vitality indicators of demand, quality, and resource utilization assessed during the Academic Program Review, Planning and Development process. In this process, programs are provided with an institutional summary of program data elements and asked to provide a self-assessment of vitality. This information allows programs to consider programmatic data as it relates to the College data overall and can be based on a transfer or career program emphasis. Deans meet with programs to discuss the summary vitality recommendation and its impact on budget recommendations.
While there are external benchmarks for institutional level comparison, external benchmarks for programs have not been formally established. Programs are prompted to consider external data, such as information available from the Kansas Board of Regents, U.S. Department of Labor, accreditation agencies, and professional/discipline organizations for comparison. Programs are able to consider how their outcomes potentially contribute to and/or impact institutional outcomes. Moving forward, the College is working toward incorporating external data from the National Higher Education Benchmarking Institute to provide additional comparison data.
Interpretation of results and insights gained
Key Performance Indicators: The results for the College’s KPIs, show the following:
- Full-time student graduation and transfer rates for first-time, degree-seeking students have improved and are slightly above the target of the 75th percentile of community college performance nationwide.
- Part-time student graduation and transfer rates for first-time, degree-seeking students showed improvements until last year, when they dipped below the 75th percentile of community college performance nationwide.
- Fall-to-Fall persistence rates have remained consistent over the last five years, varying between 45% and 47%.
- The GPA of students transferring to the University of Kansas has improved slightly to 3.00, meeting the College's target exactly.
- All five student satisfaction indicators showed significant progress between 2014 and 2015. The College is now performing at or significantly above the national median on all five indicators.
Student Learning Outcomes: Programs in the general education and career and technical education programs are at different stages of maturity for using the assessment data. Through the annual reporting within Program Review as well as progress reports provided to the Office of Outcomes Assessment, evidence of student learning, and curriculum changes are tracked and reported to improve student learning and capture emerging themes in the data. The general education assessment work will complete a three-year cycle of data collection in the 2016–2017 academic year. Assessment instruments and the usefulness of the data gathered have matured during the cycle and have generated positive results. Many of the non-general education departments are also maturing in their use of assessment instruments and implementation of the results. Incorporating the reporting of assessment activities within the Program Review cycle has been the greatest means of boosting assessment work within the College.
Program Vitality: Program vitality will continue to be refined after the first dean’s review in Summer 2016 and initial discussions with and feedback from faculty in August and September 2016. The concept of program vitality was newly introduced as part of the Program Review implementation in AY 2015–2016. Over the previous few years, increased efforts had been made to encourage faculty to look at their program data, and the new review process established the expectation that data would drive decisions on prioritization of initiatives and resources. With the exception of a few academic programs, the academic programs at the College assess the demand, quality, and resource utilization as acceptable or exceptional. In most cases, the data supported the conclusions. As a result of the review, the rationale for budget resources was stronger for the proposed AY 2016–2017 budget than the proposed AY 2015–2016 budget.
Heightened discussions with advisory members, businesses, and industries have occurred. Feedback from advisory board members indicates strong agreement with implementation of recommendations and that programs are meeting market demand. Many of the programs with career emphasis have undergone significant curriculum changes within the last several years. These include Electrical Technology, Cosmetology, Interactive Media, Computer Web, HVAC, Interior Design, and Sleep Technology.