Case Study: Transforming the Outcomes Process to Make Meaningful Change

In fall 2013, all academic programs at Flagler College, a private four-year liberal arts institution located in St. Augustine, Florida, began the process of standardizing course-level outcomes assessment. Following extensive work to redefine program mission statements and clarify student learning outcomes for each course, the logical next step was to assess and report on them.

Erskine College Logo

“From our perspective, the aggregation data capabilities are incredibly valuable. It’s one thing to see outcomes assessment at a course or program level, but being able to see it at the institution level is a game-changer.”

The Office of Institutional Analytics, Effectiveness, and Planning (IAEP) facilitated this process, with the goal of producing outcomes data for faculty members at the course and program level. Doing so would make it easier to identify areas of focus for change or improvement. According to Dr. Will Miller, Executive Director of IAEP, however, the first attempts at assessing student learning outcomes faced various obstacles. “The process involved using an Excel spreadsheet to capture and assess the information. There were also a lot of issues with the raw data, as much of it was inconsistent, making for a very cumbersome and time-consuming process. If we were lucky enough to even capture it all, it still took us roughly four or five months to report the information out to the campus community and individual stakeholders.”

Ultimately, the IAEP staff realized they couldn’t turn things around fast enough for faculty in order to allow for reflection or change. Dr. Miller and his staff knew they needed a better solution. When IAEP set out to identify a different approach, they had specific goals in mind, but weren’t sure if one system could meet them all. “We wanted something that would not only make the process easy and intuitive for our faculty, but would give us the ability to actually do something meaningful with our outcomes data,” Dr. Miller explained. To his surprise, they quickly found their solution in Campus Labs® Outcomes, offered as part of a comprehensive assessment platform that seamlessly connected with their accreditation and program review tools.

From a process standpoint, there was a lot in place before implementing the Outcomes toolset on campus. “We had curriculum maps and program assessments, but nothing very meaningful or direct…nothing that faculty could actually look at and build something off of to help them do their job better.” While the landscape was ripe for a better outcomes assessment process, faculty were still hesitant. They feared an increase in workload, and even the judgment that might come from a new process. Would they be perceived as too lenient if too many students were meeting their course outcomes? Or perhaps they would be seen as too demanding if not enough students were meeting the outcomes. “It took a bit of championing to get the ball rolling.”

Once faculty started using the Outcomes toolset and saw all of the easily accessible and valuable data, the fears soon dissipated and buy-in happened quickly. “The feedback has been overwhelmingly positive,” Dr. Miller reported. “The faculty sentiment is that it is a simple and clear process.”

While the campus-wide adoption of Outcomes has been helpful to the IAEP Office for data collection and tracking, it’s the nearly endless analytical and reporting possibilities from the data that is most exciting. “From our perspective, the aggregation data capabilities are incredibly valuable. It’s one thing to see outcomes assessment at a course or program level, but being able to see it at the institution level is a game-changer.”

One of the most valuable benefits of using Campus Labs Outcomes has been the ability to visualize the breakdown of outcomes within Bloom’s taxonomy by different levels: academic major, program, division or school, and the overall institution. “The report is clear, whereas eyeballing a curriculum map for issues can often feel subjective for faculty.” Thanks to the Outcomes reporting feature, gaps in learning are easily identified. This became immediately evident when IAEP looked specifically at the general education program outcomes. “Seeing the gaps in the curriculum generated immediate conversation amongst the general education faculty about the goals of the curriculum,” Dr. Miller said. “There are now concrete plans in place to make adjustments in preparation for the upcoming academic year to more closely align the learning outcomes with the curriculum to better impact student learning.”

The Outcomes toolset has reshaped the landscape of assessment expectations campus-wide. “Now that we have a solution in place to effectively collect and report data in a timely manner, the expectations on campus have been raised. Moving forward, faculty will be asked to specifically address how they are using the outcomes data to impact student learning. For example, is the data telling them their curriculum meets the intended learning goals, or should the outcomes be adjusted?”

The Office of Institutional Analytics, Effectiveness, and Planning at Flagler had conducted a lot of research before selecting Outcomes. Already reaping the benefits of a transformed assessment process, Dr. Miller and his team couldn’t be happier with their decision. “It was evident very early on in our research it was the best tool available. Not only would we have an easy-to-learn and intuitive tool to conduct outcomes assessment, but we could actually do something meaningful with the data, and in a timely manner, in order to better impact student learning. This tool has truly been the key to assessment for us, campus-wide.”