top of page

CSU Faculty Dashboards - Early-stage Research

During my time in the Center for Usability in Design and Accessibility (CUDA), I did a project for the CSU Chancellor's Office involving usability research on early versions of their CSU Faculty Dashboards.

Background

CUDA was hired by the CSU Chancellor's Office to do usability research on early versions of CSU Faculty Dashboards. These dashboards showed information about student performance in various departments over time (e.g., pass/fail rates) and the breakdown of student genders and ethnicities. The purpose of the dashboards is to show faculty members where equity gaps may exist to allow them to address gaps and help improve student performance in classes.

Data on a Touch Pad

Research Objectives and Goals

Two research studies--a usability study and a focus group-style interview--sought to gather initial impressions and feedback on the early mockups of the dashboard pages, which included a prototype chatbot feature. Areas of interest were:

  • Did faculty members make use of the chatbot feature? If so, what are their initial impressions and feedback?

  • What initial impressions and feedback do faculty members have about the data that is presented?

​

Seven faculty members were recruited for the usability study and three pairs of faculty members (total of six individuals) were recruited for the focus-group interviews. Faculty members worked in a variety of departments and CSU campuses. A mix of "expert" and "novice" users of the Faculty Dashboards were recruited.

​

For testing, I work alongside a fellow lab member to develop usability test scripts, facilitate testing sessions, analyze qualitative and quantitative data, summarize findings in reports, and read out findings to the CSU Chancellor's Office.

Home Office Study

Research Methods

For both the usability study and the focus-group style interview, participants were asked to use the think-aloud method and share their thoughts lives as they performed tasks and/or viewed dashboard pages.

​

In the usability study, participants were asked to assume the role of a committee member in a student success task force and explore the CSU Student Success Faculty Dashboard. They performed three tasks:

  1. Find factors that affect student success and graduating in a timely manner in their department.

  2. Find information on equity gaps in their department and report their findings back to the committee.

  3. Look into course GPA gaps between different ethnicities in their own department.

​

Performance metrics were collected and analyzed, such as number of clicks for navigation and presentation (e.g., zooming in on graphs), time on task, and number of times participants used the chatbot feature. Usability issues were categorized as either major issues, minor issues, or showstoppers, and comments towards the chatbot were categorized as either positive, negative, or neutral.

​

In the focus-group style evaluations, participants were asked to view the webpages of the dashboard as if they reflected their own students. Their initial impressions and feedback throughout the session were analyzed and placed into one of five categories: affective, social, cognitive, employment, and recommendation/criticism.

​

At the end of both evaluations, participants answered a Google Forms survey that asked questions from the System Usability Scale (SUS).

Taking Notes

Outcomes and Recommendations

Based on observations and participant feedback, notable findings and recommendations from both evaluations were as follows:

  • Although there were no showstoppers, participants generally wanted more options and filters when viewing student data. For example, participants wanted to know more about the individual students in courses that had high failure rates, or they wanted to compare students from a particular major to the entire campus. Recommendations included having more data options and filters, such that users can make more comparisons and/or view trends over time to have a deeper understanding of student performance and actions to take.

  • Some participants commented that the X- or Y-axis labels were missing, unclear, or the chart icons were not descriptive enough. Recommendations included clearly defining graph axes and icons.

  • A few participants noted that some terminology and data was outdated. For example, “Achievement gap” should be updated to “Opportunity gap” to place the onus on the education system rather than the student. Recommendations included ensuring terminology is not blaming students and that courses shown are up-to-date with what is actually offered at certain campuses and their departments.

  • All participants reacted positively to the chatbot's greeting, but noted that its functionality and responses needed improvement, as it heavily relied on specific key words to pull responses. Recommendations included having the chatbot explicitly state it does not have the answer and providing relevant contact information, instead of providing an incorrect response. Alternatively, a quick informational video or training can be provided to users upon first using the Faculty Dashboard.

Giving a Presentation
bottom of page