How Early Alert and Student Success Initiatives Fail

Puzzle

Spoiler alert: The biggest killer of early alert programs is information flow problems. Here’s a model for approaching that flow differently.

When Early Alert Programs Lack a Strong Underlying Framework of Data

We look to these research-based best practices and bring program after program to our campuses to address the needs of specific high risk populations, falling prey to the so-called “program of the month syndrome.” We analyze our retention data at the end of the year, identify a group that needs help, and we add a new program or service to address the issues of that student demographic. And so we try new program and services but can’t seem to really change retention and graduation rates of the student body. Why?

Without a strong underlying framework of data and information flow, programs can easily become information silos—places where good data go to die.

While our successful programs prefer to see themselves as cylinders of excellence, the lack of information flow and real-time communications between our programs and services reduces both effectiveness and efficiency.

A lack of data makes it difficult to identify the right student for outreach at the right time and get them to the right service to support their success, while a lack of information results in multiple programs or services outreaching to the same student while others fall through our safety nets.

Addressing these underlying data and information flow problems is critical to designing and implementing an effective student success program.

The Integrated Data and Information Flow (IDIF) Model

IDIF model
The Integrated Data and Information Flow (IDIF) Model

The Integrated Data and Information Flow (IDIF) model identifies 4 critical steps in the design of any comprehensive student success program.  But what is often overlooked is the arrow, the flow of data and information that links these 4 critical steps together.  Without a focus on the arrow, on how each step is linked together by the data and information needed to take critical actions at the next step, your early alert or student success program cannot be as effective and efficient as possible.

Here, technology can play a crucial role. Technology can be used to both create an information and communications hub, keeping  everyone on the same page with respect to the student, and to automate as much of the IDIF model as possible to preserve staff and faculty time for high value activities such as building student relationships and engagement.

1. Data collection:

To consistently identify the right student at the right time, data from early alert sensors must flow into the system without being impeded by silos or other obstructions.

As not all students will become at-risk in the same way, multiple data inputs are needed and should be customized to each institution’s student population. Both proactive and reactive strategies identification strategies must be used. Proactive strategies focus on the earliest indicators that a student may be at high risk of developing academic difficulties. Reactive strategies focus on actual student behaviors (or lack thereof) during the semester to determine who is actually at-risk of failure.

Key takeaway: Maximize actionable information, while minimizing the amount of data collected, thus reducing the burden on data collectors and helping to reduce resistance.

2. Data analysis:

Data collection and analysis must be tightly coupled in real time and should be automated to the extent possible to preserve staff time. The goal of data analysis is to identify and prioritize the right student for outreach at the right time.

Key takeaway: Setting appropriate prioritization thresholds is critical to avoid overloading and demotivating the support services staff.

DIG DEEPER

If you are finding this article useful, check out these two recorded webinars from Academic Impressions:

Three Solutions for Impacting STEM Retention

Assessing the Effectiveness of Your Retention Programming

3. Outreach:

The goal of outreach is to successfully contact the right student and get them to the right service for the right intervention.  In order to impact student outcomes, data collection, analysis, outreach and intervention must occur as quickly as possible to help the student remediate their situation prior to actual failure. Not all students are engaged at the institution in the same way.  Information on the student’s situation must reach all possible points of engagement so that the person with the highest likelihood of success can reach out to the student.

Key takeaway: Automating the first outreach attempts can increase the numbers of students who self-correct prior to developing more difficult issues.

4. Assessment and improvement:

Performance measures must consist of both implementation measures (are people implementing the new behaviors) and program measures to assess if the program is having an impact. Programs can fail because they don’t work or because people failed to adopt the new behaviors (implementation failure).  Early, middle and late stage indicators of student success must be used to demonstrate short term wins, ensuring continued stakeholder buy-in and motivation.

Key takeaway: Measure what matters, when it matters to both continue to improve and gain stakeholder buy-in.

Changing How We Work

The IDIF model provides institutions with a framework for both designing and implementing a coordinated, comprehensive student success program. By understanding how data and information needs to flow between and within various constituencies and support offices, research-based best practices can be translated to effective institutional actions. And most importantly, by identifying the right student at the right time and getting them the right intervention, institutions can dramatically increase the success of all of their students.