Follow AI

Higher Ed Impact

Identifying At-Risk Students: What Data Are You Looking At?

Renewed national and public pressure on college completion rates is fueling a continuing surge of interest in "early warning" intervention programs for college students deemed at risk of withdrawal or failing. The earlier an academically at-risk student is identified, the better the prognosis for their success in college. Early alert systems, implemented within the first 4-8 weeks of a term, can be instrumental in beginning an intervention that can help facilitate students' success and increase retention.

However, faced with frequent studies offering multitudinous data on factors influencing student attrition, it can be challenging to sort through the information available to determine what indicators deserve most attention, both to proactively identify students who may be at risk at a point prior to enrollment, and to drive early alert systems in the first weeks of a semester.

This week, we interviewed Jennifer Jones, a clinical assistant professor and recently director of academic retention at the University of Alabama. Jones has developed a comprehensive and strategic approach to identifying at-risk students.

Predictive Historical Data

The past decade has yielded abundant studies citing factors that can contribute to the likelihood of student attrition. Cohorts often deemed at risk in the published research include the academically under-prepared, students who have taken a gap year between high school and college, students who work full-time, students who are enrolled only part-time, financially independent students who must bear the full cost of their education, students with family obligations, minority populations, first-generation students, etc. Before the semester even begins, it can be good to prepare for tracking students who occupy more than one of these cohorts. Many adult, nontraditional learners, for instance, will have a multitude of these contributing "at risk" factors -- work, family, and financial obligations may all be competing priorities that exert pressure on their ability to complete a degree.

However, Jones cautions that just looking to national trends is not enough. "Don't take the national data as gospel. Too often, we forget to look at our own data," Jones warns. The national trends may offer pointers as to what indicators to check for -- but then you still have to check for them. To key in on the most critical indicators of student attrition for your institution, Jones recommends relying more on your institution's historical data to help you identify the specific cohorts at your institution that may be at risk.  Be wary of your own (and others') assumptions, and review 4-8 years' of institutional data on the persistence rates of student cohorts that you expect might perform low in terms of persistence -- and also of student cohorts you expect might perform high. Because of demographics unique to your institution, you may find surprises.

SCENARIO: HONORS STUDENTS WHO DON'T PERSIST

It can be tempting to assume that honors students will have a high persistence rate -- and accordingly, to take them for granted, But suppose an enrollment manager at a large state institution examines 6 years of historical data for the institution's honors students, and finds that they are actually persisting at a much lower rate than expected.

Why would students who are academically advanced be less likely to persist?

Further analysis reveals that a large percentage of the honors students were recruited from out of state. Despite an exemplary academic record, these students lived at a further degree of separation from family and from their support networks. There are other factors at play, as well. Despite an excellent high school GPA (which looks good on paper), some of the students lack study skills, and have been finding the transition from high school to college more difficult than they expected -- while still other honors students have been finding their first-year college classes insufficiently challenging.

The lesson to be drawn, Jones suggests, is that you need to identify your own institution's at-risk cohorts. Use the national data to help direct you as to where to start looking, but challenge your assumptions and place the greatest reliance on your own historical data.

DFW Assessment

Beyond examining particular student cohorts, Jones recommends looking at historical data for particular courses. A few institutions have seen great strides in their ability to predict and identify at-risk students by means of DFW assessment. In this case, the point is to identify which first-year courses show the highest drop, fail, and withdrawal rates. Your historical data can empower you to predict which first-year students are likely to face the most difficulty -- based on which courses they are registering for. For example, if you know in advance that your math courses have a high DFW rate, you can move proactively to support the students enrolled in them with supplemental instruction, tutoring, and other interventions. "You can offer the students opportunities to achieve greater success," Jones remarks, "before they even have the chance to fail."

Real-time Data: What's Most Critical

Finally, you can generate alerts that trigger particular interventions for at-risk students based on data provided in real time during the term. Examples can include:

  • Invite faculty to alert you to warning signs within the early weeks (e.g., missed attendances, signs of depression, etc.)
  • Identify the weeks in the term that have the highest withdrawal rates, and reach out to students who withdraw
  • Design alerts based on midterm grade reports (does a student have one C-? Two? Three? You can set up tiers of different priorities of alerts)

LEARN MORE ABOUT IDENTIFYING AND INTERVENING WITH AT-RISK STUDENTS

This "work and learn" program will share a framework or model for designing a system, show how practitioners have implemented these different steps in the model, and give you a chance to brainstorm and outline your future system.

Designing Early Alert Systems for At-Risk Students
Atlanta, GA
February 28 - March 2, 2011

Read More ArticlesSubscribeBack to Top

About the Authors

Daniel Fusch, Director of Publications & Research

Daniel provides strategic direction and content for AI’s electronic publication Higher Ed Impact, including market research and interviews with leading subject matter experts on critical issues. Since the publication’s launch in 2009, Daniel has written more than 200 articles on strategic issues ranging from student recruitment and retention to development and capital planning. If you have a question or a comment about this article, feel free to contact Daniel at daniel@academicimpressions.com.