DFW Rates and You: Rethinking Support for At-Risk Students

Tablet with data

In a recent interview with Academic Impressions, Bernadette Jungblut, West Virginia University’s director of assessment and retention, noted with some dismay that too frequently institutions have used data on individual courses’ D/fail/withdraw rates primarily as a means of performance evaluation for faculty, rather than partnering with faculty in taking a closer look at the DFW rates for clues to identify specific challenges students are having.

Jungblut suggests that historical and current DFW is a particularly effective indicator that can be used to inform proactive rather than punitive action. Indeed, many institutions have begun identifying students taking courses with high DFW rates as “at risk.” On one level, this is a useful move — if it prompts both faculty and those charged with student retention to monitor real-time, operational data on students in those courses closely.

But this is only a preliminary move, taking a “broad brush stroke” approach to tracking the students taking these courses. With a limited amount of digging into student data, it’s possible to take a much more sophisticated and effective approach to identifying and supporting at-risk students in high-DFW courses.

Jungblut offers these ideas.

Taking a Closer Look at Your Data

Have you looked at your historical DFW data and segmented it by level of student preparedness?

For example, West Virginia University assigns a five-scale institutional rating to each entering student, designating their level of academic preparedness, from IR-1 (highest) to IR-5 (lowest). If in looking at your historical data you find that a given course’s DFWs are almost all IR-4 or IR-5, this may indicate that many of the students are simply unprepared to succeed in the course. You could respond with developmental courses or with an intensive inter-session to help students get up to speed.

But what if your DFW has been more evenly spread across both the academically prepared and the unprepared? The issue might be pedagogy — but it also might not be. It might be a matter of setting the right expectations for students. Perhaps you need to:

  • Establish a summer bridge or “STEM boot camp” program to help entering students prepare for the rigor and workload of the first term. (For an example, see our August 2012 article on summer bridge programs.)
  • Address the issue with a smarter approach to developmental advising. For example, are all entering students who are interested in a STEM major being advised to take two labs and a math course in their first term (in order to graduate in four years)? If those courses have high DFW rates, reconsider that first term course load. Would it be better to advise a lighter first term and, if the student needs to complete in four years, advise summer session courses? How necessary is it that students take two labs and math in their first term?

Interviewing Your Faculty

Besides taking a more sophisticated look at your quantitative data, Jungblut advocates for a qualitative approach as well — focus groups with students and interviews with faculty. Approaching faculty as partners and as knowledgeable resources can lead to some crucial discoveries.

Jungblut cites the case of an introductory chemistry course with a high DFW rate. In this case, a full suite of intervention and student support had been put in place, including facilitated and peer-led team tutoring, extended faculty office hours, and support from teaching assistants. Yet the DFW rates were not dropping, much to the frustration of the faculty. Jungblut began interviewing faculty to learn what students most needed in order to succeed in the course. The chemistry faculty concurred that two issues were contributing to a high fail rate:

  • Some students lacked essential mathematical skills.
  • Students were realizing this lack too late in the semester.

West Virginia University is now piloting a project in which the chemistry course is split into two 8-week courses, rather than the usual single 16-week course. If students don’t pass the first, eight-week segment, that triggers a series of diagnostics to determine what is preventing their success. Some are routed to an 8-week course to develop the missing skills and knowledge, during the second half of the term. If the student passes that skills course, they then retake the chemistry course.

In this way, rather than have the student sit in the chemistry course for 16 weeks and fail, and then try to fill the skill gap the following term, the student who would not have succeeded in the course spends the second half of the term acquiring the skills needed to do so, while continuing to earn credit. “This cuts the timeline in half,” Jungblut notes.

Another Example

The interviews with faculty are critical because the quantitative data alone may not tell you why students are struggling. For instance, you might find in some majors that helping students prepare for a major may involve expectation-setting as much as it involves addressing gaps in skills or knowledge.

Jungblut cites the example of undergraduate geology courses at West Virginia University. For some time, these courses were seeing high DFW rates, even though the students did not appear to lack the foundational skills needed to succeed in the course. Then Jungblut held interviews with the faculty in the department, asking what students most needed to attend to during the course. Faculty expressed concern that, while students seemed to be trying to write down everything said during lectures, the faculty wanted students to focus on the geology drawings and descriptions provided in lecture. Many of the students seemed unaware of faculty expectations about extracting key information from lectures.

As one step toward a solution, the geology department is preparing an online video tutorial for students in which faculty will clarify these expectations.