August 5, 2019; EdSurge
A Gates Foundation-backed program, Integrated Planning and Advising for Student Success initiative, or iPASS for short, to use data to improve the outcomes of college advisors, or rather their students, is not showing the hoped-for results.
One might begin to think the foundation has taken on the unenviable goal of experimenting with all likely failures in terms of great educational notions. Again, here, the research that found the program had no discernible impact puts much of the blame on implementation, lest we think the idea was a clunker like the others (see here and here).
Conducted by the Community College Research Center at Teachers College, Columbia University and published by MDRC, an education research nonprofit, the study explored the effects of iPASS interventions at Fresno State in California, the University of North Carolina at Charlotte, and Montgomery County Community College in Pennsylvania.
Across these three schools, the study looked at how 8,011 students, split roughly evenly between an iPASS group and a control group, fared across two semesters.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
By signing up, you agree to our privacy policy and terms of use, and to receive messages from NPQ and our partners.
At Fresno State and UNCC, “the iPASS enhancements produced no statistically significant effects on students’ short-term educational outcomes,” wrote the authors. Perhaps more deflating: “Across the three institutions, large proportions of students who were identified as being at high risk still earn Ds or Fs, or do not persist into subsequent semesters of college.”
Some of the tactics the program used to make the intervention stick—like not registering students before they checked in with advisors—actually deterred registrations rather than improving retention. At Montgomery County Community College, iPASS students earned fewer credits than control group students. Write the authors, “the mechanics of the registration hold may have negatively affected enrollment in seven-week courses that began mid-semester.” This was compounded by inaccuracies in the data advisors used to predict problems: “Some students who seemed to be performing well had been determined to be at risk.”
Writes EdSurge:
The findings from this report are unfortunately not uncommon as others have also tried—and struggled—to effectively implement early-alert systems. Tallahassee Community College in Florida is now trying its third system after faculty said they “hated” previous efforts. Elsewhere, programs based around “nudging,” or touching base with students more frequently to keep them engaged, can have the opposite effect and push students to leave school. These challenges have raised questions about the feasibility of using purely data-driven approaches to quantify whether a student is likely to succeed or fail.
Another classic Gates blunder that might have been foreseen.—Ruth McCambridge