Indiana has built the data systems that could answer this question. What it has not done is make the answers visible.
On paper, the state is well equipped. Indiana maintains a State Longitudinal Data System that links K–12 records with higher education and workforce data. In principle, it can track students across years and answer precisely the kinds of questions teachers and families care about. What happens to students who struggle in middle school? Where are they five or ten years later? Do early academic failures predict later disengagement, or do many students recover?
Yet despite this infrastructure, there do not appear to be publicly available, Indiana-specific longitudinal studies that follow eighth graders who fail ILEARN forward into adulthood. Instead, what we get are two partial substitutes: snapshot reports of yearly proficiency and broad national research that gestures toward likely outcomes without grounding them in Indiana cohorts.
That gap matters.
What Indiana Actually Tracks
Indiana’s longitudinal system was designed for exactly this purpose. It links student data across grade levels, postsecondary enrollment, and employment so that long-term outcomes can be studied over time. The architecture exists. The data exist. Researchers working under state approval can access them.
Classroom teachers and the general public, however, cannot. The analyses that might tell us how eighth-grade performance connects to graduation, persistence, or later stability largely remain internal or buried in technical research projects. The system is built to see the future. We just are not shown what it sees.
What Is Publicly Reported Instead
Public reporting focuses almost entirely on annual proficiency rates. In recent years, roughly forty percent of Indiana students in grades three through eight have scored proficient in English Language Arts on ILEARN, with seventh and eighth grade among the weakest points. Middle school ELA performance has slipped rather than improved, and state officials regularly name late middle school as a critical area of concern.
What those dashboards do not show is what happens next. There is no public cohort analysis that follows non-proficient eighth graders forward. No simple accounting of how many eventually graduate on time, enroll in college, or disengage altogether. The conversation remains fixed on this year’s scores, not on students’ lives.
What Longitudinal Research Tells Us More Broadly
National and Indiana-based longitudinal research does point in a consistent direction. Patterns such as chronic absenteeism, course failure, and low middle school achievement are strongly associated with later risks. These include high school disengagement, increased dropout rates, and diminished postsecondary participation.
Indiana researchers have used earlier assessment systems and diagnostic tools to study growth over time and the effects of interventions. These studies are valuable, but they still stop short of telling a story that parents or teachers can easily grasp. We do not see five- or ten-year outcome narratives tied to a concrete marker like failing eighth-grade ILEARN.
Why This Question Still Matters
The stated purpose of statewide longitudinal data systems is not merely accountability. It is wisdom. These systems exist so states can judge whether educational decisions actually improve students’ long-term learning, attainment, and well-being.
Yet the public conversation remains trapped in annual percentages. We argue about whether scores went up or down by two points, while the deeper moral question goes unanswered. Where are these children later? Did the system help them grow, or did it quietly lose them?
Teachers ask this question not out of cynicism, but responsibility. We see students struggle at thirteen and fourteen and wonder whether the labels we attach to them will harden into trajectories.
What Can Be Done Now
If someone wants a definitive statewide answer, the most direct path is through researchers already working with Indiana’s longitudinal data. University partners and IES-funded projects are the ones capable of running exactly this analysis, if they choose to ask the question.
At the local level, however, a district-based approach may be more realistic and more humane. Following one or two eighth-grade cohorts through graduation and into early postsecondary life would yield insights that are immediately actionable. It would also restore a sense of proportionality. Not every student who fails a test fails at life. But some patterns are real, and ignoring them helps no one.
Indiana has the data. What it lacks is the will to translate that data into stories that tell us whether our schools are truly serving the children who struggle the most.
And until we ask where those children are years later, we are only pretending to measure what matters.