Delaware’s record-low dropout rate has been cast as a signature accomplishment. A closer look at the numbers, however, reveals a more complicated story.
The number was good. The timing was even better.
On Feb. 17 of this year, Delaware’s Department of Education announced the lowest dropout rate in state history.
Of the roughly 38,900 students who enrolled in Delaware high schools in fall 2013, just 817—or 2.1 percent—left school during the following year. Two years earlier those figures had been 1,527 and 3.9 percent, respectively. In just 24 months, Delaware managed to nearly halve the number of students dropping out of its high schools.
At least that’s what the numbers suggested.
Two days after the announcement, U.S. Secretary of Education Arne Duncan swept into Wilmington for what felt, at times, like a victory tour.
“We didn’t plan it this way. We scheduled this months ago,” said Duncan in a luncheon address to the Rotary Club of Wilmington. “But seeing the paper the last couple days — high school graduation rates at an all-time high, dropout rates at 30-year lows, AP [advanced placement test] passing rates at all-time highs — that’s extraordinary progress.”
Gov. Jack Markell chimed in with a similar sentiment.
“There are amazing things going on,” Markell said to applause from the crowd packed into the DuPont Hotel’s Gold Ballroom.
In the months since, Delaware’s Department of Education has enthusiastically and repeatedly cited the declining dropout rate as evidence of progress. Officials there have also used it as a shield against lawmakers and special-interest groups who question the state’s reform agenda.
But the department’s sunny interpretation obscures a deeper truth: Delaware’s tumbling dropout rate is as much a triumph of better book-keeping as it is a triumph of better education. In fact, evidence suggests that Delaware has not done a significantly better job over the last two years keeping students in school.
The state has, however, done a much better job tracking and labeling its students.
“Any kind of claim of a spectacular drop in the dropout rate probably isn’t true,” says Jeff Klein, coordinator of research, development and evaluation with the Appoquinimink School District. “It’s just that we were never capturing what the true dropout rate was.”
Thanks to large-scale data cleanup, a dropout rate that was once artificially inflated has fallen sharply. Today’s dropout rate is likely more accurate than it’s ever been, but to consider it a marker of real school improvement would be a generous interpretation of the facts.
“Some of it is more about record keeping and accounting than it is necessarily about student achievement,” says Jeff Menzer, former principal at William Penn High School and a current employee with the Colonial School District. “It doesn’t sound good, but it’s true.”
A simple fraction
The dropout rate is a simple fraction. More specifically, it measures the total number of dropouts divided over the total number of high school students.
In the past, state officials would tell each district its dropout rate at the end of the school year. Districts could contest that number if they wanted and ask to see the list of students being counted as dropouts. They could then appeal on a student-by-student basis, offering evidence that would prove the students had not indeed dropped out.
Such requests were rare. Adrian Peoples, a research and technology associate with Delaware’s Department of Education, estimates the state received roughly “five-to-10” appeals in a typical year.
“In the past, all we gave you was the graduation rate,” Peoples says. “There wasn’t much to appeal.”
Two years ago, the state introduced a new computer application called the “Dropout Verification System,” or DVS.
Through DVS, the state now sends each district and charter school a list of students it believes to have dropped out based on enrollment data. The state then reminds districts to review that list and submit appeals when they believe a student has been erroneously labeled a dropout. (In 2014, the state sent three reminder e-mails and even extended the reporting deadline.) If the district can provide evidence that the student in question transferred or died, the state removes the student from the dropout rolls.
Soon as districts started digging into the numbers, one thing became clear: Hundreds of students were being mislabeled.
In 2013, DVS’s first year, districts submitted 436 successful dropout appeals, according to figures provided by the Delaware Department of Education. Next year, as more districts embraced the new system, the state approved 825 dropout appeals.
That’s 825 students who weren’t dropouts, but probably would have been labeled as dropouts prior to the implementation of DVS.
Had districts submitted the same number of appeals in 2012, the state’s dropout rate would have likely been less than 2 percent (see * for further explanation below) — a far cry from the 3.9 percent figure reported at the time, and actually a smidge lower than the current dropout rate.
During the three years before the state introduced DVS, the dropout rate was essentially flat — fluctuating between 3.7 and 3.9 percent. In the first year of DVS’ existence, the state’s dropout rate fell a full point — from 3.9 percent to 2.9 percent. The next year it dipped again, to 2.1 percent.
Not surprisingly, many of the districts driving much of that improvement were the same districts submitting appeals.
The Christina School District had 113 successful appeals in 2013. Its dropout rate fell from 9.3 percent to 6.1 percent. In 2014, Christina submitted 273 appeals, and its dropout rate fell to 3.4 percent.
The Caesar Rodney School District didn’t submit a single dropout appeal in 2013, and its dropout rate rose from 3.3 percent to 4.4 percent. The next year, it sent in 68 successful appeals and the district dropout rate plummeted to 1.4 percent.
The Capital, Woodbridge, Smyrna, New Castle County VoTech, and Cape Henlopen School Districts all displayed similar patterns: as the appeals rate went up, the dropout rate went down.
‘An anticipated result’
Correlation, of course, does not equal causation. But the numbers do show that as districts began submitting more appeals through the Dropout Verification System, the number of students classified as dropouts began to fall for the first time in years.
“This is an anticipated result,” says Peoples, who helped develop DVS for the state. “We certainly have very committed district and school level folks that are very, very interested in getting those numbers right.”
Dan Weinles, supervisor of research and evaluation with the Christina School District, is one of those district-level folks.
“We want our data to be accurate,” Weinles says. “And I believe that our graduation and dropout rates were inaccurate prior to this.”
Christina is one of the state’s poorer districts, and its student population is uncommonly mobile. The burden falls on school-level employees to properly document each transfer. And that leads to mistakes, apparently a lot of them.
“Before we had this more intense and transparent process of reviewing and appealing erroneous cases obviously there were a ton of errors in the mix,” Weinles says.
Once Weinles knew exactly which students the state considered dropouts, he urged each of the district’s high schools to review the list and provide contrary documentation. In some cases, the evidence was easy to find — so easy, Weinles says, that he often discovered so-called “dropouts” listed as transfers in other state databases. Tougher cases required phone calls and e-mails to schools in other districts or other states to hunt down necessary documents.
The detective work wasn’t easy, but it paid dividends. In two years, Christina shed 238 dropouts, a figure that accounts for roughly a third of the state’s total dropout decline between 2012 and 2014.
“This cleanup process has, I’m sure, contributed greatly to those declines in dropout rates,” Weinles says.
Getting into ‘the weeds’
Other districts took similar measures.
The New Castle County Vocational Technical School District began asking parents to appear at its high schools in person before withdrawing students. That allowed the district to better track student movement and gather evidence for future appeals.
The district’s dropout rate dipped two percentage points the year after Delaware introduced DVS.
“It’s not, ‘Oh, here’s this innovative thing we’re doing,” says Kathy Demarest, the district’s community relations and public information officer. “The new and innovative thing is that we’re tightening up our reporting,”
She adds, “If you don’t get into the weeds, you don’t know what’s there.”
Courtney Voshell, an administrator at Dover High School who recently became school principal, made sure her staff knew exactly what happened to students once they transferred.
“I have become insane about the paperwork,” she says.
When students transfer out of Dover High, their names go into a separate file. Every month, Voshell meets with guidance counselors to ensure they are keeping close tabs on those transfer students so they can produce appeals evidence when needed. The school also added a guidance office clerk who helps sort through all the potential appeals cases. Those resources are vital for Dover High, a large school with a relatively transient population.
“You play the game. You track the kids,” says Voshell. “And it’s a little bit sad because the kids are numbers and they’re not kids.”
Dover High School is the only comprehensive high school in the Capital School District. The first year DVS went live, the district submitted 32 successful appeals and its dropout rate fell from 4.9 percent to 2.4 percent. Capital submitted the same number of appeals the next year, and its dropout rose to 2.7 percent.
The Smyrna School District tasked a secretary at the local high school with finding and filing evidence for dropout rate appeals.
“It really falls on the schools to do that tracking,” says Sandy Shalk, the district’s director of curriculum. “You have to spend a lot of time on the phones just finding out where [the students] are.”
The Colonial School District added a registrar’s office in order to better keep track of transfer students. The effort didn’t lead to many dropout rate appeals, but it did help the district to clean up another data point—its graduation rate.
In 2012, the state introduce a computer tool called the Cohort Management System (CoMS). Though separate from DVS, CoMS functions in much the same way. Each year, CoMS generates a list of students the state believes are due to graduate in a given district. Districts can then appeal if they think students are listed in error.
Colonial School District submitted over 100 appeals for the graduating classes of 2013 and 2014. In 2014, the district graduation rate climbed 10 percentage points.
The state graduation rate also rose sharply in 2014, going from 79.9 percent to 84.4 percent.
The appeals process likely played a role in that spike. Districts submitted 1,346 successful graduation rate appeals in 2014, according to figures provided by the Delaware Department of Education. That’s more than a thousand cases of students erroneously counted as a non-graduates in a graduation cohort that ended up having just over 9,700 students.
For the 2013 graduating class, districts submitted 1,297 successful appeals. In 2012, the number was 497.
The correlation between graduation rate appeals and graduation rate improvement is not nearly as strong as the correlation between dropout rate appeals and dropout rate improvement. Though appeals rose modestly in 2014, they did not rise enough to account for the larger jump in the state’s graduation rate.
Mix of factors
When approached for this story, state officials agreed that data cleanup has helped boost the state’s graduation rate and lower its dropout rate. They argue, though, that the better numbers also reflect, at least in part, substantive educational improvements.
“The state credits its improved graduation and dropout rates to a combination of factors,” department spokesperson Alison May wrote in response to an information request from Newsworks/WHYY.
May pointed first to “stronger data systems” that have allowed districts to intervene when students demonstrate at-risk behaviors. Though CoMS and DVS don’t allow for this sort of real-time monitoring, the state has distributed a third application—dubbed the Dropout Early Warning System—that “allows schools and districts to identify students who may be at risk of dropping out.”
Even before the state developed the Dropout Early Warning System, there were encouraging signs. A 2014 study by the Center for Education Policy Research at Harvard University found that the precentage of Delaware freshmen deemed “off-track” in ninth grade declined from 19 percent in 2007-08 to 12 percent in 2011-12.
Those gains, while welcome, do not account completely for Delaware’s improved dropout rates. Rather they have combined with “better reporting,” May says, to drive the dropout number down.
“The improved systems also have helped districts and charter schools report cleaner data, i.e. students that previously were being counted as drop outs may actually have been transfers but districts/charters were not reporting as such and thus they were being counted against their rates,” she wrote.
The case of the Appoquinimink School District shows how difficult it can be to untangle these two potential causes.
The district recently introduced a sophisticated data tool that allows it to identify potential dropouts starting as early as kindergarten. Modeled off software provided by the state, Appoquinimink’s new early-warning system uses academic and enrollment indicators to determine which students may need intervention.
Between 2012 and 2013, the district’s dropout rate fell by 1.4 percentage points. The district also submitted 21 successful dropout rate appeals in 2013.
Jeff Klein, the district’s data guru, believes the new early-warning software played some role in the decreased dropout rate, but he knows that can’t totally explain the sharp dip.
“I don’t think it’s any coincidence that the drop occurred at the same time [DVS] started,” Klein says.
Klein says he looks for “slow and steady” improvements rather than sudden changes if he’s trying to evaluate the effectiveness of a policy or intervention.
“You gotta take a long-term view, which I know isn’t really en vogue anymore,” says Klein. “But that’s real change.”
State board takes note
When department officials presented the dropout and graduation data to the State Board of Education in February, at least one board member, Patrick Heffernan, called attention to these data clean up efforts.
“I don’t want to downplay the improvements that we’ve made, but the question is do these numbers reflect more success with the kids or more success with the administrators tracking the kids and proving that Melvin went to Maryland,” Heffernan asked
Heffernan also wondered whether districts and schools should be focusing resources on appeals when that same energy could be redirected elsewhere.
“Is it worth 100 hours to track a kid down to make sure he’s counted correctly,” Heffernan said. “Or is it better to spend that 100 hours teaching the kids that you’ve got?”
For the moment, many districts seem to believe that 100 hours is better spent scrubbing data. And although the appeals offer a short-term stopgap for districts looking to rehab their graduation and dropout rates, many are working to catch more mistakes on the front end so that students are never misclassified.
‘You can control this’
When asked why they commit resources to tracking students no longer under their purview, district and school leaders say they can’t afford to do otherwise. In an era where data and accountability matter more than ever, districts fear ugly numbers–particularly ugly numbers that can be fairly easily remedied.
“You can control this,” says Dover High’s Courtney Voshell of the school’s dropout and graduation rates. “The testing is a whole other ball game. But you can control this.”
To be clear, Dover High has made real changes to help improve its graduation rates. They include rejiggering the school’s master schedule to allow for more credit recovery and making sure students take advantage of on-line course options. But data clean-up has also been vital, particularly when it comes to the dropout rate.
Dover High was named a Partnership Zone School for the 2011-12 school due to low performance. The designation brought extra money to the school, along with added layers of scrutiny. Voshell and her staff know they can’t let their numbers slip–not with higher-ups watching and not with parents now looking at data points such as dropout rate when they choose between district, magnet, vocational, charter, and private schools.
“You have school profiles and school choice,” Voshell says. “Now those numbers attract kids. The paradigm has shifted.”
In this new environment, schools want to make sure their data is pristine.
“If you don’t tell your story, somebody else will,” Voshell says. “There’s too much at stake.”
A success story, but what kind?
In one sense, the falling dropout rate is evidence of real success. Working together, state and district officials have improved data collection and given the public a far more reliable read on school performance.
The question is one of interpretation.
State officials have used Delaware’s dropout rate to justify their policy choices and ward off criticism. They have equated the improved figure with improved outcomes for high school students. But any such notion comes with serious caveats.
That’s the catch with data, says Kathy Demarest of the New Castle County Vocational Technical School District, “it can imply things that aren’t true.”
* Explanation from above: It is impossible to know exactly what the dropout rate would have been for two reasons. One, the state doesn’t know exactly how many appeals it received in the years before DVS, although the number was likely close to zero. Also, it’s impossible to know how many of the the appeals would have been for students that transferred out of state and thus would have been removed from both the numerator (total number of dropouts) and the denominator (total number of students). Even using the most conservative estimate–assuming that all 825 appeals would have been for students that left the state–the dropout rate in 2012 still would have been around 1.9 percent. In other words, if one subtracts roughly 800 students from the 2012 numerator and 800 students from the 2012 denominator, the state’s dropout rate would have been around 1.9 percent.