The subtext of Mr. Steinhauer’s argument is this: the "flaw" in PARCC is its accuracy. Based on objective measures, N.J. schools are not adequately preparing a large number of students for college and careers. Therefore, if we follow his logic, high school diplomas don't need to signify readiness. What, then, does a high school diploma mean in New Jersey? Does it signify that a student showed up for class for thirteen years? Or does it signify that a student is prepared for study beyond secondary school?
Last year’s PARCC results were troubling for New Jersey’s educational community. While scores on our old HSPA tests were comforting -- in 2013, HSPA’s last year, 84% of 11th graders reached proficiency benchmarks in math and 93% achieved proficiency in language arts -- PARCC scores painted a starkly different picture. in 2014, PARCC’s debut year, only 41% of 11th graders were rated proficient or above in language arts.
Notably, NJEA and other affiliated lobbyists (Mr. Steinhauer's list includes Education Law Center, Save Our Schools-NJ, and Opt Out N.J.) never opposed HSPA tests or demanded changes to state law. That's because HSPA made N.J. schools look good and PARCC scores make N.J.’s public school system look bad. (Other states’ scores on both PARCC and Smarter Balanced, the other consortium that produces Common Core-aligned assessments, were just as depressing.) So, then, is PARCC a “flawed test”?
Not according to research just released by Education Next. This study, commissioned by the Massachusetts Executive Office of Education, compares the accuracy of MA’s old college-readiness test, MCAS, with PARCC. Massachusetts is generally acknowledged to have the best state education system in the country. (see rankings from Quality Counts) and MCAS has long been considered to be the most rigorous high school standardized test.
One conclusion from the Education Next report:
Ultimately, we found that the PARCC and MCAS 10th-grade exams do equally well at predicting students’ college success, as measured by first-year grades and by the probability that a student needs remediation after entering college. Scores on both tests, in both math and English language arts (ELA), are positively correlated with students’ college outcomes, and the differences between the predictive validity of PARCC and MCAS scores are modest. However, we found one important difference between the two exams: PARCC’s cutoff scores for college-and career-readiness in math are set at a higher level than the MCAS proficiency cutoff and are better aligned with what it takes to earn “B” grades in college math. That is, while more students fail to meet the PARCC cutoff, those who do meet PARCC’s college-readiness standard have better college grades than students who meet the MCAS proficiency standard.In other words, PARCC tests aren’t flawed. They are accurate predictors of student college success.
There are other indicators of PARCC’s accuracy. The highly-regarded NAEP assessments correlate closely with PARCC results. So do SAT’s: last year Assistant Education Commissioner Bari Erlichson reported that “44 percent of students who took the SAT in 2015 met the standards for career and college readiness," which is remarkably close to PARCC results.
The union leadership’s frenzy upon the unveiling of N.J.mediocre record of student proficiency isn’t about flawed tests. It’s about resistance to change. Mr. Steinhauer writes that we should award diplomas based on “the professional judgment of the educators who actually work with and know their students,” Sometimes that works. Sometimes it doesn't. Last year WNYC profiled Wendy Cruz, a high school student in Camden who got straight “A’s" but couldn’t pass the HSPA. “I’ve been studying my whole life and I never got left back or anything.”
Wendy and her family deserve honest information. PARCC gives us a chance to offer that honesty, and that’s at the heart of Mr. Steinhauer’s opposition.