Welcome back to Pearson is Everywhere.
Last time we looked at how Pearson lowered the bar on the GED after test results plummeted nearly 90% after making the test Common Core aligned. Now we find that the results from the Common Core PARCC test also have issues.
The PARCC is mainly given online. However it is also given using pencil and paper. A new report finds that those using pencil and paper scored significantly higher than those taking it online according to a report discussed at EdWeek.
PARCC officals seemed to blame ‘familiarity’ with the system for some of the gaps:
“There is some evidence that, in part, the [score] differences we’re seeing may be explained by students’ familiarity with the computer-delivery system,” Nellhaus said.
In general, the pattern of lower scores for students who took PARCC exams by computer is the most pronounced in English/language arts and middle- and upper-grades math. – EdWeek
So not only is the test itself been controversial in quality, now the online platform itself is showing it has an impact? What this is saying, is that the kids have to be good at the online tool in order to score well? Wow. This is not good news for the “digital Ed” crowd.
It’s also not good for Pearson, whose stock has taken hits for the last few years over Common Core and who has been made the subject of undercover videos exposing textbook publisher as being ‘all about the money’. Those undercover videos also brought to light more detail to allegations Pearson engaged in bid rigging in California.
Multiple states found a “significant” enough difference between the pencil and paper version and the online version:
In December, the Illinois state board of education found that 43 percent of students there who took the PARCC English/language arts exam on paper scored proficient or above, compared with 36 percent of students who took the exam online. The state board has not sought to determine the cause of those score differences.
Meanwhile, in Maryland’s 111,000-student Baltimore County schools, district officials found similar differences, then used statistical techniques to isolate the impact of the test format.
They found a strong “mode effect” in numerous grade-subject combinations: Baltimore County middle-grades students who took the paper-based version of the PARCC English/language arts exam, for example, scored almost 14 points higher than students who had equivalent demographic and academic backgrounds but took the computer-based test.
“The differences are significant enough that it makes it hard to make meaningful comparisons between students and [schools] at some grade levels,” said Russell Brown, the district’s chief accountability and performance-management officer. “I think it draws into question the validity of the first year’s results for PARCC.” – EdWeek
Four out of five students who took the PARCC did so online.
*This article was originally posted at StopCommonCoreNC.org
But Gates won’t sell as much software for pen-and-pencil tests.
LikeLike