Spring is here, which also means it's standardized-testing season. While No. 2 pencils and Scantron bubble sheets were once assessment staples, many states have upgraded to computer-based assessments in recent years. The transition, unfortunately, has not been entirely smooth. With the new format, new issues like broadband infrastructure, data security, and a variety of digital glitches have entered the equation.
Since 2013, FairTest, the national center for fair and open testing, has been documenting examples of computer malfunctions, concluding that legislation — specifically that which centered around the Common Core State Standards — pushed tech into the the marketplace far to quickly, and that the companies being paid to create digital tests just aren't ready for the platform. "It makes no sense to attach high-stakes consequences to such deeply flawed tools," reads a post on the FairTest website.
So what trends has the organization documented over the past three years? Let's take a look at some graphs.
This first graph shows that there has been a surge in testing glitches, while one may think the number of glitches would be going down as companies learn from mistakes, that isn't quite true yet. It would appear that test companies have not quite hit the "apex of error." This increase also supports FairTest's argument that digital tests have been pushed into schools by states, even if not all districts are prepared to handle the sudden surge in tech. To be fair, many schools lack the adequate broadband infrastructure to support testing on that scale, so the blame can't be placed entirely on the companies behind the exams.
According to FairTest's documentation, there are 27 states (over half of the nation) that have reported testing glitches since 2013. In that group, several have struggled with repeated issues, such as Indiana. The state experienced issues in all three years it contracted CTB/McGraw-Hill for its exams. Earlier this month, WISH 8 reported on five Indiana schools where students were unable to take practice exams because they were either kicked off the server or because the exam was moving slowly. Indiana has 4 reported test malfunctions, but the data has only been collected for three years. That's because in 2015, there were two sets of reporting on issues: some in January and again this spring.
Ultimately, the data shows an imperfect system, which is particularly troubling given how much is tied to these tests. Also worth consideration is the mental toll computer malfunctions play on students. Will students be as sharp on their third or fourth go-around with a test? Chances are, probably not.
Would you like to see more education news like this in your inbox on a daily basis? Subscribe to our Education Dive email newsletter! You may also want to read Education Dive's look at 5 tech budget issues every school and district must consider.