State testing has long been the bane of Ohio’s primary and secondary education students. But they are likely unaware that most of their hard work is graded by computers.
Artificial intelligence (AI) programs, or so-called “robo graders,” account for the scoring of about three-fourths of completed state tests. That includes the essay portions, which AI scores by comparing the thematic content to model responses of people.
What concerns some educators is that AI scoring can directly affect candidates for high school graduation and students under the state’s Third Grade Reading Guarantee. The latter determines whether third graders advance to fourth grade.
They also question why computer grading of state tests was allegedly never announced by the Ohio Department of Education or debated publicly before its initiation. Reportedly, it came to light only after anomalies in scoring were discovered last year by individual school districts.
Both Paolo DeMaria, state superintendent, and Brian Roget, interim director of the ODE Office of Curriculum and Assessment, were unavailable for comment.
Chris Lake, Swanton Local Schools superintendent, said he questions all aspects of the state testing program.
“How many millions of dollars have been spent to create the vast array of state tests, and now the computer programs to grade them?” he said. “How did this particular company get the contract for the State of Ohio and how much money do they rake in each year to provide the tests and grading services? Am I shocked that computers are being used to grade these tests? No, not really.”
Lake said students would be better served by having an actual person grade the tests’ essay portions. He said while parents have not asked about computer grading “I doubt many people are even aware of it.”
He added, “Again, nothing shocks me when it comes to state testing. I hate to sound so cynical but it’s just another opportunity for someone to make money off of the public education system in Ohio.”
Archbold schools superintendent Jayson Selgo thinks the computer grading, which is meant to expedite test scoring and results, could be a benefit.
“In my opinion, that is helpful to school districts and allows us to begin reviewing the data and identifying areas for improvement quicker,” he said. “…If AI grading can be more efficient and accurate than human graders, then I certainly support the decision to expedite the process and allow districts to have the results sooner. I understand there is some discrepancy on the accuracy, but until I see data one way or another I will reserve my official support or opposition.”
Selgo said the claim that no public opinion on AI grading was sought before it began is, itself, a matter of opinion. He said the ODE’s Board of Education is appointed to represent the public.
“Therefore, I would suspect that ODE would argue they did allow stakeholders a part of the decision making process,” he said.
To his knowledge, no complaints about assessment grading have been filed.
District 47 State Representative Derek Merrin declined to comment, saying he’s not familiar enough with Ohio’s test grading process.
Jim Hoops (R-Napoleon) District 81 state representative, said he generally doesn’t have a problem with AI grading, “because (it) can get the test results back to the schools much faster.” In one sense it’s beneficial, Hoops said, since computers can’t show bias against the test taker.
Hoops said to his knowledge both the question-and-answer and essay portions of a state test are examined by a person if any issues arise. He conceded he’s not familiar enough with AI grading of test essays to comment.
“If there are some issues, and people feel it’s not fair, I do have some issues with it,” he said. “I’d have a big concern if I started hearing from schools about it. Testing is a big issue, and realizing the number of tests that kids have to take is unreal.”
He noted that there can also be issues with human essay graders. “They can look at things differently. There can be a bias,” he said.
Because he’s unsure of the circumstances, Hoop reserves judgment about reports that the ODE didn’t garner public opinion about computer grading. But he said DeMaria “has been doing a pretty good job reaching out to the schools and trying to communicate.”
Reach David J. Coehrs at 419-335-2010.