The remote software company ProctorU will stop offering universities proctoring services that rely solely on artificial intelligence (AI) to monitor tests. The company website explains that depending exclusively on AI for this task has led to an increased likelihood of false positives for cheating.
“What makes AI a good fit for automating repetitive tasks also makes it unsuited for handling nuanced human interactions,” the site explained. “Although AI can pick up on repetitive actions just like humans can, it cannot interpret those actions as being either meaningless or dishonest.”
ProctorU also pointed out that software-only systems can represent additional workload to overburdened administrators, who are inexperienced at detecting misconduct through the application.
ProctorU explained that it plans to dedicate more than six weeks of training for all of its proctors and add “several layers of human oversight and supervision to every exam” to prevent human bias.
Mistrust in Remote Proctoring Software
Universities that started off using remote proctoring software during the pandemic have since discontinued due to ease of use and privacy concerns. Earlier this year, the University of Illinois stopped using Proctorio software after numerous complaints from students and faculty.
Students have also complained about the discomfort of being proctored by a person they can’t see. Last October, Slate reported on the experience of a young student who had to give an unseen proctor a video tour of her test space.
“They can see you, but you can’t see them, which I didn’t feel good about,” Madi Mollico, who was taking the Graduate Record Examination, told Slate.
Surveillance anxiety, privacy violations, and discrimination concerns brought about by remote proctoring software have prompted researchers to encourage schools to invest in alternative methods to curb online cheating.