Every second, the surveillance cameras installed in each classroom at Niulanshan First Secondary School in Beijing snap a photo. The images are then fed into the Classroom Care System, an “emotion recognition” program developed by Hanwang Technology. It identifies each student’s face and analyzes their behavior: a student rifling through their desk might be labeled “distracted,” while another looking at the board would be labeled “focused.” Other behavioral categories include answering questions, interacting with other students, writing, and sleeping. Teachers and parents receive a weekly report through a mobile app, which can be unsparing: In one, a student who had answered just a single question in his English class was called out for low participation — despite the app recording him as “focused” 94% of the time.
Imagine then coupling this with analysis of every mouse click and keystroke and we have total control, or rather the illusion of control since the conclusions made by AI may be based on in-built biases (like the student called for low participation in the example above). The idea that algorithms can accurately assess a student's emotions by analysing facial expression is ridiculous. We all know how hard it is to read another person's face. But once you get AI making decisions it becomes almost impossible to question them since we tend to see computers as impartial and infallible. No matter how questionable such technologies may be there's big money to be made. The article claims that the emotional recognition market may be worth more than $33 billion by 2023. Money talks. For further serious concerns about facial recognition, see an article in Mashable, 9 scary revelations from 40 years of facial recognition research.
There are, of course, positive applications of AI in education but the key question in all cases is who owns the data, on what terms and do the students have the right to be forgotten? We all need to be very cautious about letting these technologies into the classroom.
No comments:
Post a Comment