From a business magazine called Fast Company comes a good opinion piece discussing “Why schools need to abandon facial recognition, not double down on it”.
It’s written by two fellows from the NAACP Legal Defense Fund who link to a growing body of research indicating that these systems use software that is heavily biased and frequently inaccurate. Which is pretty typical in the AI business these days.
Welcoming facial recognition into our children’s classrooms creates situations ripe for discrimination based on flimsy science. Emerging research is clear that facial recognition technology is inaccurate and reproduces age, race, and ethnicity biases. It also performs more poorly on children as compared to adults due, in part, to facial changes that occur during adolescence. Yet companies continue aggressively marketing facial recognition as a cost-effective public safety solution without disclosing these tools’ inaccuracies and racial and gender biases.
At least one company is selling something called “affect recognition”, an AI-based system which they imply can read student minds by analyzing their faces and interpreting their emotions.
Affect recognition technology relies on the flawed premise that observable differences in physical characteristics among individuals and groups can be measured, quantified, and interpreted in ways that offer insights into a person’s intellect, morality, or trustworthiness. As such, affect recognition falsely assigns scientific significance to racial differences in ways that reproduce racial hierarchy and social inequality. A person’s character, emotions, and “risk level” cannot be discerned from their body or face.
While this system is pitched as a tool to improve learning, it is more likely to be used, Minority-Report-style,1 to “catch” students who are planning to do something “wrong”. At least according to the unseen, unquestioned algorithms.
When these relatively new AI systems are layered on top of the growing general use of video and audio surveillance in schools, plus the fact that police agencies are often given direct access to the data, you have a situation that “criminalizes students of color” and actually interferes with the learning process.
Read the whole article and then ask your school or district about the surveillance systems they are using or planning to buy.
Community stakeholders can become a powerful voice in pressuring their local governments to join the growing list of cities and states that have banned police use of facial recognition technology. Students—children—should not be monitored by surveillance technology that is flawed, reproduces biased outcomes, and is ill-equipped to do anything beyond erode public trust and safety.
Yep!
The image, of course, is from the movie Minority Report, directed by Steven Spielberg, starring Tom Cruise, and based on a short story by Phillip K. Dick.
1. Since this implementation of the software is likely being used in schools with high minority populations, using the title of that movie seems additionally appropriate.
Leave a Reply