When I was still working for the overly-large school district, one of things our office did was interpret the terms of service and privacy policies for the ever growing stream of websites and application teachers were bringing into their classrooms. At least we did the best we could.
Back when I worked on the instruction side of the overly-large school district, we tried to maintain a good relationship with the people in IT.
Even so, we were pretty sure that most of them really didn’t understand the people they were supposed to be supporting. That was especially true of the folks working on network security.
Every year, in the week after New Year’s Day, the Consumer Electronics Show (CES) sets up in almost every corner of the massive Las Vegas Convention center. The trade show normally (as in not during a pandemic) hosts tens of thousands of people who are there to see the latest tech gadgets companies plan to release in the near future.1
From a business magazine called Fast Company comes a good opinion piece discussing “Why schools need to abandon facial recognition, not double down on it”.
It’s written by two fellows from the NAACP Legal Defense Fund who link to a growing body of research indicating that these systems use software that is heavily biased and frequently inaccurate. Which is pretty typical in the AI business these days.
At the start of the pandemic, way back in March 2020 when there was much confusion around how the virus was transmitted, many people decided public surfaces must be at least partially to blame.
Which led stores, hotels, airports, and other public spaces to jump into a very conspicuous effort to disinfect every surface in sight. Assigning workers to continually wipe down everything someone might touch, resulting in a distinct disinfectant odor hanging in the air everywhere you went.