People have rightly been demanding better security in our nation’s schools to guard against mass shooting situations. In response, some tech companies have been pitching cameras with facial recognition software and limited artificial intelligence to keep an eye on who is coming and going on campus. This technology is still relatively new and imperfect, but it’s been improving of late. Being able to quickly scan the people entering and leaving the school, comparing the faces to those who are supposed to be there and looking for strangers – and particularly those on watch lists – might prevent a tragedy or at least hasten the arrival of help.
Still, as the Washington Post article points out, privacy activists are already raising the alarm and claiming that this is a terrible idea.
“We’ve gotten no answers to all these questions: Under what conditions can a kid’s face be put into the system? Does the district need parental consent? Who can do a facial-recognition search?” said Jim Shultz, whose 15-year-old daughter goes to a high school in Upstate New York that is paying millions to install a surveillance network offering facial recognition. “It’s as if somebody presented them with a cool new car and they didn’t bother to look under the hood.”…
Andrew Ferguson, a law professor at the University of the District of Columbia, said surveillance companies are preying on the dread of community leaders by selling experimental “security theater” systems that only offer the appearance of safer schools.
“These companies are taking advantage of the genuine fear and almost impotence of parents who want to protect their kids,” he said, “and they’re selling them surveillance technology at a cost that will do very little to protect them.”
Are there still bugs to be worked out with these systems and challenges specific to using them in schools? As with any new technology, of course there are. Children’s faces change quickly as they grow and are harder to pick out than those of adults. And the software still makes mistakes. There will always be room for improvement. But is that any reason not to at least try? If the system can be told to watch for a particular face of a person who has made threats and it can identify them coming in the door it could mean the difference between life and death. If a child goes missing, knowing when they were last seen on campus could provide a significant lead to authorities.
But most of these complaints seem to be focused on the privacy issue. If you’re going to let that stop you, we’re not going to make much progress. Once again this seems to boil down to a conversation that liberals never want to have. It plays out something like this:
Liberal: We need to protect the children in our schools!
Conservative: Agreed. We should have armed security guards.
L: Nope. That just brings more guns into the system. Start banning all the guns.
C: How about if we lock some of the doors and control access at the rest?
L: Nope. Might be a fire hazard. Start banning all the guns.
C: Maybe if we put in this facial recognition thing to keep an eye out for known predators and threats?
L: Nope. Might violate somebody’s privacy. Start banning all the guns.
In the end, the people supporting all of these “save the students” rallies appear to have only one thing in mind. They want new laws banning more guns. That’s kind of a pity because there are some other options under discussion which might actually make the schools more secure. They just don’t want to hear about them.
The post Now activists don’t want facial recognition in schools? appeared first on Hot Air.
via Hot Air
Enjoy this article? Read the full version at the authors website: https://hotair.com