As police departments across the U.S. weigh the use of facial recognition software, several communities are raising privacy concerns over the technology, which can identify people in real time through a combination of artificial intelligence, surveillance cameras and a database of photos for comparison.
Last week, San Francisco – an epicenter of tech innovation – became the first U.S. city to ban the use of facial recognition technology by police.
A few days later, Georgetown Law’s Center for Privacy & Technology released a report on facial recognition software possessed by police departments in Chicago and Detroit.
Using documents from the software vendor DataWorks Plus, the report claims the Chicago Police Department and Chicago Transit Authority have had facial recognition capabilities since at least 2016, and that “the limited information that is available suggests that Chicago is home to the most widespread face surveillance system in the United States today.”
But as the report notes, CPD has been mum on its use of the nascent technology, claiming to not use it.
A CPD spokesperson told WTTW News facial recognition is “seldom used” due to its inaccuracy.
“We do have access to facial recognition systems but given the technological limitations, it is seldom used,” the spokesperson said via email Tuesday. “When it is used, it’s only after a crime has occurred.”
They went on to say the facial recognition software isn’t used on citywide surveillance cameras and that it draws from a database of photos from CPD and the Illinois Department of Corrections.
Attorney Karen Sheley of the American Civil Liberties Union of Illinois said either scenario is troubling, since the Georgetown report points out an apparent contract between CPD and DataWorks Plus for real-time face-tracking software.
“They’ve been paying for what looks like real-time technology, so they’re either paying for it and they can’t actually use it, which is a waste of money,” Sheley said. “Or they have been using it and they’re not being upfront about it – I don’t know which.”
Northwestern University computer science professor and artificial intelligence researcher Kris Hammond said not all facial-recognition uses are dark and dystopian – take missing or kidnapped persons, for instance.
“The issue is not the technology itself – it’s what you want to do with it; what world do you want to build with this?” Hammond said. “I have to admit, I don’t mind a world in which if I lose somebody, and I’m worried about them, I can find them and I can find them immediately and the government will help me with that.”
Hammond and Sheley join us to share their thoughts on facial recognition software and the questions the technology raises.
Follow Evan Garcia on Twitter: @EvanRGarcia