There are many things contemporary technology can do that we take for granted. One example is the city of San Francisco banning facial recognition software back in May. It was a move that made sense from a privacy standpoint. Officials just forgot that many city employees carry such software around in their pockets, in the form of Apple’s Face ID unlock feature. Turns out, banning such software is harder than anyone really thought – which is why Brookline, Massachusetts was very careful to exempt “personal devices used by city officials.”

Facial recognition has been a tricky subject since its inception. While San Francisco was the first city to outright ban it, the technology has long been considered inaccurate due to its bias – as it is considered “alarmingly inaccurate” in recognizing minorities, women, and transgendered people – and its role in spreading government surveillance. This ban was meant to stop secret surveillance in order to protect individual liberties, but it turns out that this is more difficult than anticipated due to the software’s omnipresence.

After the ban, city employees had to figure out if they owned any recognition technology, Wired reports. The problem was in their pockets, in Apple’s Face ID unlock feature, which was now illegal even if it was turned off. However, San Francisco supervisors recently voted an amendment into effect, allowing iPhones and other products with the technology “so long as other features are deemed critically necessary and there are no viable alternatives.” While city employees are now allowed to use iPhones, the unlocking method remains forbidden.

A big part of the need for such a ban, as the outlet reports, is the San Francisco Police Department. It had a mug shot database combined with facial recognition software, along with a facial recognition server, running through 2020. Reports say they were also exploring an upgrade to the system. The public was kept completely oblivious to this fact, although an internal email about a new facial recognition engine was sent on the very same day when the ban was proposed. This software is also used by the Chinese government to monitor its citizens, and a very notable use of this has been the surveillance placed on Uighur Muslims in what has since been dubbed “automated racism.”

Of course, private citizens and/or businesses are still allowed to use facial recognition software in their daily lives (remember when Taylor Swift’s security used the technology to monitor her concert crowds for stalkers?) When confronted with the words “facial recognition” – especially if you throw “surveillance” into the mix – many of us would be hard-pressed to think of an example of it in our daily lives, but the iPhones in San Francisco show exactly how mistaken we are. Anyway, that’s enough of that. Facebook just asked me if I wanted to tag my friend in this photo I posted earlier today.

Source: Wired