Facial Recognition Ban

Updated: Dec 29, 2020

In a landmark move, Portland, Oregon, has enacted the nation's most stringent, sweeping facial recognition ban. That's good news for people concerned about creeping surveillance and the erosion of civil liberties.

Last week, the city council unanimously passed two new ordinances that prohibit both private companies and city bureaus from using the surveillance software "in places of public accommodation."


The ban on government agency use goes into effect immediately, while the ban on private use goes into effect on 1 January 2021. Private citizens are not subject to the rules, so biometric technology, like Apple Face ID, is still permissible.


"Portland residents and visitors should enjoy access to public spaces with a reasonable assumption of anonymity and personal privacy," the municipal legislation states, in part. "This is true for particularly those who have been historically over surveilled and experience surveillance technologies differently."


From a practical standpoint, that means government agencies, like the Portland Bureau of Police, cannot use facial recognition software for videos from body cameras, dash cams, or any other form of surveillance equipment. And due to the second ordinance, private entities like hotels, 24-hour convenience stores, and even airports (Delta uses facial recognition to check-in passengers) cannot employ the technology.


That last bit is significant because it represents a significant leap forward from existing facial recognition bans in other U.S. cities, which are limited in scope. Boston, San Francisco, and Oakland, California, all have bans in place, but they only pertain to the likes of government bureaus.


Portland City Commissioner Jo Ann Hardesty is particularly concerned about the potential misuse of the technology - and subsequent infringement of civil liberties - in a statement she posted on Twitter, just before city council passed the law.


"No one should be unfairly thrust into the criminal justice system because the tech algorithm misidentified an innocent person."

Original source: Popular Mechanics