Welcome
Privacy & Data Security Law News

Police Use of Facial Recognition Tech Approved in Sweden

Oct. 25, 2019, 8:22 PM

Sweden will permit police departments to use facial recognition technology to help identify criminal suspects but will limit how long they can keep the data.

The Oct. 24 decision of Sweden’s Data Protection Authority clashes with bans approved this year in four U.S. cities prohibiting local government use of facial recognition tech. U.S. officials at the local, state, and federal continue to debate appropriate uses of the technology.

Police in Sweden had requested an advance decision on whether it could use the technology under Swedish law and submitted an impact assessment to the regulator. The DPA’s ruling allows law enforcement to use facial recognition to compare facial images from closed-circuit TV footage to biometric databases of known criminals, for example.

The DPA said the impact assessment provided the necessary privacy assurances for it to approve the technology. The ruling doesn’t detail specific privacy protection measures included in the impact assessment, and the DPA has yet to publish the assessment.

“It is clear that the Police Authority has taken appropriate technical and organizational measures to protect the personal data being processed, in particular against unauthorized processing, loss, destruction or other unintentional damage,” the DPA said in its decision.

According to the DPA, the processing and storage measures contained in the impact assessment comply with Sweden’s Crime Data Act, which follows the EU’s Data Protection Law Enforcement Directive. Identifying criminal suspects through facial recognition tools is more effective than identifications made by officers, “and is thus a means of allowing them to carry out their duties more effectively,” the DPA said in its decision, and thus doesn’t breach the Crime Data Act.

The EU directive clarifies the rules on processing data on crime and criminals by law enforcement and which Sweden follows. It limits collecting personal data to specified, explicit and legitimate purposes, and has tightened international transfers of crime-related data. The directive defines biometrics but doesn’t include specific language related to facial recognition.

The DPA said police must set a specific time limit for storing biometric data from cameras. The police impact assessment didn’t include any information about how long law enforcement can retain facial recognition data, according to the ruling.

Private Sector Use

The data authority is expected to rule on the use of facial recognition tools in the private sector. DPA spokesman Per Lovgren said. “This technology can be used to identify people in new ways, and brings with it new privacy concerns,” he said.

The EU’s General Data Protection Regulation (GDPR) would govern privacy sector use of the technology, Jesper Lund, chairman of Denmark’s IT-Political Association, said. Unlike the law enforcement directive, the GDPR implies that facial recognition technology may only be used with the consent of the data subject, he said.

“I would expect to see a higher bar set for the private sector,” Lund said. “It’s early in the game, and there is bound to be fragmented practice with DPA decisions in Europe.”

Companies likely will have to offer alternatives to customers or employees who don’t want their faces scanned, because the GDPR requires voluntary consent, Lund said.

“Despite the challenges posed by the GDPR, the interest in this technology among private companies seems to be growing rapidly,” he said. “In the private security industry, facial recognition technology can replace manual supervision of CCTV cameras and enforce blacklists.”

The Swedish DPA said it intends to conduct follow up inspections on the use of the technology by law enforcement authorities.

To contact the reporter on this story: Marcus Hoy in Copenhagen at correspondents@bloomberglaw.com

To contact the editors responsible for this story: Melissa B. Robinson at mrobinson@bloomberglaw.com; Rebecca Baker at rbaker@bloomberglaw.com; Keith Perine at kperine@bloomberglaw.com