Some Canadian police forces are using facial recognition technology to help solve crimes, but others say human rights and privacy concerns have made them hesitant to adopt the powerful digital tool.
The uneven application of the technology and lax rules governing its use have led legal and AI experts to call on the federal government to establish national standards.
“Until we can better manage the risks associated with using this technology, there should be a moratorium or a series of bans on how and where it can be used,” said Kristen Thomason, a law professor at the University of British Columbia.
Moreover, uneven regulation of new biometric technologies creates situations in which some citizens’ privacy rights are more protected than others.
“I think the fact that police forces are taking different measures raises concerns. [about] It’s about inequality across the country and how people are treated. [it] It also highlights the continued importance of some sort of federal action,” she said.
Facial recognition systems are a type of biometric technology that uses AI to identify people by comparing images or videos of their faces (often captured by security cameras) with existing images in a database. The technology has been a controversial tool used by police.
In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP had violated privacy laws by using the technology without informing the public. That same year, Toronto police acknowledged that some of its officers had used facial recognition software without informing its chief. In both cases, the technology was provided by U.S. company Clearview AI, whose database consisted of billions of images collected from the internet without the consent of the people whose images were used.
Last month, Ontario’s York and Peel police announced they had begun deploying facial recognition technology provided by French multinational company IDEMIA. York police Constable Kevin Nebrija said in an interview that the tool “will help speed up investigations and identify suspects sooner,” adding that when it comes to privacy, “nothing’s changed because we have security cameras all around.”
But in neighbouring Quebec, Montreal’s police chief, Fadi Dagher, said his department would not deploy such biometric tools without discussing a range of issues, from human rights to privacy.
“There’s going to be a lot of discussion before we even consider introducing it,” Dugger said in a recent interview.
Nebrija stressed that police have consulted with Ontario’s privacy commissioner on best practices, adding that any images police obtain are “lawfully obtained” with the cooperation of security camera owners or by obtaining a court order for the images.
Court permission required: Expert
While York police say its officers are seeking jurisdiction, Kate Robertson, a senior research fellow at the University of Toronto’s Citizen Lab, said Canadian police have a history of doing just the opposite.
Since it was revealed that Toronto police had used Clearview AI between 2019 and 2020, Robertson said he was “not yet aware of any police agency in Canada that has received prior approval from a judge to use facial recognition technology in an investigation.”
Getting the go-ahead from a court, usually in the form of a warrant, would be the “golden standard for privacy protection in criminal investigations,” Robertson said, ensuring that any use of facial recognition tools is properly balanced with freedom of expression, assembly and other Charter rights.
The federal government has no jurisdiction over state or city police departments, but it could amend the Criminal Code to include legal requirements regarding facial recognition software, just as it has amended laws to address voice recording technology that could be used for surveillance.
In 2022, the heads of Canada’s federal, provincial and territorial privacy commissioners urged lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including empowering independent oversight bodies, banning mass surveillance and limiting how long images can be stored in databases.
Meanwhile, the federal Department of Economic Development said Canadian law could “potentially” restrict companies from collecting personal information under the Personal Information Protection and Electronic Documents Act (PIPEDA).
“For example, when police forces, including the RCMP, outsource activities that use personal information to private companies conducting commercial activities, those activities may be regulated by PIPEDA, including services related to facial recognition technology,” the ministry said.
Quebec provincial police also have a contract with Idemia, but have not disclosed how they are using the company’s technology.
“The automated facial matching system is not used to verify the identity of individuals. The tool is used in criminal investigations and is restricted to data sheets of individuals whose fingerprints have been taken under the Criminal Identification Act,” police said in an emailed statement.
AI governance expert Ana Brandușescu sa Oittawa and the country’s police force have not heeded calls for better governance, transparency and accountability in the procurement of facial recognition technology.
“Law enforcement agencies are not listening to academics, to civil society experts, to people with lived experience, to people who have been directly affected,” she said.