Arab Canada News
News
Published: June 30, 2024
Some police agencies in Canada use facial recognition technology to help solve crimes, while other police forces say that concerns about human rights and privacy prevent them from using powerful digital tools.
This uneven application of technology - and the loose rules governing its use - is what has led legal experts and AI specialists to call on the federal government to establish national standards.
Kristin Tomsen, a law professor at the University of British Columbia, says: "Until a better handling of the risks associated with the use of this technology is reached, there should be a moratorium or a set of bans on how and where it is used."
Additionally, the mix of regulatory frameworks related to emerging biometric technologies has created situations where the privacy rights of some citizens receive greater protection than others.
She said: "I think the fact that we have different police forces taking different steps raises concerns about inequality and how people are treated across the country, but it also highlights the ongoing importance of some form of federal action that needs to be taken."
Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify individuals by comparing images or video of their faces - which are often captured by security cameras - with existing images of them in databases. This technology has been a controversial tool in the hands of the police.
In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP violated privacy laws when it used the technology without public knowledge. In the same year, the Toronto Police admitted that some of its officers used facial recognition software without informing their superiors. In both cases, the technology was provided by the American company Clearview AI, whose database consists of billions of images scraped from the internet without the consent of those whose images were used.
Last month, York and Peel police in Ontario announced that they had begun implementing facial recognition technology provided by the French multinational Idemia. In an interview, York police officer Kevin Nbriga said the tools "help speed up investigations and identify suspects more quickly," adding that with regard to privacy, "nothing has changed because security cameras are everywhere."
However, in neighboring Quebec, Montreal Police Chief Fadi Dague says the force will not adopt these biometric identification tools without discussing issues ranging from human rights to privacy.
Dague said in a recent interview: "This will be something that will require a lot of discussion before we consider implementing it."
Meanwhile, Nbriga confirmed that the department consulted the Ontario Privacy Commissioner for best practices, adding that the images police obtain will be "lawfully obtained," either through cooperation with security camera owners or by obtaining court orders for the images.
Despite York police's insistence that officers will seek judicial authority, Kate Robertson, a senior researcher at the Citizen Lab at the University of Toronto, says that Canadian police forces have a history of doing exactly the opposite.
Since the revelation of Toronto Police's use of Clearview AI between 2019 and 2020, Robertson said she "remains unaware of any police service in Canada obtaining prior judicial approval to use facial recognition technology in their investigations."
According to Robertson, obtaining the green light from the court, typically in the form of a court order, represents the "gold standard for privacy protection in criminal investigations." This ensures that the facial recognition tool, when used, is appropriately balanced with the right to freedom of expression and assembly and other rights outlined in the charter.
Although the federal government does not have jurisdiction over provincial and municipal police forces, it can amend the Criminal Code to include legal requirements for facial recognition programs in the same way that it updated the law to address audio recording techniques that can be used for surveillance.
In 2022, the heads of Canada's federal and provincial privacy committees called on lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including empowering independent oversight bodies, banning mass surveillance, and limiting the duration for which images can be retained in databases.
Meanwhile, the federal Ministry of Economic Development stated that Canadian law "is likely" to regulate companies' collection of personal information under the Personal Information Protection and Electronic Documents Act, or PIPEDA.
The department said, "For example, if a police force, including the RCMP, contracts for activities that use personal information with a private company engaged in business activities, those activities are likely to be regulated by PIPEDA, including services related to facial recognition technologies."
The Quebec Provincial Police also have a contract with Idemia, but they did not specify exactly how they use the company’s technology.
The police stated in an email statement that "its automated facial recognition system is not used to verify individuals' identities. This tool is used in criminal investigations and is limited to data sheets of individuals who have been fingerprinted under the Criminal Identification Act."
AI governance expert Ana Brandusescu says that Ottawa and police forces across the country have not responded to calls for improved governance, transparency, and accountability in the procurement of facial recognition technology.
She said: “Law enforcement is not listening to academics, civil society experts, people with lived experience, and those directly harmed.”
Comments