Arab Canada News
News
Published: August 17, 2024
The Canada Border Services Agency plans to implement an application that uses facial recognition technology to track individuals who have been ordered deported from the country.
The mobile reporting application will use biometric data to confirm the person's identity and log their location data when they use the application to check in. Documents obtained through access to information indicate that the Canada Border Services Agency proposed such an application as early as 2021.
A spokesperson for the agency confirmed that an application called ReportIn will be launched this fall. The agency later stated that the application could also be used by permanent residents and foreign citizens who are undergoing inadmissibility hearings.
Experts have raised numerous concerns and questioned the validity of user consent and the potential confidentiality regarding how the technology makes its decisions.
Each year, about 2,000 individuals ordered to leave the country fail to appear, meaning that the Canada Border Services Agency "has to spend significant resources investigating and locating these clients and, in some cases, detaining them," according to a 2021 document.
The agency proposed the smartphone application as an "ideal solution."
It stated that receiving regular updates through the application about the "person's residence, employment, and family status, among other things, will allow the Canada Border Services Agency to obtain relevant information that can be used to contact and monitor the client for any early indications of non-compliance."
"Moreover, given the automated operation, the client is likely to feel engaged and will recognize the level of visibility the border services have in their case."
Furthermore, the document noted that "if the client fails to appear for deportation, the information collected through the application will provide good investigative leads to locate the client."
The algorithmic impact assessment of the project - not yet published on the federal government's website - noted that the biometric voice technology that the CBSA attempted to use is being phased out due to "failed technology," and the ReportIn application was developed to replace it.
It stated that "the person's biometric facial data and location, provided by sensors and/or the GPS in the mobile device/smartphone," are recorded through the ReportIn application and then sent to the CBSA's backend system.
Once individuals submit their images, the "facial comparison algorithm" will generate a similarity score with a reference image.
If the system does not confirm the facial match, it triggers a process for officers to investigate the case.
The document states, "The location of individuals is also collected each time they report in, and if the individual fails to comply with their conditions." The document indicated that individuals would not be "continuously tracked."
The application uses technology from Amazon Web Services. This raised concerns for Brenda Macfie, Director of Executive Education in the Digital Society Public Policy Program at McMaster University.
She noted that while many facial recognition companies offer their algorithms for testing to the National Institute of Standards and Technology, Amazon has never voluntarily done so.
A spokesperson for Amazon Web Services stated that Amazon Rekognition "has been extensively tested - including by third parties such as Credo AI, a company specializing in responsible AI, and iBeta Quality Assurance."
The spokesperson added that Amazon Rekognition is a "widely available cloud-based system and therefore cannot be downloaded as stated in the NIST participation guidelines."
The spokesperson explained, "That is why our Rekognition Face Liveness was submitted instead for testing against industry standards to iBeta Lab," which is certified by the institute as an independent testing lab.
The CBSA document states that the algorithm used will be a trade secret. In a situation that could have life-altering consequences, Macfie asked whether "it is appropriate to use a tool protected by trade secrets or proprietary information that denies people the right to understand how decisions about them are truly made."
Kristin Thomasen, Assistant Professor and Chair of the Law, Robotics, and Society Department at the University of Windsor, indicated that the reference to trade secrets suggests that there may be legal barriers preventing information about the system.
She explained that there has been concern for years about individuals being subject to errors in systems and legally barred from obtaining further information due to intellectual property protection.
CBSA spokesperson Maria Ladosur explained that the agency "developed this smartphone application to allow foreign citizens and permanent residents subject to immigration enforcement conditions to report in without having to appear in person at a CBSA office."
She stated that the agency "worked closely in consultation" with the Office of the Privacy Commissioner regarding the application. "Registration in ReportIn will be voluntary, and users will need to consent to use the application and use their images for identity verification."
Petra Molnar, Deputy Director of the Refugee Law Lab at York University, noted that there is a power imbalance between the agency implementing the application and the individuals on the receiving end.
"Can anyone truly give consent in this situation where there is a significant power differential?"
Ladosur stated that if an individual does not wish to participate, they can report in person as an alternative.
Thomasen also warned about the risk of errors in facial recognition technology, noting that the risk is higher for individuals with darker skin.
Molnar remarked that "it is very concerning that there is no discussion about human rights impacts in the documents."
A CBSA spokesperson stated that Credo AI reviewed the program for bias against demographic groups, finding a facial match rate of 99.9 percent across six different demographic groups, adding that the application "will be continuously tested after launch to assess accuracy and performance."
The final decision will be made by a human, with officers overseeing all requests, but experts have noted that humans tend to trust judgments made by technology.
Thomasen stated that there is a "recognized psychological tendency to a considerable extent ... for people to comply with the expertise of the computer system," where computer systems are perceived as less biased or more accurate.
Comments