Key findings and recommendations from the House of Commons Standing Committee on Access to Information, Privacy and Ethics on the use and impact of facial recognitioon, along with some concerns.
Citizen Lab fellow Cynthia Khoo appeared before the House of Commons Standing Committee on Access to Information, Privacy and Ethics (ETHI) as a witness in the Committee’s study on the use and impact on facial recognition technology. She was invited to provide testimony on the potential harms and human rights implications of facial recognition, including recommendations for how the Government of Canada should regulate such technology.
Algorithmic policing technologies, including facial recognition, have arrived or are coming to Canadian cities and provinces, and they are doing so quickly. We have identified a number of significant policy, practice, and legal deficits related to the use of algorithmic policing technologies in Canada, including imminent or foreseeable impacts to human rights and fundamental freedoms including the rights to privacy, liberty, and equality, expressive and associational freedoms, and others.
Citizen Lab researchers reviewed the consultation materials, including the “Technical Paper” and the “Discussion Guide” associated with the government’s proposal to address what it has referred to as “online harms.” We provide the following comments in response to that consultation process.
In order to contribute to the IPC’s deliberations in the triaging of its strategic priorities, this submission serves to provide particularized input with respect to the IPC’s public interest mandate in the oversight of law enforcement authorities when it comes to the use of algorithmic policing technology in Ontario.
A collection of records and letters from freedom of information requests submitted to various federal and provincial departments, and municipal police services in Canada.
This report examines algorithmic technologies that are designed for use in criminal law enforcement systems, including a human rights and constitutional law analysis of the potential use of algorithmic policing technologies.
This report provides an in-depth legal and policy analysis of technology-facilitated intimate partner surveillance (IPS) under Canadian law. Stalkerware apps are designed to facilitate remote surveillance of an individual’s mobile device use with the surveillance often being covert or advertised as such. Despite increasing recognition of the prevalence of technology-enabled intimate partner abuse and harassment, the legality of the creation, sale, and use of consumer-level spyware apps has not yet been closely considered by Canadian courts, legislators, or regulators.
This report was collaboratively written by researchers from computer science, political science, criminology, law, and journalism studies. As befits their expertise, the report is divided into several parts, with each focusing on specific aspects of the consumer spyware ecosystem, which includes: technical elements associated stalkerware applications, stalkerware companies’ marketing activities and public policies, and these companies’ compliance with Canadian federal commercial privacy legislation.
However, the NEB’s failure to address any of the questions in the Citizen Lab’s letter is unfortunate, as making such information available would be in the public interest even if the NEB has decided not to move forward with its initial request for information.