ResearchTools & ResourcesFAQ

Mobility Data and Canadian Privacy Law Explained

On November 22, 2022 Citizen Lab published an analysis and recommendations pertaining to the collection of de-identified mobility data and its use under the socially beneficial and legitimate interest exemptions in Canadian privacy law. In this explainer, we discuss the report and accompanying recommendations with Amanda Cutinha and Christopher Parsons, the report’s authors.

What are the key findings of this report?

In the report, we investigate the collection of mobility data by the federal government, its legality under the existing and proposed commercial privacy regime, and proposed recommendations for the reform of draft Bill C-27 which would address many of the issues in the governance of mobility data.

The federal government obtained de-identified and aggregated mobility data from Telus and BlueDot, beginning  as early as March 2020, but this only came to the public’s attention in December 2021. The Standing Committee on Access to Information, Ethics, and Privacy (ETHI) investigated this data collection and ultimately raised concerns about the federal government’s inadequate consultation with the Office of the Privacy Commissioner, the failure of the government to verify consent had been provided to collect or disclose the mobility information, the broad purposes for data collection, and the unclear timeline for the government’s retention of data.

When we assessed the lawfulness of the collection of mobility data, we found that BlueDot and Telus likely complied with current private sector privacy legislation PIPEDA. Specifically, the de-identified information likely did not constitute personal information within the meaning of PIPEDA. This, however, led us to spotlight deficiencies in current privacy legislation. These included:

  • inadequate governance of de-identified data
  • an absence of appropriate transparency and accountability principles
  • a failure to adequately account for harmful impacts of data sharing
  • a neglect of commitments to Indigenous data sovereignty principles
  • insufficient enforcement mechanisms

We found that these deficiencies remain in the Consumer Privacy Protection Act (CPPA). Most pertinently, the proposed legislation contains significant exceptions to knowledge and consent where the purposes of data sharing are deemed as either socially beneficial or within a corporation’s legitimate interests. The result is that individuals’ mobility information may be collected or used without knowledge or consent in the service of legitimate business interests, or disclosed to parties including the federal government such as for socially beneficial purposes.

We make 19 corrective recommendations to the CPPA that would alleviate many of the thematic issues facing PIPEDA and, by extension, the CPPA. However, even were these amendments adopted the legislation should be significantly re-thought to protect individual and collective privacy rights.

What are the key privacy issues with regards to collection of mobility data during the COVID-19 pandemic?

We outline a number of privacy issues surrounding the collection of data during the COVID-19 pandemic.

First, there has been a lack of transparency concerning the collection, use, or disclosure of de-identified mobility data between private sector organizations and the federal government. Though the pandemic required timely and urgent responses, communications from the government were often muddled and did not clearly address whether the government was collecting mobility data. This lack of transparency can fuel distrust amongst members of the Canadian public who already doubt that the federal government respects their privacy rights.

Second, the federal privacy commissioner was not adequately involved in assessing the government’s collection of mobility information. In the case of the disclosure between Telus, BlueDot, and the federal government, the Privacy Commissioner was not engaged. Consequently, the Commissioner could not review the privacy practices linked to the activity in order to confirm the adequacy of de-identification or to ensure consent was obtained where necessary under law.

Third, while the government asserted it established requirements to protect Canadians’ privacy when entering into contracts with Telus and BlueDot, these requirements were not made public or discussed in greater detail.

Fourth, the stated purposes for the collection of data were very broad. They would allow for, in theory, the provision of information or policy advice to relevant provincial and municipal governments to target enforcement actions towards communities with higher-than-average mobility scores. This could have led to enforcement activities being applied to racialized neighborhoods where residents more regularly traveled significant distances for work. The prospect of disproportionate enforcement actions raises equity concerns.

Fifth, the absence of transparency was not limited to the purposes for data collection but continued through to retention timelines. The collection of data was to continue until the end of the pandemic, raising questions as to who decides when the pandemic is ‘over’.

Overall, these issues highlight deficiencies in the existing framework governing private-public data sharing: an absence of governance for de-identified data; a lack of transparency requirements in the sharing of data; inadequate protections to prevent function creep and long retention timelines; and the absence of requirements to consider the equity implications of information sharing with government agencies.

What are the potential negative consequences of collecting and sharing COVID-19 pandemic mobility data with the intention of being ‘socially beneficial’?

Individual privacy rights are at risk when data sharing can occur for socially beneficial purposes, where individuals whose data is being shared are neither aware of the sharing nor consent to it. Socially beneficial purposes can mean different things to differently-situated people: what may be perceived as being socially beneficial to one group may not be to another.

To give one example, consider the context of abortion-care services. One government might analyze de-identified data to assess how far people must travel to obtain abortion-care services and, subsequently, recognize that more services are required. Other governments could use the same de-identified mobility data and come to the opposite conclusion and selectively adopt policies to impair access to such services.

Moreover, the sharing of data for socially beneficial purposes without knowledge or consent may be interpreted as inherently paternalistic. Though the federal government is tasked with making policy that benefits the lives of its citizens, sharing data without knowledge and consent can undermine the data sovereignty of individuals in society. This problem is further pronounced for Indigenous people whose sovereignty has been historically undermined.

Is current privacy legislation adequate in protecting individuals’ privacy interests?

No. We argue that current commercial privacy legislation fails to adequately protect individuals’ privacy interests for the following reasons:

  1. PIPEDA fails to adequately protect the privacy interests at stake with de-identified and aggregated data despite risks that are associated with re-identification.
  2. PIPEDA lacks requirements that individuals be informed of how their data is de-identified or used for secondary purposes.
  3. PIPEDA does not enable individuals or communities to substantively prevent harmful impacts of data sharing with the government.
  4. PIPEDA lacks sufficient checks and balances to ensure that meaningful consent is obtained to collect, use, or disclose de-identified data.
  5. PIPEDA does not account for Indigenous data sovereignty nor does it account for Indigenous sovereignty principles in the United Nations Declaration on the Rights of Indigenous Peoples, which has been adopted by Canada.
  6. PIPEDA generally lacks sufficient enforcement mechanisms.

 Why does the collection of de-identified data matter?

De-identified data runs the risk of being re-identified, especially with the rapid evolution of machine learning technologies, breadth of publicly and commercially available datasets, and regularly evolving statistical methods for analyzing data. Where information which is sensitive is de-identified and not subject to the same privacy protections as identifiable, personal information, re-identification risks are magnified.

Would implementing your recommendations solve issues with privacy law?

We wrote this report, in part, to provide practical solutions to gaps in draft privacy legislation. Our recommendations were drafted in light of this practical aim.

However, as we ultimately conclude, our recommendations are not a panacea – even if all of the changes were implemented, they would not ameliorate all of the issues with the CPPA. In order to adequately protect individual privacy rights, the correct approach would be to take a human rights centric approach to privacy protections.

Which recommendations are the most important?

In drafting recommendations, we sought to ameliorate existing deficiencies in current privacy law. The recommendations of the most concern relate to the exemptions to knowledge and consent for “socially beneficial” purposes and “legitimate interests” of organizations.

The sharing of de-identified mobility data between the private and public sector would be authorized under the socially beneficial purpose exemption to knowledge and consent under the draft CPPA. While socially beneficial activities can have positive characteristics, determining what constitutes a beneficial activity can be political. There is a risk that what is socially beneficial for some is not for others. The failure to narrow this exception may allow for information sharing that disproportionately intrudes on the privacy or equity rights of some individuals. We offer numerous recommendations intended to reduce the risks associated with potentially socially beneficial uses of data while, at the same time, not asserting that such sharing should be barred in its entirety.

While the socially beneficial purposes clause opens the door to sharing de-identified information with third-parties, such as government agencies, the legitimate interest exception enables private organizations to determine whether the collection or use of personal information outweighs the adverse effects of doing so. While the information cannot be used to influence an individual’s behavior or decisions it could be used to create datasets that facilitate business or policy developments. While the Privacy Commissioner could investigate organizations that use the exception, they would first need to know that organizations were collecting or using information under this exception; only then could the Commissioner request the organization’s records. The effect is that unless the Privacy Commissioner is zealously engaged in asking private organizations about whether they are collecting or using personal information under the legitimate interest exception, it will be private organizations that will principally be the judges and juries of whether their collection falls under the legitimate interest exception. We argue that organizations should need to be up front with the Commissioner about the use of this exception while, also, aiming to better empower individuals to control how private organizations collect and use their personal information.