On April 22, the Citizen Lab published recommendations for Bill C-11, proposed updates to Canadian federal commercial privacy legislation. In this explainer, we discuss those recommendations with Christopher Parsons, the post’s lead author.
What is Bill C-11 and what does it cover?
Bill C-11: Digital Charter Implementation Act, 2020, and in particular the Consumer Privacy Protection Act (CPPA) that is part of the larger piece of legislation, would significantly reshape Canada’s federal commercial privacy requirements. The legislation has been designed to advance consumer interests, as opposed to being based upon human rights principles, and has been broadly critiqued by experts on the basis that the legislation would not adequately address issues of meaningful consent, de-identification, or data mobility.
In excess of those critiques, we argue that the legislation would also fail to make organizations behave more transparently. Nor would the legislation meaningfully enhance the current limited rules which empower individuals to access and correct their personal information that is held by organizations. The proposed legislation also fails to satisfactorily ensure that whistleblowers who come to the Privacy Commissioner would be adequately protected from retribution.
All this having been said, if passed as drafted C-11 would better empower the Privacy Commissioner and would also create a tribunal which would be responsible for enforcing the Commissioner’s decisions. The Tribunal could assign monetary penalties where appropriate.
When people think of a bill concerning Consumer Privacy Protection, they likely think of how commercial entities fit into that, but not human rights. How is this bill a natural entry point for discussing human rights?
The Privacy Commissioner of Canada has stated that, “[p]rivacy has long been recognized as a fundamental right – most recently in the United Nations Secretary General’s blueprint for human rights.” When we’re talking about how our personal information is collected, processed, and disclosed, we’re really talking about how other parties can affect or modify our life situations.
Collected data can potentially be used to affect the kinds of job ads you receive, used by landlords to determine if they want to rent you an apartment, or used to present services to you at different prices based on the lifestyle information that organizations have associated with you. In each of these cases, your privacy rights are directly engaged as they pertain to broader human rights.
By narrowing the legislation to focus on consumer rights, the Government of Canada hasn’t just missed an opportunity to discuss human rights. Instead, the government has failed to recognize that the legislation they are advancing is, first and foremost, engaging with basic human rights protected activities.
What are the most pressing challenges with the current bill?
Other Canadian experts have raised warnings that, as drafted, Bill C-11 would not adequately address issues of meaningful consent, de-identification, or data mobility.
We agree with these critiques, and separately are arguing that organizations will be insufficiently compelled to explain how they collect, process, and disclose personal information unless the organizational transparency rules are significantly enhanced. Similarly, as the legislation is drafted, Canadians and residents of Canada will be largely unable to obtain copies of the information organizations hold about them, nor receive fulsome explanations of how private organizations have used the information. Perhaps worse still is that whistleblowers may be inadequately protected, with the cost being that when employees of organizations believe that something improper is occurring they may be disincentivized from bringing their concerns to the Office of the Privacy Commissioner of Canada.
What recommendations do we make for increased organizational transparency?
We’ve proposed a few fixes. First, organizations should be required to be specific about how they collect, process, and disclose personal information and not be allowed to give generic explanations of how such information is used. Our research has shown, time and again, that organizations do not clearly explain how they collect, process, or disclose personal information under the current regime; the proposed legislation will do little to nothing to change the current abysmal state of affairs.
Second, organizations should be required to publish annual transparency reports as well as their law enforcement guidelines, and the Privacy Commissioner should be empowered to specify the templates that should be used as well as compel classes of organizations to produce these kinds of reports. The goal, here, is twofold. In publishing transparency reports, the aim is to ensure that organizations publicly communicate how often they are requested, or compelled, by government agencies to disclose information that the organizations collect or process. In publishing law enforcement guidelines, the goal is to explain how organizations actually receive and process requests from government agencies, principally to assess the appropriateness of how organizations are interacting with government agencies and to correct situations where organizations’ guidelines run counter to Canadian law.
Why should organizations publish who has access to personal information?
You can’t know how your information is being used if you don’t know who is using it or who has access to it. For this reason, organizations should be required to publish whom they disclose information to, or receive it from, and the specific reasons for the disclosure or reception, as well as the subsequent use of the personal information. Individuals should have the right to decide that they do not want a given organization, or set of organizations, to obtain their information or to process or disclose it, and they can only make those preferences known if organizations publish details about their information handling practices in the first place.
Why is it important that the legislation’s rules around algorithmic predictions be reformed?
As drafted, Bill C-11 doesn’t provide individuals with a right to object to how an automated decision system is used, nor require that a member of the organization employing the automated decision be involved in assessing automated decisions that may have legal consequences for the affected person. Finally, C-11 doesn’t require that decisions ultimately be reviewed by a member of the organizations employing the automated system. We think the government should significantly redraft these parts of the legislation to better empower individuals and hold organizations accountable for their use of algorithmic decision making systems.
Changes should include, at a minimum, that individuals can opt-out of an automated decision system being applied to them. In any situation where an organization can’t comply with this, they should (at a minimum) be required to notify the Privacy Commissioner of this failure and the rationale behind it. Further, whenever an automated decision system could produce legal effects concerning the individual, or significantly affect them, then any decisions made by the organization should not be made based solely on an automated decision system. And, finally, whenever an individual disputes the accuracy or result of a decision reached using an automated decision system, they should be able to have the decision reviewed by a member of the organization who possesses knowledge of how the automated decision system operates.
In aggregate these recommendations would ensure that automatic decisions systems, or algorithms, aren’t given free rein to potentially discriminate against Canadians and residents of Canada. The changes would empower individuals to have automated decisions re-assessed by humans who have expertise, which would mean that organizations couldn’t rely on tools they don’t understand, and would ensure that no algorithm could be spun up that would unilaterally have legal implications for an individual.
How would your recommendations empower whistleblowers?
The legislation does include protections for whistleblowers that activate once the Privacy Commissioner agrees they should apply. Our concern, however, is that until the Commissioner makes that decision there is a risk that a whistleblower might be revealed without protections by the Commissioner and, as such, some people won’t blow the whistle. Our suggested fixes are pretty minor and straightforward: we think that the Commissioner should be required to keep the whistleblower’s identity confidential where the Commissioner assesses that the individual has provided information that they believe in good faith is in the public interest to disclose to the Commissioner.
Alternatively, the legislation might be amended to provide assurance to whistleblowers by limiting the Privacy Commissioner’s discretion to alert organizations that the whistle has been blown about them. In this case, the Commissioner could be barred from disclosing the identity of the person who is blowing the whistle unless the affected organization can establish reasonable grounds to believe that the individual is making fraudulent assertions.