ResearchFree Expression Online

Planet Netsweeper Executive Summary

This is part one of a four-part report on the global proliferation of Netsweeper

This report describes our investigation into the global proliferation of Internet filtering systems manufactured by the Canadian company, Netsweeper, Inc. After undertaking a mapping of worldwide country installations, we focus in on ten country cases in which we verify that Netsweeper systems are being used to censor the Internet for subscribers of consumer Internet Service Providers, and where human rights and corporate social responsibility questions are acute.

Key Findings

  • Using a combination of publicly available IP scanning, network measurement data, and other technical tests, we identified Netsweeper installations designed to filter Internet content operational on networks in 30 countries
  • We then used other data points associated with these installations, including in-country measurements, to narrow our list to those installations that appear to be filtering content for national-level, consumer-facing ISPs in ten countries of interest: Afghanistan, Bahrain, India, Kuwait, Pakistan, Qatar, Somalia, Sudan, UAE, and Yemen
  • We found that Netsweeper technology is being used to block access in these ten countries to a wide range of digital content protected by international legal frameworks, including religious content in Bahrain, political campaigns in the United Arab Emirates, and media websites in Yemen
  • We identified a pattern of mischaracterization and/or over blocking involving the use of Netsweeper’s systems that may have serious human rights implications, including blocking Google searches for keywords related to LGBTQ identities and blocking non-pornographic websites in various countries on the basis of an apparent miscategorization of these sites as ‘Pornography’
  • We raise issues with the nature of the categories delimited by Netsweeper for the purpose of filtering, including the existence of an ‘Alternative Lifestyles’ category, which appears to have as one of its principal purposes the blocking of non-pornographic LGBTQ content, including that offered by civil rights and advocacy organizations, HIV/AIDS prevention organizations, and LGBTQ media and cultural groups. We also note that Netsweeper can be configured to block access to websites from entire specified countries
  • The international deployment of this Canadian-made filtering technology raises a number of human rights, corporate social responsibility, and public policy concerns and questions. These questions include whether and to what degree Netsweeper undertakes due diligence with respect to sales of its technology to jurisdictions with problematic rights records, and whether the Canadian government should be assisting Netsweeper, financially or otherwise, when its systems are used in a manner that negatively impacts internationally-recognized human rights

Part One: Summary

Internet filtering technologies play a critical role in shaping access to information online. Whether we are connecting to the Internet from our homes, coffee shops, libraries, or places of work, software that inspects, manages, and/or blocks our communications has become commonplace. When used at the level of large, consumer-facing Internet Service Providers (ISPs), Internet filtering technologies can have significant human rights impacts. A growing number of governments employ Internet filtering systems at this scale in order to undertake national-level censorship of the Internet. Filtered content ranges from pornography, hate speech, and speech promoting or inciting violence, to political opposition websites, news websites, websites affiliated with various religions, and everything in-between.

The growing responsibilities among network operators to filter content, either within private enterprises or on public networks, have given rise to a large and lucrative market. One industry report estimated the value of the web content filtering market at $3.8 billion USD by 2022. While network operators can manually configure their infrastructure to block specific websites or applications, the task can be time-consuming, complicated, and ineffective. Internet filtering companies provide professional services to ISPs and other clients to take care of this responsibility. Typically, Internet filtering companies dynamically categorize Internet resources and then let their clients choose pre-selected content categories or services that they wish to block. Customers can also add custom lists of their own to content that is filtered or blocked. In the hands of authoritarian regimes, such professional services can limit the ability of citizens to communicate freely and help impose opaque and unaccountable controls on the public sphere.

This report presents our latest research into the Internet filtering company Netsweeper, Inc. Netsweeper is a privately-owned technology company based in Waterloo, Ontario, Canada. The company has branch offices in India, Netherlands, the United Arab Emirates, and the United Kingdom, and distributors in Australia, the Middle East, South America, and the United States. As part of our ongoing research into Internet censorship practices and the filtering technologies that support them, Citizen Lab has issued several prior reports on Netsweeper, in which we identified installations on public networks in Bahrain, Pakistan, Qatar, Somalia, United Arab Emirates, and Yemen. Citizen Lab has developed a distinct fingerprint for Netsweeper installations over the course of this research, allowing us to identify such installations with high confidence. Additionally, Netsweeper is of particular research interest given that it is a Canadian company, encouraged by the Canadian government and society to “reflect Canadian values” in its operations.

For this report, we used network measurement methods to map the entire Internet for Netsweeper installations. We identified 30 countries in which Netsweeper installations were present, and, of those, we focused on ten countries that raise systemic human rights concerns: Afghanistan, Bahrain, India, Kuwait, Pakistan, Qatar, Somalia, Sudan, UAE, and Yemen. (Our full data set can be accessed here.)

Several objectives guided our research. First, we wanted to develop and refine network measurement methods that allow us to verify Internet filtering service installations, such as those sold by Netsweeper. Citizen Lab has used these methods for many years as part of our research into Internet censorship and surveillance, and there is a growing scholarly community employing these research methods. One contribution we make in this report is to show how data collected from outside (i.e., through remote scans and publicly available datasets) and inside a country (i.e., principally through tests that make use of the OONI probe system) can be combined to verify Netsweeper installations and their behaviors. Our search for Netsweeper installations included scanning every one of the billions of IP (Internet Protocol) addresses on the Internet to identify responses from those addresses that match a signature we developed for Netsweeper.

Second, we wanted to raise awareness about Internet censorship practices, and the technologies that support them, so that negative human rights impacts can be identified and mitigated. Generally speaking, corporate social responsibility (CSR) practices among companies in the digital security space are immature, and Netsweeper in particular has published or communicated very little to suggest the company has implemented CSR measures. Yet business enterprises like Netsweeper have responsibilities under international human rights law to respect human rights. Such responsibilities involve ensuring due diligence measures are used to identify, prevent, and mitigate any impacts their operations have on human rights; public transparency about those measures; and ensuring remedial action if negative impacts are identified. Netsweeper has provided little information about any such measures, systems, or policies. Meanwhile, our research has verified that Netsweeper installations are used in several countries to implement Internet censorship in ways that undermine internationally-recognized human rights.

The Government of Canada also has important obligations under international human rights law to protect human rights and require Canadian businesses to engage in due diligence to avoid causing or contributing to negative human rights impacts. The Government also has a duty to provide effective remedies for human rights victims. Canada has taken a strong public stance in support of human rights in the digital environment, yet at the same time Canadian government entities have assisted Netsweeper in developing its international trade presence and export sales. Such assistance has occurred despite the human rights implications of Netsweeper’s business activities abroad. We offer concrete recommendations to the Canadian Government on how to better meet its obligations around these issues.

The major sections of the report are as follows:

Section 1- Methodology & Technical Findings

Section 1- Methodology & Technical Findings details the research questions that informed our study, our network measurement methodology, and technical findings.

Section 2- Country Case Studies

Section 2- Country Case Studies focuses on ten countries with problematic human rights records and/or particular security or public accountability challenges in which we identified Netsweeper installations on large public-facing ISPs.

Section 3- Discussion & Conclusions

Section 3- Discussion & Conclusions examines some of the legal, regulatory, corporate social responsibility, and other public policy issues raised by our report’s principal findings. We focus on the responsibilities of Netsweeper and the obligations of the Canadian government to protect human rights and, then, suggest measures that stakeholders could take to mitigate negative human rights impact associated with Internet filtering technology.

Part Two: Background

How does Internet filtering work? What are middleboxes?

A network administrator tasked with restricting access to Internet resources has many different options available, each with their own strengths and weaknesses. One of the more simplistic ways to block access to a website is to change the site’s domain name system (DNS) record to point to an IP address that will not return any content, or will return a “blockpage” (e.g., a page saying “this website is blocked”). Users can circumvent this blocking technique by changing the DNS settings on their device.

Another approach an administrator can use to filter access to Internet resources is to block the IP address of a website, such as by using a null route. This technique is imperfect because the site may share its IP address with many other (unrelated) websites. Thus, blocking an IP address can have the unintended consequence of blocking many other websites. Furthermore, a website blocked by this technique can circumvent the block by changing its IP address, or by using IP addresses from a service like Cloudflare, which is complicated for governments to block as content delivery services are widely used by corporations to deliver their content.

DNS blocking and IP address blocking can typically be conducted without adding additional hardware or software to a network, and both are relatively easy to circumvent. More sophisticated techniques are available if an administrator purchases and installs a middlebox on their network. A middlebox is a specialty network device, appliance, or software that inspects network traffic and performs some action upon traffic that matches certain characteristics, such as throttling, dropping, or redirecting data traffic being sent to, or received from, sources that are being filtered or censored.

A middlebox is normally installed in between ISP subscribers and the outside Internet. A middlebox may employ a deep packet inspection (DPI) technique to attempt to classify traffic belonging to certain encrypted apps or features (e.g., virtual private networks [VPNs] or voice-over-Internet-protocol [VoIP] applications) by examining various properties of packet flows. Thus, DPI techniques can be used to block services like WhatsApp voice calling while allowing unrestricted access to WhatsApp text messages. Many companies sell DPI-enabled middleboxes for a variety of “network management” purposes, including website caching, blocking viruses and spam, and enforcing usage quotas. A middlebox might also be purpose-built to filter web traffic to designated URLs, such as Netsweeper’s product.

Circumventing middlebox-based blocking can sometimes be challenging. In theory, using a VPN or other circumvention applications can circumvent middlebox censorship, although DPI middleboxes can block many types of these applications. Citizen Lab has investigated the role played by DPI middlebox products from two companies– Blue Coat and Sandvine— in censorship and surveillance in its past reports.

About Netsweeper, Inc.

Netsweeper, Inc. develops an Internet content filtering product, also called Netsweeper, which is used by telecommunications companies, educational institutions, and governments. The company’s promotional material describes the product as a means of protecting against malicious or inappropriate content, meeting compliance and regulatory requirements, and protecting sensitive information.

How Netsweeper’s Internet filtering systems work

Netsweeper differentiates its product from other filtering tools on the market based on its “real-time web content categorization” technology. Given the highly dynamic nature of the Internet, manually maintaining lists of categorized web content is impractical. The company uses automated scanning and categorization techniques to maintain a large database of websites; each of these websites is assigned to a category based on its contents. A network administrator need only select a given content category– such as ‘Gambling’ or ‘Hate Speech’– and all content categorized as such will be blocked. Creating this database of websites and the ongoing process of categorization is a substantial undertaking. The company claims it has categorized over 10 billion URLs and that it categorizes 22 million new URLs each day.

The Netsweeper filtering process
The Netsweeper filtering process

FIGURE 1. The Netsweeper Filtering Process

Netsweeper’s content categories cover a wide range of web content, providing censors an easy and automated mechanism to bulk-filter entire content categories. ISPs and telecom operators can choose which of these categories they want to block but can also add their own categories and URLs manually. The comprehensiveness of the content categories suggests how pervasive Internet filtering can be. It also shows how a commercial company can aid national-level Internet censorship by providing technology and also by defining the parameters of permissible content retrieval– and thus access to information– through automated categorization.

Netsweeper’s predefined content categories include:

Abortions General News No Text Search Keywords
Adult Image Hate Speech Nudity Self Help
Advertising Host is an IP Occult Sex Education
Adware Humour Parked Social Networking
Alcohol Images Pay to Surf Sports
Alternative Lifestyles Infected Hosts Peer to Peer Streaming Media
Arts and Culture Intimate Apparel Phishing Substance Abuse
Classifieds Intranet Servers Phone Cards Technology
Criminal Skills Investing Political Tobacco
Culinary Job Search Pornography Travel
Directory Journals and Blogs Portals Under Construction
Education Legal Profanity Viruses
Educational Games Malformed URL Real Estate Weapons
Entertainment Match Making Redirector Page Web Chat
Environment Matrimonial Religion Web E-Mail
Extreme Medication Remote Access Tools Web Proxy
Gambling Network Timeout Safe Search Web Storage
Games Network Unavailable Sales
General New URL Search Engines

Part of our research in this report is intended to enumerate content category choices, censored content, and any other network behaviour on large consumer-facing ISPs in a particular country where we have identified Netsweeper installations. It is important to note that Internet content filtering is dynamic and variable and that it changes whenever a network administrator decides to update its local installation. Our tests do not provide exhaustive lists of censored content but, instead, provide representative samples that are a snapshot in time coinciding with our testing periods.

Our data collection and testing can reveal whether particular content categories are chosen, as well as whether URLs are added to a custom list and whether those choices are undertaken transparently or not (i.e., undertaken with some clear notification to users). In some instances, when a request is made for censored content, a blockpage is returned to the user that explains the reason why the content is blocked. In other cases, however, the user experiences a “time-out,” which may give the mistaken impression that something is wrong with the connection or that the content is no longer available. Internet censorship is most insidious when it involves the latter approach, because users cannot ascertain why information is inaccessible.

Prior Citizen Lab research on Netsweeper

Citizen Lab began research into the use of Netsweeper technology in 2011. That year, as a part of the OpenNet Initiative project, we published a report that documented the use of Netsweeper technology to filter content on consumer-facing ISPs: “West Censoring East: The Use of Western Technologies by Middle East Censors, 2010-2011.” This report documented the use of Netsweeper installations to censor content on three regional ISPs: Qtel (Qatar), du (UAE), and YemenNet (Yemen). The Yemen case was particularly notable because prior to using Netsweeper services, the ISP, YemenNet, used the WebSense filtering software. WebSense discontinued service to YemenNet for violating policies against government-mandated censorship following the publication of our report.

In June 2013, Citizen Lab published “O Pakistan, We Stand on Guard for Thee.” That report described the use of Netsweeper technology to filter websites relating to human rights, sensitive religious topics, and independent media on Pakistan’s largest ISP, PTCL.

In February 2014, we published “Internet Filtering in a Failed State: The Case of Netsweeper in Somalia,” which documented the presence of Netsweeper technology on the networks of three Somalia-based ISPs. The use of filtering technology in Somalia– a country with a history of contested authority, under the influence of a radical insurgency, and considered one of the world’s ‘failed states’– raised significant human rights concerns.

In October 2015, we published “Information Controls During Military Operations,” which analysed information controls during the Yemen armed conflict. This report found that Netsweeper installations were being used on the networks of state-run YemenNet, the country’s largest ISP, to filter critical political content, independent media websites, and all URLs belonging to the Israel (.il) top-level domain. This censorship occurred at a time when YemenNet was under the control of the Houthis, an armed rebel group who had taken over the Yemeni capital in September 2014.

In September 2016, we published the report “Tender Confirmed, Rights at Risk: Verifying Netsweeper in Bahrain,” which documented the use of Netsweeper technology on nine Bahrain-based ISPs. The Netsweeper installations appeared to have been activated several months after the release of a public tender by Bahrain’s Telecommunications Regulatory Authority that indicated Netsweeper had won a bid to provide a “national website filtering solution.” Testing on the ISP Batelco showed that the Netsweeper installation was being used to filter content relating to human rights, political opposition websites, Shia websites, local and regional news sources, and content critical of religion. The report noted that the use of Netsweeper technology to filter protected speech in Bahrain was particularly problematic given the country’s ongoing political crisis and record of human rights abuses against oppositional political figures and human rights activists.

Communications with Netsweeper

As a standard part of our research process for most of these reports, we sent Netsweeper a letter that described our findings, presented a series of questions regarding the use of Netsweeper technology in these countries, and committed to publishing their response in full alongside our research report. Netsweeper did not respond to any of our letters. However, in January 2016 the company filed a defamation suit against Citizen Lab director, Professor Ronald Deibert, and the University of Toronto with the Ontario Superior Court of Justice, seeking $3,500,000 in general and aggravated damages following the publication of our 2015 report on the use of their technology in Yemen. Netsweeper discontinued its claim, in its entirety, in April 2016.

Prior to the publication of this report, Citizen Lab sent a letter to Netsweeper on 10 April 2018. The letter notified the company of our intention to publish a report and described our key findings. It also offered to “publish any response you would like to provide to this letter in its entirety alongside that report.” On 12 April 2018, Netsweeper CEO Perry Roach replied by email acknowledging receipt and indicating a response would be forthcoming.

On 23 April 2018, Netsweeper responded through counsel with a document titled, “Media Release: Netsweeper responds to media enquiries regarding international operations,” sent to Citizen Lab and individual journalists. While Netsweeper stated that it “welcomes the opportunity to clarify the conduct of its operations,” the media release did not address any of the questions Citizen Lab posed to Netsweeper. Rather, it asserted that Citizen Lab’s questions did not sufficiently meet Netsweeper standards to merit answers:

“Netsweeper has always and remains fully compliant with Canadian law and in those countries where it has ongoing concerns. We appreciate receiving analysis and questions that meet professional tests of sound technological understanding and balanced interpretation.

“It is our view the information and questions provided to Netsweeper fail adequately to meet those tests.”

At the same time, however, the media release appeared to acknowledge that Netsweeper does face corporate social responsibility dilemmas inherent to the provision of Internet filtering products:

“Netsweeper cannot prevent an end-user from manually overriding its software. This a dilemma shared by every major developer of IT solutions including globally renowned corporations that make the internet work. Our firm’s technology and its applications are fully disclosed in the public realm. Even the most elementary review of our posted material shows that Netsweeper’s design does not include any organic functionality to limit the online content Mr. Diebert [sic] highlights.”

Netsweeper’s acknowledgement that IT companies face a dilemma is a step in the right direction and advances the conversation on corporate social responsibility. However, the company provided no further detail within the media release to explore the exact nature of this dilemma. For example, it did not address issues concerning the conduct of human rights due diligence to limit sales that would present significant human rights risks in the first place; the establishment of rights-oriented policies or procedures (which other companies within this market have adopted — see Section 3.3); or the existence of the ‘Alternative Lifestyles’ and ‘Countries’ filtering categories, which do appear to represent “organic functionality to limit the online content” as highlighted by Citizen Lab. Puzzlingly, this statement also seems to suggest that the company views the censorship effects noted by Citizen Lab as resulting from misuse of its technology, given the characterization of the end-user deployment as “manually overriding its software,” rather than operating the technology as designed.

 

Acknowledgements

Thanks to Elizabeth Gross, Gabrielle Lim, and Bahr Abdul Razzak for research assistance; Masashi Crete-Nishihata, Christopher Parsons, Jeffrey Knockel, and Geoffrey Alexander for peer review; Andrew Hilts and Miles Kenyon for website, layout, and communications support; and to the entire team at OONI and ICLab. Thanks to Censys and Shodan for providing access to their data. Financial support for Citizen Lab’s research on information controls is provided by the John D. and Catherine T. MacArthur Foundation, the Ford Foundation, Open Society Foundations, Oak Foundation, and the Sigrid Rausing Trust.

Appendix

Supporting Data

Media Mentions

CBC, CBC (2), Indian Express, Indian Express (2), Radio-CanadaVice, The Record, Bloomberg, WiredNews Laundry, Scroll, BGR, U of T News, Epoch Time, Daily O, CyberWire, Goo, Wolne Media, First Post, La Presse, Data News, Sludge Feed, Dutch IT Channel