Key Findings
- The Chinese government has approved pilot testing of a social credit system that draws upon citizens’ personal data to assign unofficial credit scores, which come with benefits and penalties that private companies and government bureaus manage.
- The five dimensions of credit score data are “users’ credit history, behavioral habits, ability to pay off debts, personal information, and social networks.”
- The algorithms that derive the credit scores are trade secrets, prohibiting forms of testing that can determine how they work.
- Government blacklists of debtors are being shared with private and public credit scoring services to prohibit individuals from making certain transactions.
This post, the second in a series on mobile payment systems, examines a Chinese mobile payment app feature increasingly covered in foreign media: testing of what may one day be a nationwide official social credit system to replace its traditional analog counterpart. The most popular publicly available social credit scores are currently being issued through Sesame Credit, a company established by Alibaba subsidiary Ant Financial. Western media reporting on the social credit system describes a science fiction-like phenomenon whereby citizens are assigned scores based on their online behaviour, financial information, and government records. Ant Financial representatives’ disclosures have suggested that late night web browsing, hours spent playing video games, and specific items purchased can all lower one’s score. Whether or not these claims are true, Sesame Credit’s scoring mechanisms are still in an experimental phase and are therefore subject to change. Our exploration of potential security, privacy, and other issues of such a system is meant to raise questions that can inform discussions about how it will evolve.
As early as in 2003, the Chinese government expressed interest in creating a comprehensive means of assigning citizens credit scores that would be an improvement over the country’s credit rating system. The State Council recently approved the “Outline of Regulations for Building a Social Credit System (2014-2020)” and at present eight companies have been granted the central bank’s permission to conduct pilot testing of their own social credit systems, akin to the Fair, Isaac and Company (FICO) consumer credit ratings. Part of what makes such an endeavor unique in China is the vagueness and in some cases complete absence of regulations regarding big data collection for credit scoring purposes, potential third party uses of such scores, and privacy protections of the user data that factors into credit score calculations. The new cybersecurity law released on November 7, 2016 contains language regarding privacy protections, yet will not be implemented until June 2017.
This article concerns Sesame Credit’s social credit score service, which Alipay users can currently opt into receiving within the app. Sesame Credit’s scores are the most popular example of how a national Chinese social credit system might someday operate, and provides a sense of what considerations Chinese companies, citizens, and the state think should go into such a system. It is one component of a broader Chinese government push to harness big data as a resource for social control. Credit scores are broken down into five dimensions: “users’ credit history, behavioural habits, ability to pay off debts, personal information, and social networks.” Brief descriptions of each of these categories are included within the Sesame Credit feature of Alipay, yet they shed no light on which particular actions would alter one’s score.
According to The Paper, “behavioural preferences” refer to actions such as timely bill payment, and “personal information” includes age, sex, occupation, and educational history. More troubling are the ill-defined “concepts of stability” Sesame Credit allegedly takes into consideration, which company representatives have alluded to without clearly defining. These would appear to be highly subjective markers of one’s financial and social standing that contribute to the credit scoring mechanism. The same article from The Paper posits that “if your address changes often, your creditworthiness will correspondingly drop. Additionally, your friends’ credit records will also influence your Sesame Points.” The room for misrepresentation as well as discrimination based on such factors is broad, and the behavior-changing incentives that underlie them are also concerning. Ant Financial’s chief credit data scientist Yu Wujie has said that “If you regularly donate to charity, your credit score will be higher, but it won’t tell you how many payments you need to make every month… but [development] in this direction [is undertaken with] the hope that everyone will donate.” Moreover, Ant Financial’s technology director, Li Yingyun, has stated that “Someone who plays video games for 10 hours a day, for example, would be considered an idle person, and someone who frequently buys diapers would be considered as probably a parent, who on balance is more likely to have a sense of responsibility.” These statements have fueled the condemnatory Western media coverage of social credit thus far, though it should be noted that the only way to know with certainty that the factors company representatives cite can indeed raise or lower scores would be to deconstruct the algorithms that determine them, which have been labeled trade secrets.
Without the ability to conduct static or dynamic testing of the social credit algorithms, what can we infer about how they operate? Deng Yiming, the director of Sesame Credit’s business development, has said that only 30 to 40 percent of the data used to generate credit scores comes from Alibaba-owned companies and services. Aside from the data gathered through users’ activity within the Alipay app, Sesame Credit computes individual scores with data collected from Alibaba’s e-commerce websites, ride-sharing apps, and restaurants connected to Alipay, and from government bureaus involving the law, education, and commerce. One of the most clear-cut examples of such a data exchange is Sesame Credit’s tie to the Supreme People’s Court, which has been reported to have shared its blacklist of debtors and others who have violated court verdicts with the company in order to block these people from making so-called luxury purchases in Taobao and Tmall. This arrangement prompts the question of whether Sesame Credit shares data in the opposite direction, possibly providing its partnered government bureaus with users’ transactional information. These collaborations tend to be overshadowed by those in which Sesame Credit works with the private sector to provide benefits to high score holders.
At present, the privileges of having a high social credit score can include booking a rental car or a hotel room without making a deposit, taking out loans at lower interest rates, being shipped online purchases to try out before paying for them, and obtaining visas to Luxembourg and Singapore through an expedited process. Other countries that have expressed interest in participating in this accelerated visa issuance include Japan, South Korea, Sri Lanka, and the United Kingdom. There has been discussion of future uses of high credit scores extending to matchmaking and job recruitment, which underscores the extent to which biased inputs could create new social barriers.
Chinese government portrayals of the social credit system cast it as a means of reinstating trust between citizens and in local businesses, such as restaurants that will be monitored for compliance with health regulations. Another of the framing arguments for the creation of a social credit system is that it will enable people traditionally unable to accrue credit to be able to do so: students, blue collar workers, small business owners, and the like. Yet so far it appears that the advantages of having a high score (of which it has been speculated that in fact few users have attained) are ones that favor those who are in a better financial position to make lavish and frequent expenditures. One criticism has been that Sesame Credit raises scores for people who conduct more Alipay transactions, which the company has firmly denied. The examples of higher scores for purchases associated with people who are parents and lower scores for people who change home addresses often further suggests that a lack of context can lead to misrepresentations of people’s actions which can carry disproportionate consequences.
There is a danger in explaining big data initiatives as taking in “objective” measures of user data, partially because in cases where an indicator is inaccurate, it becomes difficult to dispute this point. One example from a recent Wall Street Journal article involves a woman who used her son’s subway pass and was fined for an act that, to an algorithm, would appear to have been theft. This minor incident raises concerns about the decontexualized way in which social credit scoring may be designed to operate. What, if any, legal remedy is available for far more consequential mistakes under a social credit system?
The state of the current experiments in this field raise many questions about user protections. Regarding user privacy, there has been little discussion of the implications of making individuals’ scores viewable to to others. Alipay offers a game in which friends can guess each other’s credit scores and reveal them to one another, for example. Beyond other users, external companies that provide special services to high credit score holders, and the app’s partnered government bureaus, with which third parties will Sesame Credit share data? The centralization of the data collection mechanisms for social credit make dang’an-like dossiers of individuals’ online and offline lives all the more portable, as several media reports have noted. What security procedures are in place to protect these repositories of valuable personal information?
In an irony that has plagued many a surveillance apparatus, the spread of a social credit system and its associated sensors, QR codes, and other trace-reading tools can create new security concerns separate from those it allegedly aims to reduce through near-ubiquitous monitoring of behavior. These new threats involve the ways in which credit score data can be forged, and the ends toward which fake credit scores may be used. The expedited security check at the Beijing airport for Sesame Credit users with high enough scores who are traveling on domestic flights provides one example of a situation where a falsified high score could enable someone to bypass more rigorous security checks, which can be a threat to national security from a skilled and determined enough actor. The more widely used the social credit system becomes and the greater the range of rewards it may provide high scorers, the more incentives for figuring out how to hack it will proliferate. Articles and threads about how to raise one’s credit scores are already on the rise, and it is still unclear how easy it may be to hack into a specific account and falsify information.
As detailed in a previous post, Alibaba has experienced data leaks from company employees as well as external hacks. It is imperative to question what precautions they are taking given that the rapid buildup of a social credit system will likely make the companies that are its stewards targets for data theft and extortion. Moreover, as a host of data-collecting Internet of Things (IoT) devices are sold across China, granular capabilities to track citizens’ every move and draw from this information as a social credit input will only grow stronger, and requires work on the part of IoT manufacturers to address these issues. One development that has the potential to make the social credit companies more transparent has been the National Development and Reform Commission’s announcement that credit evaluation services will be opened up for foreign investment, which may at a minimum provide more information about how enterprise social credit scoring (for restaurants and local businesses) is conducted.
The final security and censorship-related concern we offer for consideration is the possibility of social credit and mobile finance access being blocked to penalize citizens for acts of protest, as merely one example. If both of these services become indispensable in one’s daily life, the possibility of government-sanctioned freezing of users’ accounts could pose major harms to civil liberties. As the rollout of the system progresses, it is important to keep in mind that although the algorithms that determine credit scores may quickly change in ways that users and researchers cannot perceive, the power of the state and the companies shaping the social credit system’s development is far more entrenched, and is ultimately responsible for the real-world outcomes the social credit system may produce.