The darker side of human data
What are the implications of a near future in which humans will be rated on their behaviour and relationships?
In 2014, the State Council of China announced the development of a digital ‘citizen rating system’ to determine each citizen, company, and local government’s “trustworthiness”, with the intent of building social trust and crushing China’s rampant counterfeiting and fraud issues.
To Westerners it may sound like an Orwellian dystopia. For the Chinese, however, it is soon to be a reality, and participation will be compulsory by 2020. The idea is already being implemented by eight private Chinese companies, one of whom is the financial wing of Alibaba, China’s largest online shopping platform.
Although the form of the government system is uncertain, cues can be taken from Alibaba’s system, Sesame Credit (SC). Users are given a score based on five factors: credit history, fulfilment capacity, personal information, behaviour and preferences, and interpersonal relationships. The first three may sound familiar: they are similar to a credit score, and are based on financial information and basic personal details.
The last two are factors currently unaccounted for in western big data systems. They indicate that what you – or your friends – buy online or say on social media can and will affect your “trustworthiness rating”.
“Your personal score will have tangible consequences in reality”
SC scores do not take content into account: using Alibaba’s services, or having friends who use SC, will increase your score, but what is actually bought or said has no effect. It resembles a loyalty scheme more than a social credit system. The implication, however, is that it could consider these factors, and that the governmental system will.
Data is drawn from a variety of sources: China’s corporate giants are worlds apart from their western counterparts. SC alone draws information from all of Alibaba’s branches, from shopping to taxi services to Alipay, China’s ubiquitous online payment platform. Alipay is used for everything from bills to bus tickets, and so Sesame Credit always knows where you are and what you buy. Big data is simply a fact of Chinese life.
The government has access to all of this information, atop of that from other mega-corporations, due to the Chinese government’s long-standing history of demanding users’ personal data from private companies.
Your personal score will have tangible consequences in reality. Those with high SC scores can book hotels without a deposit, or even get fast-tracked for visas. The future government system would even incorporate punishment: a 2016 document says that low scores could result in ineligibility for certain jobs, lower internet speeds, and barring from certain restaurants.
China already severely punishes dissenters: 6.3 million people who defied court orders are on a public blacklist, banning them from buying plane or train tickets and limiting the purchase of luxury goods. Internet access or social media accounts are often suspended for expressing dissent.
“Amplified by their big-data rating system, the potential for absolute control is real”
Amplified by their big-data rating system, the potential for absolute totalitarian control is real. Technological tampering is also a threat: if the security of the system is ever breached, the damage caused by hackers could ruin lives.
Despite this, the system is being welcomed by many Chinese citizens. China lacks the widespread credit score systems that are well-established elsewhere, making it difficult to obtain loans. Fraud and counterfeiting are daily grievances. The Chinese are sick of the absence of reliability in their society.
Furthermore, there are those that argue that the new system offers more governmental transparency. China rewards loyal citizens, too: hundreds of thousands of honours and titles are given every year, along with other benefits. However, the criteria for these are arbitrarily decided by officials: at least a rating system would rely on a supposedly-predictable algorithm with defined criteria.
Both points have their counterarguments. For the first, the new system undermines trust in other ways – there will be rewards for informing on friends or family, and the fact that any dissenter will drag down the scores of those around them encourages ditching ‘risky’ friendships.
The issue with the second is that a human can apply context to a situation, whereas an algorithm cannot. Under China’s automatic digital rating system, missing a bill due to being in hospital and missing a bill due to unreliability are counted as exactly the same. The technological limitations of such a system are important. Black market score manipulation, by buying different personal details, or the use of VPNs for undetected online activity, would tamper with the data. The data and analysis on which one’s score is based could be flawed: nobody knows whether the suggested system is really feasible.
The fact that the Chinese government wish to try regardless is telling. They have many options to consider as remedies for social mistrust, an issue originally caused by the government’s intolerance of any external institution that might enforce accountability. Rather than allow for independent watchdogs and increased transparency, they instead opt for greater control. As put by Foreign Policy, the government is “ensuring that the only monitor will, once and forever, be itself.”