Do the Right Thing and leave Judgment to Algorithms

by Muhammad Aurangzeb Ahmad

ScreenHunter_2352 Nov. 07 09.42In Islamic theology it is stated that for each human being God has appointed two angels (Kiraman Katibin) that record the good and the bad deeds that a person commits over the course of lifetime. Regardless of one’s belief or disbelief in this theology, a world where our deeds are recorded is in our near future. Instead of angels there will be algorithms that will be processing our deeds and it won't be God who would be judging but rather corporations and governments. Welcome to the strange world of scoring citizens. This phenomenon is not something out of a science fiction dystopia, some governments have already laid the groundwork to make it a reality, the most ambitious among them being China. The Chinese government has already instituted a plan where data from a person’s credit history, publically available information and most importantly their online activities will be aggregated and form the basis of a social scoring system.

Credit scoring systems like FICO, VantageScore, CE Score etc. have been around for a while. Such systems were initially meant as just another aid in helping companies make financial decisions about their customers. However these credit scores have evolved into definitive authorities on the financial liability of a person to the extent that the human involvement in decision making has become minimal. The same fate may befall social scoring systems but the difference being that anything that you post on online social networks like Facebook, microblogging website like Twitter, search and browsing behaviors on Google or their Chinese equivalents RenRen, Sina Weibo and Baidu respectively is being recorded and can potentially be fed into a social scoring model. As an example of how things can go wrong lets consider the case of the biggest country in the world – China. In that country the government has mandated that social scoring system will become mandatory by 2020. The Chinese government has also blocked access to non-Chinese social networks which leaves just two companies, Alibaba and Tencent, to literally run all the social networks in the country. This makes it all the more intriguing that the Social Credit Scoring system in China is being built by the help of these two companies. To this end the Chinese government has given the green light to eight companies to have their own pilots of citizen scoring systems.

As compared to other scoring systems many social scoring system not only takes a person’s activities into account but also that of their friends e.g., if your friend has a negative score or is considered to be a troublemaker by the government then your social score will also be negatively impacted by your friend. The Chinese social scoring does not take this into account right now but some folks have started ringing the alarm bells that this where it will end up. And just like the credit scores, the social score is a number that you can only ignore at your own peril. A low social score may result in you not getting the promotion, not getting hired for that government job or even that private sector job as private companies may not want to look bad by hiring too many employees with bad social scores. The government may even take things up by a notch by scoring companies based on their workforce and hiring practices. Once the system is set up, the authorities may not even need to do much policing because the algorithms will be doing the policing for them. This state of affairs has even been rightfully called gamified authoritarianism. The government can push the society in a particular direction by soft coercion. Consider the following quote from Li Yingyun, director of Ali Baba’s pilot social scoring system, as reported by the BBC, “Someone who plays video games for 10 hours a day, for example, would be considered an idle person, and someone who frequently buys diapers would be considered as probably a parent, who on balance is more likely to have a sense of responsibility.” It is not hard to imagine where such a system can go next.

Jay Stanley, from the ACLU, has rightfully observed that something like the Chinese social scoring system is unlikely to come to realization in the US. That said, governments are not the only entity that one should be worried about. If the government does not create such a system then large corporations may be tempted to replicate the Chinese social scoring systems. They not only have the resources but also motivation to do so. This scenario is not as far fetched as it may seem at first at first as many organizations already have internal scoring systems where they score their customers not just on their purchasing activities but also browsing behaviors and their engagement with the social media.

Retail and shopping is not the only thing that will be affected by social scoring system. It is not unconceivable that matchmaking services like Match.com, eHarmony, Tinder etc. will start using online social cues for marriage and dating. This is another area where China is ahead of the curve. Chinese dating apps are already incorporating social scores. After all, would you really want to date someone who can negatively impact your social score. Imagine a version of Tinder where you can filter people by their social scores. Just as you do not want to hangout with the wrong people that can lower your score, you also would not want to date someone who comes from a lower stratum of social scores. After all a social scoring system may penalize a person more for choosing to spend their life with the ‘wrong’ kind of person. The long-term effects of such a system could be self-policing as well as stratification along social scoring lines. At this point if you are wondering that this is starting to look like the contours of a digital caste system being solidified then you are not imagining things.

One disclaimer that I should add here is that the version of the social scoring system described here is the more extreme version of what the Western media has stated what the Chinese government has in store. Details coming out of China about this system keep on changing and hence assessment of the system takes the various versions into account.. The version that is currently being rolled out is limited to a credit scoring system based on online and offline financial activities. However some analysts observe that the Chinese government may install such a system in the near future and this is just a warm-up. There has already been some backlash against aspects of this system so that the Chinese government has scaled back its plans tio some extent.

Then there is the dreaded correlation is causation fallacy that people can easy fall prey to. While one may argue that even if someone has a low social score that does not mean that they are a bad person, this not how large swaths of population are going to look at the matter. It does not help that majority of the populace is not very apt at even basic stats. Thus if you have low social score then people may start thinking that something wrong with you even though the low score may be because you are the victim of adverse circumstances. The problem of visibility of social scores is even more severe as compared to financial scores. If anyone can just query your social score then you may end up making and breaking friendship based on an invisible social marker. In the near future just imagine wearing a VR device over and walking down the street where you can see everyone’s social scores. Consequently not only your interactions with the government and impersonal organizations would change but your day-to-day interactions with other people may be impacted as well.

With the proliferation of social scoring systems used by the government and large corporations, a new profession may also emerge in the near future: Reputation management for social credit scores. Organizations may crawl the internet, analyze one’s social media usage, scrutinize one’s social circle etc. to making recommends regarding what actions to take and what friendships to maintain or even break based on how these might help a person improve their social score. Especially if querying social scores becomes as easy as querying a VR device then gold digging may not be limited to financial gain in the near future. We should also think about the psychological toll that it might take on people if they have to force themselves to create the illusion of a perfect to get that perfect social score. We have already seen glimpses of this phenomenon on Facebook and Instagram. It is not that hard to create a fake life on social networks it seems.

As with many autonomous systems run by algorithms, the proponents of social scoring systems may state that since the human element is being taken out these systems are likely to be unbiased. Researchers like Frank Pasquale have observed, “There is nothing unbiased about scoring systems.” Another important issue is the data collection and interpretation of the data. The former does not take into account the circumstances that lead to people making certain life decisions and the later may run into the problem of using the same cultural yardstick across different cultures and ethnicities. We already have computing systems that recommend giving African Americans longer sentences as compared to their Caucasian counterparts for the same crime. The problem is not that the programmers who designed the systems are consciously biased or racists but rather unconscious bias and selectivity of data get incorporated into the system without its creators realizing it.

Especially if the use of social credit scores can make people get others in line without direct government coercion then this would result in the perfect form of the Panopticon. Not only are the guards not needed in this version of Panopticon but all the prisoners have to put up a smiley face as well. It is the stuff of dreams for authoritarian regimes. Like many other technologies of control theocracies may jump at the thought of keeping tabs of what their citizens do. Just imagine how a technologically advanced North Korea or Saudi Arabia would look like with a social scoring system in place. Imagine North Korea with listening devices everywhere and recording and analyzing everything that one says in real time. It would impossible for dissidents to verbalize their opposition to the regime to one another, let alone mobilize and gets their voices across the rest of the world, this truly be as the late Christopher Hitchens put it a Cosmic North Korea. Imagine a theocratic regime that keeps tabs on the religiosity of its populace and scoring them based on their compliance to a particular creeds and set of behaviors.

Given the technological and consequent social developments of our age, social scoring systems may be inevitable. If this is the case then we should try to steer them towards greater transparency, privacy and fairness. Unlike the financial scoring metrics one should be able to query the system regarding why one has a low social score. If the government or some company is giving you a negative score because of your participation in some demonstration then aren’t they infringing on your constitutional rights by penalizing you? If this is the case then you should have a say in rectifying your score. It may of course result in an unusually high number of lawsuits against social scoring system, which is enough of a reason to not have such systems in the first place. On the flipside the advocate of such systems could use this premise to argue that such systems should not be transparent. As is the case with large-scale big data systems that collect personal data and offer services the question comes down to finding the right balance between privacy and transparency. It may be that such a balance does not exist for social scoring systems.

This makes the era of Big Data quite different from the previous information revolutions. One does not have to wait till the end of the world for judgment being pronounced. The algorithms of our own making are judging us and since they are created in our own image they are likely biased. This may not exactly be what Jesus had in mind when he said Don’t judge lest ye be judged but algorithms that judge us are already here and increasingly will be part of the social fabric.