From Nudge to Hypernudge: Big Data and Human Autonomy

by Fabio Tollon

We produce data all the time. This is a not something new. Whenever a human being performs an action in the presence of another, there is a sense in which some new data is created. We learn more about people as we spend more time with them. We can observe them, and form models in our minds about why they do what they do, and the possible reasons they might have for doing so. With this data we might even gather new information about that person. Information, simply, is processed data, fit for use. With this information we might even start to predict their behaviour. On an inter-personal level this is hardly problematic. I might learn over time that my roommate really enjoys tea in the afternoon. Based on this data, I can predict that at three o’clock he will want tea, and I can make it for him. This satisfies his preferences and lets me off the hook for not doing the dishes.

The fact that we produce data, and can use it for our own purposes, is therefore not a novel or necessarily controversial claim. Digital technologies (such as Facebook, Google, etc.), however, complicate the simplistic model outlined above. These technologies are capable of tracking and storing our behaviour (to varying degrees of precision, but they are getting much better) and using this data to influence our decisions. “Big Data” refers to this constellation of properties: it is the process of taking massive amounts of data and using computational power to extract meaningful patterns. Significantly, what differentiates Big Data from traditional data analysis is that the patterns extracted would have remained opaque without the resources provided by electronically powered systems. Big Data could therefore present a serious challenge to human decision-making. If the patterns extracted from the data we are producing is used in malicious ways, this could result in a decreased capacity for us to exercise our individual autonomy. But how might such data be used to influence our behaviour at all? To get a handle on this, we first need to understand the common cognitive biases and heuristics that we as humans display in a variety of informational contexts. Read more »

Do the Right Thing and leave Judgment to Algorithms

by Muhammad Aurangzeb Ahmad

ScreenHunter_2352 Nov. 07 09.42In Islamic theology it is stated that for each human being God has appointed two angels (Kiraman Katibin) that record the good and the bad deeds that a person commits over the course of lifetime. Regardless of one’s belief or disbelief in this theology, a world where our deeds are recorded is in our near future. Instead of angels there will be algorithms that will be processing our deeds and it won't be God who would be judging but rather corporations and governments. Welcome to the strange world of scoring citizens. This phenomenon is not something out of a science fiction dystopia, some governments have already laid the groundwork to make it a reality, the most ambitious among them being China. The Chinese government has already instituted a plan where data from a person’s credit history, publically available information and most importantly their online activities will be aggregated and form the basis of a social scoring system.

Credit scoring systems like FICO, VantageScore, CE Score etc. have been around for a while. Such systems were initially meant as just another aid in helping companies make financial decisions about their customers. However these credit scores have evolved into definitive authorities on the financial liability of a person to the extent that the human involvement in decision making has become minimal. The same fate may befall social scoring systems but the difference being that anything that you post on online social networks like Facebook, microblogging website like Twitter, search and browsing behaviors on Google or their Chinese equivalents RenRen, Sina Weibo and Baidu respectively is being recorded and can potentially be fed into a social scoring model. As an example of how things can go wrong lets consider the case of the biggest country in the world – China. In that country the government has mandated that social scoring system will become mandatory by 2020. The Chinese government has also blocked access to non-Chinese social networks which leaves just two companies, Alibaba and Tencent, to literally run all the social networks in the country. This makes it all the more intriguing that the Social Credit Scoring system in China is being built by the help of these two companies. To this end the Chinese government has given the green light to eight companies to have their own pilots of citizen scoring systems.

Read more »