Permeable: The Monetization of Human Fragility

by Muhammad Aurangzeb Ahmad

“Amongst Men” by Haroon Gunn-Salie

There is a particular moment of weakness that most of us recognize without needing to name it. You are lying in your bed, it is late at night. The light from the phone is brighter than the background in the room. The day has thinned your patience as well as your judgment. You click on posts and stories what you would otherwise have ignored. You may even be tempted to buy what you do not need. In other words, your guard is down. You read for a few seconds and then doom scroll to the next story and then the next.  You feel like going to sleep but then you lie to yourself, one more scroll.” Welcome to the world of attention economy. It is basically a system in which human focus is treated as a scarce resource to be captured, measured, and monetized by digital platforms. The attention economy began with a relatively simple goal i.e., capture and hold the gaze. Over time, that ambition evolved into not just focusing on what you like but also focusing on when you are least able to resist. Timing, not just taste, has become the new frontier that these algorithms focus on. While it is true that these platforms were informed by decades of research on human psychology, they do not need to understand us in any deep, human sense. What they do need to do is to predict when you are tired, lonely, bored, anxious, or depleted. It is precisely in these states of vulnerability that it is easier to influence people and steal their attention.

Behavioral science has long recognized that cognition is not constrained by what else is going on in one’s life. Decision fatigue is real. Cognitive depletion makes us more suggestible. Our skepticism varies with sleep, stress, and emotional load. What has changed is that our devices now emit a steady exhaust of behavioral signals that correlate with these states. The phone does not know that you are lonely. It does not need to. It only needs to register that your scrolling pattern has slowed, that you are lingering on certain kinds of content, that the hour is late, that your interactions look different from your baseline. Correlation at scale does not need understanding. The use of human vulnerability is not a rhetorical device, it is anchored in behavioral research. Read more »

Monday, April 4, 2022

The Shame Machine: Author Cathy O’Neil Interviewed by Danielle Spencer

by Danielle Spencer

The Shame Machine

Cathy O’Neil’s The Shame Machine: Who Profits in the New Age of Humiliation (Crown) was released on March 22, 2022. O’Neil is the author of the bestselling Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Crown 2016) which won the Euler Book Prize and was longlisted for the National Book Award. She received her PhD in mathematics from Harvard and has worked in finance, tech, and academia. She launched the Lede Program for data journalism at Columbia University and recently founded ORCAA, an algorithmic auditing company. O’Neil is a regular contributor to Bloomberg Opinion.

Danielle Spencer: Can you speak a bit about your background and what led you to write this book?

Cathy O’Neil: I’m a mathematician and a child of two mathematicians. Very nerd-centered childhood, where science was the religion of the household. They were otherwise atheists. I became a data scientist at some point, also a hedge fund analyst.

Weapons of Math DestructionAnd then I started trying to warn people about the dangers of algorithms when we trust them blindly. I wrote a book called Weapons of Math Destruction, and in doing so I interviewed a series of teachers and principals who were being tested by this new-fangled algorithm called the value-added model for teachers. And it was high stakes. They were being denied tenure or even fired based on low scores, but nobody could explain their scores. Or shall I say, when I asked them, “Did you ask for an explanation of the score you got?” They often said, “Well, I asked, but they told me it was math and I wouldn’t understand it.”

That was the first moment I thought, “Oh my God, shame is so powerful.” That was math shame, evidently, because it wouldn’t have worked on me. [laughs] I’m a mathematician. You’re not going to shame me on math. If you tell me I wouldn’t understand something because it’s math, I’d say, “Dude, buster, if you can’t explain it to me, that’s your problem—not mine.” I would just be bulletproof to math-shaming. Read more »

Monday, July 18, 2016

Algocracy: Outsourcing Governance to Algorithms

by Muhammad Aurangzeb Ahmad

AlgorithmIn the late 17th century Gottfried Leibniz conceived of a machine that could be used to settle arguments so that instead of arguing people will just settle dispute by saying “let us calculate.” On closer inspection this idea has an uncanny resemblance to deciding disputes by delegating the decisions to algorithms. This is no longer the realm of Science Fiction as not only do algorithms already make decision on our behalf but they also make biased decisions on our behalf. Welcome to the world of Algocracy, which refers to a system of governance based on rule by algorithms.

The problem of Algocracy has been brought to the fore recently when reporters from ProPublica did an investigative analysis of a prisoner scoring software and determined that it was negatively biased towards black people. Consider two people, one black and the other one white, given the same criminal record, a commercial tool called COMPAS employed by law-enforcement agencies, would give a higher risk score for the black person. This would result in tougher convictions and longer sentences for Black people. ProPublica found a large number of examples where the non-black person with a lower risk score went on to commit more crimes but the black person did not commit any crime. Even Eric Holder weighed in on this debate by cautioning that such scoring systems are biasing the system against certain minority groups. One of the implications here is that algorithms already have much say in how our society is run. Given the proliferation of big data the role of algorithmic governance is only going to get bigger not smaller. We are already living under an Algocracy, its just that it is not evenly distributed yet.

Where does the allure of Algocracy come from? What Algocracy offers us is an “opportunity” to absolve us of moral responsibility by outsourcing it to machines, a point raised multiple times by the Philosopher Evan Selinger.

Read more »