by Fabio Tollon

I take it as relatively uncontroversial that you, dear reader, experience emotions. There are times when you feel sad, happy, relieved, overjoyed, pessimistic, or hopeful. Often it is difficult to know exactly which emotion we are feeling at a particular point in time, but, for the most part, we can be fairly confident that we are experiencing some kind of emotion. Now we might ask, how do you know that others are experiencing emotions? While, straightforwardly enough, they could tell you. But, more often than not, we read into their body language, tone, and overall behaviour in order to figure out what might be going on inside their heads. Now, we might ask, what is stopping a machine from doing all of these things? Can a robot have emotions? I’m not really convinced that this question makes sense, given the kinds of things that robots are. However, I have the sense whether or not robots can really have emotions is independent of whether we will treat as if they have emotions. So, the metaphysics seems to be a bit messy, so I’m going to do something naughty and bracket the metaphysics. Let’s take the as if seriously, and consider social robots.
Taking this pragmatic approach means we don’t need to have a refined theory of what emotions are, or whether agents “really” have them or not. Instead, we can ask questions about how likely it is that humans will attribute emotions or agency to robots. Turns out, we do this all the time! Human beings seem to have a natural propensity to attribute consciousness and agency (phenomena that are often closely linked to the ability to have emotions) to entities that look and behave as if they have those properties. This kind of tendency seems to be a product of our pattern tracking abilities: if things behave in a certain way, we put them in a certain category, and this helps us keep track of and make sense of the world around us.
While this kind of strategy makes little sense if we are trying to explain and understand the inner workings of a system, it makes a great deal of sense if all we are interested in is trying to predict how an entity might behave or respond. Consider the famous case of bomb-defusing robots, which are modelled on stick insects. Read more »


The first full moon I saw after the procedure looked as if it might burst, like a balloon with too much helium. It was just above the horizon, fat and dark yellow —moving slowly upward to the firmament where it would later appear smaller and take on a whiter shade of pale. I could distinguish its tranquil seas, the old familiar terrain coinciding with a long abandoned memory.
Even before the bandage came off, the implant’s ID card seemed to confirm it: I am a camera—with a new Zeiss lens made in Jena. Jena is back in my life.


In the middle 1990’s along with journal-editing I did another job in Berkeley which was even more arduous, but also in some ways quite exciting and instructive. I was invited by the campus academic senate to serve for 3 years in a high-powered committee that decided on all appointments, promotions, salaries and merit payment increases for all Berkeley faculty (then roughly about 2,000 in size). This committee is called the Budget Committee in Berkeley; technically it advises the Chancellor, but the latter took our advice in 99% of cases—in the less than 1% cases when the Chancellor did not follow our advice, the rule was that the Chancellor was obliged to meet us in a special session of the committee and explain why he/she would not follow our advice (most often this involved some legal issues) and we had a chance to rebut their arguments.
In his book 
Sughra Raza. This Moment … late June 2022.
I know 



