Crude American Petulance

by Mark Harvey

I’m not sure what Americans were like in the 18th and 19th century, but they have to have been a lot tougher, less whining, less self-important and paradoxically more exceptional without thinking they were exceptional than Americans of today.

Even Americans born well into the 20th century had a stoic quality and a modest sense of their own importance that seems to have been washed out of our culture. Not many WWII veterans left, but the ones I’ve met spoke about their battles at Normandy or in the South Pacific as just something that needed to get done. The people I knew who lived through The Great Depression said it was tough but made light of their own hardships.

Much of our citizenry today resemble loud spoiled children, whining and whingeing at every inconvenience, and trotting out opinions on the most complex matters—with zero formal training or any in-depth research. I fear that to other countries we look like one of those screaming toddlers having a fit in a very public place. Sort of an international cringe.

The recent melt-down over gas prices is a prime example of American petulance. Does the increase in gas prices affect the average American family? Yes it does. Are the fits over gas prices in proportion to the issue and do most Americans understand how gas prices are set? No and no. Read more »

As Darwinian As Apple Pie

by Mike Bendzela

Herbert W. Dow in the orchard in Standish, Maine, circa 1920.

Over thirty years ago, my then-partner-now-spouse, Don, began planting heritage apple trees on the small farm where we are tenants, in an attempt to partially restore the historical orchard of Herbert W. Dow, traditional Maine farmer and cider-imbiber.

Herbert’s original, handwritten map of the apple trees he grew out back was still in a desk in the house when Don moved here in the 1970s; so, we have been able to research and replant some of the old varieties Herbert grew. We decided we would tend the trees “organically,” or something equivalent, because, you know, toxic chemicals and all that, and we would sell rare apples to local markets.

This organic ideal did not last long, however. Little did we know what nature had in store for us.

Swiss botanist Augustin Pyramus de Candolle, whose view profoundly influenced Charles Darwin, declared that “all nature is at war, one organism with another.”

The greater choke the smaller, the longest livers replace those which last for a shorter period, the more prolific gradually make themselves masters of the ground.

The belief that having “dominion” over nature exempts the husbandman from this Darwinian fray is romanticized claptrap. Raising any crop requires a farmer to tend simultaneously to the needs of a hospital and slaughterhouse. What follows is a compendium of just a few of the horrors baked into your apple pie. Read more »

A Sky without Monarchs? Thoughts on the IUCN Endangered Species Listing

by David Greer

“If mankind were to disappear, the world would regenerate back to the rich state of equilibrium that existed 10,000 years ago. If insects were to vanish, the environment would collapse into chaos.” —E. O. Wilson

“[Rachel] Carson may have won a battle, but not the war.” —Dave Goulson, Silent Earth

Monarchs alight on every available surface once roused by the sun. Photo by David Greer

January 31, 2019

Though it’s close to noon high on a mountain about fifty miles west of Mexico City, the oyamel firs alongside the trail in the Monarch Butterfly Biosphere Reserve are covered in shadow, and something else besides. Their branches appear to sag beneath the weight of dark unmoving masses like swarms of bees multiplied many times over, so thick on the trees that the needles of the firs are entirely invisible.

As the winter sun approaches its zenith, its first rays strike the firs and warmth starts to creep into the clearings between the trees. Myriad wings, orange and black, begin to tremble. Soon afterwards, a few random butterflies disengage from the swarms and lift lazily into the air. Within minutes, the sky between the treetops is speckled with butterflies lazily twisting and turning as if stretching from a long night’s sleep. Their numbers quickly swell, the fluttering of wings now loud as the patter of gently falling rain, an incongruous sensation to experience on a trail beneath a cloudless sky. Read more »

Acting Machines

by Fabio Tollon

Fritzchens Fritz / Better Images of AI / GPU shot etched 1 / CC-BY 4.0

Machines can do lots of things. Robotic arms can help make our cars, autonomous cars can drive us around, and robotic vacuums can clean our floors. In all of these cases it seems natural to think that these machines are doing something. Of course, a ‘doing’ is a kind of happening: when something is done, usually something happens, namely, an event. Brushing my teeth, going for a walk, and turning on the light are all things that I do, and when I do them, something happens (events). We might think the same thing about robotic arms, autonomous vehicles, and robotic vacuum cleaners. All these systems seem to be doing something, which then leads to an event occurring.  However, in the case of humans, we often think of what we do in terms of agency: when we do perform an action things are not just happening (in a passive sense). Rather, we are acting, we are exercising our agency, we are agents. Can machines be agents? Is there something like artificial agency? Well, as with most things in philosophy, it depends.

Agency, in its human form, is usually about our mental states. It therefore seems natural to think that in order for something or other to be an agent, it should at least in principle have something like mental states (in the form of, for example, beliefs and desires). More than this, in order for an action to be properly attributable to an agent we might insist that the action they perform be caused by their mental states. Thus, we might say that for an entity to be considered an agent it should be possible to explain their behaviour by referring to their mental states. Read more »

Who Burnt Sienna?

by Carol A Westbrook

Bison from the walls of the cave of Lascaux

We live in an artificially-colored world, filled with added color in our homes, our clothing, our toys, our hair and even our food! We take this plethora of colors for granted. By comparison, the natural world is bland and almost monotone, except for small patches of brightly colored flowers or birds.

Ever since prehistoric times, man has had the urge to add color to his surroundings by expressing himself in paintings. Imagine a caveman picking up a charred piece of wood from his fire and drawing a picture on the cave wall. Later he added colors made from colored clays and stones that he picked up near the cave…and so painting begins. Some of the oldest of these ancient paintings were located deep in caves so carefully hidden that they lie undisturbed for millenia. One of the most famous of these are the paintings in the caves of Lascaux, in Southwestern France, depicting over 2,000 figures of animals and men, as shown in this detail of a bison colored with red ochres.

The Lascaux cave art was created over 17,000 years ago, using paint made from local sources.

The colored clays used to paint the cave walls were iron-containing rocks and clays known as ochres. Ochres are ubiquitous, as iron is the fifth-most abundant element in the earth. The colors of ochre stones vary from muted yellow to browns and reds; no doubt the caveman artist tried burning his colored stones and found that he created even more colors. This is due to reduction of the iron ore within the stone, resulting in the production of iron oxide or rust. The caveman’s paint colors, or his palette, included yellow ochre, red ochre, raw umber, raw Sienna and burnt Sienna. Burnt Sienna? No, this is not the name of a color derived from burning the medieval town of Sienna, but rather from burning a brown stone found near Sienna, which when burnt gives and even darker brown. Read more »

Approaching The Language of Thought by Other Routes, Take 1: The Infraverbal Case

by David J. Lobina

In my series on Language and Thought, I defended a number of ideas; to wit, that there is such a thing as a language of thought, a conceptual representational system in which most of human cognition is carried out (or more centrally, the process of belief-fixation, or thinking tout court, where different kinds of information – visual, aural, information from memory or other held beliefs – are gathered and combined into new beliefs and thoughts); that the language of thought (LoT) is not a natural language (that is, it is not, for instance, English); that cross-linguistic differences do not result in different thinking schemes (that is, different LoTs); that inner speech (the writers’ interior monologue) is not a case of thinking per se; and that the best way to study the LoT is through the study of natural language, though from different viewpoints.

In the following two posts, I shall change tack a little bit and consider an alternative approach (as well as some alternative conclusions), at least on some rather specific cases (see endnote 1). As pointed out in the already mentioned series on the LoT, the study of human thought can be approached from many different perspectives; central among these is to study the “form” or “shape” thought has through an appropriately-focused analysis of language and linguistic structure – my take – a position that raises the big issue of how language and thought at all relate.

And in considering the relationship between language and thought, it is often supposed that the rather sophisticated cognitive abilities of preverbal infants and infraverbal animal species demonstrate that some form of thought is possible without language. That is, that thought does not depend upon having a natural language, given that many non-linguistic abilities are presumably underlain by a medium other than a natural language. Read more »

Excerpt from a Work-in-Progress, Part Two

by Andrea Scrima

This past spring, I found myself sitting, masked, at a wooden desk among a scattering of scientific researchers at the Museo Galileo in Florence. Next to me was a thick reference book on the history of astronomical instruments and a smaller work on the sundials and other measuring devices built into the churches of Florence to mark the cyclical turning points of cosmic time. The gnomon of Santa Maria del Fiore, for instance, consisted of a bronzina, a small hole set into the lantern ninety meters above that acted as a camera oscura and projected an image of the sun onto the cathedral floor far below. At noon on the day of the solstice, the solar disc superimposed itself perfectly onto a round marble slab, not quite a yard in diameter, situated along the inlaid meridian. I studied the explanations of astronomical quadrants and astrolabes and the armilla equinoziale, the armillary sphere of Santa Maria Novella, made up of two conjoined iron rings mounted on the façade that told the time of day and year based on the position of their elliptical shadow, when all at once it occurred to me that I’d wanted to write about something else altogether, about a person I occasionally encountered, a phantom living somewhere inside me: the young woman who’d decided not to leave, not to move to Berlin after all, to rip up the letter of acceptance to the art academy she received all those years ago and to stay put, in New York. Alive somewhere, in some other iteration of being, was a parallel existence in an alternative universe, one of the infinite spheres of possibility in which I’d decided differently and become a different woman.

Not long before this, a friend in Graz had told me that she’d been born on American soil and so, theoretically at least, was an American citizen. She’d never lived there, however, and this was her ghost, her own parallel existence. In July of 1950, her parents had sailed from Bremerhaven to New York on the United States Army Transport W.G. Haan, a ship of displaced persons that had been reacquired by the Navy and enlisted in the Military Sea Transportation Service. Their intention was to emigrate; they’d applied for their visas, all their papers were in order, and yet they were refused entry and caught in limbo for more than a year before being sent back to Europe. My friend was born in this limbo, on Ellis Island. Read more »

Charaiveti: Journey From India To The Two Cambridges And Berkeley And Beyond, Part 55

by Pranab Bardhan

All of the articles in this series can be found here.

Among other things London School of Economics is associated in my mind with bringing me in touch with one of the most remarkable persons I have ever met in my life, and someone who has been a dear friend over nearly four decades since then. This is Jean Drèze.

I think it was in the middle-1980’s Nick Stern at LSE introduced me to Jean. I have known Nick since he was a student of Jim Mirrlees at Oxford. Once when I was teaching in Delhi Nick and my Cambridge classmate Christopher Bliss (both of them teaching at Oxford at that time) came to talk to me about any suggestion I had about an Indian village they might pick which they then wanted to study intensively. I remember telling them to choose a village that had been surveyed before so that they had some benchmark information, and directed them to the Agro-Economic Research Center of the Delhi School of Economics which over many decades carried out village surveys in different parts of north India. They finally chose a village, Palanpur, in western UP about 200 kilometers from Delhi, which had been surveyed by the Center. Over the last 50 years they and their team have studied this village intensively and repeatedly, which is quite a unique achievement in the interface of development economics and economic anthropology. Read more »

Monday, July 25, 2022

Culture and Freedom

by Martin Butler

Although by no means the only ones, two models of human beings and their relation to society are prominent in modern social and political thought. At first glance they seem incompatible, but I want to sketch them out and start to establish how they might plausibly be made to fit together.

The first one I’ll call the cultural model. It is based on the truism that human beings are the products of the particular traditions and histories of their society, and is a relativistic view since it allows for no trans-cultural standards against which cultures can be compared. The key features of this model are its historical nature and the fact that it is essentially social, in that individuals can only be fully understood with reference to the culture in which they were raised. All cultures have a history and evolve over time, and cultures only exist in groups of individuals; you can’t have your own personal culture.

To fully understand a culture it’s necessary to get to know the specific patterns of life within it, and these can’t be captured by abstractions or generalisations. You might, for example, say that culture X is Christian but that doesn’t actually tell you very much about how life is lived in that culture. This does not mean that human beings are completely malleable since we can accept that human nature exerts limits on what it’s possible for a person to be, and of course individuals within a culture can be very different. However, according to this model, culture leaves a major and indelible mark. There will, of course, be huge variation, but I think it’s important to note that typically gender and gender roles, bonds of kinship, religious belief (or lack of it) and rituals of various kinds are pivotal in shaping the identity of most cultures. The stereotypical image of a traditional culture is ethnically homogeneous with a well-established shared way of life, strong bonds of kinship and well-defined gender roles, all of which are supported by a set of widely held beliefs (usually religious) along with the general acceptance of a particular historical narrative.

The second model is the enlightenment model. The ideas that form the core of this model have been around for a long time but they found a particularly clear expression in European philosophy of the 18th century. This model gives centre stage to reason and free-will, and sees individuals and their rights as the starting point and building blocks of society, rather than the products of society. Read more »

Clever Cogs: Ants, AI, And The Slippery Idea Of Intelligence

by Jochen Szangolies

Figure 1: The Porphyrian Tree. Detail of a fresco at the Kloster Schussenried. Image credit: modified from Franz Georg Hermann, Public domain, via Wikimedia Commons.

The arbor porphyriana is a scholastic system of classification in which each individual or species is categorized by means of a sequence of differentiations, going from the most general to the specific. Based on the categories of Aristotle, it was introduced by the 3rd century CE logician Porphyry, and a huge influence on the development of medieval scholastic logic. Using its system of differentiae, humans may be classified as ‘substance, corporeal, living, sentient, rational’. Here, the lattermost term is the most specific—the most characteristic of the species. Therefore, rationality—intelligence—is the mark of the human.

However, when we encounter ‘intelligence’ in the news, these days, chances are that it is used not as a quintessentially human quality, but in the context of computation—reporting on the latest spectacle of artificial intelligence, with GPT-3 writing scholarly articles about itself or DALL·E 2 producing close-to-realistic images from verbal descriptions. While this sort of headline has become familiar, lately, a new word has risen in prominence at the top of articles in the relevant publications: the otherwise innocuous modifier ‘general’. Gato, a model developed by DeepMind, we’re told is a ‘generalist’ agent, capable of performing more than 600 distinct tasks. Indeed, according to Nando de Freitas, team lead at DeepMind, ‘the game is over’, with merely the question of scale separating current models from truly general intelligence.

There are several interrelated issues emerging from this trend. A minor one is the devaluation of intelligence as the mark of the human: just as Diogenes’ plucked chicken deflates Plato’s ‘featherless biped’, tomorrow’s AI models might force us to rethink our self-image as ‘rational animals’. But then, arguably, Twitter already accomplishes that.

Slightly more worrying is a cognitive bias in which we take the lower branches of Porphyry’s tree to entail the higher ones. Read more »

Domesticated Warfare, Continued

by Mike O’Brien

Well, it’s been two months since my last column, and I assume that most of my readers are still alive, so it’s time for a second consideration of the “war analogy” regarding our treatment of non-human animals.

I mentioned in May that Tom Regan, celebrated animal rights philosopher and activist, expressed some misgivings about the aptness and usefulness of this analogy, which compares the killing and maiming of humans in warfare to the killing and maiming (but usually killing) of animals in our economic status quo.

In several essays and interviews, Regan compared the war analogy to two other analogies employed by critics of animal agricultural industries, those of slavery and of the Holocaust. He notes that, unlike the slave owner, it is in the hog farmer’s interests to kill his captives. This is almost correct, but not quite. In both cases, the killing that results from exploitation is incidental to the goal of extracting labour, on the one hand, and meat, on the other. It would be better for the slave owner to work his captives harshly enough to kill them, but somehow have them survive to work another day. Brazilian slave owners, having access to a steady supply of new slaves from Africa, worked their captives to death at a far higher rate than those in the antebellum United States, leading Darwin to curse the country as one of the cruellest on Earth. The economically fine-tuned system of American chattel slavery is not paradigmatic of the practice across history; this is an example of how analogies between practices as widespread and varied as war, slavery and meat-eating require some nuance, specifying which instance of practice X mirrors which instance of practice Y. Read more »

Fable of the Faggot Children

by Michael Abraham

Imagine a boy, and then call him Oliver. His eyes are olive, and this is why you will call him Oliver. Oliver is not his real name, but this is no matter. None of the names in a fable are real names. In a fable, characters are named things like Fox and Hare; they have names for reasons, to tell us what they are. Hence, we will name the boy Oliver because of his bright, olive eyes—his eyes which betray so much of the intensity of him. 

Oliver has sparks inside him. Sometimes, Oliver thinks that he has these sparks inside him because he is a faggot; he believes, now and again, that faggots know more about the world than other people. He believes in what the scholar Jack Babuscio argues in a 1978 essay titled “Camp and the Gay Sensibility,” namely in “the gay sensibility as a creative energy reflecting a consciousness that is different from the mainstream; a heightened awareness of certain human complications of feeling that springs from the fact of social oppression; in short, a perception of the world which is coloured, shaped, directed, and defined by the fact of one’s gayness.” It is very certain that Oliver’s gayness is a spark inside of him, but Oliver has another spark inside of him, a wildness, a sparkling desire for the wideness and the depth of human experience. These twin sparks will inform and shape all of Oliver’s life, and, one day, that other spark will drive him mad. But, right now, Oliver is only sixteen, and the faggot spark is much brighter than the madness spark. Right now, Oliver is desperate for someone to love, for the overwhelm of another boy’s nakedness. In Autobiography of Red, Anne Carson writes that “Up against another human being one’s own procedures take on definition,” and though Oliver has not read Anne Carson yet, and though Oliver has limited experience with being up against another human being (a couple blowjobs, a couple kisses), Oliver intuits this already; intrinsically, Oliver knows that Oliver will not know himself until he is loved by another. It is in the midst of knowing this, in the midst of being quite troubled over it, that Oliver encounters Ezra.  Read more »

Beyond Being Informed

by Rebecca Baumgartner

Photo by Adrian Swancar on Unsplash

In T.H. White’s masterpiece The Once and Future King, Merlyn’s recommendation for “see[ing] the world around you devastated by evil lunatics” is to learn something:

“There is only one thing for it then – to learn. Learn why the world wags and what wags it. That is the only thing the mind can never exhaust, never alienate, never be tortured by, never fear or distrust, and never dream of regretting.”

This summer, as I’ve watched my own country being devastated by evil lunatics, I’ve tried to take this advice to heart. Under Merlyn’s tutelage, the future King Arthur learned about his world by taking the form of various animals and seeing how they lived. We have to make do with a different form of sorcery – one that, rather than educating us about others, merely shows us a reflection of ourselves fractally repeated; one that keeps us sad and angry and has proven to be far less helpful in dealing with evil lunatics than one would hope: the internet.

When the Supreme Court released its now-infamous round of rulings towards the end of June – in a whirlwind week that felt like the answer to the question, “What if America were a monarchy?” – I obviously turned to the internet to find out what I could do.

But, as I quickly discovered, it’s hard to formulate a series of Google search terms that could possibly lead to anything helpful right now. The fact that I tried to do so anyway is a telling demonstration of how our capacity for action is mediated and diluted by the internet.  Read more »

The Age of Exclamation

by Ada Bronowski

More than an age of anxiety or anger, of indignation or self-righteousness, we live in an age of the constant high. We need not consume drugs or get inebriated to be met on a daily basis with sights and behaviours to which the only reasonable reaction is an alternation between ‘wow!’ and ‘oh my god…!’ From the still-not-yet-former British prime minister saluting the oldest of the modern parliaments with a deep-felt ‘Hasta la vista baby!’, to newspaper frontpages, to twitter feeds, the most pervasive mode of communication in contemporary society is the exclamation. Social media experts and communication analysts of every stripe routinely provide us with statistics demonstrating the normalisation of the exclamation point as a sine qua non of everyday communication. It is the lack of one which legitimately arouses concern. The use of a full stop at the end of a sentence tends rather to indicate that something is not quite right. Whether you write a text saying: ‘I’m waiting.’, or whether you write: ‘I’m waiting!’, the recipient knows either to be worried they have annoyed you in the former case, or that all is fine in the latter case, though without then assuming any particular degree of excitement. Whether someone writes in an email or text: ‘That’s perfect.’ or ‘That’s perfect!’, you sense in the former case, that something is in fact not quite perfect, whereas in the latter case, you do not consider anything’s being either perfect or not. The exclamation point, rather than emphasise excitement, functions as a neutraliser, shifting the focus of a sentence away from its actual content. Read more »