Stories of Continuity

by Usha Alexander

[This is the sixteenth in a series of essays, On Climate Truth and Fiction, in which I raise questions about environmental distress, the human experience, and storytelling. All the articles in this series can be read here.]

Image of figural detail from Gwion Gwion rock artWas it inevitable, this ongoing anthropogenic, global mass-extinction? Do mass destruction, carelessness, and hubris characterize the only way human societies know how to be in the world? It may seem true today but we know that it wasn’t always so. Early human societies in Africa—and many later ones around the world—lived without destroying their environments for long millennia. We tend to write off the vast period before modern humans left Africa as a time when “nothing much was happening” in the human story. But a great deal was actually happening: people explored, discovered, invented, and made decisions about how to live, what to eat, how to relate to each other; they observed and learned from the intricate and changing life around them. From this they fashioned sense and meaning, creative mythologies, art and humor, social institutions and traditions, tools and systems of knowledge. Yet it’s almost as though, if people aren’t busily depleting or destroying their local environments, we regard them as doing nothing.

Of course, it is a normal dynamic of evolution that one species occasionally will outcompete another and drive it toward extinction. But for most of their time on Earth, human beings were not having an outsized, annihilating effect on the life around them, rather than co-evolving with it. That evolution was at least as much cultural as it was a matter of flesh. What people consumed, how they obtained it, and how quickly they used it up has always played a role in the stability and evolution of any ecosystem of which humans were a part. How fast human populations grew, how they sheltered, what they trampled and destroyed or nourished and promoted—all of this affected their environment. Naturally, their interactions with the environment were always driven by what they wanted. And what people want is intimately driven by what they believe, as in the stories they tell themselves. One thing we can safely presume: those who understood their own social continuity as contingent upon that of their environment weren’t telling the same stories we tell today. Read more »



An Idea of Home

by Danielle Spencer

Mothers, Fathers, and Others, by Siri Hustvedt - book coverThe theme of home—as a topic, question—is woven throughout Siri Hustvedt’s excellent new essay collection, Mothers, Fathers, and Others. In the first essay, “Tillie,” about her grandmother, Hustvedt also recalls her grandfather, Lars Hustvedt, who was born and lived in the United States, and first traveled to his own father’s home in Voss, Norway, when he was seventy years old. “Family lore has it that he knew ‘every stone’ on the family farm, Hustveit, by heart,” Hustvedt writes. “My grandfather’s father must have been homesick, and that homesickness and the stories that accompanied the feeling must have made his son homesick for a home that wasn’t home but rather an idea of home.”

Homesick for a home that wasn’t home but rather an idea of home. Yet home is always an idea of home, even when it is indeed a home we have experienced. Lars’ memory evinces philosopher Derek Parfit’s discussion of “q-memory”—a memory of an experience which the subject didn’t experience. Rachael’s implanted “memories” in Blade Runner are q-memories (and perhaps Deckard’s are as well, at least in the Director’s Cut.) According to Parfit, the notion that I am the person who experienced my memory is an assumption I make simply because I presume that I don’t have q-memories. Thus

on our definition every memory is also a q-memory. Memories are, simply, q-memories of one’s own experiences. Since this is so, we could afford now to drop the concept of memory and use in its place the wider concept q-memory. If we did, we should describe the relation between an experience and what we now call a “memory” of this experience in a way which does not presuppose that they are had by the same person.

As André Aciman puts it, “things always seem, they never are.” In his essay “Arbitrage,” he describes his own experience of being, in Hustvedt’s words, homesick for a home that wasn’t home but rather an idea of home—or perhaps moreso in his case, homesick for missing home. Read more »

The King of Pop, The Emperor’s New Clothes, and Modern Propaganda

by Akim Reinhardt

The Changing Face of Michael Jackson - SpindittyOver the course of more than a decade, Michael Jackson transformed from a handsome young man with typical African American features into a ghostly apparition of a human being. Some of the changes were casual and common, such as straightening his hair. Others were the product of sophisticated surgical and medical procedures; his skin became several shades paler, and his face underwent major reconstruction.

As stark as the changes were, perhaps even more jarring were Jackson’s public denials. His transformation was so severe and empirical that it was as plain as, well, the nose on his face. Yet he insisted on playing out some modern-day telling of “The Emperor’s New Clothes,” either minimizing or steadfastly denying all of it. In order to explain away the changes or claim that they had never even happened, Jackson repeatedly offered up alternate versions of reality that ranged from the plausible-but-highly-unlikely to the utterly ludicrous. He blamed the skin bleaching on treatments for vitiligo, a rare skin disorder. He denied altogether the radical changes to his facial structure, claiming his cheek bones had always been that way because his family had “Indian blood.”

It was equal parts bizarre and sad. But in some ways, perhaps the most disturbing aspect of it all were those among Jackson’s loyal fans who swallowed his story whole. Despite the irrefutability of it all, they refuted it. They parroted his narratives in lockstep, repeating his claims and avidly defending the King of Pop from any questions to the contrary.

Today we face a similar situation. But it’s not about a pop star’s face lift. Ludicrous denials of reality and bizarre make believe counter-narratives are now are now central to discourses about politics and the politicized pandemic. Read more »

Blood On The Snow Tonight

by Rafaël Newman

Toronto, 2022 (Photo: Mahalia Newman)

One afternoon in the 1980s, when I was at grad school at a university in the northeastern United States, I went for coffee with a slightly senior colleague. A boisterous, opinionated, well-liked Brooklyn native, she was renowned (or notorious, depending on one’s philologico-political position) for applying the latest “French theory” to ancient poetry, for her general sensitivity to the dernier cri emanating from Paris by way of New Haven, and for her reputed personal allegiance to the same polyvalent libertinage attributed to some of the most celebrated authors of classical antiquity.

At a certain point in our conversation, as I expatiated on my own aspirations in the burgeoning world of destabilized narrative and fluid identity, she leaned in close to confide in me, half conspiratorially, half shame-facedly: “Don’t tell anyone, but I’m not a lesbian.”

Her secret was safe with me; but I was shocked.

I daresay that the men with whom I made music almost thirty years later, in a band known as The NewMen, would have had a similar reaction had they heard of my publication of a text bearing on our work together: if they were thus to have their half-spoken suspicions confirmed, that the person who had been passing himself off as an emanation of postmodern subculture, and a disaffected dropout from the groves of academe, was in fact a mole; not the real thing at all, not a flesh-and-blood proponent of the idiom but a desiccated reflector upon it; not a creator but a critic; not the guarantor of folk authenticity via his full bodily and spiritual presence but a neurotic fetishizer of generic typologies—in short, a eunuch in the harem, rather than the sultan himself. Read more »

Managing Constitutional Hardball

by Varun Gauri

How should you respond to bullying? Common wisdom says fight back, or at least stand up for yourself, or else bullies — connoisseurs and lovers of power — will keep humiliating you, like Lucy snatching the football when Charlie Brown tried a kick. On the other hand, if you can’t risk a fight with the bully, or if you could win over influential bystanders, maybe you should simply go about your business, not fight back, go high when they go low.

That is the situation Democrats are in. At least since the mid 1990s, Republicans have been breaking long-accepted political norms, vilifying the opposition, and playing constitutional hardball. Gingrich Republicans bent House rules and impeached Bill Clinton. State legislators adopted extreme gerrymandering. More recently, Senate Republicans grabbed Supreme Court seats. Although Democrats have occasionally played hardball, as well, the polarization and norm-breaking have largely been asymmetric. For instance, while high-level elected Republicans subscribed to birtherism (the notion Obama wasn’t born in the United States and was an illegitimate president), there is no counterpart movement among elected Democrats. 

It is true that Democrats have fought back and played a bit of hardball themselves, now and again, but these efforts have been been intermittent and half-hearted — breaking the filibuster for Supreme Court nominees but maintaining it otherwise, challenging the personal integrity of Supreme Court nominees but deferring once Justices serve on the bench. Why the partial response? Read more »

Utopian Promises And Dystopian Futures: Totalitarianism, Counter-Hegemony, And The Limits Of Democratic Education

by Eric J. Weiner

In an ever-changing, incomprehensible world the masses had reached the point where they would, at the same time, believe everything and nothing, think that everything was possible and that nothing was true… The totalitarian mass leaders based their propaganda on the correct psychological assumption that…one could make people believe the most fantastic statements one day, and trust that if the next day they were given irrefutable proof of their falsehood, they would take refuge in cynicism; instead of deserting the leaders who had lied to them, they would protest that they had known all along that the statement was a lie and would admire the leaders for their superior tactical cleverness. –Hannah Arendt

In this oft-reprinted quote from Hannah Arendt’s seminal work The Origins of Totalitarianism, many 21st century readers, particularly those engaged in pro-democracy movements in the United States and abroad, see Donald Trump and the emergent totalitarian formation of Trumpism sewn piecemeal onto the template that she constructed whole cloth from Hitler and Stalin’s political regimes. Although Trump hasn’t yet matched the political power, penchant for violence, or historical significance of Hitler or Stalin, he has made clear his disdain for democracy and exhibits a desire and willingness to use his power and violence to undo its institutional structures. For readers who are encouraged by the emergence of Trumpism and excited by its promise to make America great again, which includes inciting nationalistic pride, putting America’s interests first ahead of global concerns, policing public school curriculum for progressive ideological biases, packing the courts with sympathetic ideologues, and using banal procedural rules to derail the spirit of democratic negotiation and compromise then her work may provide you a cautionary tale regarding the potential implications of delivering on those promises.

For its critics, Trumpism’s thrust is transparently dystopian, its trajectory violent and oppressive, its appeal to national greatness cynical, and its outcome always tragic and bloody. They have also consistently underestimated Trump’s capacity for violence and his ability, like Vladimir Putin, to “leverage inscrutability.” But for Trumpists, the ethos of totalitarianism and the future it promises is not oppressive or frightening, but is empowering and liberating. Read more »

Beyond our needs: on spatial analogies for death

by Jeroen Bouterse

Image source: https://www.flickr.com/photos/dou_ble_you/7454555444

“Everyone feels it’s an unbearable thought, to be limited in time, – but what if you were spatially unlimited, would that not be […] as desolate as immortality? By Zeus, no one is ever depressed because they do not physically coincide with the universe, I at least have never heard a philosopher or poet about it”

I was reminded of this thought, expressed by a character in a novella by the Dutch writer Harry Mulisch,[1] when I read Tim Sommers’s reflections about death here on 3QuarksDaily. Sommers suggests a similar symmetry between time and space: building on the idea that we can think of ourselves as four-dimensional creatures, he wonders why we should care more about being temporally limited than about being spatially limited.

Sommers presents his case, not as a clinching argument that demonstrates that death ought to be nothing to us, but as a therapeutic move that could make it lose some of its sting, changing our perspective on it by showing it to be like something we already accept. I think that is a fair strategy, although in the end I find neither Mulisch’s (character’s) nor Sommers’s version of this symmetry argument convincing.

Mulisch’s point concerns the absurdity or inhumanity of the infinite. His analogy in this respect seems to be about our extension in space and time: the character voicing it suggests we would like to exist in more time, and we might even believe we would like to exist in all time. However, we do not have that same desire to encompass the entire universe spatially and this should give us pause about wanting to encompass it temporally. In Mulisch’s version, then, the spatial counterpart to living longer is taking up more space. What we value is being large temporally speaking, and the question is: Why we do not accept the symmetry with space? Read more »

Living a purposeful life in the metaverse

by Sarah Firisen

My name is Sarah Firisen, and I’m 5ft 2 inches tall and work in software sales. But I’m also, or used to be, Bianca Zanetti, a 5ft 9 size 0 (which I’m also not), fashion designer and proprietor of a chain of stores, Fashion by B.   No, I’m not bipolar. Bianca Zanetti is my Second Life avatar.

My now ex-husband and I were very into Second Life around 14 years ago. Quickly, we realized an essential truth about the virtual world: much like the real world, it gets boring very fast without some kind of purpose. Yes, there was the initial amusement of learning how Second Life worked. We learned how to change our clothes, body shape, and hair. We learned how to control our avatars so we could fly. That was fun. We met a few people and made some friends. I’ve written before about the very real friendships that I made in my virtual life. But after a few weeks, we needed more. My ex had always been very interested in ancient Rome. He found an ancient Roman sim (themed digital plots of land), ROMA,  and became involved with the community. He bought a toga and eventually became a senator. A real-life archaeologist created the sim, and it attracted a large group of ancient history buffs. They enthusiastically took on role-playing from the senate to gladiators to high priestesses at one of the temples. When I checked into ROMA for this piece, it seems like the sim still exists and has an active community.

U.S. dollars could be traded for the in-world Linden currency, and Second Life had a vibrant economy. Virtual commerce was wide-ranging, from new hairstyles to clothes to waterslides. But if you were so inclined, everything and anything in Second Life was buildable by users.  Manipulating basic shapes and adding colors and textures, it was possible to create anything that an avatar could desire.  A scripting language could be attached to the objects in the same way that Javascript can be attached to HTML objects. Back then, I was a software developer and became fascinated with building and scripting. But I needed a purpose to my building. So I started making clothes for myself and then started selling them. To do that, I required stores in which to sell my clothes and scripts to enable people to buy my outfits. Before I knew it, Fashion by B was a serious hobby. Read more »

Roll over Beethoven: Where’d classical music go?

by Bill Benzon

About a month ago Tyler Cowen posed the following question at Marginal Revolution, a blog he runs with along with his collegue Alex Tabarrok: Why has classical music declined? If you do a general web-search on that question you’ll see that it’s a popular topic. The ensuing discussion has had 210 remarks so far. That’s a lot, especially when you consider that Marginal Revolution centers on economics and closed allied social sciences, though Cowen does comment on the arts as well. Some responses are longish, somewhat detailed, and knowledgeable. Most are relatively brief. On the whole the quality of the discussion is high, but scattered, which is to be expected on the web.

Cowen posed the question in response to a request from one of his readers, Rahul, who had asked:

In general perception, why are there no achievements in classical music that rival a Mozart, Bach, Beethoven etc. that were created in say the last 50 years?

Cowen offered several observations of his own. Here’s the first:

The advent of musical recording favored musical forms that allow for the direct communication of personality. Mozart is mediated by sheet music, but the Rolling Stones are on record and the radio and now streaming. You actually get “Mick Jagger,” and most listeners prefer this to a bunch of quarter notes. So a lot of energy left the forms of music that are communicated through more abstract means, such as musical notation, and leapt into personality-specific musics.

Yikes! From Mozart to the Rolling Stones, that’s quite a lot of musical territory – one reason, perhaps, that the discussion was scattered.

In this piece I treat the discussion as a collection of dots. I draw lines between some of them and color in some of the shapes that emerge. Read more »

On the Road: Crunch Time

by Bill Murray

Kyiv, Ukraine

Ukraine is surrounded by 100,000-plus miserable, freezing, foot-stamping Russian soldiers who are Chekov’s gun on the table in Act One of our new post-Cold War epic. We’ve moved from “surely he wouldn’t?” to “he’s really going to, isn’t he?” It’s the moment when Wile E. Coyote has run off the cliff but not yet begun to fall. 

Two years ago Covid crowded out every thing but the most immediate, every body but family. Shocked by the viral invader’s audacity, we scrambled around in a new, unfamiliar world. Everything was frightening. We had precious little time to reflect. 

Now comes the malign intent of a real-life invader. Unlike Covid, Ukraine isn’t exactly appearing out of nowhere. Russia has been moving toward military aggression for months. The US president has had time to commit high profile gaffes about any U.S. response. Russian landing craft have moved clear around Europe from the Baltic Sea to threaten Ukraine in the Black Sea. We’ve had ample opportunity to reflect.

So far the west has performed a pretty nifty feat – defying physics. Specifically Newton’s third law, the one about for every action, there is an equal and opposite reaction. Only now, at last, comes a grudging rumble from the big American reaction machine. Read more »

Charaiveti: Journey From India To The Two Cambridges And Berkeley And Beyond, Part 29

by Pranab Bardhan

All of the articles in this series can be found here.

The Naxalite phase in Bengal was a short, tragic chapter in politics, but in Bengal’s cultural-emotional life its implications were deeper, and reflected in its literature (and films)—most poignantly yet forcefully captured by the writer Mahshweta Devi, one of Bengal’s most powerful political novelists. Again and again in the 20th century some of Bengali youth have been fascinated by the romanticism of revolutionary violence–as was the case in the early decades in the freedom struggle against the British (I have earlier mentioned about my maternal uncle caught in its vortex), then again in the 1940’s when the sharecroppers’ movement (called tebhaga) was soon followed by a period of communist insurgency in 1948-50, and then in the Naxalite movement of the late 60’s and early 70’s.

In the early literature Tagore often engaged with this theme (something already familiar in 19th-century Russian literary imagination). By temperament and political judgment he was opposed to revolutionary violence and the unthinking passions associated with it, and yet he had some soft corner for the young people involved. This theme is dominant, for example, in his novel Char Adhyay (‘Four Chapters’), and in its preface he writes about his once-close friend Brahmabandhab Upadhyay, who parting company with Tagore joined the revolutionary movement. In this preface Tagore recalls the brief touching moment one evening when he came back after some years as a disillusioned man to see Tagore. In much of the profuse literature generated by the Naxalite period, while the repressive state is in the background, there is a pining over the wastage of the lives of so many idealistic youths for a brave social-justice cause–a cause that was in my judgment an insufficiently thought-out one. Read more »

Monday, January 24, 2022

On Academic Nastiness

by Scott F. Aikin and Robert B. Talisse

Academic journal publishing employs a system of anonymous peer review. Work is submitted anonymously to a journal, which then arranges for it to be reviewed by other experts in the field, who also remain anonymous. The reviewers compose a report that itemizes the submission’s merits and flaws, and eventually recommending publication, rejection, or revision-and-resubmission. The reports are shared with the submitting academic, along with a final judgment about whether the work will be published.

Every academic has stories about how this process can go haywire. Many of these stories have to do with that one reviewer, the one who seemed hell-bent on not only misunderstanding but willfully resisting the point of an essay, the one who wrote an off-the-rails, and just nasty, rebuke of the submission. The anonymous peer review process at academic journals, it seems, encourages this kind of behavior. Not only does the reviewer not know who the author is, but the author will not know who the reviewer is. And all the intuitions shared about how anonymity on the internet produces trolls bear on temptations too many reviewers give in to.

Most journal reviewing, in the humanities at least, is done without compensation. It is a service to the profession, added on to one’s teaching, university service, and research responsibilities. And it shows up out of the blue, with a short invitation from a journal editor and maybe an abstract. It’s often onerous, and too often simply annoying. In the climate of publish or perish, many essays go out to the journals before they are ready, and in fields with fast-moving controversies, they must or else be untimely. So reviewers are faced with essays that are additions to their already heavy workloads that could have used more time. And the inclination to take one’s frustrations out on the author is just too great. Add to all of this the simple but pathological delight of punching those who cannot defend themselves or hit back. We have been on the helpless receiving end of such pummeling. Many times. Read more »

Insectophilia

by Ashutosh Jogalekar

An emerald green scarab beetle of the kind I used to collect (Image: Arizona Public Media)

From the age of eleven to the age of fifteen or so, my consummate interest in life was collecting insects and studying their behavior. In the single-minded pursuit of this activity I chose to ignore every ignominy, ranging from being chased by stray dogs and irate neighbors to enduring taunts hurled by my peers and disciplinary action meted out by teachers. Suffice it to say that I would have been the last boy to be asked on a date. The best thing was that none of this mattered in the least.

I don’t remember how it began, but I do know how it progressed. I vaguely recall a book, one of those craft books that taught kids how to build terrariums and enclosures. What I do remember well is that once the hobby took hold of my mind, it changed the way I saw the world. A new universe opened up. What might look ordinary to others – a patch of dusty brush by the side of a busy highway, the outskirts of a field where everyone else except me was playing soccer, and most notably, the hill close to our house which was a venue for vigorous workouts and hikes by seniors trying to stay fit – now teemed with insect life for me. That is what science does to your mind; it hijacks it, making you see things which everyone sees but notice things that very few do. Read more »

Do androids dream of mathematics?

by Jonathan Kujawa

Image from [0]
In a research project with Brian Boe and Dan Nakano at the University of Georgia some years ago we computed the following list of numbers [1]:

2, 3, 6.

There is one of these numbers for each pair of numbers m and n. The 2 is from when m and n both equal 1, the 3 was when m equals 2 and n equals 1, and the 6 was from when m and n both equal 2. The 6 took several days and was about the limit of what we could compute by hand.

By analogy from similar calculations in other situations, it was our belief that there should be a reasonable formula that could compute these numbers just given m and n. But that was faith and intuition, not science — and definitely not math! Despite the Law of Small Numbers, mathematicians are firm believers that coincidences rarely happen. Given data, there should be an understandable order and pattern. Certainly 2000+ years of mathematics support this optimism.

In my work with Brian and Dan, we ended up proving that the above list of numbers is given by the formula (where r is the smaller of m and n):

(m+n)*r – r² + r.

If you would like to see the details, see formula (1.1.2) in this paper. We never had more than those three numbers for data, but through a combination of theoretical considerations and plain ol’ guessing, we were able to figure out the pattern.

In trying to solve other research questions you sometimes have the opposite problem. You can generate lots of data, even huge amounts of data, but it is hard to see the forest for the trees. Read more »

The enduring charms of anthropomorphism

by Brooks Riley

Not long ago, having steeled myself for the read-through of yet another dry but informative assessment of the body’s immune response to Covid 19 and her variant offspring, I was pleasantly surprised to find myself being dragged into a barbaric tale of murder and mayhem, full of gory details and dire strategies.

This was not a thriller, or the reenactment of a famous battle, but rather as entertaining an article about Covid 19 as one could hope to find in these dark days, couched in the rhetoric of anthropomorphism. Katherine J. Wu, a staff writer at The Atlantic, as part of her ongoing coverage of the pandemic, casts the lowly T-cell in the role of ruthless mercenary on a murderous rampage through the body on behalf of the immune system, investing him (her, they or it?) with intent to kill all viral interlopers, which is exactly what a T-cell should be doing.

Just listen to this:  When these immunological assassins happen upon a cell that’s been hijacked by a virus, their first instinct is to butcher. The killer T punches holes in the compromised cell and pumps in toxins to destroy it from the inside out. The cell shrinks and collapses; its perforated surface erupts in bubbles and boils, which slough away until little is left but fragmentary mush. The cell dies spectacularly, horrifically—but so, too, do the virus particles inside, and the killer T moves on, eager to murder again.

Has science writing ever produced such a graphic description of a biological killing spree? Conversely, what crime writer would endow his heroes with such unflinching maleficent intent? It’s a stunning piece of writing, but it also serves a hidden purpose: The reader will not forget this diabolical sequence—or the functions of a T-cell. How a T-cell attacks a virus will burn forever in the  imagination, along with other memorable entertainments. Read more »

On the Celebrity Sentence

by Nicola Sayers

We tell ourselves stories in order to live. 

I don’t know where I first came across this sentence. I was in my early twenties, so it can’t have been on Instagram, although I’ve since seen it there so many times that this is now how it appears in my mind’s eye: boxy black print, hovering in mid-square. My young notebooks are less polished: in those the sentence is scribbled over and over in messy, heartfelt handwriting, a kind of incantation to writerly promise. But there, too, it stands alone. Surrounded by white space, free-floating, as though it does its best work all by itself. 

But it wasn’t, of course, written to stand alone. It is the first sentence in Joan Didion’s iconic 1979 essay The White Album, an essay which goes on to examine exactly the moments when the stories we tell ourselves no longer work or, worse, when no stories present themselves to us at all, when we can’t make sense of any of it. It is an essay about California in the 1960s and, not unrelatedly, about her own mental health struggles (as is often the case in Didion’s writing, her state of mind is not examined as its own particular thing, but taken instead as a clue to the state of the world). It is an essay about disorder, about fragmentation, about falling apart. If, in 1976, Didion stated in ‘Why I write’ that ‘I write entirely to find out what I’m thinking, what I’m looking at, what I see and what it means’, in her 1979 essay (parts of which had been previously published) she concludes that – at least when it comes to the events of the sixties, and her experiences of them – ‘writing has not yet helped me to see what it means.’ 

I did not know any of this, though, when I first came across that sentence. In truth, I didn’t even know who Joan Didion was. It was a number of years before I would come to read the essay to which the sentence belongs, and I confess that I was initially disappointed. Read more »