Amit Chaudhuri: Why I Write Novels

Amit Chaudhuri in n + 1:

The title of this talk seems to suggest that I know the answer to the “why,” and that I’m about to share it with you. I began writing my first novel in 1986, in what I elected to be my gap year: so, if I’ve been trying my hand at fiction for about thirty-four years now, I should definitely have some idea why I write novels. The truth is that the title has a misleading sound. It should have been, “Why Do I Write Novels?”, with the emphasis on the “do”: because I’ve grown increasingly, rather than less, puzzled by this part of my existence—a part that, to those who know my work from afar, may even seem definitive of my existence.

Of course, in order for me to be confident of that title, “Why I Write Novels,” I have to assume that the reader knows enough of my fiction to want to learn of its backstory and provenance. I’m not making such an assumption. What I’m hoping is that the spectacle of a person who’s published seven novels over three decades without knowing exactly why he’s chosen that genre to write in will be a matter of curiosity to others.

People have pointed out to me from the start that I have been writing about my life. I have been at pains to point out to them that I’m interested in “life,” not “my life,” and that there’s a subtle difference between my understanding of the first and the second.

More here.

Sean Carroll’s Mindscape Podcast: Erich Jarvis on Language, Birds, and People

Sean Carroll in Preposterous Universe:

Many characteristics go into making human beings special — brain size, opposable thumbs, etc. Surely one of the most important is language, and in particular the ability to learn new sounds and use them for communication. Many other species communicate through sound, but only a very few — humans, elephants, bats, cetaceans, and a handful of bird species — learn new sounds in order to do so. Erich Jarvis has been shedding enormous light on the process of vocal learning, by studying birds and comparing them to humans. He argues that there is a particular mental circuit in the brains of parrots (for example) responsible for vocal learning, and that it corresponds to similar circuits in the human brain. This has implications for the development of intelligence and other important human characteristics.

More here.

The Dangerous Idolatry of Christian Trumpism

David French in The French Press:

This is a grievous and dangerous time for American Christianity. The frenzy and the fury of the post-election period has laid bare the sheer idolatry and fanaticism of Christian Trumpism.

A significant segment of the Christian public has fallen for conspiracy theories, has mixed nationalism with the Christian gospel, has substituted a bizarre mysticism for reason and evidence, and rages in fear and anger against their political opponents—all in the name of preserving Donald Trump’s power.

As I type this newsletter, I am following along with a D.C. event called the Jericho March. Eric Metaxas, a prominent Christian radio host, former featured speaker at the National Prayer Breakfast, and the best-selling author of Bonhoeffer is the master of ceremonies; former National Security Adviser Michael Flynn is a featured speaker. The event also includes a flyover from Marine One, the president’s helicopter.

More here.

Tuesday Poem

Racists

Vas en Afrique! Back to Africa! The butcher we used to patronize in the
….. Rue Cadet market,
beside himself, shrieked at a black man in an argument the rest of the
….. import of which I missed
but that made me anyway for three years walk an extra street to a shop
….. of definitely lower quality
until I convinced myself that probably I’d misunderstood that other thing
….. and could come back.
Today another black man stopped, asking something that again I didn’t
….. catch, and the butcher,
who at the moment was unloading his rotisserie, slipping the chickens
….. off their heavy spit,
and he answered—how get this right?—casually but accurately brandished
….. the still-hot metal,
so the other, whatever he was there for, had subtly to lean away a little,
….. so as not to flinch.

by C.K. Williams
from
Selected Poems
Noonday Press 1994

 

John le Carré, Dead at 89, Defined the Modern Spy Novel

Ted Scheinman in Smithsonian:

In 1947, a 16-year-old David Cornwell left the British boarding school system where he’d spent many unhappy years and ended up in Switzerland, where he studied German at the University of Bern—and caught the attention of British intelligence. As the restless child of an estranged mother and a con-man father, and a precocious student of modern languages to boot, the young wayfarer was a natural recruitment target for the security services, which scooped him up in the late 1940s to be “a teenaged errand boy of British Intelligence,” as he put it in his 2016 memoir, The Pigeon Tunnel. Over the next 15 years, those little errands would continue and grow, furnishing Cornwell with the material that would fill the whopping 25 spy novels he wrote under the pen name John le Carré. It would be true to say that he was the finest spy novelist of all time, but in fact he was one of the greatest novelists of the last century. In a blow to his millions of readers, le Carré died of pneumonia on Sunday, at the age of 89.

“I spend a lot of odd moments these days wondering what my life would have looked like if I hadn’t bolted from my public school, or if I had bolted in a different direction,” le Carré wrote in his memoir. “It strikes me now that everything that happened later in life was the consequence of that one impulsive adolescent decision to get out of England by the fastest available route and embrace the German muse as a substitute mother.”

During his parentless, wandering days in Switzerland and Germany, and indeed throughout his life, German was more than a mere second language to le Carré. He was fond of quoting the axiom, often attributed to Charlemagne, that “To possess another language is to possess another soul.”

More here.

Facebook Is a Doomsday Machine

Adrienne LaFrance in The Atlantic:

The doomsday machine was never supposed to exist. It was meant to be a thought experiment that went like this: Imagine a device built with the sole purpose of destroying all human life. Now suppose that machine is buried deep underground, but connected to a computer, which is in turn hooked up to sensors in cities and towns across the United States.

The sensors are designed to sniff out signs of the impending apocalypse—not to prevent the end of the world, but to complete it. If radiation levels suggest nuclear explosions in, say, three American cities simultaneously, the sensors notify the Doomsday Machine, which is programmed to detonate several nuclear warheads in response. At that point, there is no going back. The fission chain reaction that produces an atomic explosion is initiated enough times over to extinguish all life on Earth. There is a terrible flash of light, a great booming sound, then a sustained roar. We have a word for the scale of destruction that the Doomsday Machine would unleash: megadeath.

Nobody is pining for megadeath. But megadeath is not the only thing that makes the Doomsday Machine petrifying. The real terror is in its autonomy, this idea that it would be programmed to detect a series of environmental inputs, then to act, without human interference. “There is no chance of human intervention, control, and final decision,” wrote the military strategist Herman Kahn in his 1960  book, On Thermonuclear War, which laid out the hypothetical for a Doomsday Machine. The concept was to render nuclear war unwinnable, and therefore unthinkable.

Kahn concluded that automating the extinction of all life on Earth would be immoral. Even an infinitesimal risk of error is too great to justify the Doomsday Machine’s existence. “And even if we give up the computer and make the Doomsday Machine reliably controllable by decision makers,” Kahn wrote, “it is still not controllable enough.” No machine should be that powerful by itself—but no one person should be either.

More here.

On Philip Metres’s Poetry of War and Reconciliation

Karthik Purushothaman at The Baffler:

For much of his career, Metres has focused on American wars in the Arab world. In Shrapnel Maps, his new collection of poems from Copper Canyon Press, he shifts his terrain to Palestine-Israel. Drawing on disparate sources, including 1948 memorabilia, maps and texts from centuries earlier, and testimonies of refugees, activists, and suicide bombers, Metres orchestrates a grand conversation of voices and perspectives across three nations. The book is broken into ten sections, resembling a binder containing a war correspondent’s notes. At a climactic moment, Metres turns “shrapnel” into a verb, referring to a “shrapneled map.” The phrase evokes the image of metal invading the body politic, calling to mind these lines from the Iraq veteran writer Kevin Powers: as if “war is just us / making little pieces of metal / pass through each other.”

more here.

William Gaddis’s American Pessimism

Dustin Illingworth at The Point:

William Gaddis’s first novel, The Recognitions (1955), was initially famous for its inaccessibility. More talked about than read, the book perplexed critics with its seemingly endless allusions and erudite tangents. Despite this initial reception, however, the novel was eventually recognized as a major achievement, whose formal complexity signaled postwar fiction’s evolution beyond its vestigial modernism. Gaddis’s second novel, J R (1975), won the National Book Award, an honor he would receive again almost twenty years later, for A Frolic of His Own (1994). By the time he died, in 1998, his influence on American fiction had become pervasive, extending from the encyclopedic systems novels of the Seventies and Eighties to the more recent excesses of hysterical realism. Among the great postmodernists, only Pynchon, himself a Gaddis acolyte, comes close to exerting the same planetary attraction.

more here.

Boris Johnson’s Peculiar Game of Chicken Is About To End

by Thomas R. Wells

Rumple Johnson negotiates

Prime Minister Boris Johnson has exactly one strategy in his EU trade negotiations: threatening to drive Britain into a no-deal wall unless he gets what he wants. In other words, Johnson has been approaching this extraordinarily important matter of national interest as a peculiar version of the game of chicken. This explains much of his bizarre behaviour over the last 18 months, such as his antagonistic attitude, stubbornness, time-wasting, and even (part of) his buffoonery. Nevertheless, to be explained is not to be justified. Not only will the strategy fail, as it did before when Johnson used it in the Withdrawal Agreement negotiations. It has also foreclosed any hope for a substantive trade deal that could have fulfilled the positive aspirations of Brexiteers.

In a traditional game of chicken, two teenagers drive their cars directly towards each other at high speed, threatening mutually comprehensive destructive. The one who swerves away first is the loser, but still better off than if they had crashed. It is a game that rewards the most reckless player, and so it is no wonder that a famously irresponsible politician like Johnson would select it. He is willing to make huge bets with Britain’s national interest because he seems to see politics as just another game, and one that he gets to play with other people’s money rather than his own. The way Johnson decided to support the EU referendum back in 2016 illustrates this. He calculated that supporting Brexit gave him his best chance of winning the Prime Ministership – naturally the public interest was irrelevant to his considerations.  Not unrelated to Johnson’s disinterest in taking politics seriously, the other reason he relies so heavily on gambling as a method of government is that he is famously lazy and incompetent (in stark contrast to his Churchillian self-image). Thus, Johnson decided to play the trade negotiations as a game of chicken both because he knows that EU politicians and institutions do take the responsibility of politics seriously and will be unable to match his recklessness, and also because he knew that neither he nor the witless Brexit loyalists he has filled his government with would be capable of conducting a normal negotiation process. Read more »

Plumbing the Depths

by Raji Jayaraman

Audio version

Ancestral temple of the author’s mother.

When you say you have an ancestral temple, it sounds fancy. To be fair, some are. My mother’s ancestral temple, the Vaitheeswaran Koil, is a vast complex with five towers and hall after cavernous hall housing both worshippers and elephants. The temple is dedicated to Shiva in his incarnation as a healer. Perched on the banks of the Cauvery river, a dip in any one of its eighteen water tanks is said to cure all ailments. Dedicated to Mars, it is one of only nine ancient Tamil temples devoted to the planets. (Before you get excited about the prescience of the nine, I feel obliged to inform you that two of them are the sun and the moon, and another two are “Rahu” and “Ketu” who reside somewhere between the sun and moon, causing eclipses.) It has gold plated pillars; emerald encrusted deities; and inscriptions dating back to at least the Chola period in the twelfth century. All very impressive.

I would bask in the reflected glory except that I don’t stem from a matrilineal religious tradition, so my ancestral temple is said to be my father’s. Our parents’ biannual visits to India always included a temple tour with a mandatory visit to the ancestral temple. The itinerary never included Vaitheeswaran Koil, and looking back, I can see why. My ancestral temple was in a village called Tholacheri, and I use the term “village” loosely. Tholacheri contained four or five mud-walled, thatched roof structures surrounding three sides of a viscous pond. On the fourth side lay the temple compound.

The compound’s architectural ensemble comprised the temple itself, consisting of a small anteroom and alcove; a hut that served as the priest’s residence; and four or five giant-sized painted terracotta statues of Ayyanar deities. These statues were armed with machetes. They had bulbous eyes, protruding tongues, and outsized fangs. As an adult they looked almost comical but as children they struck the fear of God in us, which I suppose was the whole point. Read more »

Neuroscience Shouldn’t Divorce Perception and Reality

by Joseph Shieber

There is a spate of popularizations of neuroscience promoting the idea that “reality isn’t something you perceive, it’s something you create in your mind”, that “everything we perceive is a hallucination created by the brain”, or — as one Scientific American article put it — “It is a fact of neuroscience that everything we experience is a figment of our imagination.”

These popularizations are unfortunate, because they run together at least two claims that we should distinguish from each other. The first claim is that perception isn’t merely a process in which the brain passively receives impressions from the world. Rather, it’s a process in which the brain is an active participant, testing predictions about the world against the inputs that it receives. The second claim is that, if the brain is an active participant in perception, then perception “creates reality in your mind” or is a “hallucination”.

Now, when neuroscientists make the first sort of claim, they’re speaking on the basis of their expertise as scientists. Indeed, the past thirty years have seen the rise of views in psychology and neuroscience that understand the brain as, in the words of  the noted philosopher and cognitive scientist Andy Clark, a “prediction engine”.

Here’s how Lisa Feldman Barrett puts this idea in her book, How Emotions are Made: The Secret Life of the Brain :

The discovery of simulation in the late 1990s ushered in a new era in psychology and neuroscience. What we see, hear, touch, taste, and smell are largely simulations of the world, not reactions to it. … Simulations are your brain’s guesses of what’s happening in the world. In every waking moment, you’re faced with ambiguous, noisy information from your eyes, ears, nose, and other sensory organs. Your brain uses your past experiences to construct a hypothesis—the simulation—and compares it to the cacophony arriving from your senses. In this manner, simulation lets your brain impose meaning on the noise, selecting what’s relevant and ignoring the rest. (Feldman Barrett, Chap 2)

Of course, it’s important not to downplay the significance of this development in our understanding of perception. Traditionally, philosophers drew a stark distinction between perception and inference. On this traditional view, perception is the primary route by which we acquire new information, whereas inference can only transmit information that we already acquired. Read more »

The Dolphin and the Wasp: Rules, Reflections, and Representations

by Jochen Szangolies

Fig. 1: William Blake’s Urizen as the architect of the world.

In the beginning there was nothing, which exploded.

At least, that’s how the current state of knowledge is summarized by the great Terry Pratchett in Lords and Ladies. As far as cosmogony goes, it certainly has the virtue of succinctness. It also poses—by virtue of summarily ignoring—what William James called the ‘darkest question’ in all philosophy: the question of being, of how it is that there should be anything at all—rather than nothing.

Different cultures, at different times, have found different ways to deal with this question. Broadly speaking, there are mythological, philosophical, and scientific attempts at dealing with the puzzling fact that the world, against all odds, just is, right there. Gods have been invoked, wresting the world from sheer nothingness by force of will; necessary beings, whose nonexistence would be a contradiction, have been posited; the quantum vacuum, uncertainly fluctuating around a mean value of nothing, has been appealed to.

A repeat motif, echoing throughout mythologies separated by centuries and continents, is that of the split: that whatever progenitor of the cosmos there might have been—chaos, the void, some primordial entity—was, in whatever way, split apart to give birth to the world. In the Enūma Eliš, the Babylonian creation myth, Tiamat and Apsu existed ‘co-mingled together’, in an unnamed’ state, and Marduk eventually divides Tiamats body, creating heaven and earth. In the Daoist tradition, the Dao first exists, featureless yet complete, before giving birth to unity, then duality, and ultimately, ‘the myriad creatures’. And of course, according to Christian belief, the world starts out void and without form (tohu wa-bohu), before God divides light from darkness.

In such myths, the creation of the world is a process of differentiation—an initial formless unity is rendered into distinct parts. This can be thought of in informational terms: information, famously, is ‘any difference that makes a difference’—thus, if creation is an act of differentiation, it is an act of bringing information into being.

In the first entry to this series, I described human thought as governed by two distinct processes: the fast, automatic, frequent, emotional, stereotypic, unconscious, neural network-like System 1, exemplified in the polymorphous octopus, and the slow, effortful, infrequent, logical, calculating, conscious, step-by-step System 2, as portrayed by the hard-shelled lobster with its grasping claws.

Conceiving of human thought in this way is, at first blush, an affront: it suggests that our highly prized reason is, in the last consequence, not the sole sovereign of the mental realm, but that it shares its dominion with an altogether darker figure, an obscure éminence grise who, we might suspect, rules from behind the scenes. But it holds great explanatory power, and in the present installment, we will see how it may shed light on James’ darkest question, by dividing nothing into something—and something else.

But first, we need a better understanding of System 2, its origin, and its characteristics. Read more »

The Theory of the Leisure Class- A Peculiar Book

by Emrys Westacott

Thorstein Veblen’s The Theory of the Leisure Class is a famous, influential, and rather peculiar book. Veblen (1857 – 1929) was a progressive-minded scholar who wrote about economics, social institutions, and culture. The Theory of the Leisure Class, which appeared in 1899, was the first of ten books that he published during his lifetime. It is the original source of the expression “conspicuous consumption,” was once required reading on many graduate syllabi, and parts of it are still regularly anthologized.

The central argument of the can be briefly summarized. In the earliest human communities, pretty much everyone contributed to securing the means of life. In this situation there was no steep or rigid social hierarchy. Cooperative traits were socially valuable and therefore generally respected, and individual property was not important. At some point, however, certain members of the group created a situation in which they didn’t need to beaver away at mundane tasks like tending crops or making clothes. These individuals would typically be the biggest, strongest, boldest, most competitive types–for example, men who were good at hunting, which made them also good at fighting against other hostile groups. Social hierarchies emerged. Women and the less able-bodied men drudged away to produce the means of life; men who possessed the necessary traits engaged only in activities such as hunting and fighting. These men became the “leisure class.”

The leisure class came to include others who did no genuinely productive work, for instance, priests and administrators. This elite came to look down on productive work, which was performed by the majority who had to do it out of necessity. Not having to do such work thus came to be a distinguishing mark of a person’s higher social standing. In smaller societies it was fairly easy to flaunt this privilege in the form of “conspicuous leisure”, but in larger, more complex societies, one exhibited one’s status by showing off one’s wealth through conspicuous consumption. Read more »

On Academic Titles, Perception, and Respect

by Robyn Repko Waller

Image by Pexels from Pixabay

Academic titles aren’t everything. But they signpost what might not otherwise be socially salient; I, and others like me, are present here as members of this academic community. 

Earlier this week the Wall Street Journal published a now widely criticized op-ed piece, imploring Dr. Jill Biden to drop her academic title from her public persona before her tenure as First Lady. Many outlets have condemned the misogynistic tone of the piece, which refers to Dr. Biden as “kiddo,” disparages her dissertation as sounding “unpromising,” and encourages her instead to focus, as if it is mutually exclusive, on the excitement of living in the White House. Even the author’s former employer has distanced themselves from his views. 

Now, the author, Mr. Epstein, does provide an argument of sorts for his views, backed with anecdotal evidence. He cites his decades-long career as a university lecturer and editor of a scholarly publication, a career he has advanced without a Masters or PhD. Sometimes, he notes, students have called him by ‘Dr.’ He reports that he has an honorary doctorate, but he speaks poorly of the kind of individuals who typically have such honorary degrees bestowed upon them in contemporary times — wealthy donors and entertainers. Rather, he quips, “no one should call himself ‘Dr.’ unless he has delivered a child.” (An apropos male doctor reference in a misogynist piece.) Charitably (against all initial recoiling), we may read this as the view that the mere possession of a doctorate, at least of the honorary variety, does little to track the merit or quality of work or worth of the individual. Using the title, then, presumably, does not signify what it seems.

Of course, we may well agree that plenty of meritorious work, academic or otherwise, has been produced by those without a doctorate or even a bachelor’s degree. To think otherwise is to slide into the growing elitism about the “uneducated.” I was a first-generation (undergraduate) college student. Having grown up in a proud working-class family, whose character —community-mindedness, warmth, ingenuity, courage — and work ethic were second-to-none, I also recoil at  this growing elitism. Plus I can understand the frustration about celebrity and money trumping desert for academic accolades. Few scholars, if any, have gotten nearly as much attention as Nicole Polizzi, aka Snooki, did when delivering a lecture to a university (although I’m sure it was an interesting address). More seriously, there is a severe socioeconomic barrier to entry into college in the US and elsewhere.  Read more »

Writing the Virus: A New Anthology

by Andrea Scrima

An anthology I’ve edited with David Winner, titled Writing the Virus, has just been published by Outpost19 Books (San Francisco). Its authors—among them Joan Juliet Buck, Rebecca Chace, Edie Meidav, Caille Millner, Uche Nduka, Mui Poopoksakul, Roxana Robinson, Jon Roemer, Joseph Salvatore, Liesl Schillinger, Andrea Scrima, Clifford Thompson, Saskia Vogel, Matthew Vollmer, and David Dario Winner—explore the experience of lockdown, quarantine, social distancing, and the politicization of the virus from a wide variety of perspectives. The majority of the texts were written exclusively for the online literary magazine StatORec, and a keen sense of urgency prevails throughout, an understanding that the authors are chronicling something, responding to something that is changing them and the social fabric all around them.

The range of this anthology is broad: there’s a haunting story that explores the psychological dimensions of an anti-Asian hate crime with a curiously absent culprit; hallucinatory prose that gropes its way through a labyrinth of internalized fear as human encounters are measured in terms of physical distance; a piece on the uncomfortable barriers of ethnicity, civic cooperation, and racism as experienced by someone going out for what is no longer an ordinary run; and a jazz pianist who listens to what’s behind the eerie silence of the virus’s global spread. Read more »