Poem by Jim Culleny

The writer died while mixing with the rebels, these are
natural accidents of war . . . —Spanish Dictator Francisco Franco

The country has to toughen up … part of the problem is nobody
wants to hurt each other anymore, right? —US president, Donald Trump

The Last Days of Federíco García Lorca

Federico in pajamas and blazer died at night
wearing the sudden-death clothes of a poet killed
because there’s nothing more dangerous to despots
than an artist who tells the day’s truth simply
because some force within insists.

Accepting death for being one’s self
is life’s condition of being one’s self
because to speak is to be.

This condition applies to all in all times
because nothing ever changes the insistence of love
& witness under any sky or sun.

Although the atmosphere of eras & place swings from
heaven to hell on a dime before the head-count has time
to blink, and because the intractable who paint “Guernica”
or write “Canto Libre” or “Satanic Verses”
(artists who dare) could well end with bullet-through-skull
because, to a despot, silence is golden (long-lived or brief)
because despots know that painters and poets,
sculptors and dancers will always speak
from momentary possession
because they’ve found the straightway
to the brainsoul of human-kind,
the place despots only enter
by means of fear & blood
which always mocks
the divine

Jim Culleny, 3/7/19 rev-2/22/2025

“head-count” meaning, “the people”,
during the period of the Roman Empire.

Federíco García Lorca

Enjoying the content on 3QD? Help keep us going by donating now.

Sunday, February 23, 2025

Four Memorable Fancies

by Nils Peterson

A Memorable Fancy I

On the last day of the year, I think about the very first day.

One early morning a Minnesota friend turned his iPhone towards his Minnesota window and we saw snow and a grove of slim, bare trees. He’d been singing, so music was in the air and looking at the beautiful scene I remembered the song, “Morning has broken like the first morning” and I found myself wondering if this is what the first morning looked like.

We think of Eden as summer, everything in bloom, everything perfect and perpetual. A naked Adam and Eve parading around comfortably in their skin suits with navel or without depending on the artist, but suppose the first morning was like this one in Minnesota though the trees, unlike the ones outside my friend’s window, would not have lost their leaves – they would not yet have gotten them. Our hibernating friends, bears and moles say, would be created asleep in their caves or little hollows beneath the new trees. They’d soon awaken for the first time – and the seeds and tubers would begin to stir to their unfolding, to the finding out their size, their shapes, their colors – what they’ll be when they grow up – fruit, flower, vegetable – the creation a child of time, not a creature of eternity.

Adam and Eve came wholly finished later. They entered time without growing into it. Maybe that was their trouble, our trouble, that separation. Also, God told them it’s better not knowing, indeed, ordered them not to know. Perhaps He/She was thinking ahead to Thomas Gray’s line, “Where ignorance is bliss, ’Tis folly to be wise.” but we chose knowing, we chose folly, marvelous folly, and have learned much, but we have not yet chosen wisdom. Read more »

Book Plate: Ed Simon Imagines Europe

by Ed Simon

Alternating with my close reading column, every even numbered month will feature some of the novels that I’ve most recently read, including upcoming titles.

I’m a sucker for a certain type of European novel, or if not actually European, something that trades in all of those connotations of that continent, of that word. Specifically central European and eastern European settings, perhaps because of some deep ancestral affinity for that borderland between the occident and orient, a place of beets, carraway seeds, and sour-cream, of gnarled primeval forests and grey rivers, of craggy ominous mountains or lonely sunflower covered steppes, massive brutalist apartment blocks and picturesque little Medieval hamlets of onion domed churches and red-tiled roofed homes. For that reason, this past year I’ve continually drifted towards either novels from folks originally from the Balkans, Poland, Russia, or I’ve read American imaginings of that broad inchoate land bordered by the Adriatic and the Bosphorus, the Black Sea and the Baltic, along the banks of the Danube or the Volga.

Daniel Mason’s 2018 The Winter Soldier was a particular type of eastern European story, an epic war account from the perspective of introverted Austro-Hungarian medical student Lucius, a scion of Viennese society from noble Polish stock who is unfortunately sent as a medic to the homeland of his forefathers on the eve of the Brusilov Offensive. My introduction to Mason was this past summer, when I read Northwoods, his brilliant, polyvocal, magical realist, slightly gothic, maximalist account of American history from the colonial era through to the near future all as focalized through a single western Massachusetts house in the woods, a character its own right. The Winter Soldier, in an envious display of Mason’s tremendous talent, is a profoundly different book.

Effectively a realist novel in the vein of a Boris Pasternak more than the Thomas Pynchon on display in Northwoods, Mason’s earlier attempt is a novel of the Great War, with accounts of charging Cossacks and rationing in Vienna, of railroad stations filled with fleeing refugees and of cruel Hussar officers. There is, of course, a love story (and a mystery) as well, Lucius inevitably falling for the nun who works alongside him as a nurse, but it is heartbreakingly depicted, with sentimentality but no schmaltz. Beyond that, however, Mason has written an indelibly effecting account of medicine, as Lucius is forced to develop from a shaky student in the distant cosmopolitan capital into a frontline emergency physician treating soldiers whose minds and bodies have been blown apart. Read more »

Friday, February 21, 2025

Weird Politics and Cosmic Horror

by Christopher Hall

Comic horror’s fundamental lesson is that the world is not what it looks like. This thought is given particularly sharp expression in John Langan’s The Fisherman:

‘When I look at things – when I look at people – I think, None of it’s real. It’s all just a mask, like those papier-mâché masks we made for one of our school plays when I was a kid…All a mask…and the million-dollar question is, What’s underneath the mask? If I could break through the mask, if I could make a fist and punch a hole in it…what would I find? Just flesh? Or would there be something more…Maybe whoever, or whatever, is running the show isn’t so nice. Maybe he’s evil, or mad, or bored, disinterested. Maybe we’ve got everything completely wrong, everything, and if we could look through the mask, what we’d see would destroy us.

The speaker here is in grief after his entire family was killed in a traffic accident, and there is a sense that only such large dislocations can jar us out of a sense of the reality of the world around us. There is another sense, however, in which this dislocation is a fundamental condition of modernity. A person in the Middle Ages could stand on a still, firm platform and watch the universe revolve around her. It was obvious the platform was solid and still – she wasn’t moving, was she? – and from that fact many other conclusions could proceed. (This is, of course, a vast over-simplification of the medieval worldview, which, for one thing, very much did believe in non-terrestrial realities. But it remains the case that for a large part of human history the route from perception to conclusion was reasonably short.) Now, not only must we accept that we are, in fact, travelling at tremendous speeds in various directions relative to other objects, but we also do so through space that is curved, though time that slows down the faster we go, and, thanks to quantum mechanics, upon a platform where “solidity” does not mean what we expect. The winners of the Nobel Prize for Physics in 2022 won “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” The actual meaning of this is beyond most people – it certainly is beyond me – but the net result is that now one may approach the oracle Google, ask whether the universe is “locally real,” and receive the answer, “No.”

When Kamala Harris, in August of 2024, began calling Trump and his base “weird,” it resonated first and foremost through the intricate codes of behaviour MAGA has indulged itself in. “Let’s go Brandon,” wearing diapers and massive ear-bandages, and the bizarre religious fetishism for a man who is in no way Christ-like, all contributed to the idea that Trump’s supporters had become, as was and is commonly said, “disconnected from reality.” Much of this may, in fact, be derived from the online world where a good deal of Trump’s support originates; to be strange is the simplest method by which to weed out the normies. This sort of political coherence is hard to come by; even so, Trump’s response was, as is common with him, reflective in the sense that he merely threw the insult back: “They’re the weird ones. Nobody’s ever called me weird. I’m a lot of things, but weird I’m not.” “I think we’re the opposite of weird, they’re weird.” And, an increasingly long time ago, at the moment of the Sixties counterculture that gave rise to the modern Democratic party, they were weird.

So MAGA is weird, and their cultural and political opponents are weird. Are they weird in the same way? Read more »

Pocket Knife Envy

by Azadeh Amirsadri

In the poem Tavalodi Digar (Another Birth), by the brave Iranian poet Foroogh Farrokhzad, she writes:

There is an alley
which my heart has stolen
from the streets of my childhood.

There is a village which my heart has stolen from the summers of my childhood. Every summer, my grandparents went there to escape the heat of Tehran and check on their land and property. The village is in the desert between two major ancient cities known for their hand-knit carpets and handcrafted metalwork. The village was home to five landowners who came in the summer, and about ten or so local families who lived there year-round and worked on the land, and cared for their animals, mainly sheep and goats. Surrounded by mountains, this tiny dot of green in the dusty and dry landscape of the desert is where I went every summer of my childhood and stayed for the whole time until my parents came to pick me and my sisters up before school started. The village was without running water or electricity and had the most fantastic night sky where you could see the Milky Way.

The village owners were all siblings and second cousins of my grandmother, who had received the land and house as a dowry from her father. The primary agriculture was almond trees, wheat, walnuts, and fruit. After the harvest, the owners took their share of the crop every year, and the rest was distributed among the workers who toiled on the land. This was a feudal system where the crop was not distributed equally, although my grandmother always did. Because water was scarce, an underground water system emptied into a large pool area, which was the source of many water ownership fights between the workers and indirectly between the land owners. Each day, whoever’s turn it was to water their land, had to allow the water to flow to their batch of land by creating small damns around the path. Sometimes, someone would ‘accidentally’ siphon some water into their fruit or vegetable patch and hope no one would notice. Read more »

Thursday, February 20, 2025

Is AI changing the Character of War, its Nature, or Neither of the two?

by O. Del Fabbro

In recent public debates it has been argued that the implementation of Artificial Intelligence in weapons systems is changing the nature of war, or the character of war, or both. In what follows, my intention is to clarify these two concepts of nature of war and character. It will show that AI is a powerful technology, but it is currently neither changing the character nor the nature of war.

Nature of War, Character of War

In order to make sense of the difference between the nature of war and character of war, it is worthwhile to go back to the philosophy of war of the Prussian commander Carl von Clausewitz, who has systematically introduced that distinction.

Let’s start with the easier one. When referring to the character of war, one speaks of the accidental and concrete conflicts that emerge in the history of mankind and that we usually point at, when we talk about wars: World War I and II, the Napoleonic Wars, the Thirty Years’ War, the Peloponnesian War, the ongoing Israel-Hamas war and the Ukraine-Russia war. The character of war is contingent, concrete and historical.

The nature of war is theoretically more complex. Clausewitz also calls the nature of war the spirit (Geist) or concept (Begriff) of war. The nature of war is war’s essence. That is, the nature of war is on a conceptual and abstract level, and not war’s manifestation in reality. Three major aspects or principles of the nature of war are highlighted by Clausewitz. First, war is a duel between two parties. Clausewitz uses the image of two wrestlers trying to subdue each other by forcing their will upon one another. War is thus the physical coercion of the opponent, or his destruction. War is violent, it is filled with hatred and animosity, it is a blind natural drive. War is fought by a people or a state. Second, war is politics by other means, that is, war is not a self-sufficient system, isolated from other realms of reality. War is an instrument of politics. Third, war is like a game (actually a game of cards), that is, war is about chance and probability. War is guided by commanders, who need talent and courage in order to subdue the enemy. All in all, these three pillars of war are what Clausewitz calls the trinity of the nature of war.

It is absolutely crucial to understand that both, the character and nature of war, have a dialectical relationship, that is, they influence each other. That’s why one cannot talk about the character of war, without mentioning the nature of war, and vice versa. In this sense, the distinction is also a heuristic tool. It helps to understand for example, if indeed there has been a change in the character or the nature of war. Read more »

The Natural Selection of Books

by Christopher Horner

Nothing odd will do long. Tristram Shandy did not last – Dr. Johnson.

Where are the authors of yesterday? And where will today’s be tomorrow? Look for some principle of sorting, some logic to the winnowing process that consigns this to the bin and that to perpetual presence. You won’t find it. I wish the reliable answer was ‘quality’, but it doesn’t seem so. Nor is popularity: plenty of best sellers are consigned to oblivion.  Is there a kind of ‘natural selection’ going on? Is it just luck?

Some writers are strongly identified with a decade or so, are popular, wildly so in some cases, and then completely fade. Others survive and are still read, though sometimes only via one book – the rest of their output goes into the dark. Getting on a school or University reading list can help, or having a film version, but even that isn’t always enough. When I was boy certain writers were ubiquitous, but seem very dated now: Neville Shute, John Wyndham, Paul Gallico, Lawrence Durrell.  They were all set texts in their time. Does anyone think they’ll be revived? Steinbeck, though, is regularly set for students, and lives on. Not in all of his books, though: only about half a dozen of his 33 books are often read. But this is more than enough for immortality: it’s hard to imagine The Grapes of Wrath, for instance, ever going out of print: it is too clearly a very great novel for that. Or so it seems to us.

Orwell has surely been safe for ages – through just two famous books, neither of which is Keep the Aspidistra Flying. His essays seem alive too.  Ideology plays a role here: he was saying things in Animal Farm and 1984 that influential people wanted disseminated. You couldn’t get through school in Britain without being made to read him. I persist in thinking him overrated. Will he fade without the Cold War? There’s no sign of it yet.

This is all very hit or miss. Dr Johnson was famously wrong about Lawrence Sterne. Yet can we imagine the novels of John Fowles, once the big thing in 70s, getting his The Magus read in 2525, or next year? Even The French Lieutenant’s Woman seems irretrievable. But stranger things have happened. Read more »

Wednesday, February 19, 2025

An ingenious new treatment for schizophrenia

by Ashutosh Jogalekar

Molecular structures of xanomeline and trospium chloride. Note the positive charge (indicated by a +) on trospium chloride that restricts its actions to outside the brain.

Drugs for mental illness are notoriously hard. Human biology is complex, and the brain is even more complicated. We now have a good understanding of the basic mechanisms of neurotransmission, but the drugs we have for treating disorders like depression, anxiety and psychosis are often “spray and pray” approaches, either targeting the wrong mechanisms of dysfunction or targeting too many or too few. Antidepressants often stop working. Anti-anxiety medication can do little more than sedate. And many antipsychotic drugs have prohibitive side effects.

Nevertheless, there are rare cases when genuine breakthroughs occur in the field. Thorazine famously emptied out the cruel mental asylums of the 1950s and 1960s. L-DOPA provided genuine benefits for Parkinson’s patients. And there is no denying that the new generation of antidepressants works at least occasionally for a subset of patients. Last year one such potential breakthrough seemed to fly under the radar of breathless news dominated by politics and social issues. If its promise holds up, it could herald a new kind of treatment for schizophrenia.

As is well known, schizophrenia is a serious form of psychosis that is characterized by disordered thinking, hallucinations and impaired speech and expression. The disease profoundly impacts the quality of life of afflicted individuals, including being able to sustain social relationships and professional goals. In severe cases, as made infamous by the case of Michael Laudor, even high-functioning schizophrenics can become a fatal threat to themselves or others. Estimates of the prevalence of schizophrenia in the United States range from 0.25%-0.64%.

All drugs work by blocking or improving the function of proteins or receptors. Receptors in the brain include those that regulate the function of neurotransmitters like dopamine, serotonin and norepinephrine. Neuropharmacological drug discovery starts with identifying these receptors and then discovering molecules that selectively inhibit or activate them. For instance, most antidepressants are selective serotonin reuptake inhibitors (SSRIs) which increase the concentration of serotonin by blocking its re-absorbption and improving a sense of well-being. Read more »

The Paradox of Happiness

by Priya Malhotra

When I think of New York City, the first image that rises to the surface isn’t its vaunted skyline, those defiant towers scraping at the heavens. It isn’t the classical grandeur of the Metropolitan Museum where civilizations whisper through marble and canvas, nor the razzle-dazzle of Broadway where melodies unfurl amidst a fever of lights and applause. No, of all the things I could remember, the image that lingers most is one of angst—dense, unrelenting and amorphous, like yellowing seepage on the walls of an old house, eating it from the inside out.

The city I left after 25 years reminds me of the insidious decay in Charlotte Perkins Gilman’s “The Yellow Wallpaper”—that creeping, spreading rash of dread and despair. It’s there in the hunched shoulders of a Wall Street banker blindsided by an unexpected layoff; in the downcast eyes of a writer crushed by his hundredth rejection letter; in the quiet sighs of a woman realizing her partner may never be the man of her dreams; in the blank stare of a man who can neither stand being with anyone nor being alone. In New York, to thrive is to strive, and anything less than relentless striving feels like failure. More isn’t just a goal; it’s a religion. The hunger for more—a leaner body, a loftier career, a flawless relationship—is the battle cry of a city that worships ambition. But the consequence of exalting “more” is a constant, gnawing sense of lack.

In New York, not having the job you dreamed of becomes intolerable. Not achieving the extraordinary marriage you envisioned becomes unbearable. Not being able to confide your deepest fears to the mother you thought you were close to becomes insufferable. Life, in New York, is measured not by abundance but by longing, not by what is present but by what is desperately desired.

I know I’m generalizing, and this doesn’t apply to everyone. But there’s a dominant cultural strain, a restless pulse in the city that beats with this energy of craving.

Meanwhile, in New Delhi, the capital city of India to which I’ve just returned, I’ve been startled to find a different rhythm altogether – slower, steadier, and far from the edge of a precipice. Here, the streets hum with chaos, the air is thick with dust and petrol, and the disparities between wealth and poverty gape wide. And yet, amidst this, I see people who seem—dare I say it?—happier. Their circumstances, when measured against any global standard of “quality of life,” are objectively harsher than those of the stressed and striving New Yorkers I left behind. But their faces, their words, their mannerisms suggest something else entirely.

Curious, I decided to dig deeper. I asked around ten people from lower-middle-income backgrounds in Delhi to rate their life satisfaction on a scale of 1 to 10. The responses astonished me. Read more »

Tuesday, February 18, 2025

The Usefulness of Useless Knowledge

by Jonathan Kujawa

The Ant and the Grasshopper (from Wikipedia)

Among our many flaws, humanity prefers to be shortsighted. We tend to set our priorities by what we see right in front of us. This makes some sense. After all, there is no point in worrying about storing food for the winter when facing down a saber toothed tiger. But eventually winter arrives.

There is a reason Aesop told us the fable of the Ant and the Grasshopper. Children want to stay up late, eat candy for every meal, and skip anything that seems like work. As we grow up, we hopefully learn the value of investing in our future.

But just planning for the next winter is not what got us to where we are today. Hobbes said the natural life of a human was “solitary, poore, nasty, brutish, and short”. We may still sometimes be nasty brutes to each other, but most of us have long lives filled with health, wealth, and ease.

Peak aerospace on December 17, 1903.

The progress of the last century is astonishing. When my grandfather was a young boy, the Wright brothers flew at Kitty Hawk. When my parents were children, polio was a real worry [1]. When I was young, international phone calls were a dollar per minute, shopping options were limited to whatever the local stores carried, and the sum total of knowledge on a subject was contained in the encyclopedias at the library. Now we have transcontinental flights, a polio vaccine, and the internet.

How did we get from there to here? I can tell you what we didn’t do. We didn’t just solve today’s problems. We also invested in solving future problems, even problems we didn’t know existed. Even the times when we made a maximum effort to solve a hard problem of the day (the atomic bomb, the moon landing, the Covid-19 vaccine), we depended on the prior work of people who were interested in developing our fundamental knowledge with no immediate application in mind.

There will always be a tension between the problems of today and an investment in the future. We have pressing needs, and it can be hard to trust that research driven by curiosity is worth the cost. Especially when the research sounds nonsensical or when its utility is impossible to imagine. And it’s true that much of that research will never come to anything.

But when it pays off, it pays off big. Non-Euclidean Geometry was an idle amusement until it was key to understanding spacetime and, in turn, gave us space travel and GPS. The numerology of elliptic curves was an esoteric glass bead game until it became the ubiquitous key ingredient of modern cryptography. The theory of Linear Algebra was developed with no idea that it would be essential to Google Search and the current AI explosion. All of mathematics could be funded on a tiny sliver of the profits generated off of the discoveries of a century ago [2].

Hyperbolic geometry [3].
I’ve written similar words before at 3QD. Why am I beating this drum again? Read more »

Imagining, for Grown-ups: On Hunger

by Lei Wang

[This is part of a series on bringing magic to the everyday through imagination.]

Someone once told me the trick to fasting: take long walks. That way, your body believes you are at least on the search for food and temporarily forgets its hunger. When you’re in the mode of actually solving the problem, the problem tends to go away, as opposed to endless rumination about the problem. Then again, the person who told me this was the kind of person who could do things merely because she knew they were good for you. This was another one of those common sense yet counterintuitive things I often fail to put into practice, along the lines of how using energy somehow begets energy, while sleeping all day makes you sleepier.

I rarely fast, but I often find myself hungry and slightly irritated—I am one of those people—and wonder if I can take the trick a step further, using my imagination. Hunger already lends itself to animal metaphors—“I could eat an elephant,” “I’m a hungry hippo”—so why not take it all the way? On the sidewalk nearing noon, I become a large savannah cat chasing my wildebeest. At my desk, a squirrel savoring each nibble of a stolen cashew. Or a crocodile lying in wait by the marsh grass/microwave. I must, in fact, focus and stay still to get my lunch, and this calms me down. Hunger starts to feel less like a problem and more like a game. Bringing meaning to things: this is what humans do, isn’t it? And we can just make up the meanings.

Of course, one can also just have a granola bar. I’m not even concerned so much about true physiological hunger—this being a sphere of excess—but the psychological kind. I am interested in dealing with our evolutionary wants, our programmed desires, with creative and unpunishing ways to choose otherwise. In other words: how do we not act like dogs when it comes to the food instinct, even if (let’s face it) we probably all long for the life of a very good middle-class dog? Read more »

Monday, February 17, 2025

What is Law?

by Tim Sommers

John Austin was cursed with famous friends, among them Jeremy Bentham, Thomas Carlyle, James Mill and Mill’s son John Stuart, whom Austin tutored in the law. Cursed because, while they were all impressed by his intellect and predicted he would go far, he did not. His nervous and depressive disposition combined with his ill-health lead to his failure as a lawyer, an academic, and as a government official. In 1832, Austin wrote  The Province of Jurisprudence Determined, which almost no one read and promptly went out of print. Almost thirty years after his death, his widow published a second edition. This time, everybody read it.

Austin is considered the first positivist. Positivism is so-called because the law, on this account, is a “posit.” That is, all law is human-made, separate from morality, and identifiable as law by the details of how it came about – and (most importantly) the fact that the source of law is habitually obeyed. Positivism aspires to be an empirical approach to the law. So, Austin says laws are rules, but, empirically, are also a species of command.

Specifically, a law is a command made to a subject, or political inferior, by a sovereign, or political superior, habitually obeyed, who can back the command up with a credible threat of punishment of sanction. No law without sanction. If I offer money to whomever finds my dog, even if I am the sovereign, it’s not a law.

There are problems with this approach. First of all, it seems to apply best to criminal law – and only with retrofitting to other kinds of law. As my Constitutional law professor, Paul Gowder, used to say, despite what people think, “The law does not, primarily, tell people what they can’t do. It tells them how to do what it is that they want to do. Get married. Open a business. Drive a car. Make a will.”

Secondly, in post-monarchial society, who exactly is the sovereign? Austin himself had difficulty. He was forced to describe the British “sovereign” of the time, awkwardly, as the combination of the King, the House of Lords, and all the electors of the House of Commons.

Finally, as Hart emphasized, it’s not clear, on this account, that we can make a principled distinction between the commands of the sovereign and the commands of a criminal with a gun. Read more »

Will I Still Get to Give My Elevator Pitch for the Rule of Law?

by Ken MacVey

After many years as a practicing lawyer, I remain proud of what I do. Putting aside lawyer jokes, stale references to ambulance chasing and analogies with other professions that charge by the hour, I have enjoyed doing what lawyers do and I am unapologetic about it.

Sometimes to clients, colleagues, and classes that I teach, I give my elevator pitch on why law is essential to, in fact constitutive of American society. It varies but goes something like this –

“We are all aware of the visible physical infrastructure we use every day that we take for granted but can’t do without. The water reservoirs, pumps and pipelines we rely on to have a glass of water or to take a shower. The electrical plants and transmission lines that light and power our homes and businesses. The sewer systems that protect our public health. The roads and freeways that can take us anywhere in the country and deliver our food and medicine and every other item of commerce we depend on.

“But there is another infrastructure you don’t see – the invisible infrastructure of law. Without law, the physical infrastructure we take for granted would not exist as we know it.  For roads or sewers to be constructed and run, contracts must be entered, funds procured by loans and bonds, and health and safety standards set by law. In fact, there is not a single thing in our lives that you care about law does not touch. Your name and identity – a matter of law. Your home and personal property – a matter of law. Same with the money in your bank account; protection from thugs and crooks; the right to speak or vote. Your citizenship and the country you live in, the United States of America, are creatures of law. Law is what ties all of us together.”

“Law as our invisible infrastructure” is my elevator pitch as to how and why law is fundamental to what we do and who we are. And just as engineers and workers attend to one kind of infrastructure, judges and lawyers attend to another.

But now I am wondering, will I still be able to give this pitch? In the infancy of the second Trump administration, we are witnessing one of the gravest internal threats to the rule of law in American history. Read more »

Poem by Jim Culleny

“Like a bird on a wire,
like a drunk in a midnight choir
I have tried, in my way, to be free. . .”
……………………….. Leonard Cohen

Birds on Wires

Birds on a wire

The universe is synchronous,
its beauties overlap,
tunes are made of birds on wires,
Leonard sang that song for us

Nature plays its songs for us
riffs are spun of days,
melodies of suns and moons,
of particles and waves,
each to each is brought to us

All her songs belong to us
each is ours to keep,
ruthless ones of hearts on fire,
ones sublime and deep,
ones both right and wrong for us

Jim Culleny
4/27/14

Birds on a wire by Leonard Cohen

Enjoying the content on 3QD? Help keep us going by donating now.

Sunday, February 16, 2025

Into the Abyss with Benjamin Robert Haydon

by Steve Szilagyi

Haydon paints himself as Marcus Curtius. Very cool in its weird way.

June 1846 was the hottest month ever recorded in London at that time. For 22 sweltering days, temperatures soared between the 85 and 105 degrees Fahrenheit. The city’s literary luminaries—Elizabeth Barrett, Robert Browning, Alfred Lord Tennyson, and Thomas Carlyle among them—mopped their brows and grumbled about the oppressive heat like common mortals. Meanwhile, down in Piccadilly, sweating crowds lined up by the thousands outside Egyptian Hall. They had come to see P.T. Barnum’s latest sensation, the celebrated little person Tom Thumb. Barnum’s show was the talk of the town—that is, until June 22, when the heat and gossip took a backseat to shocking news. Benjamin Robert Haydon, a painter, writer, and lecturer known to them all, had been found dead in his studio, the victim of his own tragic hand.

Haydon was an extraordinary figure—brilliant, ambitious, and doomed. His life was a tale of grand aspirations, bad luck, and worse decisions. Stricken by a mysterious illness at age six, he suffered bouts of blindness for much of his life. Nonetheless, he pursued the vocation of art with fanatical zeal. Unlike many of his contemporaries who earned comfortable livings painting portraits or landscapes, Haydon devoted himself to “historical painting,” creating enormous canvases that depicted grand scenes from history and the Bible. These were not modest works; 10 by 15 feet was a typical size. But despite his herculean efforts, he rarely sold these massive paintings. The problem? They were too big to hang anywhere—and, by general consensus, they weren’t very good.

Yet Haydon’s story endures because of his remarkable personality and relentless pursuit of a hopeless dream. He inspired at least two excellent modern books: Paul O’Keeffe’s magisterial biography, A Genius for Failure, and Althea Hayter’s A Sultry Month: Scenes of London Literary Life in 1846. Haydon fascinates not because he was a great painter—he wasn’t—but because of his peculiar idealism, boundless energy, and talent for making all the wrong enemies. Read more »

Keep Your Eyes on the Donut, Not the Hole

by Mindy Clegg

This lovely portrait by John Dervishi and found on Deviantart under a creative commons license.

“Well,” Lynch said, “imagine if you did find a book of riddles, and you could start unraveling them, but they were really complicated. Mysteries would become apparent and thrill you. We all find this book of riddles and it’s just what’s going on. And you can figure them out. The problem is, you figure them out inside yourself, and even if you told somebody, they wouldn’t believe you or understand it in the same way you do.” (from Variety by Chris Morris)

The director, musician, and artist David Lynch recently left us. I count myself as a fan. Twin Peaks, which aired when I was a young teen, was a revelation. Even today, few shows or films lay bare the utter weirdness and contradictions of American society in quite the same way. Lynch took the “normal” and showed its ugly face to the world. The mystery of Laura Palmer pulled everyone in, but it was the examination of small town life in America that appealed to many. His stories were full of contradictions, of how being normal can sometimes be a cover for bad intentions, while the social misfits are often the least morally compromised. As a kid who never got the knack of “fitting in” I found it easy to see these contradictions and so the elevation of the small town misfits resonated. Celebrities die all the time, and people who never met them often express sadness. We don’t know these people, but many of us feel like we do. But there are some who represent a strata of American society that often gets marginalized or stigmatized. Those deaths feel like real losses in a way that others do not. Lynch was one of those people, representing those who choose a life on the margins, despite being of the privileged class (white, middle class, etc). Lynch shared an important corrective to American exceptionalism: dark American weirdness.

Lynch offered up an important counter-narrative to the often celebratory version of Americana found in films and TV. Think of the way Steven Spielberg often invokes the imagery of small town America in his works. Lynch also loved the all-American small town, but he was drawn into the contradictions between the polite veneer and the often brutal reality. Rather than tales of typical Americans overcoming obstacles, he revealed how often aiming for that ideal and failing left people battered, bruised, even dead. In Twin Peaks Laura Palmer was a homecoming queen from a well-respected, influential, middle class family family in the pacific northwest. But her body washes up on the banks of a rushing river, wrapped in a plastic sheet. What begins as a tense “whodunit” proved to be an examination of the false veneer of small towns. The most aware and morally upright characters were not the city leaders, the “normal” men and women of Twin Peaks. Rather many of them were participating in the exploitation of teenage girls or at least ignoring that exploitation. Read more »

Where Did All the Fiction in Fiction Go?

by Daniel Shotkin

The 2025 Oscar nominees for Best Picture

As a senior in high school, I’ve spent four consecutive years in English classes, during which my teachers have hammered home the idea that fiction follows a set progression—Chaucer, Shakespeare, the Enlightenment, Romantics, Transcendentalism, Victorian literature, realism, existentialism, modernism, and, today, postmodernism. If each past period has been clearly defined—Romantics love the sublime, Modernists reject tradition—then the period we’re in now feels a lot less certain. Today’s stories are diverse, both in genre and style. Still, there’s one trend in modern fiction that has caught my untrained eye: a lack of creativity.

Take a look at the Oscar nominees for Best Film of 2025, and you’ll find a wide variety of stories: a black comedy about a New York stripper marrying a Russian mafia heir, a drama about a Hungarian architect emigrating to the United States, and a Bob Dylan biopic. Despite the diversity in characters, settings, stories, and genres, there’s still one aspect the Academy has failed to account for—fiction.

Of the ten nominees, only three are original works. A Complete Unknown, The Brutalist, I’m Still Here, and Nickel Boys are, for all intents and purposes, biopics. Conclave, Dune: Part Two, and Wicked are adaptations of existing works. The only fully original films recognized by the Academy are Anora, Emilia Pérez, and The Substance. While it’s not my place to dictate what happens behind the scenes of film nominations, this disregard for imaginative fiction isn’t unique to filmmaking.

Flip through an issue of The New Yorker from the past five years, and you’ll find op-eds, long-reads about niche subjects, and, almost always, a short story. Fiction has been central to The New Yorker since its inception, yet, for a magazine with such literary weight, too much of the fiction featured is, to put it mildly, dull. Read more »