Read, Write, Upload, Repeat: An Interview with Steve Donoghue

by Eric Bies

There was a time when Google replied with images of and information about a world-class jockey, an Englishman born the same year Mark Twain published The Adventures of Huckleberry Finn. Lately, the results of the same query tend toward the man of our time, the subject of this interview. Call it a correction: Steve Donoghue the Boston book critic, Steve Donoghue the editor, Steve Donoghue the YouTuber.

His bylines regularly straddle Books columns at venues large (The Christian Science Monitor, The Washington Post, The National) and small (Big Canoe News, The Bedford Times Press). He is a co-founder of Open Letters Review (where his annual end-of-year Best and Worst Books wrap-ups shouldn’t be missed). His work has been selected to appear at this very site. He reads faster and probably writes faster than you and me combined, and the proof is in the literary pudding: literally thousands of book reviews, articles, and essays to his name. “Prodigious industry” does not begin to tell the story; it’s his unconventional YouTube presence that registers a note of head-scratching astonishment.

For starters, not a single one of his videos has come close to going viral. His second most popular upload, “The Only Sure-Fire Way to Deal with Book-Mildew”—a parody of book restoration guides, instructing anxious owners of moldy tomes to shower the offending objects with water, then throw them away—boasts just 18,000 views. (And that’s an outlier: the typical Donoghue upload tends to clock in at around 850.) His subscriber count, by most measures modest, weighs in at 13,600. And yet stacked against this figure is the rather remarkable channel-wide tally of 5.7 million video views. Rain or shine, that number climbs at a steady rate of 20,000 new views per week. Read more »

Hitchhiker’s at 42

by Steven Gimbel and Gwydion Suilebhan

This year marks the 42nd anniversary of the American release of The Hitchhikers’ Guide to the Galaxy. Douglas Adams’ “five-book trilogy,” of which Hitchhiker’s was the first installment, led readers through a melancholy universe in which bureaucracy is the ultimate source of evil and shallow, self-serving incompetents are the galaxy’s greatest villains. The best-selling series helped shape the worldview of Generation X, capturing the nihilistic cynicism of the Thatcher/Reagan 1980s.

Adams’ seminal work isn’t run-of-the-mill genre fiction in which heroes engage in a good-versus-evil battle, triumphing through bravery and cunning. (That’s the stuff you find in the Harry Potter series, the Millennial Generation’s literary equivalent.) In Hitchhiker’s, our hapless main character, Arthur Dent, is stuck in a world in which his good intentions fail miserably, wasted on an impenetrable system too big to succeed. His traveling companion Marvin, a robot with a brain the size of a planet, sees the truth, and the truth causes incurable depression.

To document the broader cultural impact of Hitchhiker’s, we’ve asked a number of public figures in science, the arts, the humanities, and government to reflect on how the book changed their own understanding of life, the universe, and everything. Read more »

Monday, November 21, 2022

The Gendered Ape, Essay 9: Are Alpha Males Of Any Use?

Editor’s Note: Frans de Waal’s new book, Different: Gender Through the Eyes of a Primatologist, has generated some controversy and misunderstanding. He will address these issues in a series of short essays which will be published at 3QD and can all be seen in one place here. More comments on these essays can also be seen at Frans de Waal’s Facebook page.

by Frans de Waal

The typical chimpanzee alpha male is calculating, assertive, and sometimes violent in order to keep his position vis-à-vis male rivals. But he is also protective and generous towards others, keeping order and protecting the underdog. If he is good at guaranteeing group harmony, he becomes quite popular, loved even.

In my study of power among chimpanzees, I was inspired by Niccolò Machiavelli’s “The Prince,” a book from half a millennium ago, which famously declared that “it’s better to be feared than loved if you cannot be both.”

But let’s not forget that for a leader it’s better to be respected than feared. Respect is easily combined with love. Fear is, in fact, for the untalented ruler, the one who needs to beat everyone over the head to get them to do what he wants. It’s for bullies and despots.

Nonetheless, visit any businessbook section and you’ll find a plethora of how-to books on alpha males that perpetuate the notion that they thrive on fear. Here two recent titles:

  • Dominic Mann (2017). “How to Be an Alpha Male, Dominate in Both the Boardroom and Bedroom, and Live the Life of a Complete Badass.”
  • Jack Landry (2015). “Alpha Male Bible: Become Legendary, A Lion Amongst Sheep.”

These books glorify the alpha concept, borrowed from wolf and primate research, with little mention of the skills that set a good alpha apart, such as generosity, impartiality, and shielding of the underdog. We’re presented with a cardboard version of leadership.

I find this all the more galling given the role of my book “Chimpanzee Politics” (1982) has played in the alpha concept’s popularity. My book drew the attention of U.S. Speaker of the House, Newt Gingrich, who put it on the reading list of Members of Congress. Since then, the “alpha male” label gained currency in Washington, DC, for politicians who dominate and intimidate. Read more »

Monday Poem

In Books

when words make love sentences are born
the world’s heft is altered by the weight of nouns;
a pause of hyphens and commas, like the space between breaths 
tells the rhythm of what’s new and what’s been;
dead stops of periods spell the end of what a breath holds;
adjectives, like the blood blush of infants, color clauses;
articles wrap things in skin; pronouns,
unlike the particular names of beings, sometimes
identify the generalities of their forms by inclusion,
by saying, “We,” suggesting that mine and thine are one;
verbs are the darting eyes of life, the spastic gestures of infants,
the random smiles that pass in their faces suddenly uncalled for,
and of course the cautious steps of the old reaching for footholds
that once came naturally without thought,
too soon after the preface, amid
hints of epilogues
.
Jim Culleny
12/1/16

Give me monotony!

by Charlie Huenemann

“Monotonizing existence, so that it won’t be monotonous. Making daily life anodyne, so that the littlest thing will amuse.” —Bernardo Soares (Fernando Pessoa), The Book of Disquiet, translated by Richard Zenith, section 171

Senhor Soares goes on to explain that in his job as assistant bookkeeper in the city of Lisbon, when he finds himself “between two ledger entries,” he has visions of escaping, visiting the grand promenades of impossible parks, meeting resplendent kings, and traveling over non-existent landscapes. He doesn’t mind his monotonous job, so long as he has the occasional moment to indulge in his daydreams. And the value for him in these daydreams is that they are not real. If they were real, they would not belong to him. They would belong to others as public resources, and not reside in his own private realm. And what is more, if they were real, then what would he have left to dream? Far better, he thinks, “to have Vasques my boss than the kings of my dreams.” It’s more than that he doesn’t mind his monotonous job. On the contrary: the more monotonous his existence, the better his dreams.

This is, of course, mere escapism from the crappy life he’s stuck with. His attempt to justify his monotonous existence by saying that it allows for better daydreams is as see-through as an 8-dollar verification program. He’s just coating his own unremarkable existence in cheap veneer. Soares, one might judge, should have the courage to make his life really better, to find something worth doing, worth taking pride in, and something of some value to others. He should dare to live dangerously. Maybe he could start a book club. There’s nothing wrong with daydreams, okay, but they should serve only as an occasion for a busy person to “recharge” and then return with greater focus to an active, productive life.

But as one gets older and realizes that most of life’s good stuff is contained between two ledger entries, one sees that if it weren’t for dreams, for stories and for art, for inventing personas and writing books through their hands and eyes, life would be insufferable. This is because our brains are too big. We are overpowered for the tasks modern life assigns us, and if we narrowed our focus to just what’s actually before us, we would find ourselves on the road with Estragon and Vladimir, surveying a bleak Beckettian stage, haunted by a vague sense that wasn’t there supposed to be something more, someone showing up who would make a difference? Read more »

The Enduring Allure of Jerry Fodor

by David J. Lobina

‘It should be by now common knowledge that the “cognitive revolution” that gripped the fields of psychology and philosophy in the 1950s and 60s originated in Cambridge, Massachusetts, where a particular intellectual milieu was then forming around Noam Chomsky.’ Or so I started an article of mine a few years ago (it was never published, though a butchered version ended up here).

Fine, I continued, Chomsky might have been primus inter pares in this sphere, but what about the philosopher Jerry Fodor, perhaps second-best among this group of cognitive scientists, as they were to be known from then on? What about Fodor indeed. In the rest of the piece, I tried to explain why Fodor’s contributions may be more enduring in the long run. That was on the occasion of Fodor having been honoured with a festschrift of sorts at the time, a book that contains a piece of my own, in fact.

I want to take a slightly different approach here – a more personal one, in a way – as I would like to make a point born from the perception that Fodor is often too easily (or too quickly) dismissed. This might sound rather counterintuitive to some, and it is certainly not the case that Fodor’s ideas haven’t received plenty of attention ever since he started publishing in the 1960s or so, including outside academia, most notable in the London Review of Books (Darwin was said to have got something wrong once). No-one would deny he has been a central figure in cognitive science. But it is also true that the attitudes of some scholars towards Fodor’s work, and towards him in fact, have sometimes been rather cavalier. Read more »

Privacy as a Common Good in the Age of Big Data

by Josie Roux and Fabio Tollon

Do we need to rethink the role (or conception) of privacy in a highly digitised world? The widespread collection of online user data has generated substantial interest in the various ways in which our right to privacy has been violated. Additionally, worries about our privacy being undermined are also linked to the coercive or manipulative power that digital technologies have over our lives. The concern, then, is that the widespread gathering and use of massive amounts of private information by Big Data barons might undermine individual autonomy. Moreover, if we consider that citizen autonomy is a crucial element of democracy, it becomes clear that the problem of privacy invasions of widespread data collection goes beyond its effect on individual users.

Here we would like to suggest that this situation demands that we reassess the way that we value privacy in liberal democracies.  Traditionally, privacy has been valued as an individual good; it is valued instrumentally for the individual goods it protects such as intimacy, creativity, self-expression, and personhood. In general, privacy is viewed as a right afforded to individuals that protects them from incursions from society. However, if we value privacy for its essential role in the protection of democracy, then it becomes clear that privacy is not only important for individuals but for society as a whole, and is not just an individual good but a common good. Read more »

Musings on Exile, Immigrants, Pre-Unification Berlin, Trauma, Naturalization, and a Native Tongue

by Andrea Scrima

I moved to Berlin in 1984, but have rarely written about my experiences living in a foreign country; now that I think about it, it occurs to me that I lived here as though in exile those first few years, or rather as though I’d been banished, as though it hadn’t been my own free will to leave New York. It’s difficult to speak of the time before the Wall fell without falling into cliché—difficult to talk about the perception non-Germans had of the city, for decades, because in spite of the fascination Berlin inspired, it was steeped in the memory of industrialized murder and lingering fear and provoked a loathing that was, for some, quite visceral. Most of my earliest friends were foreigners, like myself; our fathers had served in World War II and were uncomfortable that their children had wound up in former enemy territory, but my Israeli and other Jewish friends had done the unthinkable: they’d moved to the land that had nearly extinguished them, learned to speak in the harsh consonants of the dreaded language, and betrayed their family and its unspeakable sufferings, or so their parents claimed. We were drawn to the stark reality of a walled-in, heavily guarded political enclave, long before the reunited German capital became an international magnet for start-ups and so-called creatives. We were the generation that had to justify itself for being here. It was hard not to be haunted by the city’s past, not to wonder how much of the human insanity that had taken place here was somehow imbedded in the soil—or if place is a thing entirely indifferent to us, the Earth entirely indifferent to the blood spilled on its battlegrounds. Read more »

Reclaiming the American Narrative

by Mark Harvey

“It is certain, in any case, that ignorance, allied with power, is the most ferocious enemy justice can have.” —James Baldwin

The election a couple of weeks ago came as a relief to many of us. It was not a feeling of happily getting back on track again but rather a sense of relief that we hadn’t entirely lost our democracy to shrill lunatics intent on building a bargain-bin version of American fascism. The Republican Party today is unrecognizable even to rock-ribbed Republicans. When someone from the Cheney family threatens to leave the party for its cowardice and extremism, you know you’re dealing with a party that has completely lost its way.

A Republican used to be someone like Dwight Eisenhower, a moderate who worked well with the opposing party, even meeting weekly with their leadership in the Senate and House. Eisenhower expanded social security benefits and, against the more right-wing elements of his party, appointed Earl Warren to be the Chief Justice of the Supreme Court. Warren, you’ll remember, wrote the majority opinion of Brown v Board of Education, Miranda v Arizona, and Loving v Virginia. If Dwight Eisenhower were alive today, he would be branded a RINO and a communist by his own party. I suspect he would become registered as unaffiliated. Read more »

As Goes Ohio

by Mike Bendzela

Look on my works, ye Mighty, and despair!
Nothing beside remains.
—From “Ozymandias,” Percy Bysshe Shelley

The railroad crossing in the old oil town of Dowling, Ohio, along which my great grandmother, Blanche Thompson, picked blossoms for her homemade dandelion wine.

Prologue

An investigation into the livelihoods of two great-great grandfathers, both oilfield workers in Ohio, has of necessity become a study in the nature of forgetting.

I have sought one thing–my ancestral grandfathers’ involvement in the history of oil production in Northwest Ohio–only to have it slip through my fingers. In the process I have found something else, a great grandmother both besotted and besieged by the men in her life, someone whom I can scarcely look away from. With the help of my brother’s research and my mother’s endless stories, I will try to draw Blanche Thompson’s tale out of the dust of an extinct oil town.

Part One: The Oil Pumpers

The seas come and go, mountains come and go, lands come and go, and so on. What nature builds up, nature takes away. . . . [1]

While researching the various, mysterious, entangled threads of our family’s history, my brother Ben found a document–an utterly banal one, a census report–that became a rabbit hole down which my imagination would disappear. Read more »

Fear of Flying

by Deanna Kreisel (Doctor Waffle Blog)

It’s not about dying, really—it’s about knowing you’re about to die. Not in the abstract way that we haphazardly confront our own mortality as we reach middle age and contemplate getting old. And not even in the way (I imagine) that someone with a terminal diagnosis might think about death—sooner than expected and no longer theoretical. It’s much more immediate than that.

Whenever I teach logical reasoning to my students, I start with a classic syllogism to illustrate deduction: All humans are mortal; Socrates is a human; therefore Socrates is mortal. For an example of inductive reasoning I ask them to think about the major premise of the syllogism: All humans are mortal. How do we know this statement is true? The only reason we assume that anyone currently alive is mortal (including ourselves) is that a very large number of people have died before us. We have no proof.

But if you’re in an airplane hurtling toward the earth, my guess is that such airy sophistries fly right up to the ceiling along with the beverage carts. Suddenly an incurable cancer diagnosis might seem kind of warm and cozy in comparison: you would have some time to get more used to the idea, say your goodbyes, rewrite your will to indulge your current spites. People in a crashing airplane might just have time to clutch an arm or an armrest, gabble a hasty prayer, perhaps make a quick phone call to leave an unerasable message on a loved one’s voicemail.

And that’s what I’m really afraid of: those 60-to-600 seconds. Read more »

The Mysterious Origin of Corn

by Carol A Westbrook

Modern corn (maize)

The new research technician walked into my lab at the University of Chicago, and I introduced her to my research group.

“I enjoyed the walk from home to the lab,” she added. “Everyone in Hyde Park is so friendly! Why just today I stopped to talk to a gardener. He proceeded to tell me about the corn plants he was cultivating. He showed me how he pollinates the plants by hand, and he began to discuss the complicated genetics of corn. “Honestly, Hyde Park is a very impressive place to live. Even the gardeners are highly educated!”

Everyone laughed. “Looks like you came across George Beadle,” someone said. “He’s a Nobel laureate and former president of the University. He’s now retired and doing the research that he always wanted—to determine the origin of the corn plant using genetics.” He likes nothing more than to discuss his theories with anyone who walks by, and spends his day working in his beloved corn fields, where he is doing his research on corn plants.”

This is a true story. George Beadle won the Nobel prize in 1958 for his genetic work on the mold Neurospora, which led to the “one gene-one enzyme” theory, a true breakthrough in understanding the function of DNA. But Nobel prize or no Nobel prize, Beadle’s real passion was the work he started while a graduate student at Cornell University, and that is to solve the mystery of corn’s origin. Read more »

Monday, November 14, 2022

The Gendered Ape, Essay 8: Every Mammal Owns A Clitoris!

Editor’s Note: Frans de Waal’s new book, Different: Gender Through the Eyes of a Primatologist, has generated some controversy and misunderstanding. He will address these issues in a series of short essays which will be published at 3QD and can all be seen in one place here. More comments on these essays can also be seen at Frans de Waal’s Facebook page.

by Frans de Waal

Bonobos have sex in all positions and all partner combinations. Here an adult female carries another, who clings to her, for GG (genito-genital) rubbing. Note the role of eye contact, which is actively sought and maintained during sex, and the facial expressions.

Sigmund Freud — a man with little anatomical expertise and no vagina – invented the vaginal orgasm. Considering it superior to the clitoral orgasm, he dismissed the latter as something for children. Women who enjoyed it were stuck at an infantile stage, ripe for psychiatric treatment.

It was Freud’s way of saying that female orgasm without male penetration doesn’t count.

Anatomists, however, have been unable to find the nerve endings in the muscular vaginal wall required for pleasure, while the clitoris has an abundance. Even though the clitoris is a marvel of engineering, there was a time when evolutionary biologists dismissed it as a by-product that wasn’t any more functional than the male nipple. Stephen Jay Gould declared the clitoris a “glorious accident.”

The medical community, too, still acts as if this little organ hardly matters even though the clitoris has a higher density of sensory points and nerve endings than the penis. Read more »

Beware, Proceed with Caution

by Martin Butler

Going with the evidence is one of the defining principles of the modern mind. Science leads the way on this, but the principle has been applied more generally. Thus, enlightened public policy should be based on research and statistics rather than emotion, prejudice or blind tradition. After all, it’s only rational to base our decisions on the observable evidence, whether in our individual lives or more generally. And yet I would argue that evidence, ironically, indicates that in some respects at least we are far too wedded to this principle.

How is it that the dawning of the age of reason, which saw science and technology become preeminent in western culture, coincided with the industrial revolution, which is turning out to be disastrous for the natural world and, quite possibly, humanity too? Our rationalism seems to have created something which is, in retrospect at least, deeply irrational.

When the industrial revolution was moving through the gears from the end of the 18th century and throughout the 19thcentury there was little clear evidence of the environmental disasters awaiting us. Of course many voices revolted against the tide but these voices were almost exclusively based on emotion and romanticism rather than hard scientific evidence. Why shouldn’t we do something if there is no evidence of harm? And offending our sensibilities does not constitute harm. The colossal productive power unleashed by the industrial revolution promised many benefits, and the dark satanic mills were just the price we had to pay for wealth and progress. No matter how ugly the change might appear at first, breaking with the past was surely just the inevitable tide of history.  Looking back, however, it seems the romantics, the traditionalists and even the Luddites had a point.

So, where and why did it all go wrong? Science only finds something if it is looking for it, and it never just looks for evidence. It frames research questions in a particular way and uses particular methods to investigate. All this rests on assumptions which are not themselves questioned. Read more »

Monday Poem

Narragansett Evening Walk to Base Library

Two young men greeted a new crew member on a ship’s quarterdeck 60 years ago and, in a matter of weeks, by simple challenge, introduced this then 18 year-old who’d never really read a book through to the lives that can be found in them. —Thank you Anthony Gaeta and Edmund Budde for your life-altering input.

bay to my right (my rite of road and sea:
I hold to its shoulder, I sail, I walk the line)

the bay moved as I moved, but in retrograde
as if the way I moved had something to do
with the way the black bay moved, how it tracked,
how it perfectly matched my pace, but
slipping behind, opposed, relative
(Albert would have a formula or two
to spin about this if he were here)
behind too, over shoulder, my steel grey ship at pier,
transfigured in cloud of cool white light,
a spray from lamps on tall poles ashore
and aboard from lamps on mast and yards
among needles of antennae which gleamed
above its raked stack in electric cloud enmeshed
in photon aura, its edges feathered into night,
enveloped as it lay upon the shimmering skin of bay

from here, she’s as still as the thought from which she came:
upheld steel on water arrayed in light, heavy as weight,
sheer as a bubble, line of pier behind etched clean,
keen as a horizon knife

library ahead, behind
a ship at night

the bay to my right (as I said) slid dark
at the confluence of all nights,
the lights of low barracks and high offices
of the base ahead all aimed west, skipped off bay
each of its trillion tribulations jittering at lightspeed
fractured by bay’s breeze-moiled black surface
in splintered sight

ahead the books I aimed to read,
books I’d come to love since Tony & Ed
in the generosity of their own fresh enlightenment
had teamed to bring new tools to this greenhorn’s
stymied brain to spring its self-locked latch
to let some fresh air in crisp as this breeze
blowing ‘cross the bay from where to everywhere,
troubling Narragansett from then
to me here now

Jim Culleny
12/16/19

Hyperintelligence: Art, AI, and the Limits of Cognition

by Jochen Szangolies

Deep Blue, at the Computer History Museum in California. Image Credit: James the photographer, CC BY 2.0, via Wikimedia Commons

On May 11, 1997, chess computer Deep Blue dealt then-world chess champion Garry Kasparov a decisive defeat, marking the first time a computer system was able to defeat the top human chess player in a tournament setting. Shortly afterwards, AI chess superiority firmly established, humanity abandoned the game of chess as having now become pointless. Nowadays, with chess engines on regular home PCs easily outsmarting the best humans to ever play the game, chess has become relegated to a mere historical curiosity and obscure benchmark for computational supremacy over feeble human minds.

Except, of course, that’s not what happened. Human interest in chess has not appreciably waned, despite having had to cede the top spot to silicon-based number-crunchers (and the alleged introduction of novel backdoors to cheating). This echoes a pattern well visible throughout the history of technological development: faster modes of transportation—by car, or even on horseback—have not eliminated human competitive racing; great cranes effortlessly raising tonnes of weight does not keep us from competitively lifting mere hundreds of kilos; the invention of photography has not kept humans from drawing realistic likenesses.

Why, then, worry about AI art? What we value, it seems, is not performance as such, but specifically human performance. We are interested in humans racing or playing each other, even in the face of superior non-human agencies. Should we not expect the same pattern to continue: AI creates art equal to or exceeding that of its human progenitors, to nobody’s great interest? Read more »