A New Biography Of Goethe

Dwight Garner at the New York Times:

At what point does an aside become a tangent, a tangent a digression, a digression a meander, a meander a ramble, a ramble a circumlocution, a circumlocution an excursus and an excursus a cul-de-sac? The reader has time to consider such matters while reading A.N. Wilson’s elastic-waisted but hardly unintelligent new biography, “Goethe: His Faustian Life.”

Wilson is a prolific English biographer (of Darwin, Tolstoy, Milton and Queen Victoria, among others) and novelist whose books are usually worth attending to. Especially recommended is his bittersweet memoir “Confessions: A Life of Failed Promises,” from 2022. Two of his daughters, the classicist Emily Wilson and the food writer Bee Wilson, are inimitable writers as well. Does the world need another biography of the German polymath Johann Wolfgang von Goethe (1749-1832), the author of the novels “The Sorrows of Young Werther” and “Elective Affinities” as well as “Faust,” his masterpiece, a tragic play in two parts?

more here.

Enjoying the content on 3QD? Help keep us going by donating now.

Netflix’s One Hundred Years of Solitude Does the Impossible

Imogen West-Knights in Slate:

One Hundred Years of Solitude has a near-mythical status for me that no other book does. Aged about 14, bored one day during the summer holidays, I found the Picador 1978 paperback edition on my parents’ bookshelf. I opened it on a whim, and read one of the most iconic first sentences in existence: “Many years later, as he faced the firing squad, Colonel Aureliano Buendía was to remember that distant afternoon when his father took him to discover ice.” I immediately sat down on the sofa and read for a further three hours. I date my life as a reader of literature to that afternoon, to that first sentence which I still know by heart. I have since reread it only once, 10 years later, because I wanted to wait until I had forgotten what happens. I’ll read it again as soon as the details have once more faded from my memory, and I can’t wait.

If you’ve not read it (and I appreciate that this is one of the most famous books in the world, but just in case), Gabriel García Márquez’s 1967 novel follows six generations of a sprawling family in the fictional Colombian town of Macondo. I read it before I knew what magical realism was, the genre García Márquez became a figurehead for, and it blew my head off. How could this be? It was so compellingly strange. Along the way, babies are born with pig tails, a single trail of blood makes its way all the way across town to announce someone’s death, a rainstorm lasts almost five years, someone literally ascends to heaven, ghosts and spirits abound.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

What Does It Mean to Be In the ‘Post-Genomic’ Age?

C. Brandon Ogbanu in Undark Magazine:

As 2025 approaches, we can expect the silver anniversary announcements on the completion of a draft of the human genome to be on their way. Many of the people who were involved are still alive and well known. Because of this, we will likely hear reflections from an ensemble cast of characters associated with the 2000 announcement, and those whose more contemporary work is linked to the study of genomes: J. Craig VenterFrancis CollinsJennifer Doudna, and others. We have entered what can be called a “post-genomic” age, where the biological sciences build on our understanding, developed over the past quarter-century, moving us towards the next generation of discoveries in various subfields of biology.

What work is “post” doing in “post-genomic?” One dictionary definition offers that “post” can be used as “a prefix, meaning ‘behind,’ ‘after,’ ‘later,’ ‘subsequent to,’ ‘posterior to.’” Its use in “post-genomic” does not indicate a world without genomics, but rather a scientific world where we take genomics for granted and it is no longer the bottleneck in understanding biological systems at the molecular level.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

Thursday, December 26, 2024

How Money Problems and a Bad Manager Tore the Beatles Apart

Allan Kozinn and Adrian Sinclair at Literary Hub:

“Looking back on it, we did used to say, it’s like a divorce,” Paul McCartney reflected on the Beatles’ breakup, now a marathon heading into its fifth year. “It really was like that, but four fellas trying to divorce instead of a man and a woman. And then you get four sets of lawyers instead of just two. All of that kind of stuff was not making life easy at all.” At the moment, the lawyers were not the problem.

As Paul, Linda and their three daughters—Heather (11), Mary (4), and Stella (2)—were enjoying some downtime at High Park in Campbeltown between recording sessions in Stockport, the attorneys representing each of the former Beatles convened in New York on Monday, February 11, 1974—the tenth anniversary of the Beatles’ American debut concert in Washington, D.C.—to hammer out an agreement dissolving the Beatles’ partnership.

This had been Paul’s goal since early 1970, shortly after John Lennon announced to his bandmates that he was leaving the group. It was John, in fact, who first used the word divorce, likening his split from the Beatles—and the liberation he felt in declaring it—to his divorce from his first wife, Cynthia.

McCartney had not wanted to divorce the Beatles. He wanted to divorce Allen Klein, the brash New York manager whom John brought in to manage the Beatles and their company, Apple, which was losing money alarmingly by early 1969.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

How Hallucinatory A.I. Helps Science Dream Up Big Breakthroughs

William J. Broad in the New York Times:

Artificial intelligence often gets criticized because it makes up information that appears to be factual, known as hallucinations. The plausible fakes have roiled not only chatbot sessions but lawsuits and medical records. For a time last year, a patently false claim from a new Google chatbot helped drive down the company’s market value by an estimated $100 billion.

In the universe of science, however, innovators are finding that A.I. hallucinations can be remarkably useful. The smart machines, it turns out, are dreaming up riots of unrealities that help scientists track cancer, design drugs, invent medical devices, uncover weather phenomena and even win the Nobel Prize.

“The public thinks it’s all bad,” said Amy McGovern, a computer scientist who directs a federal A.I. institute. “But it’s actually giving scientists new ideas. It’s giving them the chance to explore ideas they might not have thought about otherwise.”

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

Kafka’s Screwball Tragedy: Investigations of a Philosophical Dog

Aaron Schuster at the MIT Press Reader:

Written toward the end of Franz Kafka’s life, “Investigations of a Dog” is one of the lesser-known and most enigmatic works in the author’s oeuvre. Kafka didn’t give the story a title, writing it in the autumn of 1922 but leaving it unpublished and unfinished. It was published posthumously in 1931 in a collection edited by his friend and biographer Max Brod, who named it Forschungen eines Hundes — which could also be translated as “Researches of a Dog,” to give it a more academic ring.

The name Kafka is popularly associated with the horrors of a grotesquely impenetrable legal system, but there is another aspect to his work, which concerns knowledge. “Investigations of a Dog” presents a brilliant and sometimes hilarious parody of the world of knowledge production, what the French psychoanalyst Jacques Lacan called “the university discourse.” And the contemporary academy might easily be qualified as Kafkaesque, with its nonsensical rankings and evaluations, market-driven imperatives, and exploding administrative ranks.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

Ancient Women Philosophers

Emily Hulme at Notre Dame Philosophical Reviews:

This is an important book. It gathers the fruit of recent research on women in ancient philosophy, and across twelve chapters (all very well-written) offers the reader food for thought on a huge range of topics. We meet scientists and Cyrenaics, Epicurean sex workers and neo-Platonist mathematicians. Two chapters take us outside the standard story of Mediterranean antiquity, and we learn much from these perspective-shifting chapters. For example, Ban Zhao stands out as what, in the Greek tradition, we’d call a phronima: a woman who uses her practical wisdom to write philosophical texts even against the backdrop of difficult cultural headwinds that would seem to make a woman in philosophy an impossibility. While readers might be tempted to skip to the chapters that serve their current research interests, I would strongly encourage taking a look at the whole book: it is more than a sum of its parts, because the recurring themes look different precisely when they’re understood to be recurring.

The introduction will be required reading for anyone working on this topic, and gives judicious coverage of the relevant issues (e.g., why does it matter that these figures are women? Are they doing philosophy in a particularly “womanly” way? How should we manage studying women who are more often the subject of men’s writings than authors themselves?).

more here.

Enjoying the content on 3QD? Help keep us going by donating now.

Figures of the Fool: From the Middle Ages to the Romantics

Colin Jones at Literary Review:

We tend to view the figure of the medieval and Renaissance fool or jester through 19th-century filters – to think of Victor Hugo’s Quasimodo and Verdi’s Rigoletto. Perhaps the show’s most striking achievement is to transcend these anachronistic and wistful habits and to reveal how much more complicated – and fascinating – the world of the fool has been.

The exhibition’s organisers, Elisabeth Antoine-König and Pierre-Yves Le Pogam, have chosen not to approach the subject through the anachronistic prism of mental illness. As they observe, the category of ‘natural fool’ in earlier times was often held to include the mentally or physically impaired (it could encompass cultural outsiders as well). They place the exhibition’s emphasis on ‘artificial fools’ – those who assumed a fool’s identity. Playing the fool might be done with passionate sincerity – by the likes of St Francis of Assisi, self-declaredly ‘God’s fool’ – or with sardonic insouciance and comic and sometimes malevolent intent.

Court jesters, masters of the art of playing the fool, are at the heart of the show.

more here.

Enjoying the content on 3QD? Help keep us going by donating now.

The Rise of Post-Literate History

Matthew Walther in Compact:

The English historian J.A. Froude was famously gloomy about the ultimate prospects for his chosen branch of literature. “To be entirely just in our estimate of other ages is not difficult,” he said. “It is impossible.” Froude’s words came to mind the other day when I encountered Tucker Carlson’s interview with the podcaster Darryl Cooper, whose opinions about World War II may politely be described as “controversial.”

No summary could do justice to the parade of oversimplification, decontextualized pseudo-astonishment, one-sided gotcha-ism, casuistry, and moral lassitude on display in the conversation. But moral preening shouldn’t be our response to what Cooper is doing when he calls Churchill “the chief villain of the Second World War” or blames his alleged aggression on the influence of unnamed “financiers.” Outrage will only feed Cooper’s self-conception as a Promethean figure, carrying his benighted listeners out of the darkness to which they have been consigned into the pure light of historical knowledge.

The Cooper imbroglio is symptomatic of a larger problem: the epistemic gulf between the current consensus—however broadly defined—of practicing historians on any given subject and the attitudes of the ordinary person of general education. This holds true, as far as I can tell, across all subject areas.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

The Largest Whole-genome Sequencing Study in Cancer

Danielle Gerhard in The Scientist:

There is no single genetic blueprint for cancer. Instead, each individual cancer draws on a collection of acquired mutations that endow the cells with a selective advantage and superior immune evasion and proliferation tactics. Thanks to next-generation sequencing technologies, many patients diagnosed with a particular cancer can discover whether their tumors harbor specific mutations that render them more susceptible to particular therapies. However, targeted approaches fail to capture the full suite of alterations and biomarkers nestled in the complex genetic architecture of a patient’s tumor, potentially obscuring the best available treatment plan for an individual patient.

In a study published in Nature Medicine, researchers developed a bioinformatics pipeline for integrating whole-genome sequencing (WGS) data from 13,880 tumors with matching patient clinical data.1 The large-scale study revealed somatic and germline DNA mutations that affect prognosis, highlighting the potential influence of comprehensive cancer genomics on patient outcomes.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

Wednesday, December 25, 2024

The Ultimate Best Books of 2024 List

Emily Temple at Literary Hub:

I now present to you the Ultimate List, otherwise known as the List of Lists—in which I read all the Best Of lists and count which books are recommended most.

In all the time I’ve done this, I’ve never seen a book run away with this list like Percival Everett’s James has this year: it was recommended a full dozen more times than the next most popular books. (It also won the National Book Award and the Kirkus Award, and was a finalist for the Booker—and with good reason.)

Overall, this year, I processed 69 lists from 39 outlets, which collectively recommended more than 1,200 individual books (RIP my spreadsheet). As always, these are probably not all the lists, but all universes have to end sometime. Anyway, 90 of those books made it onto 5 or more lists, and I have collated these for you here, in descending order of frequency.

33 lists:

Percival Everett, James

21 lists:

Kaveh Akbar, Martyr!
Miranda July, All Fours

20 lists:

Sally Rooney, Intermezzo

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

An AI system has reached human level on a test for ‘general intelligence’. Here’s what that means

Michael Timothy Bennett & Elija Perrier in The Conversation:

On December 20, OpenAI’s o3 system scored 85% on the ARC-AGI benchmark, well above the previous AI best score of 55% and on par with the average human score. It also scored well on a very difficult mathematics test.

Creating artificial general intelligence, or AGI, is the stated goal of all the major AI research labs. At first glance, OpenAI appears to have at least made a significant step towards this goal.

While scepticism remains, many AI researchers and developers feel something just changed. For many, the prospect of AGI now seems more real, urgent and closer than anticipated. Are they right?

To understand what the o3 result means, you need to understand what the ARC-AGI test is all about. In technical terms, it’s a test of an AI system’s “sample efficiency” in adapting to something new – how many examples of a novel situation the system needs to see to figure out how it works.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

James Wynbrandt’s Excruciating History of Dentistry

Chelsea Follett at Human Progress:

The writer P. J. O’Rourke famously quipped, “When you think of the good old days, think one word: dentistry.” So let us take his advice. James Wynbrandt’s The Excruciating History of Dentistry: Toothsome Tales & Oral Oddities from Babylon to Braces provides plenty to chew on. As the New York Times Book Review put it, “Wynbrandt has clearly done his homework.”

Our ancestors’ teeth were in an appalling state. As Wynbrandt points out while quoting the Old Testament, calling a woman’s teeth white as sheep and noting that none were missing once counted as high praise worthy of a love poem. After all, healthy teeth were far rarer in the past than today. The first mass-produced bristle toothbrush did not appear until around 1780 in England during that country’s industrialization. Our preindustrial forebears had only a primitive understanding of what was causing their teeth to rot, fall out, and constantly ache.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

What Makes Fish Fast?

George Lauder in Harvard Magazine:

Sharks present an interesting case study because unlike other fast swimmers such as tuna or swordfish, which have smooth skin surfaces, shark skin is rough—covered in teeth-like structures called denticles. Although ichthyologists have known for decades that denticles likely hold the key to a shark’s ability to quickly and efficiently move through the water, how they contribute to speed remained a mystery.

Lauder’s laboratory ran a series of experiments last summer that used small pieces of shark skin to explore how they interact with water. Samples were placed in tanks that move water over the skin at a known rate. The researchers added particles to the water so they could see with microscopic imaging systems how it flowed over the denticles. When they analyzed the resulting data to calculate the friction at the interface between skin and water, they found that the water flow created small fluid vortices that reduced drag. Lauder compares the phenomenon to the dimples in a golf ball, which enable golfers to hit the ball about twice as far as a completely smooth ball. “You would think that you would want to be as smooth as possible to be most effective at moving through a dense fluid like water,” he says, “but actually you don’t want to be as smooth as possible. That roughness really matters for efficiency.” Lauder and his colleagues are now using their findings to “print” artificial denticles with properties similar to shark skin that could one day coat the surface of underwater robots to help make them more efficient swimmers.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

Everything to Remember About Squid Game Before Watching Season 2

Kayti Burt in Time Magazine:

It’s been three long years since Squid Game first premiered on Netflix to become the streamer’s most-watched TV series, ever. The Korean-language drama was an unexpected cultural phenomenon, drawing one in four Americans into its tale of deadly competition. That being said, you would be forgiven for not remembering all of what happened in Season 1 ahead of the second season, which begins streaming on Dec. 26. For those who are a bit fuzzy on where we left protagonist Gi-hun (Lee Jung-jae), who the Front Man (Lee Byung-hun) is, and the basic rules of the titular game, here is a recap with everything you need to know heading into Season 2.

What to remember about the protagonist Gi-hun, aka Player 456

While Squid Game is an ensemble drama, the story thus far has centered around Seong Gi-hun, aka Player 456. When we meet Gi-hun in Season 1, he is an affable deadbeat dad who has to borrow money from his elderly mom to take his daughter out for her birthday. While he struggles with a gambling addiction and the debt he has accumulated as a result, he is a good guy who tries his best to be there for his mom, daughter, and friends—but he often fails. When Gi-hun is approached by The Recruiter (The Trunk’s Gong Yoo) to take part in a game with the chance to win billions of won, he sees it as an opportunity to finally set his life right.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.

What’s Our Age Again?

The Editors at n+1:

When exactly did we stop hearing the word postmodernism? Fredric Jameson’s death this fall at the mighty age of 90 left us wondering. (The question reemerged a few weeks later with the reelection of Donald Trump, who for all his ominous contemporaneity is trailed by a permanent miasma of the tacky, made-for-TV ’80s.) The subtitle of Jameson’s Postmodernism, or, The Cultural Logic of Late Capitalism has enjoyed an enduring popularity into the 21st century, with academic treatises on this or that form-and-content dialectic recycling the formula like they’re trying to solve the 1980s landfill crisis.* But the headline term now has the status of relic, an outgrown holdover from a wavier time. Art critics we know report avoiding postmodern as a descriptor unless writing in the past tense; academics abandoned it years ago somewhere on the battlefields of the theory wars. In literature and film, one encounters rather less metafiction and upcycled kitsch than might be expected in our AI- and IP-laden times. One recent exception, Francis Ford Coppola’s brashly pomo production Megalopolis, confirmed our suspicion that only architects still embrace the P-wordarchitects, and a smattering of far-right YouTubers, who decry “postmodernism” as a gateway drug to low-IQ wokeness, immorality, and (only sort of wrongly) Marxism.

more here.

Enjoying the content on 3QD? Help keep us going by donating now.