Casey Cep in The New Yorker:
Who was the monkiest monk of them all? One candidate is Simeon Stylites, who lived alone atop a pillar near Aleppo for at least thirty-five years. Another is Macarius of Alexandria, who pursued his spiritual disciplines for twenty days straight without sleeping. He was perhaps outdone by Caluppa, who never stopped praying, even when snakes filled his cave, slithering under his feet and falling from the ceiling. And then there’s Pachomius, who not only managed to maintain his focus on God while living with other monks but also ignored the demons that paraded about his room like soldiers, rattled his walls like an earthquake, and then, in a last-ditch effort to distract him, turned into scantily clad women. Not that women were only distractions. They, too, could have formidable attention spans—like the virgin Sarah, who lived next to a river for sixty years without ever looking at it.
These all-stars of attention are just a few of the monks who populate Jamie Kreiner’s new book, “The Wandering Mind: What Medieval Monks Tell Us About Distraction” (Liveright). More specifically, they are the exceptions: most of their brethren, like most of us, were terrible at paying attention.
Jennifer Szalai in The New York Times:
Among the resources that have been plundered by modern technology, the ruins of our attention have commanded a lot of attention. We can’t focus anymore. Getting any “deep work” done requires formidable willpower or a broken modem. Reading has degenerated into skimming and scrolling. The only real way out is to adopt a meditation practice and cultivate a monkish existence. But in actual historical fact, a life of prayer and seclusion has never meant a life without distraction. As Jamie Kreiner puts it in her new book, “The Wandering Mind,” the monks of late antiquity and the early Middle Ages (around A.D. 300 to 900) struggled mightily with attention. Connecting one’s mind to God was no easy task. The goal was “clearsighted calm above the chaos,” Kreiner writes. John of Dalyatha, an eighth-century monk who lived in what is now northern Iraq, lamented in a letter to his brother, “All I do is eat, sleep, drink and be negligent.”
Kreiner, a historian at the University of Georgia, organizes the book around the various sources of distraction that a Christian monk had to face, from “the world” to the smaller “community,” all the way down to “memory” and the “mind.” Abandoning the familiar and profane was only the first step in what would turn out to be an unrelenting process — though as Kreiner explains, many monks continued to reside at home, committing themselves to lives of renunciation and prayer. For the monks who did leave, there were any number of possibilities beyond the confines of a monastery, which could pose its own distractions. Caves and deserts were obvious alternatives. Macedonius “the Pit” was partial to holes in the ground. Frange dwelled in a pharaoh’s tomb. Simeon, a “stylite,” lived on top of a pillar.
Bill Gates in his own blog:
The United States has made remarkable progress over the last two years toward a future where every home is powered by clean energy. Thanks in part to historic federal investments, we’re on a path to use more clean electricity sources than ever before—including wind, solar, nuclear, and geothermal energy—which would reduce household costs, cut pollution, and diversify our energy supply so we’re not dependent on any one thing.
But to take advantage of this opportunity, we need to first bring our grid into the 21st century. (This is an issue in other places around the world, too, but I’m going to focus on the U.S. here.) The way we move electricity around in this country just isn’t designed to meet modern energy needs.
Elizabeth Gibney in Nature:
The Moon doesn’t currently have an independent time. Each lunar mission uses its own timescale that is linked, through its handlers on Earth, to coordinated universal time, or UTc — the standard against which the planet’s clocks are set. But this method is relatively imprecise and spacecraft exploring the Moon don’t synchronize the time with each other. The approach works when the Moon hosts a handful of independent missions, but it will be a problem when there are multiple craft working together. Space agencies will also want to track them using satellite navigation, which relies on precise timing signals.
It’s not obvious what form a universal lunar time would take.
Katrina Goldstone at the Dublin Review of Books:
When Joseph Roth died in penury in Paris in 1939, he left little behind. No trace of his collection of penknives, watches, walking canes and clothes bought for him by Stefan Zweig. The knives had been gathered to protect himself from both imagined and very real enemies. Soon after war’s end, his cousin visited one of his translators, Blanche Gidon, and was presented with “an old worn out coupe case”. Within it was a treasure trove ‑ some manuscripts, never published in his lifetime, books and letters. Throughout the long dark night of Nazi occupation Gidon had kept it hidden under the bed of the concierge. There have been other custodians of Roth’s reputation along the way, Hermann Kesten, a friend, and Roth’s translator, Michael Hofmann. Yet his literary significance was often ignored. Roth had been an early and vocal critic of Hitlerism. His masterpiece, The Radetsky March, had been among the first books committed for incendiary destruction when the Nazis came to power. Yet, as this magisterial biography by Kieron Pim shows, the phrase “man of many contradictions” is scarcely fit for purpose when trying to grapple with the complex contrarian Moses Joseph Roth. If Joseph Roth hadn’t been born, he’d have been invented, as a picaresque character in a novel probably by an impoverished disillusioned Mitteleuropa writer fleeing from Nazi Germany for his life.
Quinn Moreland at Pitchfork:
The most ambitious moment on Rat Saw God arrives via the eight-and-a-half-minute opus “Bull Believer.” While Hartzman prefers to keep the specifics private, it’s a song about death and desperation told through images of blood, roadside monuments, and a bull with one foot in the grave.
Hartzman knew that she wanted the track to conclude with a guttural outcry, but she was also a little frightened by what might emerge from her body when she opened her mouth. “I was too self-conscious to try in practice,” she explains. “I imagined driving into the middle of nowhere and trying to scream, but even then I was worried someone would hear me and be worried.” At the studio, while her bandmates played Tetris downstairs, Hartzman channeled the agony of the past until it erupted out of her in a torrent of screams, nervous laughter, and a single phrase pulled from Mortal Kombat: “Finish him!”
Kaiser and Brainard in Science:
Physiologist Alejandro Caicedo of the University of Miami Miller School of Medicine is preparing a grant proposal to the U.S. National Institutes of Health (NIH). He is feeling unusually stressed because of a new requirement that takes effect today. Along with his research idea, to study why islet cells in the pancreas stop making insulin in people with diabetes, he will be required to submit a plan for managing the data the project produces and sharing them in public repositories.
For his lab, that’s a daunting task. Unlike neuroscience or genomics, Caicedo’s field has no common platforms or standards for storing and sharing the kinds of data his lab generates, such as videos of pancreatic islet cells responding to a glucose stimulus. The “humongous” raw imaging files are currently stored in an on-campus database, notes Julia Panzer, a postdoctoral researcher in the lab. To protect patient privacy, the database is secured and not designed to provide access to outsiders. Sharing the data will mean uploading them somewhere else.
Pranab Bardhan in Project Syndicate:
I think the century will probably not belong to China or India – or any country, for that matter. Chinese achievements in the last few decades have been phenomenal, but it is now experiencing a palpable – and expected – slowdown. And while international financial media have been hyping the arrival of “India’s moment,” a cold look at the facts suggests that such assessments are premature at best.
A key reason for this is demographic. Yes, India has most likely already surpassed China as the world’s largest country by population, and its youth bulge is substantial. But far from delivering a “demographic dividend,” India’s relatively young population may turn out to be a liability, as the country struggles to generate sufficient productive employment.
From Delancey Place:
“Marcel Duchamp (1887-1968), the younger brother of the sculptor Duchamp-Villon, was perhaps the most stimulating intellectual to be concerned with the visual arts in the twentieth century — ironic, witty and penetrating. He was also a born anarchist. Like his brother, he began (after some exploratory years in various current styles) with a dynamic Futurist version of Cubism, of which his painting Nude Descending a Staircase, No. 2 is the best known example. It caused a scandal at the famous Armory Show of modern art in New York in 1913. Duchamp’s ready-mades are everyday manufactured objects converted into works of art simply by the artist’s act of choosing them. Duchamp did nothing to them except present them for contemplation as ‘art’. They represent in many ways the most iconoclastic gesture that any artist has ever made — a gesture of total rejection and revolt against accepted artistic canons. For by reducing the creative act simply to one of choice ‘ready-mades’ discredit the ‘work of art’ and the taste, skill, craftsmanship — not to mention any higher artistic values — that it traditionally embodies. Duchamp insisted again and again that his ‘choice of these ready-mades was never dictated by an aesthetic delectation. The choice was based on a reaction of visual indifference, with at the same time a total absence of good or bad taste, in fact a complete anaesthesia.’
Main Street: Tilton, New Hampshire
I waited in the car while he
went into the small old-fashioned grocery
for a wedge of cheddar.
Late summer, Friday afternoon.
A mother and child walked past
trading mock blows
with paper bags full of —what—
maybe new clothes for school.
They turned the corner by the Laundromat,
and finally even the heel
of the girl’s rubber flip-flop
passed from sight.
Across the street a blue pickup, noisy,
with some kind of homemade wooden
scaffolding in the bed pulled
close to the curb. A man got out
and entered the bank . . .
………………………………….. A woman sat
in the cab, dabbing her face
with tissue. She might have been weeping,
but it was hot and still,
and maybe she wasn’t weeping at all.
Through time and space we came
to Main Street—three days before
Labor Day, 1984, 4:47 in the afternoon;
and then that moment passed, displaced
by others equally equivocal.
by Jane Kenyon
Graywolf Press, 1997
Ted Gioia at The Honest Broker:
Sigrid Undset was an unlikely literary star. Modernist themes were on the ascendancy in those days, but she wanted to write medieval romances. So at night, after work, she researched her subject, studying the sagas, old ballads, and chronicles of the Middle Ages.
You might say she lived in the past. Or in a fantasy world of her own creation. But Undset wanted to deal with these kinds of tales in a brutally realistic manner. She started writing stories where, in her own words, “everything that seems romantic from here—murder, violent episodes, etc. becomes ordinary—comes to life.” Her first novel was rejected by publishers. But her second book got noticed—and caused a scandal. The opening line announced: “I have been unfaithful to my husband.”
more here. (via The Book Haven)
Do we really need copyediting? I don’t mean the basic clean-up that reverses typos, reinstates skipped words, and otherwise ensures that spelling and punctuation marks are as an author intends. Such copyediting makes an unintentionally “messy” manuscript easier to read, sure.
But the argument that texts ought to read “easily” slips too readily into justification for insisting a text working outside dominant Englishes better reflect the English of a dominant-culture reader—the kind of reader who might mirror the majority of those at the helm of the publishing industry, but not the kind of reader who reflects a potential readership (or writership) at large.
Alexandra Alter and Elizabeth A. Harris in the New York Times:
In “Victory City,” a new novel by Salman Rushdie, a gifted storyteller and poet creates a new civilization through the sheer power of her imagination. Blessed by a goddess, she lives nearly 240 years, long enough to witness the rise and fall of her empire in southern India, but her lasting legacy is an epic poem.
“All that remains is this city of words,” the poet, Pampa Kampana, writes at the end of her epic, which she buries in a pot as a message for future generations. “Words are the only victors.”
Framed as the text of a rediscovered medieval Sanskrit epic, “Victory City” is about mythmaking, storytelling and the enduring power of language. It is also a triumphant return to the literary stage for Rushdie, who has been withdrawn from public life for months, recovering from a brutal stabbing while onstage during a cultural event in New York last year.
Leon Vlieger at The Inquisitive Biologist:
The wolves reintroduced to Yellowstone National Park in 1995 are some of the best-studied mammals on the planet. Biological technician and park ranger Rick McIntyre has spent over two decades scrutinising their daily lives, venturing into the park every single day. Where his previous books focused on three notable alpha males, it is ultimately the females that call the shots and make the decisions with lasting consequences. This book is a long overdue recognition of the female wolf and continues this multigenerational saga.
If wolf 21, the subject of the second book, was the most famous male wolf in Yellowstone, then his granddaughter 06 (named after her year of birth, 2006) can safely be called the most famous female wolf. This fourth book picks up where the third book ended, covering the period 2009–2015. It tells 06’s life story, her untimely death, and the fate of one of her daughters.
Magdalena Pacholska at the Cambridge University Press:
Military artificial intelligence (AI)-enabled technology might still be in the relatively fledgling stages but the debate on how to regulate its use is already in full swing. Much of the discussion revolves around autonomous weapons systems (AWS) and the ‘responsibility gap’ they would ostensibly produce. This contribution argues that while some military AI technologies may indeed cause a range of conceptual hurdles in the realm of individual responsibility, they do not raise any unique issues under the law of state responsibility. The following analysis considers the latter regime and maps out crucial junctions in applying it to potential violations of the cornerstone of international humanitarian law (IHL) – the principle of distinction – resulting from the use of AI-enabled military technologies. It reveals that any challenges in ascribing responsibility in cases involving AWS would not be caused by the incorporation of AI, but stem from pre-existing systemic shortcomings of IHL and the unclear reverberations of mistakes thereunder. The article reiterates that state responsibility for the effects of AWS deployment is always retained through the commander’s ultimate responsibility to authorise weapon deployment in accordance with IHL. It is proposed, however, that should the so-called fully autonomous weapon systems – that is, machine learning-based lethal systems that are capable of changing their own rules of operation beyond a predetermined framework – ever be fielded, it might be fairer to attribute their conduct to the fielding state, by conceptualising them as state agents, and treat them akin to state organs.
Zeynep Tufecki in The New York Times:
Any close follower of the British media should not have been surprised that after Prince Harry fell in love with Meghan Markle, the biracial American actress, years of vitriolic, even racist coverage followed. Whipping hatred and spreading lies — including on issues far more consequential than a royal romance — is a specialty of Britain’s atrocious but politically influential tabloids.
People like me, uninterested in celebrities, shouldn’t dismiss the brouhaha around Harry’s memoir as mere celebrity tittle-tattle. He has made credible, even documented claims that his own family refused to stand up against their ugly, sustained attacks against Meghan. In other words, it appears that Britain’s most revered institution, funded by tens of millions in taxpayer funds annually, plays ball with one of its most revolting institutions.
At the very least, it seems clear by now where some senior members of the royal family position themselves in all this.
Heidi Ledford in Nature:
Crystal Mackall remembers her scepticism the first time she heard a talk about a way to engineer T cells to recognize and kill cancer. Sitting in the audience at a 1996 meeting in Germany, the paediatric oncologist turned to the person next to her and said: “No way. That’s too crazy.”
Today, things are different. “I’ve been humbled,” says Mackall, who now works at Stanford University in California developing such cells to treat brain tumours. The US Food and Drug Administration approved the first modified T cells, called chimeric antigen receptor (CAR)-T cells, to treat a form of leukaemia in 2017. The treatments have become game changers for several cancers. Five similar products have been approved, and more than 20,000 people have received them. A field once driven by a handful of dogged researchers now boasts hundreds of laboratory groups in academia and industry. More than 500 clinical trials are under way, and other approaches are gearing up to jump from lab to clinic as researchers race to refine T-cell designs and extend their capabilities. “This field is going to go way beyond cancer in the years to come,” Mackall predicts.