Thursday Poem

Mr. Palomar’s Wave

A long time ago, I went with my aunt to hear
Italo Calvino at the 92nd Street Y—so close
to where she lived then we could walk there.

My aunt loved all things Italian, especially literature.
She was the Italianate aunt, who happened to come
from Cleveland. To me, Mr. Calvino looked very old.

He read the story about Mr. Palomar, who tries
to isolate a single wave, though of course the sea
keeps moving and changing, and he read in his own

English, which sounded more like Italian—that is,
he described Mr. Palomar’s impossible task
in a sing-song voice that made the feat sound

quite possible, the way a child might, if that
child were an Italian child. On an Italian beach.
Determined to isolate an Italian wave.

Or so it seemed to me. Charmed by Mr. Palomar,
I understood nothing of his fury to isolate what
would not be fixed, nor how his effort to trick time

wasn’t a choice, but a fervent belief he might
ambush the world’s complexities, no longer be
this tense, neurasthenic man, unsure of everything.

If my aunt, who was starting to grow old
knew how little of this I absorbed, she was kind
enough to say nothing, knowing that when

I re-encountered Mr. Palomar and his wave,
twenty or forty years on, Calvino’s story would
be a hook to the heart, and that there was

no rush for such knowledge—none at all.
There was her hand on mine (the frail wrist
with her Venetian gold snake swallowing

its tail) signaling time to head for the exits
and find a cab before the rest of the crowd
got out, because this was Manhattan

in the early eighties, and it was late
of an evening, when she still lived
so close we could have walked back.

by Julie Bruck
from
Plume Poetry



The Myth of Oscar Wilde’s Martyrdom

Clare Bucknell in The New Yorker:

Oscar Wilde was in the dock when he observed himself becoming two people. It was a Saturday in May, 1895, the final day of his trial for “gross indecency,” and the solicitor general, Frank Lockwood, was in the midst of a closing address for the prosecution. His catalogue of accusations, shot through with moral disgust, struck Wilde as an “appalling denunciation”—“like a thing out of Tacitus, like a passage in Dante,” as he wrote two years later. He was “sickened with horror” at what he heard. But the sensation was short-lived: “Suddenly it occurred to me, How splendid it would be, if I was saying all this about myself. I saw then at once that what is said of a man is nothing. The point is, who says it.” At the critical moment, he was able to transform the drama in his imagination by taking both roles, substituting the real Lockwood with an alternative Wilde, one who could control the courtroom and its narrative.

Martyrs don’t usually admit to feeling “sickened” by accounts of their own behavior, and any ambiguities or contradictions in their personalities tend to be glossed over by their hagiographers. Among Wilde’s modern biographers, faced with a subject whose life has been flattened out for exemplary purposes by various communities (gay, Irish, Catholic, socialist), it’s axiomatic to acknowledge his multidimensionality, his slipperiness. “Oscar Wilde lived more lives than one, and no single biography can ever compass his rich and extraordinary life,” Neil McKenna tells us at the beginning of “The Secret Life of Oscar Wilde” (2005), before choosing just one of those lives to tell—Wilde’s sexual and emotional history. Biographers who do aim to “compass” the whole story, as Hesketh Pearson (1946), H. Montgomery Hyde (1975), Richard Ellmann (1988), and now Matthew Sturgis have sought to do, are obliged not only to recognize the many Wildes but to do something about them.

More here.

New brain-inspired chips could provide the smarts for autonomous robots and self-driving cars

Robert Service in Science:

HILLSBORO, OR – AUGUST 9: Mike Davies, Director of Neuromorphic Computing Lab, photographed at Intel Jones Farm Conference Center in Hillsboro, Ore. on August 9, 2021.

HILLSBORO, OREGON—Though he catches flak for it, Garrett Kenyon, a physicist at Los Alamos National Laboratory, calls artificial intelligence (AI) “overhyped.” The algorithms that underlie everything from Alexa’s voice recognition to credit card fraud detection typically owe their skills to deep learning, in which the software learns to perform specific tasks by churning through vast databases of examples. These programs, Kenyon points out, don’t organize and process information the way human brains do, and they fall short when it comes to the versatile smarts needed for fully autonomous robots, for example. “We have a lot of fabulous devices out there that are incredibly useful,” Kenyon says. “But I would not call any of that particularly intelligent.”

Kenyon and many others see hope for smarter computers in an upstart technology called neuromorphic computing. In place of standard computing architecture, which processes information linearly, neuromorphic chips emulate the way our brains process information, with myriad digital neurons working in parallel to send electrical impulses, or spikes, to networks of other neurons. Each silicon neuron fires when it receives enough spikes, passing along its excitation to other neurons, and the system learns by reinforcing connections that fire regularly while paring away those that don’t. The approach excels at spotting patterns in large amounts of noisy data, which can speed learning. Because information processing takes place throughout the network of neurons, neuromorphic chips also require far less shuttling of data between memory and processing circuits, boosting speed and energy efficiency.

Neuromorphic computing isn’t new. Yet, progress has been slow, with chipmakers reluctant to invest in the technology without a proven market, and algorithm developers struggling to write software for an entirely new computer architecture. But the field appears to be maturing as the capabilities of the chips increase, which has attracted a growing community of software developers.

More here.

Thursday, October 7, 2021

The Inescapable Dilemma of Infectious Disease

Kyle Harper in the Boston Review:

The control of infectious disease is one of the unambiguously great accomplishments of our species. Through a succession of overlapping and mutually reinforcing innovations at several scales—from public health reforms and the so-called hygiene revolution, to chemical controls and biomedical interventions like antibiotics, vaccines, and improvements to patient care—humans have learned to make the environments we inhabit unfit for microbes that cause us harm. This transformation has prevented immeasurable bodily pain and allowed billions of humans the chance to reach their full potential. It has relieved countless parents from the anguish of burying their children. It has remade our basic assumptions about life and death. Scholars have found plenty of candidates for what made us “modern” (railroads, telephones, science, Shakespeare), but the control of our microbial adversaries is as compelling as any of them. The mastery of microbes is so elemental and so intimately bound up with the other features of modernity—economic growth, mass education, the empowerment of women—that it is hard to imagine a counterfactual path to the modern world in which we lack a basic level of control over our germs. Modernity and pestilence are mutually exclusive; the COVID-19 pandemic only underscores their incompatibility.

But to grasp the full significance of the history of infectious disease, we need more than ever to understand this recent success through the lens of ecology and evolution.

More here.

Sean Carroll’s Mindscape Podcast: Chiara Marletto on Constructor Theory, Physics, and Possibility

Sean Carroll in Preposterous Universe:

Traditional physics works within the “Laplacian paradigm”: you give me the state of the universe (or some closed system), some equations of motion, then I use those equations to evolve the system through time. Constructor theory proposes an alternative paradigm: to think of physical systems in terms of counterfactuals — the set of rules governing what can and cannot happen. Originally proposed by David Deutsch, constructor theory has been developed by today’s guest, Chiara Marletto, and others. It might shed new light on quantum gravity and fundamental physics, as well as having applications to higher-level processes of thermodynamics and biology.

More here.

Teaching Poetry in the Palestinian Apocalypse

George Abraham in Guernica:

I gave my first lecture, at my first academic job, behind a wall of plexiglass, speaking to an awkwardly spaced out group of masked students who had maybe already given up – and honestly, who could blame them? I walked in sweating and late because my building’s social distancing protocol required me to run up five floors and down two to get to my third floor classroom. Leaning into the mic, I opened with the joke: “Welcome to apocalyptic poetry!”

My students chuckled nervously. Maybe the joke was that it was day one of the fall semester, and who really wanted to be in a required advanced poetic form class? Or maybe it was my way of cutting the tension of our gathering, united by the sole purpose of discussing poetry in a time that, back then, felt newly apocalyptic to some.

Soon, apocalypse became a tired punchline. Languishing through mere existence, I did what any young Palestinian instructor of literature would likely do: I returned to Audre Lorde, who reminds us “poetry is not a luxury,” and June Jordan, who gives us models for writing against and despite the state.

More here.

The chimaera challenge: Animals that have human organs could save the lives of people

Liam Drew in Nature:

It’s 2036, and you have kidney failure. Until recently, this condition meant months or years of gruelling dialysis, while you hoped that a suitable donor would emerge to provide you with replacement kidneys. Today, thanks to a new technology, you’re going to grow your own. A technician collects a small sample of your blood or skin and takes it to a laboratory. There, the cells it contains are separated out, cultured and treated with various drugs. The procedure transforms the cells into induced pluripotent stem (iPS) cells, which, like the cells of an early embryo, are capable of generating any of the body’s tissues. Next, the technician selects a pig embryo that has been engineered to lack a gene required to grow kidneys, and injects your iPS cells into it. This embryo is implanted in a surrogate sow, where it develops into a young pig that has two kidneys consisting of your human cells. Eventually, these kidneys are transplanted into your body, massively extending your life expectancy.

This hypothetical pig is a chimaera: an animal composed of cells derived from more than one fertilized egg. The name comes from the part-lion, part-goat, part-serpent chimaera of ancient mythology. And, right now, a small number of scientists are working to make the above kidney scenario a reality.

More here.

What Squid Game’s fantasies and harsh realities reveal about Korea

Aja Romano in Vox:

By any measure, Netflix’s Squid Game is a runaway hit. The Korean drama-slash-horror series about a battle royale conducted via children’s playground games — think Red Light, Green Light or tug of war but with a lot more blood — debuted on September 17 and became an instant sensation, rocketing to the top of Netflix’s most-viewed releases and generating memes across social media. After barely three weeks on the platform, Squid Game has not only become the most popular Korean drama in Netflix’s history, but it’s on track to surpass Bridgerton as the most popular show in Netflix history.

Squid Game’s success is such a fantastic payoff for Netflix’s decision to invest $500 million in Korean entertainment in 2021 that it is causing the company’s stock to boom. That might be somewhat ironic given that Squid Game is all about socioeconomic divides, the exploitation of the poor by the rich, and the desperation of Korea’s financially destitute class of laid-off workers.

Creator Hwang Dong-hyuk came up with the idea for the show after years spent reading manga and manhwa (Japanese and Korean comics, respectively) with similar themes, including the influential horror satire Battle Royale, which kicked off the contemporary trend of ensemble casts battling each other to the death in elaborate high-stakes gaming arenas. He paired these concerns with the Korean entertainment industry’s ongoing interest in the socioeconomic plight of a growing number of downwardly mobile workers, once solidly middle class, who’ve found themselves forced into lower-paying jobs due to Korea’s changing economy and decreasing reliance on industry.

More here.

Thursday Poem

How Do You Do?

All hands are out in the street today,
straining against the leashes of forearms.
Little concerned with us, they leap
to greet each other, tangle and clasp,
a subtle suction, like a kiss,
then off again in a friendly game
of overlord and underdog
we only understand in part.

Sometime later, folded in prayer,
or contemplation, right says to left,
if anything should happen to me
you’ll know, won’t you, what to do?

and left says to right, you’ve always kept me
friendless and illiterate.

We really ought to get them to shake,
but it’s not clear they fit that way.

by Jeff Dolven
from
The Hopkins Review, Spring 2009

“Titane,” Reviewed

Richard Brody at The New Yorker:

The curse of genre is that it encourages filmmakers to downplay causes in the interest of effects. In the best genre movies, the quantity and power of these effects serve as sufficient compensation for the thinned-out drama. “Titane,” the new film by Julia Ducournau, is a genre film, a twist on horror with a twist on family—like Ducournau’s first feature, “Raw.” But “Titane” is far stronger, far wilder, far stranger. The radical fantasy of its premise—a woman gets impregnated by a car—wrenches the ensuing family drama out of the realm of the ordinary and into one of speculative fantasy and imaginative wonder that demands a suspension of disbelief—which becomes the movie’s very subject.

The film’s protagonist, Alexia (Agathe Rousselle), has an affinity for cars that amounts to a sort of destiny. As a child (played by Adèle Guigue), Alexia is sitting in the back seat of a car driven by her father (Bertrand Bonello, himself a notable director), who’s got music on the radio.

more here.

Chatting With Maggie Nelson

Ross Simonini and Maggie Nelson at The Believer:

BLVR: It’s interesting that you make the distinction between art and not-art, because your writing doesn’t seem to make that distinction.

MN: I do think it’s all part of one flow. And to me, how personal it is and what form it takes on the page are just a Bob Creeley, “form follows content” kind of a thing. But at the same time, I’m alert to what different genres can do. And I think if someone calls Bluets a novel, I’m like, OK, that’s fine. To me, it’s within the realm of experimental speculative nonfiction. Someone could say The Art of Cruelty is a series of essays, but it was not conceived as a series of essays. It was conceived as an ongoing thought that had episodic rings of action. On Freedom is weird because it’s four long chapters, which are each, like, seventy-five pages in manuscript. This was not a particularly elegant form to me. There was no real experimenting with the accretion of fragments, like I’ve done before. And then that became a kind of formal question to me, like: How can things this long hang together? I like to come up with subtitles, like “a reckoning” or “a murder” or things that kind of name something about the form. And I always thought of these sections of On Freedom as long songs. That was my idea about them. Songs can be quite long and still hang together, and they are less boring than chapters.

more here.

Wednesday, October 6, 2021

Water: A Biography

Giulio Boccaletti in Orion:

The fundamental struggle with water has never really abated since it first began on the shores of the Persian Gulf. The multiple transitions, from nomadism to sedentism, from hunting and foraging to domesticated agriculture, from small rural communities to a productive, specialized, urbanized society, were severe disruptions. But while individuals would have lived through them as gradual, incremental transformations, over the course of Homo sapiens’ existence, they amounted to shocking events. From the moment Homo sapiens, late in its history, decided to stay in one place, surrounded by a changing environment, it began to wrestle with water, an agent capable of destruction and life-giving gifts.

The reason the early story of water and society matters is that it has left deep cultural traces guiding and inspiring human adaptation ever since. For example, given the experience of early Chinese communities, it is not surprising that water myths are abundant in that culture, and have captured the role of the water landscape in Chinese identity. One Chinese myth tells of how the world formed from the body of a giant, whose blood and veins turned into water and rivers. In another, the Jade Emperor, Lord of Heaven, entrusted four great dragons to bring rain to the people. Their names were Long, Yellow, Pearl, and Black. After they disobeyed him, he entrapped them in mountains, so the dragons turned themselves into rivers, becoming the Yangtze, Yellow, Pearl, and Amur Rivers, the great historical sources of water for agriculture. These are the cultural traces of the great East Asian monsoon.

More here.

Two Stray Notes On “Moby-Dick”

William Logan at The New Criterion:

The harsh criticism of Melville’s period may be almost as disconcerting to a modern reader as the shiftiness of publishers. Those who think critics should never criticize may find the treatment of his masterpiece, however acceptable by nineteenth-century standards, not edifying but shocking. The early critical reactions did not prevent Moby-Dick from being recognized as the Great American Novel, though that took most of a century—and neither did the positive reviews, nor the bits of possibly paid-for puffery, create a best-seller. The British edition never sold out. Though Moby-Dick did go through three more American printings over the next twenty years, the later ones were small and overall sales poor. Melville’s nine novels were published in an astonishing eleven years, the first seven in seven. After the last, The Confidence-Man (1857), an act of genius exceeded only by the tale of the whale, Melville abandoned fiction and fled to poetry, for which he possessed almost no gift. For two decades he was forced to make his living as a New York City customs-house inspector.

more here.

The Pattern And Decoration Movement

Lynne Cooke at Artforum:

IN AN APPRECIATIVE 2016 REVIEW of new work by Valerie Jaudon, critic David Frankel noted that the Pattern and Decoration movement, of which Jaudon was a prominent member, had long been held in disrepute. “In the early ’80s,” Frankel wrote, “I remember a colleague at Artforum at the time saying it could never be taken seriously in the magazine.”1 In retrospect, what makes this dismissal so striking is that, in the mid-’70s, Artforum contributed significantly to P&D’s emergence into the spotlight, publishing key texts by its advocates along with numerous reviews of its shows. Amy Goldin’s “Patterns, Grids, and Painting” (1975) and Jeff Perrone’s “Approaching the Decorative” (1976) were among the early touchstones for P&D’s heterogeneous cohort, riled by the unmitigated critical support for diverse ascetic and masculinist tendencies pervasive in the painting of the moment. However, by the mid-’80s, eclipsed by newer developments—the Pictures generation, neo-geo, et al.—P&D was increasingly coming under fire for positions now considered controversial: for the purported essentialism of its versions of second-wave feminism, for a naive advocacy that masked acts of Orientalizing and primitivizing, for cultural imperialism. More fundamental “problems” largely went unnoted, including a lack of the kind of conceptual depth expected of cutting-edge practices: In their commitment to the decorative, P&D artists prioritized surface over subject matter, the former serving primarily as a vehicle for sensuous effects. Not least, the art world’s entrenched sexism fostered the occasion for its denizens to belittle and sideline a movement renowned for the dominant role played by women in its genesis and trajectory.

more here.

Anil Seth Finds Consciousness in Life’s Push Against Entropy

Dan Falk in Quanta:

Anil Seth wants to understand how minds work. As a neuroscientist at the University of Sussex in England, Seth has seen firsthand how neurons do what they do — but he knows that the puzzle of consciousness spills over from neuroscience into other branches of science, and even into philosophy.

As he puts it near the start of his new book, Being You: A New Science of Consciousness (available October 19): “Somehow, within each of our brains, the combined activity of billions of neurons, each one a tiny biological machine, is giving rise to a conscious experience. And not just any conscious experience, your conscious experience, right here, right now. How does this happen? Why do we experience life in the first person?”

This puzzle — the mystery of how inanimate matter arranges itself into living beings with self-aware minds and a rich inner life — is what the philosopher David Chalmers called the “hard problem” of consciousness. But the way Seth sees it, Chalmers was overly pessimistic. Yes, it’s a challenge — but we’ve been chipping away at it steadily over the years.

More here.

On the Push and Pull of Muslim Cultural Identity

Omar Mouallem in Literary Hub:

The first time I learned I was Muslim was in preschool.

During an excursion to a pizzeria, which is what passed for a field trip in my hometown of High Prairie, Alberta, I consumed a few morsels of ham. My mom arrived with the other parents to pick me up from class, and I began to sing the praises of Hawaiian pizza. She cut me off with a gasp. “You’re Muslim,” she said loudly for my teacher to hear. “Muslims don’t eat pork.”

Abstaining from pork could be the first law of the Five Pillars of Western Islam. Unlike the actual pillars (pray daily, pay alms, fast through Ramadan, pilgrimage to Mecca, and declare Muhammad as a messenger of the one true God), they are defined by what you don’t do: eat pork, celebrate Christmas, drink alcohol, gamble, and date. It’s safe to say that I’ll probably never complete the first set of pillars. But the second I dutifully observed until mid-adolescence. Then I pushed them over, one by one, over the course of ten years. The first to go up was the last to fall.

Which brings me to the second time I learned I was Muslim, a few years ago, at age 30.

More here.