Ali Smith, et alia, at The Guardian:
I heard John Berger speaking at the end of 2015 in London at the British Library. Someone in the audience talked about A Seventh Man, his 1975 book about mass migrancy in which he says: “To try to understand the experience of another it is necessary to dismantle the world as seen from one’s own place within it and to reassemble it as seen from his.”
The questioner asked what Berger thought about the huge movement of people across the world. He put his head in his hands and sat and thought; he didn’t say anything at all for what felt like a long time, a thinking space that cancelled any notion of soundbite. When he answered, what he spoke about ostensibly seemed off on a tangent. He said: “I have been thinking about the storyteller’s responsibility to be hospitable.”
As he went on, it became clear how revolutionary, hopeful and astute his thinking was. The act of hospitality, he suggested, is ancient and contemporary and at the core of every story we’ve ever told or listened to about ourselves – deny it, and you deny all human worth. He talked about the art act’s deep relationship with this, and with inclusion. Then he gave us a definition of fascism: one set of human beings believing it has the right to cordon off and decide about another set of human beings.
Sergio Pitol at The Quarterly Conversation:
Our century seems to take pleasure in repeating cyclically that strange comedy of errors that stirs between certain authors and an unreceptive public. The cases of Robert Musil, Hermann Broch, Malcolm Lowry, Joseph Roth are examples of writers who have needed an upheaval in literary taste, which happened twenty-five or thirty years after their death, in order for the magnitude of works like The Man Without Qualities, The S, Under the Volcano, The Radetzky March, At Swim-Two-Birds, and The Third Policeman to be added to the list of those fundamental novels of our time that have been rediscovered belatedly.
The bizarre name of this novel, At Swim-Two-Birds, comes from the name of a village that lies on the banks of the River Shannon, which is the Anglicized form of a long-ago place mentioned in medieval Irish lyric, which in Gaelic sounds like Snám-da-en.
At Swim-Two-Birds entails a dizzying transit between every register of Irish literature, and is a book that contains at least three other books: one, about the relationship between the novelist and his characters, an erratic convivence between the demiurge and his creatures, who end up rebelling against he who gave them life; another, about the old medieval legend of King Sweeney whom God cursed with madness and —as if that were not enough! —with immortality, for having attempted to kill a pious cleric, and who in those old Gaelic songs appears transformed into a pathetic old bird that leaps from tree to tree; and a third, which registers at a level that could be called realist, composed of the familial vicissitudes of a young man who attempts to write a novel, his initiation into alcohol, his day-to-day conflicts.
Thomas Curwen at the LA Times:
In the spring of 1952, poet Philip Levine worked at the Chevrolet Gear and Axle plant in Detroit, Mich. He was 24 and had been writing poems for nearly 10 years.
Some get their calling early, but being a young poet is not easy. He had clocked hours in an ice factory, a bottling corporation, on the railroad — but none as boring and hazardous as in the gear and axle plant, where he pulled automotive parts from the forge and hung them on conveyors that whisked them elsewhere in the factory.
Years later, well committed to a life of “poverty and poetry,” he could still feel “almost without hatred that old sense of utter weariness that descended each night from my neck to my shoulders, and then down my arms to my wrists and hands.”
Over his lifetime, Levine, who died at 87 in 2015, never lost that muscle memory, his ars poetica. He carried it with him whenever he “lifted a pencil to write,” the memory of that hellish world — with its clanging steel, poisoned rats and freezing winds rushing through broken windows — of so many exploited lives.
Gary Ferguson in Orion Magazine:
I grew up in an age of industrial hauteur, a woozy time of contrivance and contraption that, despite enormous benefits, was by the mid-1950s sick with bravado. My brother and I, along with millions of fellow baby boomers, took our first bike rides and hoisted our first kites in a world stained by poisons, from nuclear fallout in the Rockies to DDT on the Great Plains. In New York City alone, three separate smog events between 1953 and 1966 killed more than six hundred people. Meanwhile the Bureau of Reclamation was stumping hard to shove dams across many of the last wild rivers of the West, including a dogged yet ultimately unsuccessful attempt to plug Colorado’s Green River, in the heart of Dinosaur National Monument. Urban waterways were fouled beyond recognition, most notably the Cuyahoga in Ohio, so dirty with oil and solvents that in 1969 it caught fire. Public forests throughout the Pacific Northwest and California suffered from massive clearcutting, including the destruction of nearly all the giant sequoias on private land. In 1964 there were still government-sponsored bounties on a wide variety of “bad” animals, from mountain lions to coyotes, wolves to weasels, hawks to owls. And in 1969, when I was thirteen, a hundred thousand barrels of crude oil spilled off the coast of Santa Barbara, wiping out thousands of birds and sea lions and elephant seals.
The environmental movement that arose in response to these disasters was hardly populated by Luddites. The very emblem of the movement was a photograph called Earthrise, taken by Apollo 8 astronaut William Anders—the product of one of the most complicated technological accomplishments in human history. Indeed the boomers deliberately linked science and technology to the goals of clean air and water and sustainable agriculture. They succeeded because they were good at speaking scientific truth to power rather than pining for an Arcadian past. In 1967, about 20 percent of Americans listed the environment as a top priority. Just fifteen months later—in large part thanks to rallies and protests and organizing by young environmentalists—that number had swelled to 80 percent.
Laurel Thatcher Ulrich in Harvard Magazine:
Every object tells a story, and most objects tell many stories. Some can help us transcend boundaries between people, cultures, and academic disciplines to discover crosscurrents in history. Allow me to make that argument by examining a common object, an “orphaned” sewing machine. Several years ago, my colleague Ivan Gaskell and I decided it would be interesting to have students look at one of the landmark inventions of the nineteenth century—a sewing machine. The first sewing machines were patented about 1845. By 1900 they were as common as a cell phone might be today—and just as much a model of innovation and social transformation. When we couldn’t find a sewing machine in any of Harvard’s museums, I called a curator who had been cataloguing Harvard’s so-called “ephemeral collections,” things kept in offices, dormitories, or classroom buildings. She said Harvard did not have a sewing machine, but she did and she would be happy to let me use it.
…According to one scholar, the Singer sewing machine emerged from a collaboration “between a mechanical genius, Isaac Merrit Singer (Image 4), and a lawyer, Edward Clark.” Singer may or may not have been a mechanical genius, but he was a genius at keeping his own name in view: it appears at least five times on our sewing machine (Image 5). It is an appealing name: a sewing machine may not sing, but it certainly hums. Lawyer Edward Clark (Image 6) played another role. As historian Nira Wickramasinghe explains, “The early history of the company…can be read as a maze of patent grabbing by a number of inventors….and subsequent litigation. No owner of a single patent could make a sewing maching without infringing on patents of others. Clark was instrumental in the creation of the Albany patent pool, where the holders of these key patents agreed to forgo litigation and to license their technology to one another.” Some sense of the importance of patents is seen on our sewing machine, which lists those from the 1880s to 1892.
Steven Weinberg in the New York Review of Books:
The development of quantum mechanics in the first decades of the twentieth century came as a shock to many physicists. Today, despite the great successes of quantum mechanics, arguments continue about its meaning, and its future.
The first shock came as a challenge to the clear categories to which physicists by 1900 had become accustomed. There were particles—atoms, and then electrons and atomic nuclei—and there were fields—conditions of space that pervade regions in which electric, magnetic, and gravitational forces are exerted. Light waves were clearly recognized as self-sustaining oscillations of electric and magnetic fields. But in order to understand the light emitted by heated bodies, Albert Einstein in 1905 found it necessary to describe light waves as streams of massless particles, later called photons.
Then in the 1920s, according to theories of Louis de Broglie and Erwin Schrödinger, it appeared that electrons, which had always been recognized as particles, under some circumstances behaved as waves. In order to account for the energies of the stable states of atoms, physicists had to give up the notion that electrons in atoms are little Newtonian planets in orbit around the atomic nucleus. Electrons in atoms are better described as waves, fitting around the nucleus like sound waves fitting into an organ pipe.1 The world’s categories had become all muddled.
Shuja Haider in Jacobin:
If you had read in early 2016 about a National Policy Institute conference on the theme of “Identity Politics,” you might have assumed it was an innocent gathering of progressives. If you had attended, you would have been in for an unpleasant surprise. The National Policy Institute is an organization of white nationalists, overseen by neo-Nazi media darling Richard Spencer.
Spencer, who popularized the now common euphemism “alt-right,” is fond of describing his platform as “identity politics for white people.” He takes pains to correct those who refer to him as a white supremacist, insisting that he is merely a “nationalist,” or a “traditionalist,” or, better yet, an “identitarian.” He wants to bring about what he calls a “white ethno-state,” a place where the population is determined by heritability. In a knowing inversion of social justice vocabulary, he describes it as “a safe space for Europeans.”
Spencer has an advanced degree in humanities, spent time in the famously left-wing graduate program at Duke University, and wrote an antisemitic interpretation of Theodor Adorno’s music criticism for his master’s thesis. His political mentor, Paul Gottfried, was a student of Herbert Marcuse. Spencer is clearly intimately acquainted with both academic left philosophy and campus social justice activism.
It was only a matter of time before the right identified liberal and leftist strategies that they themselves could adopt, as a conservative Christian Duke freshman portended in 2015. Amid widespread debate over trigger warnings, he refused to read Alison Bechdel’s graphic novel Fun Home, a memoir that included depictions of lesbian sex.
Wendy Lesser at Threepenny Review:
Dance is the least predictable of art forms. Even when done by a choreographer you love, with superb music and excellent dancers, a new piece may disappoint—and, conversely, something about which you had low expectations may delight and move you. Last spring I saw, in quick succession, two relatively new evening-length ballets on either side of the Atlantic. Christopher Wheeldon’s The Winter’s Tale had been around since 2014, in performances by both the Royal Ballet and the National Ballet of Canada, but I only caught up with it in May at Covent Garden. Alexei Ratmansky’s version of The Golden Cockerel (a work which has an old Russian history, going back to Fokine and earlier) opened in Copenhagen in 2012 but did not have its American premiere until this past June in New York. In each case, the performances I saw surprised me.
Wheeldon is not one of my favorite choreographers, and I’ve especially disliked the way he’s sometimes used women in his work, which did not bode well for Hermione, Paulina, and Perdita. Worse yet, thisWinter’s Tale had a newly commissioned score by an unknown-to-me Englishman, Joby Talbot—a worrisome fact, given my long history of attending British Shakespeare productions burdened by trashily new music. Moreover, the lead dancer in the May performance, Edward Watson, had recently been criticized by a major Anglo-American dance critic for, among other things, having red hair and pale skin: flaws that I forgive, since I have them both myself, but still… So I was nervous about the Covent Garden evening. I warned my non-ballet-fan companions (dragged along at great expense) that the only thing in the dance’s favor was that a San Francisco friend had raved about the Toronto performance she had seen, saying it was one of her favorite new ballets ever.
Andrew Scull at the Times Literary Supplement:
On November 19, 1948, the two most enthusiastic and prolific lobotomists in the Western world faced off against each other in the operating theatre at the Institute of Living in Hartford, Connecticut. They performed before an audience of more than two dozen neurosurgeons, neurologists and psychiatrists. Each had developed a different technique for mutilating the brains of the patients they operated on, and each man had his turn on the stage.
William Beecher Scoville, Professor of Neurosurgery at Yale, went first. His patient was conscious. The administration of a local anaesthetic allowed the surgeon to slice through the scalp and peel down the skin from the patient’s forehead, exposing her skull. Quick work with a drill opened two holes, one over each eye. Now Scoville could see her frontal lobes. He levered each side up with a flat blade so that he could perform what he called “orbital undercutting”. What followed was not quite cutting: instead Scoville inserted a suction catheter – a small electrical vacuum cleaner – and sucked out a portion of the patient’s frontal lobes.
The patient was wheeled out and a replacement was secured to the operating table. Walter Freeman was next, a Professor of Neurology from George Washington University. He had no surgical training and no Connecticut medical licence, so he was operating illegally – not that this seemed to bother anyone present. Freeman sought an assembly-line approach so that lobotomy could be performed quickly and easily. His technique allowed him on occasion to perform twenty and more operations in a single day. He proceeded to use shocks from an Electro-Convulsive Therapy machine to render his female patient unconscious, and then inserted an ice pick beneath the eyelid until the point rested on the thin bony structure in the orbit.
I saw her, pegging out her web
thin as a pressed flower in the bleaching light.
From the bushes a few small insects
clicked like opening seed-pods. I knew some
would be trussed up by her and gone next morning.
She was so beautiful spinning her web
above the marigolds the sun had made
more apricot, more amber; any bee
lost from its solar flight could be gathered
back to the anther, and threaded onto the flower
like a jewel.
She hung in the shadows
as the sun burnt low on the horizon
mirrored by the round garden bed. Small petals
moved as one flame, as one perfectly-lit hoop.
I watched her work, produce her known world,
a pattern, her way to traverse
a little portion of the sky;
a simple cosmography, a web drawn
by the smallest nib. And out of my own world
mapped from smallness, the source
of sorrow pricked, I could see
I saw the same dance in the sky,
the pattern like a match-box puzzle,
tiny balls stuck in a grid until shaken
so much, all the orbits were in place.
Above the bright marigolds
of that quick year, the hour-long day,
she taught me to love the smallest transit,
that the coldest star has planetesimal beauty.
I watched her above the low flowers
tracing her world, making it one perfect drawing.
by Judith Beveridge
from The Domesticity of Giraffes
Black Lightning Press, 1987
Rozina Ali in The New Yorker:
A couple of years ago, when Coldplay’s Chris Martin was going through a divorce from the actress Gwyneth Paltrow and feeling down, a friend gave him a book to lift his spirits. It was a collection of poetry by Jalaluddin Rumi, the thirteenth-century Persian poet, translated by Coleman Barks. “It kind of changed my life,” Martin said later, in an interview. A track from Coldplay’s most recent album features Barks reciting one of the poems: “This being human is a guest house / Every morning a new arrival / A joy, a depression, a meanness, / some momentary awareness comes / as an unexpected visitor.”
Rumi has helped the spiritual journeys of other celebrities—Madonna, Tilda Swinton—some of whom similarly incorporated his work into theirs. Aphorisms attributed to Rumi circulate daily on social media, offering motivation. “If you are irritated by every rub, how will you ever get polished,” one of them goes. Or, “Every moment I shape my destiny with a chisel. I am a carpenter of my own soul.” Barks’s translations, in particular, are shared widely on the Internet; they are also the ones that line American bookstore shelves and are recited at weddings. Rumi is often described as the best-selling poet in the United States. He is typically referred to as a mystic, a saint, a Sufi, an enlightened man. Curiously, however, although he was a lifelong scholar of the Koran and Islam, he is less frequently described as a Muslim.
Mark Strauss in National Geographic:
Every day, thousands of enigmatic objects in space produce bursts of radio waves that flash for just a few milliseconds yet are capable of generating as much energy as 500 million suns.
Astronomers didn’t even know these fast radio bursts existed until a decade ago. In the years since, they’ve been scouring the cosmos, hoping that by pinpointing their locations, they might be able to figure out what—or perhaps who—is producing them.
Today, a team of astronomers announced that they have finally found their quarry. Relying on a global network of powerful telescopes, they managed to capture a fast radio burst that is broadcasting from a dwarf galaxy some three billion light-years away.
The discovery—described in multiple papers in Nature and the Astrophysical Journal Letters—could have profound implications. It may provide astronomers with a new window into the early universe, while also offering vital clues to a mystery that continues to challenge our perceptions of the cosmos.
More here. [Thanks to Farrukh Azfar.]
Ken Roth in Foreign Policy:
As Donald Trump prepares to take office, many fear a new hostility to human rights on the part of the United States. From his divisive rhetoric about minorities to his embrace of autocrats abroad, there is plenty to worry about.
Trump presents a stark contrast with President Barack Obama, whose tone was strikingly different. In a 2011 speech at the State Department, for example, Obama said U.S. support for universal rights “is not a secondary interest” but a “top priority that must be translated into concrete actions, and supported by all of the diplomatic, economic and strategic tools at [the U.S. government’s] disposal.” During his eight years in office, his administration did sometimes live up to that rhetoric, and it never stooped to the kind of open disdain of human rights concerns that is feared from Trump.
But the truth is, a careful review of Obama’s major human rights decisions shows a mixed record. In fact, he has often treated human rights as a secondary interest — nice to support when the cost was not too high, but nothing like a top priority he championed.
James Hamblin in The Atlantic:
Speaking of packing entire books into one paragraph: Large-scale animal agriculture has become a primary driver of climate change. We are eating and producing much more meat than ever before. The human population is on pace to hit 10 billion by the middle of the century; that’s 10 times as many people as there were in 1800. When we find a way to grow delicious red meat in petri dishes, then we can discuss exactly how much is healthy to eat. For now, the only way forward for our species seems to be to consider meat as something closer to a delicacy.
Of all the “probiotics” on the market, one of the few with actual evidence that it serves our microbes well is plant fiber. Fiber is the carbohydrate that humans can’t digest, yet we’ve long known that people who eat high-fiber diets tend to be healthier. Among multiple studies with similar results, one with 40,000 subjects found that a high-fiber diet came with a 40-percent lower than average risk of heart disease. Fiber also seems to protect against metabolic syndrome. One of the mechanisms behind these benefits appears to be that fiber essentially feeds the microbes in our guts, encouraging diverse populations. Those microbes are implicated in a vast array of illnesses and wellbeing. A diet heavy on meat and dairy is necessarily lower on fiber. In that light, the idea of “Paleo-veganism” is an interesting one. Loosely defined, it could mean eating minimally processed, plant-heavy diets. If a flaw in veganism is that some people think they can drink juice and eat white bread all day and be healthy, that might be sustainable for the planet but not good for you. Paleo-veganism (again, loosely defined lest we descend into madness trying to discern the plant varieties this would include) might work as a rule of thumb that generally keeps us focused on the sorts of foods that promote health.
Susan Schneider in KurzweilAI:
Humans are probably not the greatest intelligences in the universe. Earth is a relatively young planet and the oldest civilizations could be billions of years older than us. But even on Earth, Homo sapiens may not be the most intelligent species for that much longer. The world Go, chess, and Jeopardy champions are now all AIs. AI is projected to outmode many human professions within the next few decades. And given the rapid pace of its development, AI may soon advance to artificial general intelligence—intelligence that, like human intelligence, can combine insights from different topic areas and display flexibility and common sense. From there it is a short leap to superintelligent AI, which is smarter than humans in every respect, even those that now seem firmly in the human domain, such as scientific reasoning and social skills. Each of us alive today may be one of the last rungs on the evolutionary ladder that leads from the first living cell to synthetic intelligence.
What we are only beginning to realize is that these two forms of superhuman intelligence—alien and artificial—may not be so distinct. The technological developments we are witnessing today may have all happened before, elsewhere in the universe. The transition from biological to synthetic intelligence may be a general pattern, instantiated over and over, throughout the cosmos. The universe’s greatest intelligences may be postbiological, having grown out of civilizations that were once biological. (This is a view I share with Paul Davies, Steven Dick, Martin Rees, and Seth Shostak, among others.) To judge from the human experience—the only example we have—the transition from biological to postbiological may take only a few hundred years. I prefer the term “postbiological” to “artificial” because the contrast between biological and synthetic is not very sharp. Consider a biological mind that achieves superintelligence through purely biological enhancements, such as nanotechnologically enhanced neural minicolumns. This creature would be postbiological, although perhaps many wouldn’t call it an “AI.” Or consider a computronium that is built out of purely biological materials, like the Cylon Raider in the reimagined Battlestar Galactica TV series.
John Waters at Lapham's Quarterly:
If you play that game where you pick your porn-star moniker by using the name of the street of your first childhood home as your first name and your real middle name as your last, I’d be Clark Samuels. John S. Waters Jr., 1401 Clark Avenue, Lutherville, Maryland. My first house. My first arena. My launching pad to creative filth.
Does anyone ever forget the first room you were allowed to make your own? As a kid, I was lucky enough to have my own bedroom far enough away from my parents or brother and sisters that spying on me was nearly impossible. All I ever wanted to be was the rock-and-roll king. When I first saw Elvis on The Ed Sullivan Show in 1957, twitching and moaning “Heartbreak Hotel,” I was almost eleven but immediately knew I was gay. Then the decorator gene kicked in. Down from my walls came the family’s tasteful Audubon prints and up went glossy head shots of the Everly Brothers, Tab Hunter, and the Platters (one of whom sported a pencil mustache). I nagged my mom and dad into buying me a reel-to-reel tape recorder so I could be an early pirate and tape all the hits off the radio without having to wait to buy them. Then I’d play these rockabilly and rhythm-and-blues numbers over and over as I danced around my room lip-synching, gyrating, and talking to myself. Finally I had a think tank.
J. Hoberman at Artforum:
WITH HIS NEW FILM, Neruda, Chile’s master of the political gothic, Pablo Larraín, exhumes a sacred monster: namely, his nation’s 1971 Nobel Laureate, the poet Pablo Neruda. Hardly a biopic, Nerudafocuses on a brief, if dramatic, period in its subject’s life—a fifteen-month period from January 1948 through March 1949 during which the poet, an elected senator and an outspoken member of the banned Chilean Communist Party, went underground, finally escaping over the Andes to Argentina.
Neruda devotes only a dozen pages to the topic in his memoirs, half of them concerning the exciting last stage of his getaway. The writing is routinely self-aggrandizing: “Even the stones of Chile” knew his voice, he brags, only partially in jest. The movie, written by Guillermo Calderón, is wryly admiring of Neruda’s imperturbable chutzpah.
Embodied with dour, deadpan magnificence by Luis Gnecco (who played a leftist organizer in No , the third installment of Larraín’s anti-Pinochet trilogy), Neruda is introduced running a gauntlet of flashbulbs as he enters the senatorial washroom to denounce the nation’s sellout president, Gabriel González Videla. The tumult continues as leftists party in masquerade. Neruda, dressed in a burnoose as Lawrence of Arabia, thrills comrades and admirers with his poet’s voice, as his indulgent second wife (charmingly portrayed by the Argentinean actress Mercedes Morán) calls it, sonorously reciting, not for the last time, the youthful love poem that begins, “Tonight I can write the saddest lines. . . .”
Juliet Barker at Literary Review:
English medieval embroidery might seem an odd subject for a book, but this is no ordinary volume. Published to accompany a major exhibition at the V&A (which runs until 5 February 2017), it is not only a catalogue and scholarly monograph but also a visual feast, with magnificent colour plates on virtually every page, bursting at the seams with titbits of fascinating information. It’s the sort of book that makes you want to hug yourself with glee: revelatory and as exquisitely produced as the medieval embroidery it celebrates.
The stereotypical embroiderer, medieval or otherwise, is always female and usually noble-born. Charles Henry Hartshorne, writing in 1845, summed it up neatly in one of the earliest studies on the subject: embroidery ‘served to occupy the leisure of the English gentlewoman when there were but few other modes in which her talents could be employed’; immured within a castle chamber or convent, ‘the needle alone supplied an unceasing source of amusement’. What English Medieval Embroidery demonstrates is that embroidery in England was first and foremost a business, employing both male and female workers whose professional skill was renowned throughout Europe. In the 13th and 14th centuries their work was in demand everywhere from Scandinavia to Portugal and from Riga to Patras, with the Church hierarchy and the papal curia, in particular, proving insatiable in their appetite for what was known outside England as opus anglicanum.