Thursday Poem

Diatribe Against the Dead

The dead are selfish:
they make us cry and don't care,
they stay quiet in the most inconvenient places,
they refuse to walk, we have to carry them
on our backs to the tomb
as if they were children. What a burden!
Unusually rigid, their faces,
accuse us of something, or warn us;
they are the bad conscience, the bad example,
they are the worst things in our lives always, always.
The bad thing about the dead
is that there is no way you can kill them.
Their constant destructive labor
is for that reason incalculable .
Insensitive, indifferent, obstinate, cold,
with their insolence and their silence
they don't realize what they undo.
.

by Angel Gonzalez
from The Vintage Book of Contemporary World Poetry
Vintage Books, 1996
.

The human nose knows more than we think

Andrea Marks in Scientific American:

NoseThe smell of coffee may urge you out of bed in the morning, and the perfume of blooming lilacs in the spring is divine. But you do not see police officers with their noses to the ground, following the trail of an escaped criminal into the woods. Humans do not use smell the way other mammals do, and that contributes to our reputation for being lousy sniffers compared with dogs and other animals. But it turns out the human sense of smell is better than we think. In a review paper published in Science last week neuroscientist John McGann of Rutgers University analyzed the state of human olfaction research, comparing recent and older studies to make the argument our smelling abilities are comparable with those of our fellow mammals. McGann traces the origins of the idea that humans have a poor sense of smell to a single 19th-century scientist, comparative anatomist Paul Broca. Broca, known for discovering Broca’s area—the part of the brain responsible for speech production—noted that humans had larger frontal lobes than those of other animals, and that we possessed language and complex cognitive skills our fellow creatures lacked. Because our brains’ olfactory bulbs were smaller than those of other mammals and we did not display behavior motivated by smell, Broca extrapolated these brain areas shrank over evolutionary time as humans relied more on complex thought than on primal senses for survival. He never conducted sensory studies to confirm his theory, however, but the reputation stuck.

Scientists built on that tenuous foundation over the years, McGann says. Geneticists saw supporting evidence for humans’ limited olfactory abilities because we have a smaller fraction and number of functioning olfactory genes—but again this was not well tested. The idea that color vision took the evolutionary pressure off olfaction was later debunked when no link was found between that evolutionary development and smell loss. In addition, the size of olfactory bulbs, both in absolute terms and in proportion to the brain, does not relate directly to smelling power as scientists once thought. Now that more sensory tests are being done, the results are mixed. Experiments conducted in previous decades have found humans are just as sensitive as dogs and mice to the aroma of bananas. Furthermore, a 2013 study found humans were more sensitive than mice to two urine odor components whereas mice could better detect four other sulfur-containing urine and fecal-gland odors tested. A 2017 study also revealed humans were more sensitive than mice to the smell of mammal blood.

More here.

Wednesday, May 24, 2017

MICHAEL CHABON ROAMS THE WEST BANK WITH SAM BAHOUR

Michael Chabon in Literary Hub:

Chabon-illoThe tallest man in Ramallah offered to give us a tour of his cage. We would not even have to leave our table at Rukab’s Ice Cream, on Rukab Street; all he needed to do was reach into his pocket.

At nearly two meters—six foot four—Sam Bahour might well have been the tallest man in the whole West Bank, but his cage was constructed so ingeniously that it could fit into a leather billfold.

“Now, what do I mean, ‘my cage’?” He spoke with emphatic patience, like a remedial math instructor, a man well practiced in keeping his cool. With his large, dignified head, hairless on top and heavy at the jawline, with his deep-set dark eyes and the note of restraint that often crept into his voice, Sam had something that reminded me of Edgar Kennedy in the old Hal Roach comedies, the master of the slow burn. “Sam,” he said, pretending to be us, his visitors, we innocents abroad, “what is this cage you’re talking about? We saw the checkpoints. We saw the separation barrier. Is that what you mean by cage?”

Some of us laughed; he had us down. What did we know about cages? When we finished our ice cream—a gaudy, sticky business in Ramallah, where the recipe is an Ottoman vestige, intensely colored and thickened with tree gum—we would pile back into our hired bus and return to the liberty we had not earned and were free to squander.

“Yes, that’s part of what I mean,” he said, answering the question he had posed on our behalf. “But there is more than that.”

More here.

A Grand New Theory of Life’s Evolution on Earth

Sarah Zhang in The Atlantic:

Lead_960The modern world gives us such ready access to nachos and ice cream that it’s easy to forget: Humans bodies require a ridiculous and—for most of Earth’s history—improbable amount of energy to stay alive.

Consider a human dropped into primordial soup 3.8 billions years ago, when life first began. They would have nothing to eat. Earth then had no plants, no animals, no oxygen even. Good luck scrounging up 1600 calories a day drinking pond- or sea water. So how did we get sources of concentrated energy (i.e. food) growing on trees and lumbering through grass? How did we end up with a planet that can support billions of energy-hungry, big-brained, warm-blooded, upright-walking humans?

In “The Energy Expansions of Evolution,” an extraordinary new essay in Nature Ecology and Evolution, Olivia Judson sets out a theory of successive energy revolutions that purports to explain how our planet came to have such a diversity of environments that support such a rich array of life, from the cyanobacteria to daisies to humans.

More here.

AC Grayling: What happens when our military machines are not only unmanned but autonomous?

AC Grayling in Prospect:

D3P2G4_webWar, then, has changed in dramatic respects, technologically and, consequentially, in character too. But in other fundamental respects it is as it ever was: people killing other people. As Theodor Adorno said, thinking of the development of the spear into the guided missile: “We humans have grown cleverer over time, but not wiser.” Every step of this evolution has raised its own ethical questions, but the next twist in the long story of war could very well be autonomous machines killing people—something that could well necessitate a more profound rethink than any that has been required before.

As well as posing their own particular ethical problems, past advances in military technology have—very often—inspired attempts at an ethical solution too. The 1868 Declaration of St Petersburg outlawed newly-invented bullets that split apart inside a victim. The 1899 Hague Conference outlawed aerial bombardment, even before heavier-than-air flight had become possible—it had in mind the throwing of grenades from balloons. After the First World War, chemical weapons were outlawed and following the Second World War much energy was devoted to attempts at banning or limiting the spread of nuclear weapons. When Bashar al-Assad gassed his own people in Syria, President Donald Trump enforced the world’s red line with an airstrike.

So, just as the continuing evolution of the technology of combat is nothing new, nor is the attempt to regulate its grim advance. But such attempts to limit the threatened harm have often proved to be futile. For throughout history, it is technology that has made the chief difference between winning and losing in war—the spear and the atom bomb both represent deadly inventiveness prompted by emergency and danger. Whoever has possessed the superior technology has tended to prevail, which—if it then falls to the victors to enforce the rules—points to some obvious dilemmas and difficulties.

More here.

Why whales grew to such monster sizes

Elizabeth Pennisi in Science:

WhaleWeighing in at 200,000 kilograms and stretching the length of a basketball court, the blue whale is the biggest animal that’s ever lived. Now, scientists have figured out why they and other baleen whales got so huge. “It’s a cool study,” says Jakob Vinther, an evolutionary paleobologist at the University of Bristol in the United Kingdom. "I’m going to send it to my students." Biologists have long debated why some whales became the world’s biggest animals. Some have proposed that because water bears the animal’s weight, whales can move around more easily and gulp in enough food to sustain big appetites. Others have suggested that whales got big to fend off giant sharks and other megapredators. Researchers have also argued about when these animals got so huge. In 2010, Graham Slater, an evolutionary biologist currently at the University of Chicago in Illinois, argued that cetaceans—a term that includes whales and dolphins—split into different-sized groups very early in their history, perhaps 30 million years ago. Dolphins remained the shrimps of the cetacean world, filter-feeding baleen whales became the giants, and predatory beaked whales stayed in the middle size-wise, with the descendants in those three groups sticking within those early established size ranges.

However, Nicholas Pyenson, a whale expert at the Smithsonian Institution's National Museum of Natural History in Washington, D.C., was skeptical. So a few years ago, the two decided to tap the museum’s vast cetacean fossil collection to settle the dispute. Pyenson had already surveyed living whale proportions and determined that the size of the whale correlated with the width of its cheek bones. So Pyenson measured or obtained these data from skulls of 63 extinct whale species and of 13 modern species and plotted them on a timeline that showed the whale family tree. The data showed that whales didn’t get really big early on, as Slater had suggested. Nor did they gradually get big over time. Instead they become moderately large and stayed that way until about 4.5 million years ago, Slater, Pyenson, and Jeremy Goldbogen at Stanford University in Palo Alto, California, report today in the Proceedings of the Royal Society B. Then baleen whales went “from relatively big to ginormous,” Slater says. Blue whales today are 30 meters long, where until 4.5 million years ago, the biggest whales were 10 meters long.

More here.

Roger Ailes: The Man Who Destroyed Objectivity

Neal Gabler in billmoyers.com:

Roger-Ailes-1024x720Fox News creator and former chief Roger Ailes, who died at 77 last week from complications after a fall in his Florida home, may have been the most significant political figure of the last 35 years — which isn’t necessarily a compliment to those of us who believe media mavens shouldn’t also be political operatives. Ailes clearly thought differently. He simultaneously changed the contours both of American politics and American media by melding them, and in doing so changed the contours of fact and objectivity as they once were understood before the era of post-fact.

It seems a lot to put on one man, but Roger Ailes destroyed the idea of media objectivity in the name of media objectivity, the way a phony evangelist might destroy virtue in the name of virtue. Things have never been the same since. Ailes was a political acolyte of Richard Nixon, and Nixon was a media acolyte of Ailes. It was a perfect and powerful alliance — two outcasts seeking retribution. Nixon’s great contribution to American politics was to take his personal umbrage, a lifetime of slights, and nationalize it. Like so many among the aggrieved, he aspired to be an insider and was tormented by not being admitted to their ranks. In anger, he took them on, especially those he regarded as haughty elites — accused spy Alger Hiss, who was the personification of the Ivy-educated aristocrat and on whose takedown Nixon built his career; John Kennedy, who was everything Nixon wanted to be and wasn’t; and the entire liberal establishment, which would denigrate and denounce him as he rose through the political ranks. He became the avatar for every person who had suffered the same disdain and abuse, and he turned Republicanism into a therapeutic movement of social vengeance.

More here.

Looking at Giorgio de Chirico

Fiat-1400Heather Ewing at The Brooklyn Rail:

In 1969 a young artist in Turin named Giulio Paolini took as his personal motto the Latin inscription—itself a quotation from Nietzsche—at the foot of an early Giorgio de Chirico self-portrait: Et quid amabo nisi quod ænigma est [And What Shall I Love If Not the Enigma]. He made the phrase into his own business card and transformed it into a public manifesto by placing it on an enormous banner hung across the main piazza in Como. This was his contribution to Campo Urbano, the public art intervention staged that year by Luciano Caramel in collaboration with Ugo Mulas and Bruno Munari, which invited artists out of their studios and galleries to engage directly with the urban environment, the spaces of daily life. For Paolini, it was the beginning of a decades-long fascination with de Chirico’s oeuvre, which Paolini has referenced, cited, and interrogated in his conceptual practice—artwork that is now the subject of the fourth season at the Center for Italian Modern Art (CIMA), which places paintings spanning much of de Chirico’s career together with works by Paolini from the 1960s to today.

This phrase “And What Shall I Love If Not the Enigma” was a touchstone as well for Philip Guston—Dore Ashton said he quoted it all his life; and it was a prompt for Sylvia Plath, too, who wrote several poems inspired by de Chirico paintings, as did Mark Strand, John Ashbery, and others (Ashbery also translated parts of de Chirico’s surrealist novelHebdomeros). I love that Louise Bourgeois and her husband, the art historian Robert Goldwater, together dedicated themselves to translating some of de Chirico’s writings. De Chirico’s work has beguiled and bedeviled a surprising number of artists and writers.

more here.

TALES OF PORT-AU-PRINCE

JalouiseEdwidge Danticat at Literary Hub:

The republic of Port-au-Prince, as it is often called, is a city of survivors. Even those who would like to see the country decentralized or have the capital moved elsewhere talk about creating another Port-au-Prince, a different one for sure, but an improved version of the old one. Still, Port-au-Prince is also a heartbreaking city. It is a city where a restaurant that charges over 20 American dollars for a steak might stand inches from some place where others are starving. It is a city where the dead can lie in a morgue for weeks as the family clamors for money to pay for the burial.

It is also a city where paintings line avenue walls, where street graffiti curses or praises politicians, depending on who has paid for them. It is a city of so much traffic that it has become a city of back roads, short cuts that rattle your body through hills, and knolls that at first don’t seem passable. It’s a city of motto taxis, which are better fitted for such roads. It is also a city of cell phones, where conversations often end abruptly because someone’s prepaid cards have run out of minutes. It is a city, as one of Haiti’s most famous novelists, Gary Victor, has written, where people who might run toward bullets will flee the rain, because the rain can reconfigure roads in an instant and can take more lives in a few minutes than a gun.

more here.

Germaine de Staël. A Political Portrait

ImagesHugh Gough at the Dublin Review of Books:

Madame de Staël has attracted the attention of historians and biographers alike, not only because of her flamboyant lifestyle and unwavering sense of self importance but also because of her sheer energy and impressive literary output. Her life had all the characteristics of a romantic novel, launched as she was into the privileged world of Parisian salons before revelling in the turmoil of the French Revolution, spending years of exile in the family château near Geneva, and travelling through central and eastern Europe before finally returning to France in the early years of the Bourbon restoration. Her interests straddled the Enlightenment and early Romanticism and she was one of the few French-language authors of the late eighteenth century to immerse herself in both German and Italian culture. She met Goethe and Schiller in Weimar, was a close friend of the philosopher and founder of the University of Berlin Wilhem von Humboldt, and had her children educated by the romanticist and orientalist Auguste Wilhelm von Schlegel. Her novels Delphine (1802) and Corinne (1807) have enjoyed lasting success and De l’Allemagne, published between 1810 and 1813, was a highly original analysis of the literature, history and philosophy of a culture that her nemesis, Napoleon Bonaparte, briefly dominated but never understood.

She was a lady born into wealth and privilege, and now has a society dedicated to her memory which publishes her complete works, encourages research and hosts annual meetings in the château in Coppet which has been the home of the Necker family since the eighteenth century. Her political ideas are less well known than her novels but she was an intelligent and reflective commentator on the twists and turns of events in France until Napoleon finally sent her into exile.

more here.

Tuesday, May 23, 2017

How to be a stoic

Skye Cleary in 3:AM Magazine:

97818460450733AM: I understand that your book grew out of a New York Times opinion piece by the same name: How to Be a Stoic? Why did you decide to write the book? Or was it more about riding the wave of Fate?

Massimo Pigliucci: In a sense, it was about Fate. But in another sense, it was a very deliberate project. Fate entered into it because The New York Times article went viral, and I immediately started getting calls from a number of publishers, enquiring into whether I intended to write a book. Initially, I didn’t. But then I considered the possibility more carefully. After all, I had started a blog (howtobeastoic.org) with the express purpose of sharing my progress in studying and practicing Stoicism with others, and I am convinced that Stoicism as a philosophy of life can be useful to people. So, a book was indeed the next logical step.

3AM: What are the key differences between ancient Stoicism and your new Stoicism? Why did it need updating?

Massimo Pigliucci: Stoicism is an ancient Greco-Roman philosophy, originating around 300 BCE in Athens. It’s only slightly younger than its Eastern counterpart, Buddhism. But while Buddhism went through two and a half millennia of evolution, Stoicism was interrupted by the rise of Christianity in the West. A lot of things have happened in both philosophy and science in the 18 centuries since there were formal Stoic schools, so some updating is in order.

More here.

the story of the Great Regression

UrlWolfgang Streek at the New Left Review:

By the end of the 1980s at the latest, neoliberalism had become the pensée unique of both the centre left and the centre right. The old political controversies were regarded as obsolete. Attention now focused on the ‘reforms’ needed to increase national ‘competitiveness’, and these reforms were everywhere the same. They included more flexible labour markets, improved ‘incentives’ (positive at the upper end of the income distribution and negative at the bottom end), privatization and marketization both as weapons in the competition for location and cost reduction, and as a test of moral endurance. Distributional conflict was replaced by a technocratic search for the economically necessary and uniquely possible; institutions, policies and ways of life were all to be adapted to this end. It follows that all this was accompanied by the attrition of political parties—their retreat into the machinery of the state as ‘cartel parties’ [4] —with falling membership and declining electoral participation, disproportionately so at the lower end of the social scale. Beginning in the 1980s this was accompanied by a meltdown of trade-union organization, together with a dramatic decline in strike activity worldwide—altogether, in other words, a demobilization along the broadest possible front of the entire post-war machinery of democratic participation and redistribution. It all took place slowly, but at an increasing pace and developing with growing confidence into the normal state of affairs.

As a process of institutional and political regression the neoliberal revolution inaugurated a new age of post-factual politics. [5] This had become necessary because neoliberal globalization was far from actually delivering the prosperity for all that it had promised. [6] The inflation of the 1970s and the unemployment that accompanied its harsh elimination were followed by a rise in government debt in the 1980s and the restoration of public finances by ‘reforms’ of the welfare state in the 1990s. These in turn were followed, as compensation, by opening up generous opportunities for private households to access credit and get indebted. Simultaneously, growth rates declined, although or because inequality and aggregate debt kept increasing. Instead of trickle-down there was the most vulgar sort of trickle-up: growing income inequality between individuals, families, regions and, in the Eurozone, nations.

more here.

On ‘The Real People of Joyce’s Ulysses’ by Vivien Igoe

52283308Dominic Green at The New Criterion:

The relation of Ulysses to literary realism is one question, its relation to reality another. By the “realness” of Ulysses, we usually mean Joyce’s representation of the inner lives of Leopold Bloom, Molly Bloom, and Stephen Dedalus, from the micturant scent of grilled kidneys in the morning to the affirmations with which Molly ends Bloom’s day. No writer in English since Sterne had unpicked the layers of language and consciousness so carefully; perhaps only Henry James had woven them together with as sharp an eye for detail. Yet our focus on Joyce’s method reflects more than hisself-conscious technique and sophistication. It also reflects the distances between the novel’s conception and its composition, and between its composition and its reception.

Joyce wrote Ulysses between 1914 and 1921, in self-exile from Ireland. Sylvia Beach of Shakespeare & Co. published Ulysses in Paris on February 2, 1922, Joyce’s fortieth birthday. Notoriously, the subsequent journey of Ulysses to acceptance in the English-speaking world took longer than the original Ulysses’s return from Troy. Censorship controversies on both sides of the Atlantic turned Ulyssesinto one of those smutty books whose function is to register the tidemarks of artistic license. Meanwhile, the cultural distance between Dublin and the literary metropoles of Paris, London, and New York grew.

more here.

The pros and cons of the digitized Whitman and his “lost” novels

WhitmanbutterflyJames McWilliams at the Paris Review:

Still, it’s hard not to feel perplexed about Walt’s reputation as technology and scholarly fortitude converge to hone in on his secret work. When I read The Life and Adventures of Jack Engle, it seemed obvious why Whitman had published it anonymously. The novel is essentially a formulaic blend of period-piece tropes and Horatio Alger moralizing. In terms of a literary contribution, it adds nothing. It was the bland stuff that newspapers paid for, payments that Whitman needed to underwrite poetry that would transform poetry. (Leaves of Grass was self-published on July 4, 1855.) Whitman edited his life as if it were a poem. As much as he would have preferred to burn the work he didn’t want others to see—as did his self-censorious contemporaries Melville, Hawthorne, and Dickinson—he had to publish it and trust that newsprint would hold his secrets. For more than 150 years, it did—a good run.

Literary scholars and historians exist in part to demythologize the past; it’s our job. As much as I wish there were a decent argument to prevent Turpin and others from using technology to knock the myth off of Whitman, there’s not. But, since it’s out there now,read the unearthed Whitman novel yourself. You might find that the satisfaction of knowing the full truth about Whitman is rapidly ephemeral.

more here.

On Not Letting Bastards Grind You Down

Yung In Chae in Avidly:

Hamdmaids-scrabbleYou may remember the “Romans Go Home” scene from Monty Python’s Life of Brian. In the dead of the night, a centurion catches Brian painting the (grammatically incorrect) Latin slogan Romanes eunt domus on the walls of Pontius Pilate’s house. Revolutionary graffiti should spell the end of Brian—except, it doesn’t, because the centurion is angry about neither the medium nor the message but the grammar, and he makes Brian write the correct version, Romani ite domum, one hundred times as punishment. By sunrise, anti-Roman propaganda covers the palace. In 2017, The Handmaid’s Tale painted another Latin phrase, nolite te bastardes carborundorum, on the walls of the Internet, and once again the centurion became the butt of the joke by not getting it. “Aha,” at least one “think piece”/centurion said, “the famous phrase from Margaret Atwood’s The Handmaid’s Tale is grammatically incorrect!” He then explained that while nolite and te are okay, bastardes is a bastardized (ha) word, and carborundorum comes from an older, gerundive-like name for silicon carbide (carborundum). This, he concluded, reveals a greater truth about the book and Hulu show, one so obvious that he doesn’t question what it is. You don’t have to be a classicist, or know Latin, in order to figure out that nolite te bastardes carborundorum is fake. You can just read The Handmaid’s Tale, which makes no pretension to its authenticity. Roughly two hundred pages in, the Commander informs Offred, who has discovered the phrase carved into her cupboard, “That’s not real Latin […] That’s just a joke.” Atwood herself told Elisabeth Moss the same thing in Time.

As for why it’s fake, history supplies the facts, or more accurately, fun facts. Does The Handmaid’s Tale reveal anything this time? “It’s sort of hard to explain why it’s funny unless you know Latin,” the centurion—I mean, Commander—says. Oh. Maybe he wanted to add, fun fact never means fun for everyone. After all, the Commander’s revelation horrifies Offred: “It can’t only be a joke. Have I risked this, made a grab at knowledge, for a mere joke?” She had been reciting nolite te bastardes caborundorum as a prayer, hoping for meaning. In the end, how real or fake the Latin is doesn’t matter for the meaning. Upon hearing the translation—“don’t let the bastards grind you down”—Offred understands why her Handmaid predecessor wrote the phrase, and realizes that the Commander had secretly met with her, too, because where else would she have learned it? The person who fails to understand or realize anything is, for all his Latin, the Commander.

More here.

In ‘Enormous Success,’ Scientists Tie 52 Genes to Human Intelligence

Carl Zimmer in The New York Times:

IntelIt’s still not clear what in the brain accounts for intelligence. Neuroscientists have compared the brains of people with high and low test scores for clues, and they’ve found a few. Brain size explains a small part of the variation, for example, although there are plenty of people with small brains who score higher than others with bigger brains. Other studies hint that intelligence has something to do with how efficiently a brain can send signals from one region to another. Danielle Posthuma, a geneticist at Vrije University Amsterdam and senior author of the new paper, first became interested in the study of intelligence in the 1990s. “I’ve always been intrigued by how it works,” she said. “Is it a matter of connections in the brain, or neurotransmitters that aren’t sufficient?” Dr. Posthuma wanted to find the genes that influence intelligence. She started by studying identical twins who share the same DNA. Identical twins tended to have more similar intelligence test scores than fraternal twins, she and her colleagues found. Hundreds of other studies have come to the same conclusion, showing a clear genetic influence on intelligence. But that doesn’t mean that intelligence is determined by genes alone. Our environment exerts its own effects, only some of which scientists understand well. Lead in drinking water, for instance, can drag down test scores. In places where food doesn’t contain iodine, giving supplements to children can raise scores. Advances in DNA sequencing technology raised the possibility that researchers could find individual genes underlying differences in intelligence test scores. Some candidates were identified in small populations, but their effects did not reappear in studies on larger groups. So scientists turned to what’s now called the genome-wide association study: They sequence bits of genetic material scattered across the DNA of many unrelated people, then look to see whether people who share a particular condition — say, a high intelligence test score — also share the same genetic marker.

In 2014, Dr. Posthuma was part of a large-scale study of over 150,000 people that revealed 108 genes linked to schizophrenia. But she and her colleagues had less luck with intelligence, which has proved a hard nut to crack for a few reasons. Standard intelligence tests can take a long time to complete, making it hard to gather results on huge numbers of people. Scientists can try combining smaller studies, but they often have to merge different tests together, potentially masking the effects of genes. As a result, the first generation of genome-wide association studies on intelligence failed to find any genes. Later studies managed to turn up promising results, but when researchers turned to other groups of people, the effect of the genes again disappeared. But in the past couple of years, larger studies relying on new statistical methods finally have produced compelling evidence that particular genes really are involved in shaping human intelligence.

More here.

Monday, May 22, 2017