His Subject, Himself: Claudia Roth Pierpont’s ‘Roth Unbound’

Martin Amis in The New York Times:

RothAmerican anti-Semitism, which was running high throughout the 1930s, steadily increased after the onset of war. During the entire period, polls showed, well over a third of the populace stood ready to back discriminatory laws. Nor was this a mere offshoot of the general xenophobia spawned by isolationism. Every synagogue in Washington Heights was desecrated (and some were smeared with swastikas); in Boston, beatings, wreckings and defilements had become near-daily occurrences by 1942. The disgraceful fever, which ruled out all but a trickle of immigration and so cost countless lives, reached its historic apogee in 1944 — by which time the Holocaust was more or less complete. And what of the media? News of the killings emerged in May/June 1942: a verified report with a figure of 700,000 already dead. The Boston Globe gave the story the three-column headline “Mass Murders of Jews in Poland Pass 700,000 Mark,” and tucked it away at the foot of Page 12. The New York Times quoted the report’s verdict — “probably the greatest mass slaughter in history” — but gave it only two inches. We may venture to say that such reticence is slightly surprising, given that the historiography of the events outlined above now runs to many tens of thousands of volumes.

Philip Roth would use this soiled and feckless backdrop in “The Plot Against America” (2004), his 26th book; but ­anti-Semitism and its corollary, anti-anti-­Semitism, wholly dominated the publication of his first, “Goodbye, Columbus and Five Short Stories” (1959). “What is being done to silence this man?” asked a rabbi. “Medieval Jews would have known what to do with him.” Roth’s cheerful debut, some thought, shored up the same “conceptions . . . as ultimately led to the murder of six million in our time.” So he wasn’t only contending with a “rational” paranoia; he was also ensnared in the ongoing anguish of comprehension and absorption, as the sheer size of the trauma inched into consciousness. After a hate-filled public meeting at Yeshiva University in New York in 1962, Roth solemnly swore (over a pastrami sandwich) that he would “never write about Jews again.”

It was a hollow vow.

More here.

Friday, October 18, 2013

Restoring F. P. Ramsey

FrankRamsey_Web_377314h

David Papineau reviews Margaret Paul's Frank Ramsey (1903-1930): A Sister’s Memoir, in the TLS:

F. P. Ramsey has some claim to be the greatest philosopher of the twentieth century. In Cambridge in the 1920s, he singlehandedly forged a range of ideas that have since come to define the philosophical landscape. Contemporary debates about truth, meaning, knowledge, logic and the structure of scientific theories all take off from positions first defined by Ramsey. Equally importantly, he figured out the principles governing subjective probability, and so opened the way to decision theory, game theory and much work in the foundations of economics. His fertile mind could not help bubbling over into other subjects. An incidental theorem he proved in a logic paper initiated the branch of mathematics known as Ramsey theory, while two articles in the Economic Journalpioneered the mathematical analysis of taxation and saving.

Ramsey died from hepatitis at the age of twenty-six in 1930. For some geniuses, an early death accelerates the route to canonization. But for Ramsey it had the opposite effect. Ramsey’s death coincided with Ludwig Wittgenstein’s return to Cambridge after his reclusive years in the Austrian Alps. The cult surrounding Wittgenstein quickly caught fire, and for the next fifty years dominated philosophy throughout the English-speaking world. By the time it subsided, Ramsey had somehow been relegated to a minor role in history, a footnote to an archaic Cambridge of Russell, Keynes and the Bloomsbury set.

More here.

The Essence of the Japanese Mind: Haruki Murakami and the Nobel Prize

1q84-243x366-1382045345

Amanda Lewis in the LA Review of Books:

BRITISH ODDSMAKERS Ladbrokes gave 64-year-old Japanese novelist Haruki Murakami a 3-1 chance of winning the 2013 Nobel Prize in Literature. Last year, they had it at 8-1. He didn’t get it this year, losing to Alice Munro, but he has a good shot in 2014 or 2015. If he wins, he’ll be the third Japanese writer to receive the prize, but he is nothing like the two men who came before him.

The first — Yasunari Kawabata, in 1968 — was cited by the Nobel Committee “for his narrative mastery, which with great sensibility expresses the essence of the Japanese mind.” The second — Kenzaburō Ōe, in 1994 — once wrote that, “The role of literature, insofar as man is obviously a historical being, is to create a model of a contemporary age which encompasses past and future, a model of the people living in that age as well.”

Murakami, however, does not write to capture “the essence of the Japanese mind,” and his characters are not meant to be “a model” of any era or group.

“I fully believe it is the novelist's job to keep trying to clarify the uniqueness of each individual soul by writing stories,” he said when accepting the Jerusalem Prize in 2009, rejecting the conventional notion of Japan as not only racially homogenous but also somehow intellectually and emotionally the same from mind to mind, from prefecture to prefecture.

In fact, from his debut in 1979 until the early 1990s, Murakami wrote bestselling Japanese novels that were almost aggressively un-Japanese. And the Tokyo literati, particularly Ōe, hated him. Hated that his magical tales dripped with brand names and references to American pop culture. Hated that he wrote his first novel out in English before rewriting it in his native language. Hated that he never wrote about war.

More here.

A Philosophy of Tickling

Chimp_tickling_FINAL_tint

Aaron Schuster in Cabinet magazine:

Aristotle famously defined man as the rational animal (zoon echon logon), and as the political animal (zoon politikon). But there are also passages in his work that indicate another less remarked upon, though no less profound, definition. In Parts of Animals, he writes: “When people are tickled, they quickly burst into laughter, and this is because the motion quickly penetrates to this part, and even though it is only gently warmed, still it produces a movement (independently of the will) in the intelligence which is recognizable. The fact that human beings only are susceptible to tickling is due (1) to the fineness of their skin and (2) to their being the only creatures that laugh.”1 Perhaps this notion of the “ticklish animal” was further elaborated in the second book of the Poetics, the lost treatise on comedy; indeed, the relationship between ticklish laughter and comic laughter remains an open question. Should tickling be investigated under the heading of comedy or of touch? Touch, Aristotle argues, is the most primary sense, and human beings are uniquely privileged in possessing the sharpest sense of touch thanks to the delicate nature of their skin. Though other animals have more advanced smell or hearing, “man’s sense of touch … excels that of all other animals in fineness.”2 We might view tickling as a side effect of the hyper-sensitivity of human touch. Our peculiar vulnerability to tickling is the price to be paid for more sophisticated and discriminating access to the world.

Why do people laugh when tickled? Why can’t you tickle yourself? Why are certain parts of the body more ticklish than others? Why do some people enjoy tickling and others not? And what is tickling, after all? In fact, it is not so easy to pin down the phenomenon, as a number of scientific studies attest. Here’s one attempt at a precise description: “Tickle may thus be finally defined as an intensely vivid complex of unsteady, ill-localized and ill-analyzed sensation, with attention distributed over the immediate sensory contents and the concomitant sensations reflexly evoked.”

More here.

Why Scandinavian Prisons Are Superior

Lead_large_tmp (12)

Doran Larson in The Atlantic:

It’s a postcard-perfect day on Suomenlinna Island, in Helsinki’s South Harbor. Warm for the first week of June, day trippers mix with Russian, Dutch, and Chinese tourists sporting sun shades and carrying cones of pink ice cream.

“Is this the prison?” asks a 40-something American woman wearing cargo pants and a floral sleeveless blouse.

Linda, my guide and translator, pauses beside me between the posts of an open picket fence. After six years of teaching as a volunteer inside American prisons, I’ve come from the private college where I work to investigate the Scandinavian reputation for humane prisons. It’s the end of my twelfth prison tour, and I consider the semantics of the question: If you can’t tell whether you’re in a prison, can it be a prison? I’ve never considered this in so many words. Yet I find that I know the answer, having felt it inside a prison cell in Denmark:There is no punishment so effective as punishment that nowhere announces the intention to punish. Linda is an intern working on a degree in public policy. Young and thoroughly practical, she smiles and says to the tourists, “Yes, you are here.”

The Americans look shocked and afraid. The father glances at his wife. The wife cocks her head back, as though she’s ventured too far. The son—fit as my La Crosse-playing students—takes a step in reverse, just outside the gate, and says to his mother: “I told you.” (Linda clearly wonders what she’s said to cause such a reaction.)

Then the son adds, his voice cracking on a nervous attempt at sarcasm: “It’s sure reassuring to know we’re being protected from criminals.”

More here.

Physics: What We Do and Don’t Know

Weinberg_1-110713_jpg_250x966_q85

Steven Weinberg in the NYRB:

The modern era of scientific cosmology began forty-eight years ago, with the accidental discovery of the cosmic microwave background radiation. So much for the steady state cosmology—there was an early universe. This microwave radiation has been under intensive study since the mid-1960s, both from unmanned satellites in orbit and from large ground-based radio telescopes. Its present temperature is now known to be 2.725 degrees Centigrade above absolute zero. When this datum is used in calculations of the formation of the nuclei of atoms in the first three minutes after the start of the big bang, the predicted present abundance of light elements (isotopes of hydrogen, helium, and lithium) comes out pretty much in agreement with observation. (Heavier elements are known to be produced in stars.)

More important than the measurement of the precise value of the temperature is the discovery in 1977 that the temperature of the microwave radiation is not the same throughout the sky. There are small ripples in the temperature, fluctuations of about one part in a hundred thousand. This was not entirely a surprise. There would have to have been some such ripples, caused by small lumps in the matter of the early universe that are needed to serve as seeds for the later gravitational condensation of matter into galaxies.

These lumps and ripples are due to chaotic sound waves in the matter of the early universe. As long as the temperature of the universe remained higher than about 3,000 degrees, the electrons in this hot matter remained free, continually scattering radiation, so that the compression and rarefaction in the sound waves produced a corresponding variation in the intensity of radiation. We cannot directly see into this era, because the interaction of radiation and free electrons made the universe opaque, but when the universe cooled to 3,000 degrees the free electrons became locked in hydrogen atoms, and the universe became transparent. The radiation present at that time has survived, cooled by the subsequent expansion of the universe, but still bearing the imprint of the sound waves that filled the universe before it became transparent.

More here.

Why we need fairytales: On Oscar Wilde

Jeanette Winterson in The Guardian:

WildeThe work Wilde is remembered for was written over a period of less than 10 years. The Happy Prince and Other Tales was published in 1888. That volume marks the beginning of Wilde's true creativity. He had lectured extensively in the US – but that would not have won him any lasting legacy, any more than his journalism or his poems. He had published a great many poems, but Wilde was a bad poet – he rarely found the right words and he was old-fashioned. Read him next to Emily Dickinson, Walt Whitman or WB Yeats, and you will see for yourself. We don't read his poetry now – it is dated and dead; too much Arcady and Hellenic Hours. The early plays suffer from the same verbal excess. Wilde at his worst wrote in purple. At his best he is dazzling. The birth of his children seems to have regenerated Wilde as a writer. The tedious Hellenism vanished. The purple-isms faded. There are still overwritten images – Dawn's grey fingers clutching at the stars – and he never gives up his fondness for a biblical moment, usually appearing as precious stones or pronouns (thee and thy), but his style did change. The writing became freer and sharper, and also more self-reflective, without being self-absorbed.

Academic criticism of Wilde's work has too often dismissed these fairy stories as a minor bit of sentimentalism scribbled off by an iconoclast during a temporary bout of babymania. But since JK Rowling's Harry Potter and Philip Pullman's His Dark Materials, children's literature has been repositioned as central, not peripheral, shifting what children read, what we write about what children read, and what we read as adults. At last we seem to understand that imagination is ageless. Wilde's children's stories are splendid. In addition, it seems to me that they should be revisited as a defining part of his creative process.

More here.

Cats see the world in an entirely different light

Tia Ghose in Live Science:

CatsCats' fondness for pouncing on feet and feathery toys may be rooted in their hunting instinct, but it also has a lot to do with their unique vision. And, as it turns out, scientists know a lot about what cats see. Now, a new set of images, by artist Nickolay Lamm, tries to capture the differences between cat vision and human vision. Whereas humans are able to see more vibrant colors during the day, their feline companions have the edge when it comes to peripheral vision and night vision. [Images: See What a Cat Sees]

Cats have a wider field of view — about 200 degrees, compared with humans' 180-degree view. Cats also have a greater range of peripheral vision, all the better to spot that mouse (or toy) wriggling in the corner. Cats are crepuscular, meaning they are active at dawn and dusk. That may be why they need such good night vision. Their eyes have six to eight times more rod cells, which are more sensitive to low light, than humans do. In addition, cats' elliptical eye shape and larger corneas and tapetum, a layer of tissue that may reflect light back to the retina, help gather more light as well. The tapetum may also shift the wavelengths of light that cats see, making prey or other objects silhouetted against a night sky more prominent, Kerry Ketring, a veterinarian with the All Animal Eye Clinic in Whitehall, Mich., wrote in an email. [10 Surprising Facts About Cats]. Their extra rod cells also allow cats to sense motion in the dark much better than their human companions can.

More here.

Friday Poem

Queen Anne's Lace

Her body is not so white as
anemone petals nor so smooth – nor
so remote a thing. It is a field
of the wild carrot taking
the field by force; the grass
does not rise above it.
Here is no question of whiteness,
white as can be, with a purple mole
at the center of each flower.
Each flower is a hand's span
of her whiteness. Wherever
her hand has lain there is
a tiny purple blemish. Each part
is a blossom under his touch
to which the fibers of her being
stem one by one, each to its end,
until the whole field is a
white desire, empty, a single stem,
a cluster, flower by flower,
a pious wish to whiteness gone over –
or nothing.

by William Carlos Williams
from The Collected Poems: Volume 1 1909-1939
New Directions Books, 1986

Study reveals brain ‘takes out the trash’ while we sleep

From EurekAlert:

ScreenHunter_362 Oct. 18 09.39In findings that give fresh meaning to the old adage that a good night's sleep clears the mind, a new study shows that a recently discovered system that flushes waste from the brain is primarily active during sleep. This revelation could transform scientists' understanding of the biological purpose of sleep and point to new ways to treat neurological disorders.

“This study shows that the brain has different functional states when asleep and when awake,” said Maiken Nedergaard, M.D., D.M.Sc., co-director of the University of Rochester Medical Center (URMC) Center for Translational Neuromedicine and lead author of the article. “In fact, the restorative nature of sleep appears to be the result of the active clearance of the by-products of neural activity that accumulate during wakefulness.”

The study, which was published today in the journal Science, reveals that the brain's unique method of waste removal – dubbed the glymphatic system – is highly active during sleep, clearing away toxins responsible for Alzheimer's disease and other neurological disorders. Furthermore, the researchers found that during sleep the brain's cells reduce in size, allowing waste to be removed more effectively.

The purpose of sleep is a question that has captivated both philosophers and scientists since the time of the ancient Greeks. When considered from a practical standpoint, sleep is a puzzling biological state. Practically every species of animal from the fruit fly to the right whale is known to sleep in some fashion. But this period of dormancy has significant drawbacks, particularly when predators lurk about. This has led to the observation that if sleep does not perform a vital biological function then it is perhaps one of evolution's biggest mistakes.

While recent findings have shown that sleep can help store and consolidate memories, these benefits do not appear to outweigh the accompanying vulnerability, leading scientists to speculate that there must be a more essential function to the sleep-wake cycle.

The new findings hinge on the discovery last year by Nedergaard and her colleagues of a previously unknown system of waste removal that is unique to the brain.

More here.

The Education of Abraham Cahan (and Seth Lipsky)

Ezra Glinter in Forward:

Lipsky-coverOn the morning of August 24, 1929, an Arab mob attacked the Jewish population of Hebron. Homes were pillaged, synagogues were desecrated, and scores of people were murdered or maimed. In the final tally, some 67 Jews were killed.

A few days later, Abraham Cahan, then at the height of his power as editor of the Yiddish-language Forverts, editorialized on the massacre. Although his words stood in contrast to those of the Communist newspaper Frayhayt, which considered the attack a revolt against British and Zionist imperialism, they expressed a nuanced point of view. The underlying cause of the massacre, he wrote, was a universal failing — a “dark chauvinism” that was at “the root of all wars, of all misfortunes.” But he also compared the tragedy to a “Third Destruction,” invoking the sacking of the temples in Jerusalem and placing the event in a history of specifically Jewish suffering. National identity could be the cause of conflict, he realized, but also its target.

Cahan had not always been so sensitive to Jewish trauma; 48 years earlier, when pogroms broke out in Ukraine after the assassination of Czar Alexander II, he was downright indifferent. “Even though the pogrom brought dread into the heart of every Jew, I must admit that the members of my group were not disturbed by it,” he later wrote in his autobiography,“The Education of Abraham Cahan.” “We regarded ourselves as human beings, not as Jews. There was only one remedy for the world’s ills, and that was socialism.”

More here.

Thursday, October 17, 2013

Bleak House

Charles Simic in the New York Review of Books:

ScreenHunter_362 Oct. 17 18.10Some of us notice them, while others don’t seem to, even though there are 46.5 million of them according to the latest census and they are everywhere if one cares to look. A tall man in his late fifties, whose portrait might have once hung over the boss’s desk in some company office, packing grocery bags in a supermarket with grim efficiency; a meek-looking old couple in a drug store waiting their turn at the cash register with a bottle of generic ibuprofen and a box of tissues, who, upon learning the price for each put the tissues aside and pay with small change for the painkiller; a handsome, middle-aged father, unshaven and looking unkempt, waiting with his small son for a school bus outside a modest home in the suburbs; the tired and resigned look of fast food workers and store clerks in a mall, some of them young, but many of them middle-aged and even older, most of them being paid minimum wage for their work and needing an additional job, food stamps, or some other form of government assistance to support their families; a soup kitchen in New York with people who could be one’s relatives waiting patiently in line.

Anyone who averts his eyes from the hopeless lives many of our fellow citizens lead and tells himself and others that these men and women only have themselves to blame, is either a fool or a soulless bastard.

More here.

Key to Ants’ Evolution May Have Started With a Wasp

Carl Zimmer in the New York Times:

ScreenHunter_361 Oct. 17 18.05One factor in the spectacular success of ants is their social life. They live in large colonies in which they divide the labor of finding food, rearing their young and defending their nests. Their societies are so complex that some scientists have studied ants as a way to understand the factors behind our own evolution into a social species.

It’s thus no surprise that many biologists — Dr. Ward among them — have long wondered how ants evolved. In the journal Current Biology, Dr. Ward and colleagues at the University of California, Davis, and the American Museum of Natural History, have now published an evolutionary tree of ants and their closest relatives that may provide the answer.

The authors conclude that the ancestors of ants were wasps. Not just any wasps, though: the closest relatives of ants turn out to include mud dauber wasps, which make pipe-shaped nests on the walls of buildings.

More here.

The Larkin Amis friendship

P11_Powell_Web_377329hNeil Powell at the Times Literary Supplement:

The best recipe for a successful literary friendship, as perhaps for any other sort, is a solid base of common ingredients spiced with touches of absolute difference: English literature’s most celebrated double acts – Wordsworth and Coleridge, Auden and Isherwood, Amis and Larkin – are all like that. Each pair has a background of broadly compatible class and education, shared background interests and cultural tastes; yet as writers they diverge in ways that both sustain and endanger their relationships. Kingsley Amis had an explicit, if ironic, sense of the footsteps in which he and Philip Larkin were following: “Well with you as the Auden and me as the Isherwood de nos jours, ‘our society’ is not doing so bad”, he told his friend in October 1957.

That date – on the face of things quite a late one for a pair who had met at Oxford sixteen years earlier – is in itself significant. Until the mid-1950s, it hadn’t been at all certain who was to be the Auden and who the Isherwood. That became clear not so much with the publication as with the critical reception of Amis’s Lucky Jim in 1954 and Larkin’s The Less Deceived in 1955.

more here.

The American Way of Death

FuneralparlorBess Lovejoy at Lapham's Quarterly:

When The American Way of Death was published in 1963, a typical funeral in the United States involved thorough embalming, a spackling of cosmetics, and hundreds of dollars of flowers heaped upon an open casket. The whole affair was likely to cost over a thousand dollars, at a time when the median annual income in the country was just $5,600. A funeral was often one of the largest single expenditures a family made, after a house and car, yet morticians justified the hefty price tag by claiming the ceremony offered almost magical curative powers. They placed particularly great stock in providing the bereaved with a reassuring last image of the deceased, known in the industry as a “memory picture.” In the trade literature of the time, this image was seen as a balm to grief, a means of facing up to the finality of death without countenancing the disquieting signs of decomposition. The cost—Americans spent $1.6 billion on funerals in 1960—was also said to be a natural result of consumer desires, particularly the much trumpeted postwar American desire to have the best of everything.

more here.

In Antarctica, maps are a little different

Map_MILSara Wheeler at More Intelligent Life:

Captain Scott discovered the 34-mile Taylor Glacier on his first expedition, the one that sailed south in 1901. Named after Griffith Taylor, one of Cook’s geologists, the glacier lies at the head of an arid valley created by the advances and retreats of glaciers through the Transantarctic Mountains. These dry valleys, partially free of ice for about 4m years, are dotted with saltwater basins—you can see some of them on the map—and they form one of the most extreme deserts in the world. NASA tested robotic probes there before dispatching them on interplanetary missions. One of the engineers told me, “This is as close to Mars as we can get.”

My crew at Lake Bonney, funded by the National Science Foundation, were melting holes in the 12-foot lid of ice that covered the lake and lowering sediment traps to the bottom. It was complex, fraught and expensive work, and their shifts often extended to 30 hours straight. Down south, it’s hard to keep an ice hole open for three months. The team waged a constant battle against the Big Freeze. Over supper (usually pasta with freeze-dried vegetable sauce), the guys—yup, all guys—talked about the organic carbon sloshing around at the bottom of the lake and the ribboned crystals trapped in the ice cover, and asked each other questions about the microbial life going about its business in the soupy, nitrate-rich water.

more here.

Thursday Poem

World Greater Than We Make

—ruin: total destruction or disintegration
rendering something formless, useless, or
valueless. -American Heritage Dictionary

1.
The ruin I'm thinking of spilled down the entire hillside,
blocks of stone like dice shaken in the cup of the sky.
A place that invites whispering or song. Darkly inviting
as the slaughterhouse collapsed beside a lonely road.
Crazed china, the lightning-bolt symmetry of cracks.
A ruin is the bones of a thing slowly exposed by time,
wind, water. The fin of an old Caddy rusting in a circle
of trees on a high ridge. The ribs of a boat
rocking on sand. Colors only waiting can paint.
All gashes are old gashes. No blood.

2.
I wander Manaus, city of perpetual ruin,
where the walls grew new layers all by themselves,
where a man sits writing a novel neither of us knows
I will translate eight years later. I am alone
and inexplicably happy. I lean back off the curb
into traffic to fill my eyes with the grand,
decayed facade, spindly papaya trees
at attention in second story windows.
I bend to the gutter for a chip of tile, twirl it
under my chin like a buttercup: it says
I love this accident.

3.
The dictionary is dead (and it was a good one): the ruin
is spilling still. The farthest thing from worthless,
it is a sadness made beautiful over time—or is it beauty
mad sad? A silent clash of meanings we can
picnic in. We say a lot about ourselves
with the buildings we build, but their ruins speak
with other than human voices. See how ruins gladly
join earth and sky, how the natural world bends down, creeps in,
to meet them. More fools we, to preserve or construct
the stone and wood we borrow, instead of simply watching
as gorgeous chaos slowly gathers back its own.
.
by Ellen Doré Watson
from We Live in Bodies
Alice James Books, 1997

Kanye West Knows you think he sounded nuts on kimmel

Cord Jefferson in Gawker on plausibly deniable racism:

KanyeI think one of the most damaging effects America's omnipresent racism has on a person's psyche isn't the brief pang of hurt that comes from being called a slur, or seeing a picture of Barack Obama portrayed by a chimpanzee. Those things are common and old-fashioned, and when they happen I tend to feel sadder than angry, because I'm seeing someone who engages with the world like a wall instead of a human being. Rather, I think what's far more corrosive and insidious, the thing that lingers in the back of my mind the most, is the framework of plausible deniability built up around racism, and how insane that plausible deniability can make a person feel when wielded. How unsure of oneself. How worried that you might be overreacting, oversensitive, irrational.

Read more here.