The Bourne Identity, randolph bourne

Baflr28-Johnson-Bourne-Bacevich-838x1152Andrew J. Bacevich at The Baffler:

A hundred years ago, Randolph Bourne was a hot property—an intellectual wunderkind who was taking the American intellectual scene by storm. Bourne was the complete package: brilliant, charismatic, filled with social energy, and exquisitely attuned to the moment. Bourne’s essays appeared in leading periodicals like The Atlantic, The Dial, and The New Republic back when magazines set the American political and cultural agenda. Admirers considered him a visionary, an exponent of a humane new cosmopolitanism. True freedom and real democracy, he believed and exemplified, implied a spirit of tolerance, generosity, and creativity consummated in what he called “the beloved community.”

Barely two years after writing these words, Bourne became persona non grata. His offense involved not personal scandal—no violence, fraud, embezzlement, or sexual shenanigans—but something much, much worse: when the climate of opinion abruptly shifted, he refused to follow. They zigged, he zagged. While other members of the New York intelligentsia were swooning at the prospect of waging a war to end all wars that would make the world safe for democracy, Bourne dared to dissent. For this, they shut him out of virtually all the journals in which he had been publishing, and all respectable outlets generally.

more here.



oliver sacks on Mendeleev’s Garden

Periodic-tableOliver Sacks at The American Scholar:

The periodic table was incredibly beautiful, the most beautiful thing I had ever seen. I could never adequately analyze what I meant here by beauty—simplicity? coherence? rhythm? inevitability? Or perhaps it was the symmetry, the comprehensiveness of every element firmly locked into its place, with no gaps, no exceptions, everything implying everything else.

I was disturbed when one enormously erudite chemist, J. W. Mellor, whose vast treatise on inorganic chemistry I had started dipping into, spoke of the periodic table as “superficial” and “illusory,” no truer, no more fundamental than any other ad hoc classification. This threw me into a brief panic, made it imperative for me to see if the idea of periodicity was supported in any ways beyond chemical character and valency.

Exploring this took me away from my lab, took me to a new book that immediately became my bible, the CRC Handbook of Physics and Chemistry, a thick, almost cubical book of nearly three thousand pages, containing tables of every imaginable physical and chemical property, many of which, obsessively, I learned by heart.

I learned the densities, melting points, boiling points, refractive indices, solubilities, and crystalline forms of all the elements and hundreds of their compounds. I became consumed with graphing these, plotting atomic weights against every physical property I could think of.

more here.

Here’s What Actually Gets Terrorists To Tell The Truth — And It’s Not Torture

Peter Aldhous in BuzzFeed:

Enhanced-buzz-wide-13846-1439655140-14Rather than focusing on stress, the new interrogation research program has concentrated on interviewing techniques that help people remember details about events — and make it harder for liars to keep their story together.

Central to this approach is the “cognitive interview,” developed by Ronald Fisher, a psychologist at Florida International University in Miami. Rather than being asked a series of questions, suspects may be told to close their eyes and recall what happened at a key meeting, or draw a sketch of the room in which it took place. They are encouraged to go over events repeatedly and offer details whether or not they seem important.

In one test, Fisher’s team asked seasoned instructors at the Federal Law Enforcement Training Center in Glynco, Georgia, to get their colleagues to recall the details of meetings held to plan field exercises. Those who used a cognitive interview, rather than the standard approach of asking direct questions,extracted 80% more information.

This approach can also separate liars from truth-tellers. When recalling their experiences in a cognitive interview, people who are telling the truth give longer and more detailed answers.

More here.

NAOMI KLEIN’S CALL TO ARMS

Cornelia Parker in More Intelligent Life:

KLEINI’ve never met Klein, though I would very much like to—but I went to see her speak about her latest book in London. I was shocked by how young she still is, and obviously she’s beautiful, but mostly she is brave: she speaks her mind, articulately and powerfully. There are lots of people that wouldn’t like her to say the things she says, but she says them anyway, and that’s what I admire. The first book I read of hers was “No Logo”, and then “The Shock Doctrine”, which was amazing, and now “This Changes Everything”, which I think is much, much needed. Somebody needed to write an intelligent book about climate change and its politics, because politics is the main reason we are all so blindly riding the boat over the waterfall.

Klein takes the position that custodianship of the planet is at odds with capitalism. The so-called free world is actually run by corporations, and it’s not in their financial interest for us all to get alarmed about our future. (And this though they’ve got children.) What Klein has done is to flag up the huge amount of disinformation that stops us taking the action we need to take. Every politician should be duty-bound to act on climate change, but they don’t because there are too many corporate hands in their jar, as it were. Klein says that China may be able to act because it isn’t a democracy yet; its government can say “you can have only one child”, and everybody has to jump. China could lead a new economic world order because it will be able to make big changes fast. We can’t, because capitalism keeps interrupting. So somehow we’ve got to have a huge paradigm shift, and that’s what Klein is trying to tell us. It’s a call to arms.

More here.

Lack of sleep puts you at higher risk for colds

Hanae Armitage in Science:

SleepEMBEDMoms and sleep researchers alike have stressed the importance of solid shuteye for years, especially when it comes to fighting off the common cold. Their stance is a sensible one—skimping on sleep weakens the body’s natural defense system, leaving it more vulnerable to viruses. But the connection relied largely on self-reported, subjective surveys—until now. For the first time, a team of scientists reports that they have locked down the link experimentally, showing that sleep-deprived individuals are more than four times more likely to catch a cold than those who are well-rested. “It’s very nice to see an experiment looking at sleep as an important regulator for specific antiviral immune responses,” says Michael Irwin, a psychoneuroimmunologist at University of California (UC), Los Angeles, who is not involved with the study. “In this particular case, there’s a hard clinical outcome showing [sleep deprivation] and susceptibility to the common cold.”

In a carefully controlled two-part experiment, scientists began by collecting nightly sleep data on 164 healthy individuals for 1 week. Participants were asked to record the times at which they went to bed and woke up. They also wore small watchlike devices that use a technique called wrist actigraphy to monitor movement (much like a Fitbit tracks activity) while they slept. Aric Prather, lead author of the study and a sleep researcher at UC San Francisco, says that he and his colleagues associate the wrist actigraphy data with being awake—if during a reported sleep period, the wrist band records movement, they take that as an indication of wakefulness, and subtract the time spent moving from the hours asleep. Then came part two: the cold infections. Scientists quarantined participants in a hotel and gave them nose drops containing rhinovirus—the virus responsible for the common cold. They then closed off the hotel floor for 5 days, letting the hosts’ immune system do the rest. To ensure the most accurate results, researchers drew participants’ blood before the viral exposure to test for levels of rhinovirus antibody, a defensive agent in the immune system that recognizes and attacks rhinovirus. If they found high, preexisting levels of the protective protein, they removed the participant from the study so that prior immunity would not bias the infection rates of the group.

More here.

Wednesday Poem

Poem
by Rachel Zucker
The other day Matt Rohrer said, the next time you feel yourself going dark in a poem, just don’t, and see what happens.  That was when Matt, Deborah Landau, Catherine Barnett, and I were chatting, on our way to somewhere and something else.  In her office, a few minutes earlier, Deborah had asked, are you happy? And I said, um, yes, actually, and Deborah: well, I’m not—  all I do is work and work. And the phone rang every thirty seconds and between calls Deborah said, I asked Catherine  if she was happy and Catherine said, life isn’t about happiness it’s about helping other people. I shrugged, not knowing how  to respond to such a fine idea. So, what makes you happy? Deborah asked, in an accusatory way,  and I said, I guess, the baby, really, because he makes me stop working? And Deborah looked sad  and just then her husband called and Deborah said, Mark, I’ve got rachel Zucker here, she’s happy,  I’ll have to call you back. And then we left her office and went downstairs to the salon where a few weeks before  we’d read poems for the Not for Mothers Only anthology and I especially liked Julie Carr’s poem about crying while driving while listening to  the radio report news of the war while her kids fought in the back seat while she remembered her mother crying while driving, listening to  news about the war. There were a lot of poems that night about crying, about the war, about fighting, about rage, anger, and work. Afterward  Katy Lederer came up to me and said, “I don’t believe in happiness”—you’re such a bitch for using that line, now no one else can.  Deborah and I walked through that now-sedated space which felt smaller and shabby without Anne Waldman and all those women and poems and suddenly  there was Catherine in a splash of sunlight at the foot of a flight of stairs talking to Matt Rohrer on his way to a room or rooms I’ve never seen.  And that’s when Deborah told Matt that I was happy and that Catherine thought life wasn’t about happiness and Deborah laughed a little and flipped  her hair (she is quite glamorous) and said, but Matt, are you happy? Well, Matt said he had a bit of a cold but otherwise was and that’s when he said,  next time you feel yourself going dark in a poem, just don’t, and see what happens. And then, because it was Julian’s sixth birthday, Deborah went  to bring him cupcakes at school and Catherine and I went to talk to graduate students who teach poetry to children in hospitals and shelters and other  unhappy places and Matt went up the stairs to the room or rooms I’ve never seen. That was last week and now I’m here, in bed, turning toward something I haven’t felt  for a long while. A few minutes ago I held our baby up to the bright window and sang the song I always sing before he takes his nap. He whined and struggled  the way toddlers do, wanting to move on to something else, something next, and his infancy is almost over. He is crying himself to sleep now and I will not say  how full of sorrow I feel, but will turn instead to that day, only a week ago, when I was the happiest poet in the room, including Matt Rohrer.

.

Tuesday, September 1, 2015

Ian McEwan: when I was a monster

Ian McEwan in The Guardian:

ScreenHunter_1345 Sep. 01 19.20In 1970, when I was 22, I moved to Norwich and lodged in a small, pleasant room on the edge of the city. I had come to do an MA in English at the University of East Anglia, but my overriding purpose was to write fiction. At the end of my first week, with all arrangements made, I sat down at a card table by the end of my bed one evening and told myself that I would work continuously through the night until I had completed an entire short story. I had no notes, only a scrap, a dreamy notion of what sort of story this would be.

Within an hour, a strange voice was talking to me from the page. I let it speak. I worked on into the night, filled with a romantic sense of myself, the writer heroically driven by a compelling idea, pushing on towards dawn as the city slept. I finished around 6 o’clock.

The story was called “Conversation with a Cupboard Man”, one of a handful I wrote that year that went into my first book, First Love, Last Rites, published in 1975. Its narrator was a man who didn’t want to grow up – a strange choice for me because I considered myself that year to have finally reached adult independence. Being in Norwich was the first major decision I’d taken in my life without reference to or advice from anyone else. I wanted a fresh start after undergraduate life. I regarded myself as a full-time committed writer. An MA was what I could do in my spare time. An academic grant would support me.

Other strange voices, other weird or wretched characters, surfaced in that year to haunt or infest my fiction. Violent, sexually perverse, lonely, they were remote from the life I was living in Norwich at the time. I was meeting many new friends, falling in love, keenly reading contemporary American fiction, hiking the North Norfolk coast, had taken a hallucinogenic drug in the countryside and been amazed – and yet whenever I returned to my notebook or typewriter, a savage, dark impulse took hold of me.

More here.

How Reliable Are Psychology Studies?

Ed Yong in The Atlantic:

ScreenHunter_1344 Sep. 01 18.46No one is entirely clear on how Brian Nosek pulled it off, including Nosek himself. Over the last three years, the psychologist from the University of Virginia persuaded some 270 of his peers to channel their free time intorepeating 100 published psychological experiments to see if they could get the same results a second time around. There would be no glory, no empirical eurekas, no breaking of fresh ground. Instead, this initiative—the Reproducibility Project—would be the first big systematic attempt to answer questions that have been vexing psychologists for years, if not decades. What proportion of results in their field are reliable?

A few signs hinted that the reliable proportion might be unnervingly small. Psychology has been recently rocked by several high-profile controversies, including: the publication of studies that documented impossible effects like precognition, failures to replicate the results of classic textbook experiments, and some prominent cases of outright fraud.

No one is entirely clear on how Brian Nosek pulled it off, including Nosek himself. Over the last three years, the psychologist from the University of Virginia persuaded some 270 of his peers to channel their free time intorepeating 100 published psychological experiments to see if they could get the same results a second time around. There would be no glory, no empirical eurekas, no breaking of fresh ground. Instead, this initiative—the Reproducibility Project—would be the first big systematic attempt to answer questions that have been vexing psychologists for years, if not decades. What proportion of results in their field are reliable?

More here.

Fuck Nuance

Kieran Healy, Associate Professor of Sociology at Duke University, at his own website:

ScreenHunter_1343 Sep. 01 18.29Nuance is not a virtue of good sociological theory. Sociologists typically use it as a term of praise, and almost without exception when nuance is mentioned it is because someone is asking for more of it. I shall argue that, for the problems facing Sociology at present, demanding more nuance typically obstructs the development of theory that is intellectually interesting, empirically generative, or practically successful.

As alleged virtues go, nuance is supercially attractive. Isn’t the mark of a good thinker the ability to see subtle dišerences in kind or gracefully shade the meaning terms? Shouldn’t we cultivate the ability to insinuate overtones of meaning in our concepts? Further, isn’t nuance especially appropriate to the di›cult problems we study? I am sure that, like mine, your research problems are complex, rich, and multi-faceted. (Why would you study them if they were simple, thin, and one-dimensional?) When faced with problems like that, a cultivated capacity for nuance might seem to režect both the di›culty of the topic and the sophistication of the researcher approaching it. I am sure that, like me, you are a sophisticated thinker. When sophisticated people like us face this rich and complex world, how can nuance not be the wisest approach?

It would be foolish, not to say barely comprehensible, for me to try to argue against the idea of nuance in general. at would be like arguing against the idea of yellow, or the concept of ostriches. It does not make much sense, in any case, to think of nuance as something that has a distinctive role all of its own in theory, or as something that we can add to or take away from theory just as we please.

More here.

Steven Colbert on Making The Late Show His Own

Joel Lovell in GQ:

ScreenHunter_1342 Sep. 01 18.23It was early July, about nine weeks before the debut of The Late Show with Stephen Colbert, and we were sitting in his temporary office above a BMW dealership on the far west side of Manhattan. He looked very tired, and he was apologizing (unnecessarily) for rambling on in a way that was maybe a little uncomfortably overemotional. “I didn't leave the studio until 2 A.M. last night,” he said. “Didn't get to bed until three, and I've been traveling and just got here—.”

He'd been up late doing a strange stunt the night before, stepping in unannounced as host of Only in Monroe, a local public-access program in Monroe, Michigan, about forty miles south of Detroit. There was all sorts of pressure on their first show, he said. “First show! First show! Well, fuck the first show. There's going to be 202 this year—how do you do a first one? So I just wanted to go do a show someplace. And now we've done it.”

The idea was to do Only in Monroe more or less as it always is—same production values, same set and graphics and crew—just a ton more jokes. His first guests were the show's regular hosts, Michelle Bowman and (former Miss America) Kaye Lani Rae Rafko Wilson. (Colbert on-air: “I'm not sure how many people that is.”) He did Monroe news and the Monroe calendar, and about twenty minutes in, he brought out his next guest, “a local Michigander who is making a name for himself in the competitive world of music, Marshall Mathers.”

More here.

Ralph Waldo Emerson’s American poetry

150907_r26934-320Dan Chiasson at The New Yorker:

The listlessness of Emerson’s poetry is surprising, given the veneration he expressed for the art. Some of his best prose is devoted to lobbying for the special advantages of poetry. These works are thrilling because they are written in thrilling sentences. This does not necessarily imply that Emerson’s poetry will be thrilling, though he must have intended his large claims for poetry to be tested on his own work. Like many of his essays, “The Poet” was printed with an original short lyric as its epigraph. The mediocrity of these poem-epigraphs is often emphasized by the essays’ attempts to honor them as superior forms of expression. It makes for a strangely rigged contest between turbocharged prose and the rickshaw verse it ostensibly reveres. Emerson’s “poet”—a “complete man,” a “man without impediment,” a “sayer” and “namer,” like Adam—would not have printed the lacklustre verses appended to “The Poet,” which venerate “Olympian bards” and “divine ideas” with rhymes as bouncy as a Super Ball.

In “Merlin I,” written, like “The Poet,” in the eighteen-forties, Emerson plays the unwinnable game of arguing in metre against metre and in rhyme against rhyme:

Thy trivial harp will never please

Or fill my craving ear;

Its chords should ring as blows the breeze,

Free, peremptory, clear.

No jingling serenader’s art,

Nor tinkle of piano strings,

Can make the wild blood start

In its mystic springs.

more here.

on ‘Naked at Lunch: The Adventures of a Reluctant Nudist’

Bywater_08_15Michael Bywater at Literary Review:

The American edition of Naked at Lunch has the title in big upper-case letters, printed as though they were cutouts, windows onto the scene behind, showing a man on a slatted chair that appears to be on a ship. You can see the sea, and the kind of light you get at sea, which has inspired artists for…

You're right. This is avoidance behaviour so I'll just say it: the guy is NAKED, okay? NUDE. Plump, middle-aged, bald, grey goatee, specs and he's in the effing nude. On his lap is a MacBook Air. It's resting on his willy. His todger, for God's sake, wang, doodle, schlong, his PENIS is TOUCHING his COMPUTER. This must be the author on the nudist cruise, with 1,865 other naked people, and I really hope the ID tag around his neck doesn't say 'Access All Areas'.

So that's the naked author, with his whacker and his Mac, and this is his book about nudists and what they're like and what the hell they think they're doing. So, not unreasonably, the book is categorised as social science. In the USA.

But not here. Here in Britain, there's no nude author. The cover is whimsical, cartoony: there are little pink blobby people, sunbeds, a swimming pool and a very tanned woman with a poodle and a tent. And here in Britain, the category is travel writing.

more here.

Two books confront the challenges of growing up black in America

Cover00Gene Seymour at Bookforum:

Fine. Let’s start with “Negro,” or, if one prefers, “negro.” Even with this word’s present-day, often lower-case status, there are African Americans for whom “Negro” is a trigger word for outrage or affront. Some want the word excised altogether—which, at least to this African American, displays amnesia toward (or, worse, disrespect for) our collective history. Between the years 1900 and 1970 (give or take), “Negro” defined a people in transition through two world wars, a cultural renaissance, and a social and political movement that changed everything around it. Those who defined themselves as “Negro” flew airplanes to battle fascism, made their own movies, established baseball franchises, and used their hard-won education in law, the arts, and science to pull their people ahead with them, transforming a nation that otherwise refused to see them as they were, when it chose to see them at all. Where that other “N-word” demeaned and distorted (and still does, no matter who uses it), “Negro” dignified and elevated. After the ’60s had run their course, Negroes collectively agreed to shift to “black” because the other was no longer considered sufficient, or useful. It was outdated, perhaps. But an insult? Our grandparents and great-grandparents might beg to differ, no matter what they chose to call themselves.

I am, in short, riding the same train as Margo Jefferson, who may be even more bullish on the matter than I am, certainly more lyrical: “I still find ‘Negro’ a word of wonders, glorious and terrible. . . . A tonal-language word whose meaning shifts as setting and context shift, as history twists, lurches, advances, and stagnates. As capital letters appear to enhance its dignity; as other nomenclatures arise to challenge its primacy.”

more here.

Why can’t we stop for death?

John Grey in New Statesman:

CrowWhen he was entering what he knew would be the final stage of his terminal illness, Bob Monkhouse used to joke that the terrible thing about dying was how stiff it left you feeling the next day. There is something pleasantly cavalier in the comedian’s quip. Why make a tragedy of something that will happen to us all? Perhaps we’d be wiser if we didn’t think of death at all, but instead – as the philosopher Spinoza recommended – only of life. But that kind of wisdom seems to be beyond our capacity. The human preoccupation with death is pervasive and universal, and every society offers remedies for the anxiety that the fact of mortality evokes. Religions have their afterlives, while secular faiths offer continuity with some larger entity – nations, political projects, the human species, a process of cosmic evolution – to stave off the painful certainty of oblivion. In their own lives, human beings struggle to create an image of themselves that they can project into the world. Careers and families prolong the sense of self beyond the grave. Acts of exceptional heroism and death-defying extreme sports serve a similar impulse. By leaving a mark, we can feel we are not just fleeting individuals who will soon be dead and then forgotten.

Against this background, it might seem that the whole of human culture is an exercise in death denial. This is the message of Stephen Cave’s thoughtful and beautifully clear Immortality: the Quest to Live For Ever and How It Drives Civilisation (2012). A more vividly personal but no less compelling study of our denial of death is presented in Caitlin Doughty’s Smoke Gets in Your Eyes: and Other Lessons from the Crematorium (2015), in which the author uses her experience of working at a Californian funeral parlour to show how contemporary mortuary practice – removing the corpse as quickly as possible, then prettifying it so that it almost seems alive – serves to expel the fact of death from our lives.

More here.

Psychology Is Not in Crisis

Lisa Feldman Barrett in The New York Times:

Barrett-master675An initiative called the Reproducibility Project at the University of Virginia recently reran 100 psychology experiments and found that over 60 percent of them failed to replicate — that is, their findings did not hold up the second time around. The results, published last week in Science, have generated alarm (and in some cases, confirmed suspicions) that the field of psychology is in poor shape. But the failure to replicate is not a cause for alarm; in fact, it is a normal part of how science works. Suppose you have two well-designed, carefully run studies, A and B, that investigate the same phenomenon. They perform what appear to be identical experiments, and yet they reach opposite conclusions. Study A produces the predicted phenomenon, whereas Study B does not. We have a failure to replicate. Does this mean that the phenomenon in question is necessarily illusory? Absolutely not. If the studies were well designed and executed, it is more likely that the phenomenon from Study A is true only under certain conditions. The scientist’s job now is to figure out what those conditions are, in order to form new and better hypotheses to test.

A number of years ago, for example, scientists conducted an experiment on fruit flies that appeared to identify the gene responsible for curly wings. The results looked solid in the tidy confines of the lab, but out in the messy reality of nature, where temperatures and humidity varied widely, the gene turned out not to reliably have this effect. In a simplistic sense, the experiment “failed to replicate.” But in a grander sense, as the evolutionary biologist Richard Lewontin has noted, “failures” like this helped teach biologists that a single gene produces different characteristics and behaviors, depending on the context.

More here.

Monday, August 31, 2015

Sunday, August 30, 2015

Restoring Henry Kissinger

Michael O'Donnell in Washington Monthly:

1509-odonnell_bk_article01In 1940 the young Henry Kissinger, caught in a love quadrangle, drafted a letter to the object of his affections. Her name was Edith. He and his friends Oppus and Kurt admired her attractiveness and had feelings for her, the letter said. But a “solicitude for your welfare” is what prompted him to write—“to caution you against a too rash involvement into a friendship with any one of us.”

I want to caution you against Kurt because of his wickedness, his utter disregard of any moral standards, while he is pursuing his ambitions, and against a friendship with Oppus, because of his desire to dominate you ideologically and monopolize you physically. This does not mean that a friendship with Oppus is impossible, I would only advise you not to become too fascinated by him.

Kissinger disclaimed any selfish motive for writing, loftily quoted from Washington’s farewell address, and regretted with some bitterness Edith’s failure to read or comment on the two school book reports he had sent her. Would she please return them for his files?

It is unfair to judge a man’s character by a jealous letter that he drafted (and did not send) at age sixteen. Yet here, to a remarkable extent, is the future nuclear strategist, national security advisor, and secretary of state. The reference to Edith’s attractiveness bespeaks the charm and flattery for which Kissinger would become famous. Secrecy and deceit are present also: he went behind his friends’ backs and coyly advised against a relationship with “any one of us,” which of course really meant the other guys. By trashing his buddies in order to get a girl, Kissinger displayed ruthlessness. The letter is written in what Christopher Hitchens memorably described as Kissinger’s “dank obfuscatory prose,” which relies on clinical-sounding phrases like “dominate you ideologically.” And, of course, the letter betrays vanity. How could anyone fail to be dazzled by his book reports!

More here.