ken russell (1927-2011)

10-Ken-Russell-1-REX

It is the commonplace fate of British cinema’s more visionary talents to end their careers marginalised and even mocked. This was certainly what happened to Ken Russell, who has died aged 84. In his latter years, with his shock of white hair and his red face, the director cut a cantankerous and slightly buffoonish figure. He asked for money for interviews. His greatest work wasn’t much in circulation. Those who knew him from such lesser efforts as The Fall Of The Louse Of Usher (2002), his eccentric and low-budget Edgar Allan Poe adaptation, or for his Cliff Richard and Sara Brightman videos, were probably baffled that he had such a glowing reputation. The director’s son Alex Verney-Elliott said his father had died in hospital after a series of strokes. Russell’s widow, Elize, said she was “devastated” by her husband’s death, which had been “completely unexpected”. Even in his pomp, he had always been a figure of considerable controversy. He was so often called the “enfant terrible” of British film that no one paid as much attention to his craftsmanship as they should have done.

more from Geoffrey Macnab at The Independent here.

Tuesday Poem

Vocation

Each cold October morning he went out
into the Gate Field and walked up and down,
like the horse-drawn seed-drill quartering every inch
to make sure the harvest was kept constant,
reading his Office, every sentence
of the forty pages for the day. In the evening,
as the colder darkness fell with the crows’
harsh calling, he sat alone in the back
benches of the unheated chapel, hour
after hour, staring for inspiration
at the golden, unresponsive tabernacle.

by Bernard O'Donoghue
Publisher: PIW, © 2011

Aging stem cells may explain higher prevalence of leukemia, infections among elderly

From PhysOrg:

Stem cellsHuman stem cells aren't immune to the aging process, according to scientists at the Stanford University School of Medicine. The researchers studied hematopoietic stem cells, which create the cells that comprise the blood and immune system. Understanding when and how these stem cells begin to falter as the years pass may explain why some diseases, such as acute myeloid leukemia, increase in prevalence with age, and also why elderly people tend to be more vulnerable to infections such as colds and the flu.

“We know that immune system function seems to decline with increasing age,” said Wendy Pang, MD. “This is the first study comparing the function and gene expression profiles of young and old purified, human hematopoietic stem cells, and it tells us that these clinical changes can be traced back to stem cell function.” Specifically, the researchers found that hematopoietic stem cells from healthy people over age 65 make fewer lymphocytes — cells responsible for mounting an immune response to viruses and bacteria — than stem cells from healthy people between ages 20 and 35. (The cells were isolated from bone marrow samples.) Instead, elderly hematopoietic stem cells, or HSCs, have a tendency to be biased in their production of another type of white blood cell called a myeloid cell. This bias may explain why older people are more likely than younger people to develop myeloid malignancies.

More here.

Human Nature’s Pathologist

Carl Zimmer in The New York Times:

PinkerDr. Pinker has focused much of his research on language on a seemingly innocuous fluke: irregular verbs. While we can generate most verb tenses according to a few rules, we also hold onto a few arbitrary ones. Instead of simply turning “speak” into “speaked,” for example, we say “spoke.” As a young professor at the Massachusetts Institute of Technology, he pored over transcripts of children’s speech, looking for telling patterns in the mistakes they made as they mastered verbs. Out of this research, he proposed that our brains contain two separate systems that contribute to language. One combines elements of language to build up meaning; the other is like a mental dictionary we keep in our memory.

This research helped to convince Dr. Pinker that language has deep biological roots. Some linguists argued that language simply emerged as a byproduct of an increasingly sophisticated brain, but he rejected that idea. “Language is so woven into what makes humans human,” he said, “that it struck me as inconceivable that it was just an accident.” Instead, he concluded that language was an adaptation produced by natural selection. Language evolved like the eye or the hand, thanks to the way it improved reproductive success. In 1990 he published a paper called “Natural Language and Natural Selection,” with his student Paul Bloom, now at Yale. The paper was hugely influential. It also became the seed of his breakthrough book, “The Language Instinct,” which quickly became a best seller and later won a place on a list in the journal American Scientist of the top 100 science books of the 20th century.

More here.

Monday, November 28, 2011

Sunday, November 27, 2011

Inverting the Turing Test

Stuart Shieber in American Scientist:

Most-human-what-talking-with-computers-teaches-brian-christian-hardcover-cover-artIn his book The Most Human Human, Brian Christian extrapolates from his experiences at the 2009 Loebner Prize competition, a competition among chatbots (computer programs that engage in conversation with people) to see which is “most human.” In doing so, he demonstrates once again that the human being may be the only animal that overinterprets.

You may not have heard of the Loebner competition, and for good reason. The annual event was inspired by the Turing test, proposed by Alan Turing in his seminal 1950 paper “Computing Machinery and Intelligence” as a method for determining in principle whether a computer possesses thought. Turing meant his test as a thought experiment to address a particular philosophical question, namely, how to define a sufficient condition for properly attributing intelligence, the capacity of thinking, to a computer. He proposed that a blind controlled test of verbal indistinguishability could serve that purpose. If a computer program were indistinguishable from people in a kind of open-ended typewritten back-and-forth, the program would have passed the test and, in Turing’s view, would merit attribution of thinking.

The Loebner competition picks up on this idea; it charges a set of judges to engage in conversation with the chatbot entrants and several human confederates, and to determine which are the humans and which the computers. At the end, a prize is awarded to the “most human” chatbot—that is, the chatbot that is most highly ranked as human in paired tests against the human confederates. “Each year, the artificial intelligence (AI) community convenes for the field’s most anticipated and controversial annual event,” Christian says. Well, not so much. The AI community pretty much ignores this sideshow. It’s the chatbot community that has taken up the Loebner competition.

More here.

Offense, irony, comedy, and who knows what else

Jed Perl in The New Republic:

MaurizioCattelanHimThey are selling postcards of Hitler in the gift shop at the Guggenheim Museum. To be precise, they are selling photographic reproductions of a work entitled Him, a polyester portrayal of the Führer that is one of the works by Maurizio Cattelan in his retrospective at the museum. I can imagine being outraged or at least troubled by the postcards in the gift shop, except that by the time I saw them I had already been bombarded by this exhibition in which nearly all of Cattelan’s oversized neo-Dadaist baubles have been hung from the ceiling of Frank Lloyd Wright’s rotunda. Cattelan’s Hitler doll—like his Picasso doll, his bicycle, his dinosaur, and the rest of the 128 items in this stupefyingly sophomoric show—is engineered for offense, irony, comedy, or who knows what else. Those who are bothered by the Hitler postcards in the gift shop are naturally going to be dismissed as insufficiently hip. The same goes for those who are disturbed by the sight of one of the world’s greatest public spaces once again turned over to an art world charlatan as his personal playpen. My own feeling is that the postcards, however misbegotten, are speech we accept, although not necessarily embrace, in a society we prize for its openness. What is really disquieting is the event that has occasioned these postcards. “Maurizio Cattelan: All”—that’s the title of the show—amounts to hate speech directed at the sponsoring institution.

More here.

Sunday Poem

Mapping the Interior

Imagine that you had a dishcloth
Bigger than the one mothers put on the bread
To slow its cooling, that you could spread
Over the whole kitchen floor to bring up its face
As clearly as the features on the cake.

You’d have a print you could lift up
To the light and examine for individual traces
Of people who came to swap yarns, and sit on
Sugan chairs that bit into the bare floor, leaving
Unique signatures on concrete that creased
Over time into a map you could look at and

Imagine what those amateur cartographers
Were thinking when their eyes fell, in the silence
Between the stories, that was broken only by
The sound of the fire and whatever it was that
Was calling in the night outside.

by Eugene O'Connell
from One Clear Call
Bradshaw Books, Cork, © 2003

Storms

Mitch Dobrowner in Lensculture:

Dobrowner_1Landscape photographers count ourselves lucky to be in the right place at the right time if a storm system is moving through — but I wanted to actively pursue these events. Since storms are a process (not a thing) I needed a guide. I soon connected with Roger Hill (regarded as the most experienced storm-chaser in the world); he introduced me to Tornado Alley and the Great Plains of the United States. In July 2009 Roger and I tracked a severe weather system for nine hours — from its formation outside of Sturgis, South Dakota, through Badlands National Park and into Valentine, Nebraska. Eventually we stopped in a field outside of Valentine, and there we stood in awe of the towering supercell (a thunderstorm with a deep rotating updraft) which was building with intake wind gusts of 60mph. It was like standing next to a 65,000-foot-high vacuum cleaner. It was unlike anything I had seen before in my life; the formation of the supercell had an ominous presence and power that I had never witnessed or experienced before. I remember turning to Roger, who was standing next to me, and saying, 'what the ****… you have to be kidding me'. It was only the second day of my “experiment” in shooting storms, but I knew without a doubt that this experiment would become an important project to me.

Words are inadequate to describe the experience of photographing this immense power and beauty. And the most exciting part is with each trip I really don’t know what to expect. But now I see these storms as living, breathing things.

More here. [Editor's note: You'll discover a lot more detail when you look at these images in our high-resolution slide show.]

The Era of Small and Many

Bill McKibben in Orion Magazine:

SmallEarlier this year, my state’s governor asked if I’d give an after-lunch speech to some of his cabinet and other top officials who were in the middle of a retreat. It’s a useful discipline for writers and theorists to have to summarize books in half an hour, and to compete with excellent local ice cream. No use telling these guys how the world should be at some distant future moment when they’ll no longer be in office—instead, can you isolate themes broad enough to be of use to people working on subjects from food to energy to health care to banking to culture, and yet specific enough to help them choose among the options that politics daily throws up? Can you figure out a principle that might undergird a hundred different policies? Or another way to say it: can you figure out which way history wants to head (since no politician can really fight the current) and suggest how we might surf that wave?

Here’s my answer: we’re moving, if we’re lucky, from the world of few and big to the world of small and many. We’ll either head there purposefully or we’ll be dragged kicking, but we’ve reached one of those moments when tides reverse. Take agriculture. For 150 years the number of farms in America has inexorably declined. In my state—the most rural in the nation—the number of dairies fell from 11,000 at the end of World War II to 998 this summer. And of course the farms that remained grew ever larger—factory farms, we called them, growing commodity food. Here in Vermont most of the remaining dairies are big, but not big enough to compete with the behemoths in California or Arizona; they operate so close to the margin that they can’t afford to hire local workers and instead import illegal migrants from Mexico. But last year the USDA reported that the number of farms in America had actually increased for the first time in a century and a half. The most defining American demographic trend—the shift that had taken us from a nation of 50 percent farmers to less than 1 percent—had bottomed out and reversed. Farms are on the increase—small farms, mostly growing food for their neighbors.

More here.

Saturday, November 26, 2011

Lynn Margulis 1938-2011

John Brockman in Edge:

LynnBiologist Lynn Margulis died on November 22nd. She stood out from her colleagues in that she would have extended evolutionary studies nearly four billion years back in time. Her major work was in cell evolution, in which the great event was the appearance of the eukaryotic, or nucleated, cell — the cell upon which all larger life-forms are based. Nearly forty-five years ago, she argued for its symbiotic origin: that it arose by associations of different kinds of bacteria. Her ideas were generally either ignored or ridiculed when she first proposed them; symbiosis in cell evolution is now considered one of the great scientific breakthroughs.

Margulis was also a champion of the Gaia hypothesis, an idea developed in the 1970s by the free lance British atmospheric chemist James E. Lovelock. The Gaia hypothesis states that the atmosphere and surface sediments of the planet Earth form a self- regulating physiological system — Earth's surface is alive. The strong version of the hypothesis, which has been widely criticized by the biological establishment, holds that the earth itself is a self-regulating organism; Margulis subscribed to a weaker version, seeing the planet as an integrated self- regulating ecosystem. She was criticized for succumbing to what George Williams called the “God-is good” syndrome, as evidenced by her adoption of metaphors of symbiosis in nature. She was, in turn, an outspoken critic of mainstream evolutionary biologists for what she saw as a failure to adequately consider the importance of chemistry and microbiology in evolution. I first met her in the late 80's and in 1994 interviewed her for my book The Third Culture: Beyond the Scientific Revolution (1995). Below, in remembrance, please see her chapter, “Gaia is a Tough Bitch”. One of the compelling features of The Third Culture was that I invited each of the participants to comment about the others. In this regard, the end of the following chapter has comments on Margulis and her work by Daniel C. Dennett, the late George C. Williams, W. Daniel Hillis, Lee Smolin, Marvin Minsky, Richard Dawkins, and the late Francisco Varela. Interesting stuff.

More here.

Jeffrey Eugenides talks about ‘The Marriage Plot’ and pokes fun at literary theorists

From The Christian Science Monitor:

JeffJeffrey Eugenides published his first novel at 33 after he was fired from his position as executive secretary at the Academy of American Poets. The reason he lost his job? He was spending too much time at work honing the manuscript of his debut novel “The Virgin Suicides” (1993). “Middlesex,” his second novel, earned Eugenides a Pulitzer Prize for fiction in 2002. “The Marriage Plot,” just released in October, is Eugenides' third novel which took him nine years to complete. The story centers around three college students – Madeleine, Leonard, and Mitchell – all of whom graduate from Brown College in 1982. The book is a postmodernist take on the original marriage plot within the Victorian novel. A lot of the time, it is also a novel about other novels, in which the characters spend their time discussing Derrida, Tolstoy, Austen, and Hemingway. The second half of the book moves away from literary theory, and in some colorful scenes set in Paris, Calcutta, and New York, Eugenides explores the difficulties of dealing with mental illness, failed romance, and one man’s battle with his faith in religion. And of course Eugenides also returns to his central source of inspiration: the coming-of-age story. Eugenides recently spoke to the Monitor about the extent of free will, why semiotics is needlessly convoluted, and how reading James Joyce nearly made him choose a career in religion over a career in writing.

Your new novel moves more towards realism than your previous work. Why the change in style?

I’ve always considered myself a realist at heart. I’ve never written a book that violated physicalprinciples. My books often have an atmosphere of the fantastic or
the surreal, but actually nothing happens in them that couldn’t happen in reality, so I don’t know if this book is that much of a departure in terms of realism.

More here.

Saturday Poem

Arriving Shortly

When amma came
to New York City,
she wore unfashionably cut
salwar kurtas,
mostly in beige,
so as to blend in,
her body
a puzzle that was missing a piece –
the many sarees
she had left behind:
that peacock blue
Kanjeevaram,
that nondescript nylon in which she had raised
and survived me,
the stiff chikan saree
that had once held her up at work.

When amma came to
New York City,
an Indian friend
who swore by black
and leather,
remarked in a stage whisper,

“This is New York, you know –
not Madras.
Does she realise?”

Ten years later,
transiting through L.A. airport
I find amma
all over again
in the uncles and aunties
who shuffle past the Air India counter
in their uneasily worn, unisex Bata sneakers,
suddenly brown in a white space,
louder than ever in their linguistic unease
as they look for quarters and payphones.
I catch the edge of amma’s saree
sticking out
like a malnourished fox’s tail
from underneath
some other woman’s sweater
meant really for Madras’ gentle Decembers.

by K. Srilata
from Arriving Shortly
Publisher: Writers Workshop, Kolkata, © 2011

Look, I Made a Hat

1_listing

It might be that the stage musical is now pretty well over as a form. Certainly, the gloomy parade of ‘juke-box’ musicals through the West End doesn’t give one much hope for the future. It is difficult to pick out a worst offender, but the Ben Elton We Will Rock You, confected from the Queen catalogue, is as bad as any. Its premise, of taking the work of a curious-looking, homosexual, Parsi, excessive genius like Freddie Mercury and turning it into an idiotic story about two clean-cut stage-school kids Putting the Show on Right Now says something truly terrible about the musical: it says that it can only deal with conventional views of conventional subjects. The demonstration of just how untrue that really is comes with the collected works of Stephen Sondheim, who is surely the greatest figure in the entire history of the stage musical. In his long career, he has not hesitated to address difficult subjects. It’s certainly true that other classics in the genre have dealt with some serious issues — race relations in Showboat, the Anschluss in The Sound of Music, even trade union movements in The Pyjama Game and urban prostitution in Sweet Charity. When Sondheim takes on themes of colonial exploitation (Pacific Overtures), political assassinations (Assassins) or Freudian psychological depths (pretty well the whole oeuvre), he is not stepping outside the previously established limits of the form.

more from Philip Hensher at The Spectator here.

thinking through ows

Ows-1

Protests do not write policy. And something as loosely formed as the OWS action shouldn’t be drafting white papers. What protests can do most effectively is to alter the common sense understanding of what is right and wrong. In this case, the OWS action makes other sufferers of debt and disenfranchisement feel that their problems are political—not a symptom of personal shortcomings, and not just the unfortunate side effect of a passing miscalculation by the Peter Orszags of the world. The real “goal” of OWS is to rally together everyone who is willing to say to Washington, “American democracy cannot bear this inequality.” This movement may prove to be adept at waging ideological war against the disastrous free-marketeers, occupying the airwaves as well as the streets—but it will indeed fall to others to write legislation and to organize economic priorities in debt-wracked communities. The OWS protests should operate in concert with such efforts (OWSers have assisted foreclosure resistance in Queens, for instance), and should put up new forms of protests that keep the public’s eyes on the culprits. Bank occupations have already begun. Major campaigns are now successfully exhorting citizens to move their savings and checking accounts from big banks to local credit unions. The black box of high finance has finally been pried open and exposed for the unregulated machine of destruction that it is, and the alternatives being proposed in the tumult of Occupy Wall Street sound pretty smart to me.

more from Sarah Leonard at Bookforum here.

agatha

600full-agatha-christie

Agatha Christie was not cozy. She earned the title the Queen of Crime the old-fashioned way — by killing off a lot of people. Although never graphic or gratuitous, she was breathtakingly ruthless. Children, old folks, newlyweds, starlets, ballerinas — no one is safe in a Christie tale. In “Hallowe’en Party,” she drowns a young girl in a tub set up for bobbing apples and, many chapters later, sends Poirot in at the very last minute to prevent a grisly infanticide. In “The ABC Murders,” she sets up one of the first detective-taunting serial killers. The signature country home aside, Christie’s literary world was far from homogenous. Her plots, like her life, were international, threading through urban and pastoral, gentry and working class, dipping occasionally into the truly psychotic or even supernatural. Christie murders were committed for all the Big Reasons — love, money, ambition, fear, revenge — and they were committed by men, women, children and in one case, the narrator. Some of her books are truly great — “Death on the Nile,” “And Then There Were None,” “The Secret Adversary,” “Murder on the Orient Express,” “Curtain” to name a few — and some are not. But even the worst of them (“The Blue Train,” “The Big Four”) bear the hallmarks of a master craftsman. Perhaps not on her best day, but the failures make us appreciate the successes, and the woman behind them, that much more.

more from Mary McNamara at the LA Times here.

The Logic of Deceit and Self-Deception

Drew DeSilver in The Seattle Times:

2016723867Back in 1982, Air Florida Flight 90 was attempting to take off from Washington, D.C., in a blinding snowstorm. Though the co-pilot was concerned the plane's wings hadn't been thoroughly de-iced and his instrument panel wasn't displaying the correct airspeed, the pilot dismissed his concerns until seconds before the plane crashed into the Potomac River, killing all but five aboard.

The crash, as cockpit voice recordings later showed, was primarily the result of the pilot's overconfidence leading him to ignore or minimize a whole series of warning signs that his more observant, but less assertive, colleague had pointed out to him. It's one of the most dramatic illustrations of the costs of self-deception in Robert Trivers' new book, “The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life” (Basic Books, 397 pp., $28).

Trivers, an evolutionary biologist who teaches at Rutgers, starts by asking one of those questions that seems obvious once someone else asks it: Why should our brains — whose job, after all, is to make sense of everything we see, hear, touch, taste and smell — be so prone to self-deception? Natural selection would seem to work against creatures who persistently fail to see the world as it is, yet self-deception seems to be deeply embedded in our psyches.

Trivers' answer, which he first advanced in 1976 and has been elaborating since, is that we deceive ourselves the better to deceive others. If we can convince ourselves that we are stronger, smarter, more skillful, more ethical or better drivers than others, we're a long way toward convincing other people too.

More here.