Meet The 2014 Winners Of The MacArthur ‘Genius Grants’

From NPR:

ScreenHunter_800 Sep. 20 21.52One is becoming as well-known for her autobiographical work as she is for her test for what movies meet a gender-balance baseline. Another directed one of the best-reviewed and most surreal documentaries of the past decade and has a follow-up on this year's film-festival circuit. Another has been leading the fight for gay-marriage rights since 2004 in Massachusetts.

Alongside cartoonist Alison Bechdel, The Act of Killing director Joshua Oppenheimer and attorney Mary Bonauto, other 2014 MacArthur Award winners are exploring the subtleties of race via psychology and poetry, using math to model the human brain or define the limits of prime numbers, or providing physical, home and job security to some of the country's most at-risk populations.

More here.



Man Down

Anand Giridharadas in The New York Times:

BlackThere are places in America where life is so cheap and fate so brutal that, if they belonged to another country, America might bomb that country to “liberate” them. This book is a mesmeric account of such a place — a ghetto near Newark — that asks the consummate American question: Is it possible to reinvent yourself, to sculpture your own destiny? “The Short and Tragic Life of Robert Peace” seeks answers in the true story of two men, reared in the same mostly black, mostly luckless neighborhood, whose trajectories spectacularly diverge. One man is Shawn, born to a sweet-talking, drug-pushing father named Skeet, who tries to keep his son from books, fearing they will make him too soft for a hard world. Instead, Skeet teaches Shawn how to fight, intimidate, know everyone on avenues where it’s lethal not to. When Skeet is imprisoned for killing two women, Shawn inherits his friends. He becomes a dealer, too, eventually sleeping in his car, wearing a Kevlar vest. The other man is Rob, son of a feistily aspirational mother, who, while toiling in kitchens, wishes for her child the escape she never had. She borrows books from the local library to read to her small son, and later buys him the first volume of an encyclopedia, getting additional ones, letter by letter, when she can afford them. She navigates their bleak world to find institutions and people who will help him. A Benedictine school rescues Rob. A bank executive offers to pay all his college expenses. Yale accepts him. He majors in molecular biophysics and biochemistry, and works in a cancer and infectious disease laboratory.

What makes this book so devastating is that these two men, Rob and Shawn, are really one: Robert DeShaun Peace, who went from a New Jersey ghetto to Yale to wherever men go after dying face down, knees bent, in a drug-related murder.

More here.

Saturday Poem

We Are Here

we are here
slaving for sovereignty by selling freedom
into the captivity of patriotism.
we are here silent, brainwashed
we are here
poor, frightened, and angry
wondering who is the next torture victim or petrol-
bomb casualty.
we are here, clutching at a fragile economy
a disintegrating social system.

we are here feasting on propaganda
starving,
while poets sing praise litanies
to dictators.
we are here
queuing for basic commodities
chasing sky-rocketing prices
doing business in an unstable environment
we are here where the dollar is extinct
and millionaires are homeless

mother, what happened to the breadbasket of Africa?
sister, what happened to Africa’s paradise?
brother, what happened to the sunshine city
and that of Kings?

we are here honouring the zhing-zhong products flooding the
market while home industry gathers dust in
derelict shelves.
well, we are here,
grounded,
wondering where, when and how
we lost our bearings.
.

by Cosmas Mairosi
from Poetry International Web, 2007

Friday, September 19, 2014

Ig Nobels: British researchers take coveted science humour prize

Ian Sample in The Guardian:

Staying-up-late-006The nation can hold its head up high. Once again, researchers in Britain have been honoured with that most coveted of scientific awards, the Ig Nobel prize.

Not to be confused with the more prestigious – and lucrative – prizes doled out from Stockholm next month, the Ig Nobels are awarded for science that makes people laugh and then makes them think.

The winners this year received their awards at a ceremony at Harvard University, where a stern eight-year-old girl was on hand to enforce the strict 60 second limit on acceptance speeches. The ceremony is organised by the science humour magazine, Annals of Improbable Research.

Speaking at the event was Rob Rhinehart, creator of the all-in-one food, Soylent, and Dr Yoshiro Nakamatsu, a prolific inventor with more than 3,000 patents who won an Ig Nobel in 2005 for photographing every meal he ate in the previous 34 years.

Holding the flag for Britain, though only figuratively because the flight to Boston cost too much, was Amy Jones, who shared the Ig Nobel prize forpsychology. Her work with Minna Lyons at Liverpool Hope University revealed that people who habitually stayed up late were, on average, more self-admiring, manipulative and psychopathic.

“To be honest, I hadn't heard of the awards before,” Jones told the Guardian. “It's absolutely overwhelming. No one could be more surprised than me.”

People who display the traits often do very well in life, having desirable jobs and more sexual partners, she said. “Successful psychopaths are going to end up in all the high end jobs, in charge of companies, making millions. The unsuccessful psychopaths are the ones that end up in jail.”

More here.

The Writer and the Valet: The Story of Dr. Zhivago

Ston05_3618_04

Frances Stonor Saunders in the LRB:

In his youth Pasternak looked, Marina Tsvetaeva said, ‘like an Arab and his horse’. In older age, he looked the same. Sinewy and tanned from long walks and tending his orchard, at 66 he was still an intensely physical presence. This was the woodsman-poet who was waiting by the garden gate to greet his friend Isaiah Berlin, 19 years younger, bespectacled and pudgy, his indoor skin betraying the rigours of the Senior Common Room and the international diplomatic circuit.

‘The Foreigner Visiting Pasternak at His Dacha’ is its own subgenre of intellectual history. Its principal theme is the excitement of discovering a lost generation who, like ‘the victims of shipwreck on a desert island’, have been ‘cut off for decades from civilisation’ (Berlin). The foreigner, moved by his role as witness to an impossible reality, records every detail of the encounter: the welcome (Pasternak’s handshake is ‘firm’, his smile ‘exuberant’); the walk (oh, that ‘cool’ pine forest, and look, some dusty peasants); the conversation, with Pasternak holding forth ‘as if Goethe and Shakespeare were his contemporaries’; the meal, at which his wife, ‘dark, plump and inconspicuous’ (and often unnamed), makes a sour appearance; the arrival of other members of the Peredelkino colony, the dead undead; the toasts, invoking spiritual companions – Tolstoy, Chekhov, Scriabin, Rachmaninov. And finally the farewell at the gate, at which Pasternak disappears back into the dacha and re-emerges with sheaves of typescript. These are given to the visitor (‘the guest from the future’, as Anna Akhmatova put it), who is now tasked with the sacred and thrillingly immortalising responsibility of carrying Pasternak’s writings out of this place where the clock has stopped and into the world beyond.

Berlin’s reports of his meetings with Pasternak, which cover two periods spanning a decade, conform to the conventions of the genre (not surprising, as he largely invented it) but his published account of his visit of 18 August 1956 is curiously short on colour, and there is no mention of his bride, Aline, who accompanied him, or of Pasternak’s wife, Zinaida. We learn only that the two men convened in a lengthy conversation, which must have vibrated amid the pine trees like some strange antiphon. Pasternak, Berlin once observed, ‘spoke slowly in a low tenor monotone, with a continuous even sound, something between a humming and a drone’; Berlin’s voice was variously described as ‘a low, rapid rumble’, ‘a melting Russian river’, the ‘bubble and rattle’ of a ‘samovar on the boil’. At some point, Pasternak took Berlin into his study, where he thrust a thick envelope into Berlin’s hands and said: ‘My book, it is all there. It is my last word. Please read it.’

More here.

Philosophy of Captivity

Screen-Shot-2014-08-01-at-8.21.24-PM (1)

Richard Marshall interviews Lori Gruen in 3:AM magazine:

3:AM: A recent book of yours looks at ethics and animals. You begin by looking at the position of human exceptionalism, something that goes back to at least Aristotle. What is the position, and is it a kind of default position for those who just don’t think we should think about animals ethically?

LG: Human exceptionalism is a prejudice that not only sees humans as different from other animals but that also sees humans as better than other animals. Of course humans are unique in a variety of ways, although those differences are often articulated based on naïve views about other animals. In Ethics and Animals, I explore some of the claims that have been made to differentiate humans from other animals (that we are the only beings that use tools or that use language or that have a theory of mind) and show that they do not establish that humans are unique in the ways postulated. But I also discuss the ways that other animals are indeed different from us and different from each other. These differences are important for understanding them and for promoting, or at least not negatively impacting, their well being.

Human exceptionalism also underlies skepticism about including other animals in the sphere of moral concern. It is related to two other views that are discussed more often in the literature about moral considerability – speciesism and anthropocentrism. Speciesism is the view that I only owe moral consideration to members of my own species. Although this view is usually thought to be focused on humans, it seems consistent with the view that only Vulcans matter to members of that species, or only orangutans matter to that species. Anthopocentrism is the view that humans are at the center of everything and that everything is understood through our human interpretive lenses. Of course we humans experience everything as humans, so in some sense humans are necessarily the center of our own perceptions, but that doesn’t mean we are unable to try to understand or care about non-humans. There is a sense in which we are inevitable anthropocentrists, but we needn’t be human exceptionalists.

Human exceptionalism sees humans as the only beings worthy of moral concern. Normative exceptionalist arguments generally fail in one of two ways—they pick out a supposedly unique characteristic or property upon which moral worth is supposed to supervene but it turns out that either not all humans have that property or that humans aren’t the only ones that have it.

More here.

Friday Poem

Deeper

Often at night, sometimes
out in the snow or going into the music, the hunch says,
“Deeper.”
I don't know what it means.
Just, “Push it. Go further. Go deeper.”
And when they come talking at me I get
antsy at times, but mostly I stay put and it keeps saying,
“Deeper. This is not it. You must go deeper.”
There is danger in this, also
beautiful fingers and I believe it can issue in
gestures of concord; but I
cannot control it, all I know is one thing—
“Deeper. You must go further. You must go deeper.”
.

by Dennis Lee
from Canadian Poetry Online

Was There an ‘Early Modern’ Period in Indian Philosophy?

RamaJustin E.H. Smith at berfrois:

If philosophy questions everything, surely it must also question the periodization of its own history. Professional historians themselves tend to agree that the imposition of periods on the past –premodern, Renaissance, early modern, and so on– is always to some degree arbitrary, even if it is also impossible to imagine how we could describe the past without any periodization at all. The bounding off of temporal regions in this way is made all the more problematic if we wish to consider the past from a global perspective, rather than simply focusing on a single region, since the rationale for periodization in one place might not apply in another. However artificial the notion of the ‘medieval’ period is, we may nonetheless say with certainty that this notion is more usefully applied to Europe than to, say, South America: there is nothing ‘medieval’ about the 10th century in Peru (nor, strictly speaking, is there any meaningful sense in which Peruvians can be said to have experienced the 10th century). There is also nothing medieval about what we often call ‘medieval Islamic philosophy’. Whether or not we may see the period between the 8th and the 12th centuries as a ‘Golden Age’, a term that implies a subsequent decline, it is in any case a mistake to see the period of flourishing of ibn Rushd in Iberia, or of ibn Sina in Central Asia, as a relative void between antiquity and modernity. It was certainly not experienced by the people who lived it as ‘between two ages’, and nor, within the context of Islamic history, is there any interesting sense in which this period was a transitional one.

more here.

Impressionism Into Modernism in america

Sinclair_norton.jpg__1072x0_q85_upscaleNatasha Geiling at The Smithsonian Magazine:

To be considered a serious artist in late-19th-century America, you had to have studied in a European, academic workshop, testing your brushtrokes among the masters of the continent. But art is nothing if not transformation, and almost as soon as American artists embraced the European traditions, they rebelled against it. Taking a cue from the French Impressionists who made their debut in their own private exhibition in 1874, these Americans grappled for a style that reflected the new realities of the post-war industrial American city.

It is this journey—from the European tradition of impressionism to the avant-garde movement of Modernism—that will be on display at the Smithsonian Affiliate Peoria Riverfront Museum from September 26 through January 11, 2015. Featuring works spanning from the 1880s to 1950s, the exhibition “Impressionism Into Modernism: A Paradigm Shift in American Art,” covers the Industrial Revolution, two world wars and a depression—all of which shaped the way American artists worked. “I felt that it would be interesting and appropriate to use American impressionism as a jumping off point as the story of the process of American artists embracing change,” says Kristan McKinsey, the show's curator. “It's a time where American artists are moving away from academic art traditions and looking to create an art that was original and not derivative of European art.”

more here.

oliver sacks loves libraries

Apaul-articleLargeOliver Sacks at Threepenny Review:

When I was a child, my favorite room at home was the library, a large oak-paneled room with all four walls covered by bookcases—and a solid table for writing and studying in the middle. It was here that my father had his special library, as a Hebrew scholar; here too were all of Ibsen’s plays—my parents had originally met in a medical students’ Ibsen society; here, on a single shelf, were the young poets of my father’s generation, many killed in the Great War; and here, on the lower shelves so I could easily reach them, were the adventure and history books belonging to my three older brothers. It was here that I found The Jungle Book; I identified deeply with Mowgli, and used his adventures as a taking-off point for my own fantasies.

My mother had her favorite books in a separate bookcase in the lounge—Dickens, Trollope, and Thackeray, Bernard Shaw’s plays in pale green bindings, as well as an entire set of Kipling bound in soft morocco. There was a beautiful three-volume set of Shakespeare’s works, a gilt-edged Milton, and other books, mostly poetry, that my mother had got as school prizes.

Medical books were kept in a special locked cabinet in my parents’ surgery (but the key was in the door, so it was easy to unlock).

more here.

OUR DELHI BREAD

Saskya Jain in MoreIntelligentLife:

MymadI don’t remember thinking of running away when I asked Ram Singh, our house help, to get my small grey suitcase from the storeroom. We were living in a government flat surrounded by a big garden in the centre of New Delhi. I was five or six years old, and it was the first of many long summer holidays. My classmates had all fled from the heat—abroad, mostly. The school fees sapped my parents’ income and, with both of them working full-time, the only prospect of travel was accompanying my father to meetings in nearby Jaipur. So began what turned into a ritual of sorts. Every day I would arrange a varying selection of belongings in the empty stomach of my suitcase—only to unpack them all a little while later.

To fill our own empty stomachs, my family relied on Ram Singh’s limited repertoire of roti, sabzi, dal and chawal—unleavened flatbread, fried or curried vegetables, lentils and rice. My brother and I often ate by ourselves, and we knew that, of the four, only the roti lent itself to mid-meal entertainment. It could be torn in half if a ship’s hold needed to be loaded up with potato bricks, okra beams or chickpea crates. It could be attached to each ear, to make a pair of giant earrings such as we had seen dangling from certain aunties’ rubbery lobes. With just one bite, a solo roti could become Krishna’s lethal chakra, which he’d spin around his finger on Sunday-morning episodes of “The Mahabharata” before using it to slice off his enemy’s heads. But despite our best efforts, lunch rarely brought us more than 15 minutes closer to the end of the holidays. I was often in our garden, watching aeroplane trails wrinkle the clear blue sky. The promise of discovery wrapped in the idea of travel appealed to me. I started telling my parents that I had lived in America in my previous life, before I was born into our family. They encouraged me to tell them stories of my prenatal adventures; it took me some time to figure out that their queries were motivated by something other than a keen interest in geography.

More here.

Artificial Sweeteners May Disrupt Body’s Blood Sugar Controls

Kenneth Chang in The New York Times:

SweetArtificial sweeteners may disrupt the body’s ability to regulate blood sugar, causing metabolic changes that can be a precursor to diabetes, researchers are reporting. That is “the very same condition that we often aim to prevent” by consuming sweeteners instead of sugar, said Dr. Eran Elinav, an immunologist at the Weizmann Institute of Science in Israel, at a news conference to discuss the findings. The scientists performed a multitude of experiments, mostly on mice, to back up their assertion that the sweeteners alter the microbiome, the population of bacteria that is in the digestive system. The different mix of microbes, the researchers contend, changes the metabolism of glucose, causing levels to rise higher after eating and to decline more slowly than they otherwise would.

The findings by Dr. Elinav and his collaborators in Israel, including Eran Segal, a professor of computer science and applied mathematics at Weizmann, are being published Wednesday by the journal Nature. Cathryn R. Nagler, a professor of pathology at the University of Chicago who was not involved with the research but did write an accompanying commentary in Nature, called the results “very compelling.” She noted that many conditions, including obesity and diabetes, had been linked to changes in the microbiome. “What the study suggests,” she said, “is we should step back and reassess our extensive use of artificial sweeteners.”

More here.

Thursday, September 18, 2014

How to Predict the Unpredictable

Steven Poole in The Guardian:

Rock-paper-and-scissors-011“Prediction is very difficult,” the great Danish physicist Niels Bohr was fond of saying, “especially if it's about the future.” This book doesn't in fact claim to teach you how to predict what is really unpredictable – such as the weather in a month's time, or the next turn of the roulette wheel. It might more honestly, but less seductively, have been titled How to Predict the Sort-of Predictable Behaviour of People Who Are Trying to Act Randomly. This is actually much more interesting than the bland paradox of the given title. When they want to act unpredictably, it turns out, people deviate from true randomness in ways that can be recognised. According to Poundstone's vivid account, this was first rigorously demonstrated by a family of “outguessing machines” created by mathematicians and engineers at Bell Labs in the 1950s.

The outguessing machines played a very simple game. Every round, both machine and human player pick one of two choices: heads or tails, left or right. It is decided beforehand that if the choices match, one player scores a point, whereas if they are different, the other player scores. What happens is that, over dozens of rounds, humans fall into unconscious patterns that a computer can recognise, and therefore anticipate. In this way, with only 16 bits of memory (16 ones or zeroes), a machine by the information theorist Claude Shannon was able to beat all comers. To call this “outsmarting” the humans is perhaps a bit of a stretch, but it is what Poundstone means by the term when he goes on to apply it to different areas. Indeed there are a surprising number of areas where a similar kind of “outguessing” strategy can be fruitful. Rock, Paper, Scissors is a random game, but because most people deviate from true randomness, it is possible to have a strategy. (“A player who loses is more likely to switch to a different throw the next time,” Poundstone explains as an example.) In tennis, too, most players alternate their directions of serve too regularly, so Poundstone recommends using a wristwatch or heart-rate monitor to properly randomise them.

More here.

References, Please

Tim Parks in the New York Review of Books:

Daumier_Footnotes__jpg_600x713_q85In the age of the Internet, do we really need footnotes to reference quotations we have made in the text? For a book to be taken seriously, does it have to take us right to the yellowing page of some crumbling edition guarded in the depths of an austere library, if the material could equally well be found through a Google search? Has an element of fetishism perhaps crept into what was once a necessary academic practice?

I have just spent three days preparing the text references for a work of literary criticism for Oxford University Press. There were about two hundred quotations spread over 180 pages, the sources being a mix of well-known nineteenth- and twentieth-century novels, very much in the canon, some less celebrated novels, a smattering of critical texts, and a few recent works of psychology. Long-established practice demands that for each book I provide the author’s name, or the editor’s name in the case of a collection of letters or essays, the translator’s name where appropriate, the publisher, the city of publication, the date of publication, and the page number. All kinds of other hassles can creep in, when a book has more than one volume for example, or when quoting from an essay within a collection of essays, perhaps with more than one editor, more than one translator, more than one author. Since the publisher had asked me to apply the ideas I develop in the book to at least one of my own novels there are even three quotations to be referenced from Cara Massimina, a noir I wrote way back in the 1980s.

More here.

Americans are not the world

Joe Henrich and his colleagues are shaking the foundations of psychology and economics—and hoping to change the way social scientists think about human behavior and culture.

Interesting article from last year by Ethan Watters in Pacific Standard:

ScreenHunter_796 Sep. 18 17.06In the summer of 1995, a young graduate student in anthropology at UCLA named Joe Henrich traveled to Peru to carry out some fieldwork among the Machiguenga, an indigenous people who live north of Machu Picchu in the Amazon basin. The Machiguenga had traditionally been horticulturalists who lived in single-family, thatch-roofed houses in small hamlets composed of clusters of extended families. For sustenance, they relied on local game and produce from small-scale farming. They shared with their kin but rarely traded with outside groups.

While the setting was fairly typical for an anthropologist, Henrich’s research was not. Rather than practice traditional ethnography, he decided to run a behavioral experiment that had been developed by economists. Henrich used a “game”—along the lines of the famous prisoner’s dilemma—to see whether isolated cultures shared with the West the same basic instinct for fairness. In doing so, Henrich expected to confirm one of the foundational assumptions underlying such experiments, and indeed underpinning the entire fields of economics and psychology: that humans all share the same cognitive machinery—the same evolved rational and psychological hardwiring.

The test that Henrich introduced to the Machiguenga was called the ultimatum game.

More here. [Thanks to Yohan John.]

laurie lee: a walker-writer of genius

P10_Blythe_web_1095757hRonald Blythe at the Times Literary Supplement:

How he would have hated it – to be remembered as a centenarian. The youth with the violin and the spate of fresh words. He is best remembered as a walker-writer of genius. Like nearly everyone of his generation Laurie Lee had little option but to tread life out. First to the elementary school, its limitations forgiven, then the walks to sex, to love, to battle. These would keep him young in his own eyes until he died. It was not a bad way to deal with the years, to walk them out. To be the lad with the fiddle on life’s highway. But his lasting music was in the way he wrote in a carefully crafted hand which couldn’t tell prose from poetry.

Born in Slad, Gloucestershire in 1914, a village that lay in shade and caught little of the sun, he simply walked out of it one midsummer morning. It was not an original thing to do at the time, vagrancy being common. Patrick Leigh Fermor was eighteen when he set out for Greece. But their circumstances were vastly different. Lee’s first stop was a London building site. Neither would have heard about John Clare’s walk from Epping to Peterborough with bleeding feet. Theirs had an enchantment about it. However, there were dark companions, a whole army of them pushing prams piled with bedding, men in khaki greatcoats, women in headscarves, tramps, loners who trod from workhouse to workhouse. And cyclists free as birds but with few useful destinations. The bike had destroyed the parish boundary and its old demands. Shorts, Aertex shirts and Penguin books in the saddlebag, but not for Laurie Lee. He “took to the road” in 1934. The fields on either side of the road were cultivated but behind them lay an agricultural depression as bad if not worse than the workless backstreets of towns.

more here.

the heartbreaking story of the last passenger pigeon

PI_GOLBE_PIGEON_FT_001Stefany Anne Golberg at The Smart Set:

In the last years of her life, Martha began to lose her feathers. Sol Stephan, General Manager of the Cincinnati Zoo, where Martha spent most of her years, began collecting the feathers in a cigar box without much idea of what he would do with them. Martha lived a sedentary life at the zoo. Her cage was 18 feet by 20 feet — she had never known what it was to fly free. When Martha’s last friend George (who was also named for a Washington) died in 1910, Martha became a celebrity. She watched the people passing by, alone in her enclosure, and they watched her. Martha ate her cooked liver and eggs, and her cracked corn, and sat. On the outside of her cage, Stephan placed a sign announcing Martha as the Last of the Passenger Pigeons. Visitors couldn’t believe that Martha really was the last. They would throw sand inside the cage to make her walk around.

Martha died on a September afternoon in 1914, one hundred years ago. Her elderly body was sent to the Cincinnati Ice Company and frozen in a 300-pound block of ice. They put the frozen Martha on a train to the Smithsonian, where she could be mounted and stuffed. Martha was displayed at the Smithsonian between the 1920s and 1950s. For a while, she sat next to an unnamed male passenger pigeon that had been shot in 1873. Later, she was displayed alone.

more here.

Scottish nationalism and British nationalism aren’t the same

Billy Bragg in The Guardian:

ScreenHunter_795 Sep. 18 16.47For me, the most frustrating aspect of the debate on Scottish independence has been the failure of the English left to recognise that there is more than one type of nationalism. People who can explain in minute detail the many forms of socialism on offer at any demo or conference seem incapable of differentiating when it comes to nationalists.

Confronted by someone recently who claimed to believe that there was no difference between the Scottish National party and the British National party, I can’t help wondering if this is wilful – like the Daily Mail’s insistence that anyone who wants to see a fairer society must be a Stalinist.

In the past months, I have found myself arguing with comrades who don’t understand how someone who wrote new lyrics to The Internationale can possibly be in favour of an independent Scotland. You’re betraying the working class of Britain they tell me. What about international solidarity?

It baffles me as to why they should feel that voting against the Westminster status quo is an act of class betrayal. People who marched for CND in the 1980s are now telling me I am wrong to support a decision that may force the UK to give up its nuclear weapons.

It seems to be a very English viewpoint.

In Scotland, Wales and Ireland nationalism is the name given to the campaign for self-determination. James Connolly gave his life for the nationalist cause; John MacLean, perhaps the greatest leftwinger that Scotland has produced, was in favour of independence and campaigned for a Scottish parliament.

Both recognised that the British state was highly resistant to reform, and that the interests of working people were best served by breaking with the United Kingdom.

More here.