the universal becomes personal

600

Essays sneak up on us. They are — or often feel — accidental: the record of a writer wrestling with an idea, an observation, a slice of experience, of a writer figuring it out. They have a conditional quality, as if they could go in any direction, offering impressions more than conclusive points of view. As Tom Bissell notes at the beginning of “Magic Hours: Essays on Creators and Creation”: “When I am asked … for advice on how to get started as a nonfiction writer, I tell them to start small and look around.” There it is, the idea exactly: Keep your eyes open and follow where the language leads. In “Magic Hours” (Believer Books: 304 pp., $14), Bissell introduces a video game actress and a collective of disgruntled writers called the Underground Literary Alliance. He considers the career of Werner Herzog and a Hollywood cult filmmaker named Tommy Wiseau. On some level, of course, he is always writing about himself. “The overwhelming majority of a writer’s time,” he points out, “is spent wondering why this world is not as vivid as he or she once — agonizingly, deludedly — believed. To write is to fail, more or less, constantly.”

more from David L. Ulin at the LA Times here.

happy Bloomsday

Large

2012 is a special year for these Joyceans. The 71st since Joyce’s death, it marks the first — across the EU at any rate — that his work may be shared freely among them, without needing permission — for public readings, performances, or re-interpretations — from his estate. This is no small matter: since inheriting the estate in 1982, Joyce’s grandson Stephen Joyce has gained a reputation as the most controlling literary executor in history. Copyright’s recent history is exclusively one of concessions granted to copyright holders — who are increasingly multinational corporations such as TimeWarner and EMI and wealthy estates — either extensions in the scope of copyright’s protection or the length of its term, or streamlined mechanisms for its enforcement. The public are the losers in these deals, yet they have rarely been moved to join the chorus of academics and archivists and their persistent protest that we are moving ever further from the view of copyright established by the framers of the Constitution: a necessary, temporary, evil to promote the health of a knowledge base that is ultimately shared. Indeed, our journey towards a corporate vision of perpetual knowledge assets exploited for profit seems unstoppable.

more from Becky Hogge at Atlantic Wire here.

Five Years Without Richard Rorty

Rorty2Santiago Zabala in Al Jazeera:

Despite Rorty's international success, his criticism was regarded as a betrayal by most of his colleagues, and in the eighties he left the philosophy department and began teaching in English departments.

But Rorty's main subversive act was not publicly opposing Bush or distancing himself from the dominant philosophy position of his time but rather suggesting that philosophers ought to stop “worrying about truth”, “contributing to knowledge”, or “getting things right”. While these suggestions might seem the first step of a relativist, sceptic, or even nihilist philosopher, Rorty was none of these. He was a pragmatist interested in fusing together different philosophers such as William James, Friedrich Nietzsche, and Thomas Kuhn in order to transform the discipline into a looser activity where progress would be measured in relation not to non-human realities (such as truth, God, or foundational human nature) but rather to historical contingencies that formed our present. These, as he explained, could be the family we grew up with, the society around us, or the language we feel most comfortable in.

But why did Rorty propose this transformation? Principally because of our almost reverent use of the term “rationality”, that is, how we “rationally” claim superiority for certain philosophies, politics, or religions. The problem with this claim is that its presupposes a demonstration from premises that are apparently acceptable to all human beings regardless of their cultural, national, or historical location. As we well know, these contingencies differ every time, and it is impossible to unify them. This is why Rorty argues that he does

“not see that we do anything called 'appealing to truth'. We appeal to the statements of the tortured, the records in the archives, the monuments of the past, the slides under the microscope, the images in the lens of the telescope, and so on, but not to 'truth'. Insistence on the existence or the importance of truth seems to me empty, at least by comparison to insistence on the need of freedom.”

Slavoj Žižek: ‘Humanity is OK, but 99% of people are boring idiots’

From The Guardian:

Slavoj-Z-iz-ek-at-his-hom-008If you have read all of Žižek's work, you are doing better than me. Born in 1949, the Slovenian philosopher and cultural critic grew up under Tito in the former Yugoslavia, where suspicions of dissidence consigned him to academic backwaters. He came to western attention in 1989 with his first book written in English, The Sublime Object of Ideology, a re-reading of Žižek's great hero Hegel through the perspective of another hero, the psychoanalyst Jacques Lacan. Since then there have been titles such as Living in the End Times, along with films – The Pervert's Guide To Cinema – and more articles than I can count. By the standards of cultural theory, Žižek sits at the more accessible end of the spectrum – but to give you an idea of where that still leaves him, here's a typical quote from a book called Žižek: A Guide for the Perplexed, intended to render him more comprehensible: “Žižek finds the place for Lacan in Hegel by seeing the Real as the correlate of the self-division and self-doubling within phenomena.” At the risk of upsetting Žižek's fanatical global following, I would say that a lot of his work is impenetrable. But he writes with exhilarating ambition and his central thesis offers a perspective even his critics would have to concede is thought-provoking. In essence, he argues that nothing is ever what it appears, and contradiction is encoded in almost everything. Most of what we think of as radical or subversive – or even simply ethical – doesn't actually change anything. “Like when you buy an organic apple, you're doing it for ideological reasons, it makes you feel good: 'I'm doing something for Mother Earth,' and so on. But in what sense are we engaged? It's a false engagement. Paradoxically, we do these things to avoid really doing things. It makes you feel good. You recycle, you send £5 a month to some Somali orphan, and you did your duty.” But really, we've been tricked into operating safety valves that allow the status quo to survive unchallenged? “Yes, exactly.” The obsession of western liberals with identity politics only distracts from class struggle, and while Žižek doesn't defend any version of communism ever seen in practice, he remains what he calls a “complicated Marxist” with revolutionary ideals.

To his critics, as one memorably put it, he is the Borat of philosophy, churning out ever more outrageous statements for scandalous effect.”The problem with Hitler was that he was not violent enough,” for example, or “I am not human. I am a monster.” Some dismiss him as a silly controversialist; others fear him as an agitator for neo-Marxist totalitarianism. But since the financial crisis he has been elevated to the status of a global-recession celebrity, drawing crowds of adoring followers who revere him as an intellectual genius. His popularity is just the sort of paradox Žižek delights in because if it were down to him, he says, he would rather not talk to anyone.

More here.

Did Neandertals Paint Early Cave Art?

From Science:

CaveThe basic questions about early European cave art—who made it and whether they developed artistic talent swiftly or slowly—were thought by many researchers to have been settled long ago: Modern humans made the paintings, crafting brilliant artworks almost as soon as they entered Europe from Africa. Now dating experts working in Spain, using a technique relatively new to archaeology, have pushed dates for the earliest cave art back some 4000 years to at least 41,000 years ago*, raising the possibility that the artists were Neandertals rather than modern humans. And a few researchers say that the study argues for the slow development of artistic skill over tens of thousands of years.

Figuring out the age of cave art is fraught with difficulties. Radiocarbon dating has long been the method of choice, but it is restricted to organic materials such as bone and charcoal. When such materials are lying on a cave floor near art on the cave wall, archaeologists have to make many assumptions before concluding that they are contemporary. Questions have even arisen in cases like the superb renditions of horses, rhinos, and other animals in France's Grotte Chauvet, the cave where researchers have directly radiocarbon dated artworks executed in charcoal to 37,000 years ago. Other archaeologists have argued that artists could have entered Chauvet much later and picked up charcoal that had been lying around for thousands of years.

More here.

Friday Poem

The Alibi

Whenever I come to visit you
only the time that’s intervened
from one visit to the next has changed.
As for the rest, as always
from my eyes runs a river
your engraved name blurred
– godfather to the little hyphen
between the two dates
so people won’t think the length
of your life died unbaptised.
Next I clean the flowers’
withered droppings adding
some red earth where black had been laid
and finally I change the glass in the oil-lamp
for another a clean one I bring.

As soon as I get home
I diligently wash the dirty one
disinfecting it with chlorine
and the caustic foam of disgust I emit
as I shake vigorously.
Always with gloves and keeping my body
well away from the tiny basin
so the dead water won’t splash me.
With strong aversion’s wire wool I scour
the ingrained grease on the glass’ rim
and on the palate of the doused flame
while rage crushes the illicit stroll
of a snail, trespasser
in the neighbouring stillness.

I rinse it then rinse with scalding fury
a boiling effort to bring the glass to its prime
its happy normal use
for quenching thirst.
And at last it becomes crystal clear
how hypochondriacal my wish is not to die.

dearest – look at it this way:
when wasn’t love afraid of death?
.
.
by Kiki Dimoula
translation: David Connoly
from A minute´s licence
publisher: Poetry Greece, Corfu, 2000

Gilgamesh: An Epic Obsession

Gilgamesh1I have had this fascination with the Epic of Gilgamesh for a long time. I love the fact that perhaps the oldest story known to mankind is about friendship. Theodore Ziolkowski in Berfrois:

I have long been interested in the reasons for the fascination with figures and works from antiquity among twentieth-century writers, artists, musicians, and their publics. By this I do not mean the scholarly interest in antiquity that motivates classicists, archaeologists, art historians and others moved by a professional commitment to gain further understanding of past cultures. What intrigues me, rather, are the insights into our own contemporary culture that the popular reception of antiquity provides: a topic that I have pursued in several earlier books. In Virgil and the Moderns (1993), for instance, I found that it was the sense of crisis produced by World War I and intensified by the chaotic social and political upheavals of the 1920s and 1930s that sent many writers and their readers back to the past in search of patterns of order and stability, as well as models of personal ethics — precisely the qualities that they thought to find in Virgil’s life and works.

During those same years other writers, in search of transformative experiences through which to reconstruct their lives, turned to Ovid’s Metamorphoses; still others, during and after the Second World War, looked to the poet’s life for consolation in their own destiny as exiles (Ovid and the Moderns, 2005). During the 1930s, in turn, it was the horrors of fascism and the sense of defenseless oppression that attracted writers and many painters to the Cretan legend of the Minotaur, recently publicized by the excavations of Sir Arthur Evans in Knossos (Minos and the Moderns, 2008), for example, in the Paris cultural journal Minotaure (1933-1939).

It was a similar curiosity that moved me to seek an understanding of the reasons underlying the conspicuous recent obsession with Gilgamesh, which differed from the classical legends in several important senses.

whitman’s brain, frankenstein

Whitman

In the 1931 movie Frankenstein, the doctor’s hunchback assistant Franz raids the medical school’s lab to retrieve a brain for the monster. Whoops! He drops the jar that has the good brain and takes the bad brain instead – the brain of a demented murderer. (Video below.) Who would have guessed that there is some factual basis for this set piece? Walt Whitman’s brain may have been on the back of someone’s mind as the scenario was written, though it cannot be proved for certain. You see, Walt Whitman’s postmortem brain was put into some sort of a jam jar, and somebody dropped it, and it shattered. The brain, not the jar … or rather, probably, both. Or neither. Actually, it’s not certain the brain ever made it into a jar, or was dropped while it was in a sort of rubber sack.

more from Cynthia Haven at The Book Haven here.

the way things go

ID_PI_GOLBE_WAYTH_AP_001

The most common interpretation is that “The Way Things Go” is a celebration of the extraordinariness of the ordinary, a celebration of everydayness, of the way things go. But it’s also about the way we make things go — for better or worse — whether we are creating a work of art or falling down a flight of stairs and landing on a cat or even standing still (“The Artist as Prime Mover” is what Arthur C. Danto called his essay on Fischli/Weiss). The forward motion of life, life as a chain reaction of cause and effect is predictable; it is a cliché. Whatever we do or don’t do, there will be cause and there will be effect. But as David Weiss once said in an interview, “There is something right about clichés.” In other words, there is truth in the predictable. We can predictably learn to make things happen in a way that is predictably successful. Through repetition and work we can finally get the paint can to knock over the ladder. And yet, how that happens, the way it happens, is somewhat more mysterious. “I’ve always found that astonishing…” David Weiss told Frieze, “the way people always laugh when the next thing falls over.”

more from Stefany Anne Golberg at The Smart Set here.

Sylvia Plath’s Sketches

PlathBoats-300x209

Those of us who weren’t able to visit the exhibition of Sylvia Plath’s drawings on view at London’s Mayor Gallery in November may take some comfort in The Telegraph’s comprehensive slide show of the poet’s pen-and ink work. The delicacy and precision of her execution will come as no surprise to fans of Plath’s writing; her mastery of the medium may. Do look at the whole gallery, but below, find just a few demonstrating the range of her subjects.

more from Sadie Stein at Paris Review here.

Write as short as you can/ In order/ Of what matters

Image

It’s possible to have a clear attitude toward Twitter if you’re not on it. Few things could appear much worse, to the lurker, glimpser, or guesser, than this scrolling suicide note of Western civilization. Never more than 140 characters at a time? Looks like the human attention span crumbling like a Roman aqueduct. The endless favoriting and retweeting of other people’s tweets? Sounds like a digital circle jerk. Birds were born to make the repetitive, pleasant, meaningless sounds called twittering. Wasn’t the whole thing about us featherless bipeds that we could give connected intelligible sounds a cumulative sense? The signed-up user is apt to have more mixed feelings. At its best, Twitter delights and instructs. Somebody, often somebody you wouldn’t expect, condenses the World-Spirit into a great joke, epigram, or aperçu. What oft was thought but ne’er so well expressed, you think, and favorite the tweet. Or: So funny, and you retweet. Pretty nice, also, when the ricocheting retweets say that the witty one is you!

more from The Editors at n+1 here.

Footnote Fairy Tale

RedAdam Kotsko reviews Francis Spufford’s Red Plenty in The New Inquiry:

Francis Spufford’s Red Plenty, like Infinite Jest, is a book you need to read with two bookmarks: one for the main text, one for the endnotes. It is also like Infinite Jest insofar as it shows us an alternative future that seems uncannily like our own. The difference is that while David Foster Wallace constructed a purely fictional near-future, Spufford presents us with an alternative timeline that failed to materialize: the triumph of Soviet central planning over Western capitalism. In this respect, Red Plenty also bears striking similarities with Wallace’s unfinished posthumous novel The Pale King, which featured a great variety of characters but was finally about a system, namely, the IRS. Yet the real topic isn’t so much the actual existing Soviet system, but a reformed system based on advanced theories of cybernetics, which was proposed but, again, failed to materialize.

Thus it’s fitting that Spufford describes his novel as a fairy tale. Much of its material is the stuff of fantasy: a visitor from a far-away land (America) bearing magical talismans (appliances), or a city magically popping up in the wilderness of Siberia so that academics can live in luxury. What differentiates Red Plenty is that the author provides much more documentation of the parallels between the fantasy world and the real one than is typical. In this fairy tale with footnotes, readers can flip back to find out that a relatively frank conversation was of course much more frank than was likely in the Soviet Union—Spufford had to exaggerate it because the real change in the frankness level would have been undetectable by outsiders. Similarly, we regularly learn that the time lag between events has been foreshortened for dramatic effect.

The most surprising endnotes, however, are the ones that verify that something really is true. Moscow really did have fast food before the United States, as we learn in the notes to Khrushchev’s fictional musings on the wonders of the American hemburger. What’s more, in the early 1960s Khrushchev really did believe the USSR would overtake the West by 1980, and a good portion of the Soviet population believed along with him. (The pervasive cynicism one now associates with Actual Existing Socialism didn’t really take hold until the Brezhnev years.) And perhaps most importantly, individual apparatchiks really did sit down and write a comprehensive economic plan for the entire Soviet Union—and then laboriously edit it by hand when something inevitably went wrong.

Making a Costume Drama out of a Crisis

DowntonAbbeyJenny Diski on Downton Abbey, in the LRB:

Alistair Cooke can be seen in an old TV clip, thanks to the bottomless well that is YouTube, carefully cross-legged, wearing a blazer, a discreetly silver-striped black tie on a pearl grey shirt, and what can only be called slacks. He sits on a high-backed black leather and polished mahogany library chair. Behind him to his right, hung on flocked wallpaper, is an ornately framed landscape painting winking ‘old master’, on the other side an overarching potted palm, between them a window hung with heavy, draped velvet curtains, and beneath his elegantly shod feet (the lasts of which must surely have been made and stored by Lobb’s) a fine oriental carpet. The whole set trembles with the weight of Vicwardian Britishness. ‘Good evening,’ he says in his immaculately trimmed mid-Atlantic accent, so reassuring that you wonder if perhaps he is going to sound the nuclear alert. He is very nearly the perfect benevolent-English-gentleman-in-America; nevertheless his calm, almost lazy intonation reminds me of the underlying menace in the same phrase when used by that other official (though, like Cooke, miscategorised) quintessential English gent, Alfred Hitchcock. Good evening.

Beginning in 1974, for the benefit of American television viewers, Alistair Cooke introduced every episode of the five original series of Upstairs, Downstairs. It wasn’t offered as slush or soapy escapism, but as classy and educational, shown in the Masterpiece Theatre strand of the Public Broadcasting Service (as were those other great English classics Morse, Prime Suspect and Midsomer Murders). Upstairs, Downstairs, Cooke explains at the beginning, ‘follows the life of a London family through the reign of Edward VII. He was the big bearded monarch who had a notorious appetite for bed and bawd, but nevertheless was known as Edward the Peacemaker.’ Nearly forty years later, in January 2012, Emily Nussbaum reviewed the first US showing of the second series of Downton Abbey in the New Yorker: ‘To let us know that we’re safely in the Masterpiece zone, Laura Linney, clad in a black cocktail dress, introduces each episode with a tense grin, as if welcoming us to a PBS fundraiser, which I suppose she is.’ In fact, she was explicating such complexities as ‘entail’ and ‘primogeniture’ to an American audience who had already been deemed to have too short an attention span to cope with the show, which had two hours cut from its British running time.

How to Get Into and Out of an Economic Crisis…

Magnason_468wAndri Snær Magnason on Iceland's collapse and recovery, in Eurozine:

When the global recession crisis hit, Iceland suddenly made world news. The headlines were breathtaking. Collapse, national bankruptcy and demonstrations on the streets. In Vanity Fair you could read descriptions of burning Range Rover jeeps and people stocking up on groceries. Actually, neither story was true. One man was suspected of having set his luxury car on fire and Iceland continued to export fish. The collapse, however, was a fact. Iceland was one of the worst casualties of the global financial crisis. The Icelandic stock market, which at its peak had reached 9000 points, stood at 14 points after the collapse. This rollercoaster ride was as dramatic as the statistics.

In the five years between 2002 and 2008, Icelandic banks had gone from serving a small local market to operating as large international corporations. They grew tenfold and had become twelve times richer than the Icelandic gross national product. Their sights were set even higher: every single bank possessed drawings of new headquarters which were supposed to be ten times larger than those they had in 2007.

The banks employed a young generation full of confidence – they knew their way around the business districts of London, New York, Tokyo and Shanghai. They were highly educated with postgraduate degrees from Harvard, MIT and LSE. They spoke more languages than their parents and knew how to put together complicated financial transactions, forward contracts, derivatives and all those things ordinary people don't understand. This was the most highly educated generation in Icelandic history and the banks had an insatiable need for educated manpower. They became a black hole for talent, sucking in the best people.

Patricia Churchland on What Neuroscience Tells Us About Morality

Over at Rationally Speaking, a discussion between Patricia Churchland and Massimo Pigliucci:

The Rationally Speaking podcast is proud to feature another certified genius: Patricia Churchland, a philosopher well known for her contributions to neurophilosophy and the philosophy of the mind, was professor at the University of California San Diego from 1984-2010, and won the MacArthur Genius Grant in 1991. In this episode, she, Massimo, and Julia discuss what philosophy has to say about neuroscience, what neuroscience has to say about philosophy, and what both of them have to say about morality.

The Adolescent Brain

Bk_598_blakemore630Sarah- Jayne Blakemore over at Edge:

The reason I became interested in the adolescent brain is twofold. Firstly, we know that most adult mental disorder has its onset at some point during the teenage years, so if you look at disorders like anxiety disorders, depression, addictions, eating disorders, almost all of them will have their onset some time during the teenage years.

Schizophrenia, as you might know, is a very horrific psychiatric condition that's characterized by delusions, like being paranoid and thinking that people are out to get you, and hallucinations like imagining that people are talking to you inside your head, hearing voices. That has its onset at the end of adolescence, normally in the early 20s, on average. So that's one reason why I think it's really important to study the adolescent brain. The hypothesis is that something is going wrong in normal brain development to trigger these psychiatric and psychological disorders.

The second reason why adolescence is an interesting period of life to study is because unlike most other periods of life, the leading causes of death in adolescence are accidents. That's the number one leading cause of death during the period of adolescence, the second is suicide. The accidents are caused, generally, by risk taking. So we know that teenagers take more risks than either children or adults. The question is, why? Why is adolescence associated with this phenomenon like increased risk taking and especially when adolescents are with their peers, so peers become really influential in adolescence. Adolescents are driven towards impressing their peers, trying to seek approval of their peers, and becoming more and more independent from their parents. Social cognition, the social brain seems to change during the period of adolescence, and that's something that particularly interests me.

And finally, self-awareness; awareness of one's self, and consciousness of one's self. We all know, if you remember what it's like being a teenager, that feeling of heightened self-consciousness that seems to happen in early adolescence where you become easily embarrassed by things like your parents, or social situations where you're not seen as cool, and that kind of thing.

The War of Lies

Troepen_idfUri Avnery on the 30th anniversary of the Israeli invasion of Lebanon, in Gush Shalom (image from wikipedia):

Almost all wars are based on lies. Lies are considered legitimate instruments of war. Lebanon War I (as it was later called) was a glorious example.

From beginning to end (if it has ended yet) it was a war of deceit and deception, falsehoods and fabrications.

THE LIES started with the official name: “Operation Peace in Galilee”.

If one asks Israelis now, 99.99% of them will say with all sincerity: “We had no choice. They launched katyushas at the Galilee from Lebanon every day. We had to stop them.” TV anchormen and anchorwomen, as well as former cabinet ministers have been repeating this throughout the week. Quite sincerely. Even people who were already adults at the time.

The simple fact is that for 11 months before the war, not a single shot was fired across the Israeli-Lebanese border. A cease-fire was in force and the Palestinians on the other side of the border kept it scrupulously. To everybody’s surprise, Yasser Arafat succeeded in imposing it on all the radical Palestinian factions, too.

At the end of May, Defense Minister Ariel Sharon met with Secretary of State Alexander Haig in Washington DC. He asked for American agreement to invade Lebanon. Haig said that the US could not allow it, unless there were a clear and internationally recognized provocation.

And lo and behold, the provocation was provided at once. Abu Nidal, the anti-Arafat and anti-PLO master terrorist, sent his own cousin to assassinate the Israeli ambassador in London, who was grievously wounded.

In retaliation, Israel bombed Beirut and the Palestinians fired back, as expected. The Prime Minister, Menachem Begin, allowed Sharon to invade Lebanese territory up to 40 km, “to put the Galilee settlements out of reach of the katyushas.”

When one of the intelligence chiefs told Begin at the cabinet meeting that Abu Nidal’s organization was not a member of the PLO, Begin famously answered: “They are all PLO”.

Bonds for Well-Being: A (Protein) Social Network

From Harvard Magazine:

ProtJust about everything the body does depends on the interactions of proteins—the molecules encoded by genes that serve as the primary workers in cells. Without thousands of coordinating proteins, cells wouldn’t function properly; even subtle problems in these interactions can lead to disease.

Spyros Artavanis-Tsakonas, professor of cell biology at Harvard Medical School (HMS), believes that to better grasp what can go wrong with proteins, scientists need to understand how these molecules function together (not just in isolation) in healthy cells. In the October 28 issue of Cell, his team published a large-scale map that tracks the interactions of thousands of proteins in fruit flies (Drosophila melanogaster). Since then, the researchers have continued to expand the map and delve into these connections in more detail. The map was created through a painstaking process that Artavanis-Tsakonas compares to fishing. The scientists first randomly generated thousands of distinct proteins to serve as “bait,” and introduced these proteins into Drosophila cells. When they removed the baits, they could see which proteins had adhered to them, thanks to the application of a highly precise technique, mass spectrometry, carried out by HMS professor of cell biology Steven Gygi. The result: a vast “social network” of proteins.

More here.

Genetic sequence could solve mystery of why bonobos are more peaceful than other chimpanzees

From Nature:

Online81258190When the Congo River in central Africa formed, a group of apes was forever stranded on its southern banks. Two million years later, the descendants of these apes — the bonobos — have developed distinct social patterns. Unlike their chimpanzee relatives on the northern shore, they shun violent male dominance and instead forge bonds through food-sharing, play and casual sex. An 18-year-old female named Ulindi has now become the first bonobo (Pan paniscus) to have its genome sequenced. Scientists hope that the information gleaned will explain the stark behavioural differences between bonobos and common chimpanzees (Pan troglodytes) and help to identify the genetic changes that set humans apart from other apes.

Distant relatives

Humans, chimps and bonobos all share a common ancestor that lived about 6 million years ago in Africa, when the human lineage splintered off. By the time that our Homo erectus ancestors were roaming the African savannah 2 million to 1.5 million years ago, populations of the common ancestor of chimpanzees and bonobos had been separated by the Congo River. Little and probably no interbreeding has occurred since then, says Kay Prüfer, a bioinformatician at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, who led the sequencing study. Comparisons of the bonobo genome and sequences of chimps from various populations showed that chimps living just across the Congo River were no more closely related to bonobos than were populations living as far away as Côte d’Ivoire. That implies that the separation was quick and permanent, says Prüfer.

Easy-going apes

Once the ancestors of bonobos had been separated from those of chimpanzees, they may have found themselves in a very different ecological world. North of the Congo River, the ranges of chimpanzees and gorillas overlap, so those animals compete for food. But no gorillas live south of the river, so bonobos face much less food competition, says Victoria Wobber, a comparative psychologist at Harvard University in Cambridge, Massachusetts, who has worked with bonobos including Ulindi. In the absence of competition, the ancestors of bonobos may have been free to forage a wider range of foods in large groups, and share the spoils freely. “When food is more consistently available, a lot of the aggression you see in chimps, you don’t need anymore,” says Wobber. Bonobos also treat sex as casually as a handshake, earning them the nickname ‘hippie chimps’. The sex is often non-procreative and can occur between pairs of the same sex. Chimps tend to have sex only when females are in estrous. “Instead of resolving their disputes aggressively, maybe they’re resolving them with sex,” says Wobber. And whereas chimpanzee groups are dominated by hyper-aggressive males, bonobo groups are less hierarchical and are often headed by females. Genetic discrepancies between chimps and bonobos must be involved in these behavioural differences, says Prüfer.

More here.

the saving power of form?

Picasso_0

And to think about Picasso’s imagination is to find ourselves right back with Roger Fry and Virginia Woolf at the National Gallery, because there is no modern artist who has struggled more mightily than Picasso to reconcile the rival claims of sentiment and design. The impossible conflict of his years with Françoise Gilot was that try as he might, he could never find a structure compelling enough to crystallize her youthful beauty. Even his finest portraits of her—Richardson speaks of the Femme-fleur—feel programmatic, at least compared to his earlier responses to Fernande Olivier, Marie-Thérèse Walter, Dora Maar, or, later, to Jacqueline Roque. Perhaps the nearly endless transformations he wrecked on several lithographic studies of Françoise—the Gagosian show contains a large number of these states—suggest his unease, his inability to find a pictorial language to express his ardent emotions. Two of the finest works in this exhibition have little or nothing to do with the demands of portraiture. The 1953 painted wood assemblage, Femme portant un enfant, some five-and-a-half feet high, abstracts the mother-and-child relationship through the metaphor of a sculpture that is like something a child might dream up with building blocks. And the great 1950 Vallauris landscape, Paysage d’hiver, turns away from the affective dilemmas of portraiture entirely. Picasso had not been so fully committed to the art of landscape since his studies of the Spanish village of Horta, done in the early Cubist years.

more from Jed Perl at TNR here.