My candle burns at both ends

Raza Rumi in Himal Southasian:

Rekha It is not a coincidence that the earliest novels of the Subcontinent dealt with the intense and memorable characters of ‘nautch girls’. Essentially a colonial construct, a nautch girl referred to the popular entertainer, a belle beau who would sing, dance and, when required, also provide the services of a sex worker. The accounts on the marginalised women from the ‘dishonourable’ profession are nuanced, concurrently representing the duality of exploitation and empowerment.
Long before feminist discourse explored and located the intricacies of sex workers’ lives and work, male novelists during the 18th and 19th centuries were portraying the strong characters of women in the oldest profession. Stereotypes of the hapless and suffering prostitute rarely find mention in texts from that time, but one early novel, written in Urdu, is Mirza Muhammad Hadi Ruswa’s Umrao Jan Ada. While the Lucknow-based poet Ruswa is said to have persuaded Umrao to reveal her life history, many critics have surmised that the narrative was authored by Umrao herself. The tone and candour of the story suggests that Umrao played a significant role in drafting this semi-documentary piece.

Umrao’s woes originated in a typical patriarchal mould. As a young girl, she was kidnapped by a hooligan and sold to a Lucknow kotha (a high-culture space also operating as a brothel) managed by Khanum Jan. This act was the hooligan’s way of seeking revenge against Umrao’s father, who had testified against him. At the kotha, an erudite, elderly maulvi transformed Umrao into a civilised poet-cum-entertainer, educating her in the arts and culture. Her seeking knowledge and acquiring confidence to handle a predominantly male world takes place within this space. Thus, the tale of exploitation turns into a narrative of self-discovery.

An archetypal courtesan steeped in Avadhi high culture and manners, Umrao Jan Ada comes across as a voice far ahead of her times. In her frank conversations with Ruswa, Umrao explains how a sex worker’s only friend is money. The realisation that a dancing girl would be a fool to jeopardise her livelihood by giving her love to a man was a clear expression of her empowerment. The plain rejection of wifehood in Umrao Jan’s worldview was directly rooted in the decision not to trade independence for an institutionalised relationship, despite the respectability that such an association might offer. The empowerment of Umrao is in many ways linked to her profession. In an age where women were completely dependent on men for financial and social sustenance, sex work emerged as a safety valve for her existence. And Umrao remains contemptuous of courtesans who leave their position of power and independence, and subject themselves to the whims of respectable men who may or may not reciprocate by according them social respect.

More here.

Video Game Helps Solve Protein Structures

From Science:

Gamers People playing a simple video game can match, and even surpass, the efforts of a powerful supercomputer to solve a fiendishly difficult biological problem, according to the results of an unusual face-off. The game isn't Pac-Man or Doom, but one called FoldIt that pushes people to use their intuition to predict the three-dimensional (3D) structure of a protein. When it comes to solving protein structures, scientists usually turn to x-ray crystallography, in which x-rays shining through a protein crystal reveal the location of atoms. But the technology is expensive and slow and doesn't work for all proteins. What scientists would love is a method for accurately predicting the structure of any protein, while knowing nothing more than the sequence of its amino acids. That's no small task, considering that even a moderately sized protein can theoretically fold into more possible shapes than there are particles in the universe.

To get around that problem, computer programs focus on which shapes require the least amount of energy—and thus which ones the protein is most likely to adopt. But these programs must rely on intense computing to make any headway. One of the most powerful, Rosetta@home, was created by David Baker, a molecular biologist at the University of Washington (UW), Seattle. The program distributes its calculations to thousands of home computers around the world, automatically sending the results back to Baker's lab. (It runs on the same “distributed computing” architecture as the SETI@home search for alien life.) The entire network is capable of nearly 100 trillion calculations per second, dwarfing most supercomputers. Two years ago, Baker wondered whether humans might help Rosetta@home do better. Although the program is impressively good at solving the first 95% of the folding of a protein, putting the correct finishing touches on a molecule often stumps it. People complained to Baker by e-mail that it was frustrating to watch the program flail around on their computer screens when the necessary final tweaks were sometimes obvious to the human eye.

More here.

you are and aren’t free

C2CAA889-979E-BACC-B8467A9CA1C4FA51_1

In an influential article in the Annual Review of Neuroscience, Joshua Gold of the University of Pennsylvania and Michael Shadlen of the University of Washington sum up experiments aimed at discovering the neural basis of decision-making. In one set of experiments, researchers attached sensors to the parts of monkeys’ brains responsible for visual pattern recognition. The monkeys were then taught to respond to a cue by choosing to look at one of two patterns. Computers reading the sensors were able to register the decision a fraction of a second before the monkeys’ eyes turned to the pattern. As the monkeys were not deliberating, but rather reacting to visual stimuli, researchers were able to plausibly claim that the computer could successfully predict the monkeys’ reaction. In other words, the computer was reading the monkeys’ minds and knew before they did what their decision would be. The implications are immediate. If researchers can in theory predict what human beings will decide before they themselves know it, what is left of the notion of human freedom? How can we say that humans are free in any meaningful way if others can know what their decisions will be before they themselves make them?

more from William Egginton at The Opinionater here.

THE GOD THAT FAILED

Gray_08_10

Kafka died in 1924, twenty years before the start of the Cold War, but he understood the absurdities of life under totalitarian rule better than many of the protagonists in the conflict. The accuracy of Kafka’s insight was admitted even by the Marxist literary critic Georg Lukács, an abject Stalinist who denigrated the Czech writer for deviating from the tenets of ‘progressive humanism’. Appointed Minister of Culture in Imre Nagy’s government during the Hungarian Revolution, Lukács was arrested by the Soviets and transported to a castle in Romania, where he and the other prisoners lived at times as visiting dignitaries and at others as criminals. After some days of this treatment, David Caute tells us, Lukács commented, ‘So Kafka was a realist after all!’ Politics and the Novel During the Cold War is the continuation and completion of several decades of writing and thinking about the role of intellectuals in the grand political conflicts of the twentieth century. First published in 1973, Caute’s The Fellow-Travellers: Intellectual Friends of Communism remains the most enduring work on the subject. Formidably learned, it is full of absurdist vignettes. Describing the visit to China of the ninety-year-old Hewlett Johnson (the Red Dean of Canterbury), which Chou En Lai had facilitated by providing a special plane complete with bed and oxygen mask, Caute writes: ‘This hugely tall figure with the puffs of white hair billowing over his ears stood on stage with the minute, masked members of the Peking Opera, clapping with them, smiling, clapping …’.

more from John Gray at Literary Review here.

the war against Thanatos

Hitchens

I have more than once in my time woken up feeling like death. But nothing prepared me for the early morning last June when I came to consciousness feeling as if I were actually shackled to my own corpse. The whole cave of my chest and thorax seemed to have been hollowed out and then refilled with slow-drying cement. I could faintly hear myself breathe but could not manage to inflate my lungs. My heart was beating either much too much or much too little. Any movement, however slight, required forethought and planning. It took strenuous effort for me to cross the room of my New York hotel and summon the emergency services. They arrived with great dispatch and behaved with immense courtesy and professionalism. I had the time to wonder why they needed so many boots and helmets and so much heavy backup equipment, but now that I view the scene in retrospect I see it as a very gentle and firm deportation, taking me from the country of the well across the stark frontier that marks off the land of malady. Within a few hours, having had to do quite a lot of emergency work on my heart and my lungs, the physicians at this sad border post had shown me a few other postcards from the interior and told me that my immediate next stop would have to be with an oncologist. Some kind of shadow was throwing itself across the negatives.

more from Hitch at Vanity Fair here.

Sit and spin

John-baldessari-point-man.5165598.40

What makes things meaningful? Is it the mere indication that something meaningful is, in fact, present? Is it the attention we then invest in it? Is it our capacity to subsequently rationalize that experiential phenomenon into a communicable verbal analog: to describe it in words? Is it in the act of communication? Such questions have been inherent to the act of artmaking since prehistory but began breaking surface in the 20th century, nowhere more elegantly than in the work of L.A. painter/photographer John Baldessari, whose retrospective exhibit “Pure Beauty” is up at LACMA through September 12. Having debuted at London’s Tate Modern last year, it will movie to NYC’s Metropolitan Museum later in the fall. A perfect example of Baldessari’s eloquence on these philosophically pointed matters is his series of Commissioned Paintings from 1969, a group of identically formatted canvases, each with a centered, more-or-less photorealistically rendered image of a finger indicating a feature in the environment — often a smudge or stain on a surface — with a caption below by a professional sign painter, reading “A Painting by Patrick X. Nidorf O.S.A.” or “A Painting by Anita Storck.”

more from Doug Harvey at the LA Weekly here.

the futurity man

TLS_Harman_727835a

“Self-made” simply isn’t a strong enough term for H. G. Wells, as Michael Sherborne’s authoritative new Life makes very clear. His father was an unsuccessful shopkeeper in Bromley, Kent; his mother a lady’s maid who had to return to service as the family got gradually poorer. Lack of money meant that Bertie’s formal education was delayed until a few months before his eighth birthday and ended soon after his thirteenth: the next year he was expected to teach other children, some bigger than him, at a National School in Wookey. Lessons consisted of “whatever occurred” to the teenager, punctuated by hand-to-hand combat, as Wells recalled in his autobiography: “I fought my class, hit them about viciously and had altogether a lot of trouble with them”. Wells was put to several apprenticeships and seemed fated to replicate his father’s life in trade, either as a draper or a chemist. But he was not prepared to leave his change of fortune to luck or accident. There was a “game against life” which he was determined to win, and as his lowly hero Mr Polly comes to realize in the novel, “If the world does not please you, you can change it”.

more from Claire Harman at the TLS here.

Wednesday, August 4, 2010

Mullican, Mullican

Slaven_1 Jessica Slaven in Paper Momument:

Matt Mullican bears the consummate art-insider’s pedigree: his father, Lee Mullican, is a well-respected painter; he attended the California Institute of the Arts in the 70s and studied with John Baldessari; he participated in Documentas IX and X and was shown in the 2008 Whitney Biennial; he has excellent gallery representation and an impressive critical bibliography; he even has an artwork commissioned in black granite in the 50th Street subway station in New York City. In the summer of 2005, he was invited to be a Visiting Artist at the Skowhegan School of Painting and Sculpture in Skowhegan, Maine, where he lectured on his life’s work—from his earliest student experiments, through his seminal sculptures and installations of the 80s, up to new drawings and video-taped performances from “Learning from That Person’s Work,” an exhibition he’d just mounted at the Ludwig Museum in Cologne.

I was there. His lecture turned out to be the surprise controversy of the summer at Skowhegan, where our cohort was rumored to have been the most professional and well behaved in recent memory. Before you cock an eyebrow, remember that 2005 was perhaps the historical apex of the American MFA system, which excelled at producing refined and competent young artists ready to meet and greet the contemporary art world. Back then, it paid to be affable and business-like; it was a time to be on your insider-y best behavior.

Charming and gracious in person, Matt Mullican has been performing and lecturing for nearly thirty years, and by now it’s often difficult to distinguish the two activities.

The “Life” of Theodor W. Adorno: One Last Genius

J.M. Woolsey reviews Detlev Claussen by Theodor W. Adorno: One Last Genius, in Politics and Culture:

In his writings dedicated to collective memory, Maurice Halbwachs argued that family memory operates as a “physiognomy” in which our remembrance of family members and relations consist of a condensation or “summation of an entire period—the idea of a type of life” (60). At the level of both content and method, Halbwachs’s observations about the condensed and physiognomic nature of family memory as “the idea of a type of life” speak directly to Detlev Claussen’s Theodor W. Adorno: One Last Genius. Anyone familiar with Adorno will surely be aware of his infamous claim in Minima Moralia that “Wrong life cannot be lived rightly” (39). This statement—usually interpreted as “survivor’s guilt,” or a declaration about the impossibility of escaping total enmeshment in the exploitative exchange relations of capitalism—has become, inter alia, the equivalent of a physiognomic sound bite, metonymically branding Adorno. It is perhaps Claussen’s most valuable biographical insight that this negation of an idea of a type of life should be understood in relation to Adorno’s own memory of family life and his membership in the extended family of his intellectual friendships.

With the publication of One Last Genius, Claussen, a former student of Adorno who is now a professor of social theory and culture at the Leibniz University of Hanover, has made an important and valuable contribution to the recent literature dedicated to exploring the relation between the dialectician’s life experiences and his unique articulation of critical theory. By combining a close, if unbalanced, reading of some of Adorno’s central texts with an astute attention to letters and other private and public testimony written by his intellectual contemporaries, Claussen has produced a work similar in scope (but not depth) to Rolf Wiggerhaus’s magisterial account of the Institute of Social Research: The Frankfurt School: Its History, Theories and Political Significance (1995). However, anyone who reads One Last Genius looking for a definite statement about Adorno or “negative dialectics” will be sorely disappointed. In fact, throughout much of the text, Adorno is a shadowy figure in the background of a story focused on his intimate circle of friends; just another face in the crowd. Furthermore, Claussen’ s stylistic approach is often so repetitious and circular that it is bound to frustrate even the most interested and sympathetic reader. This problem can partially be attributed to the fact that each chapter in the book is designed to stand on its own and thus inevitably covers the same material as others, often in the same context and to make the same point.

An Excerpt from ‘Ernest Gellner: An Intellectual Biography’

OB-JI496_Gellne_DV_20100722170456From John Hall's new book, in the WSJ:

When Ernest Gellner died in December 1995, the flags of the University of Cambridge, where he had taught from 1984 to 1992, were set at half mast. This reflected the status he had achieved in the last years of his life, as a public intellectual able to comment on a very wide range of issues. It did not mean, however, that his views had lost their bite. If Gellner's name had been made during the scandal surrounding his early attack on Oxford linguistic philosophy, his late essays – not least his attack on Isaiah Berlin as a 'Savile Row postmodernist' – were capable of causing just as much outrage. Still, many felt affection for Gellner, with whose voice they had become familiar, and to whom they often turned for guidance and insight. All the same, very few people knew what to make of him. He was hard to pin down. For two decades he had the curious title of Professor of Sociology with special reference to philosophy at the London School of Economics and Political Science (LSE) – held, it should be noted, in two different departments: first Sociology, then Philosophy, Logic and Scientific Method – before taking up the William Wyse Professorship of Social Anthropology at the University of Cambridge. He had separate reputations as scholar of Islam, theorist of nationalism, philosopher of history, and historian of ideas. He ended his career in Prague, the city in which he had grown up as a boy, though in his final years he was most interested in developments in Russia. His status as public intellectual rested on this background, that of a multilingual polymath, a modern philosophe. He was sometimes cited as one of the last great thinkers from Central Europe whose Jewish background meant a direct experience of the twentieth century's horrors.

It is possible to hint at what follows by noting the very particular way in which Gellner fits into this last category. The contours of his formative experiences are clear, and were pungently expressed by Gellner himself when discussing the work of Hannah Arendt. The rise of nationalist sentiment at the end of the nineteenth century created a dilemma for Jews, especially those who had experienced the Enlightenment and an end to anti-Jewish discrimination by the state. Gellner insisted that the return to cultural roots was always an illusion, a piece of pure romanticism he neatly illustrated by noting sardonically that 'it was the great ladies at the Budapest Opera who really went to town in peasant dresses, or dresses claimed to be such'. Illusion or no, the Jews felt the pull of belonging just as much as others did – perhaps even more. But the romantic call to belong affected the minority Jewish community and the demographic majority in two very different ways.

Emotions help animals to make choices

From PhysOrg:

Emotionshelp To understand how animals experience the world and how they should be treated, people need to better understand their emotional lives. A new review of animal emotion suggests that, as in humans, emotions may tell animals about how dangerous or opportunity-laden their world is, and guide the choices that they make.

An animal living in a world where it is regularly threatened by predators will develop a negative emotion or 'mood', such as anxiety, whereas one in an environment with plenty of opportunities to acquire resources for survival will be in a more positive mood state. The researchers argue that these emotional states not only reflect the animal's experiences, they also help it decide how to make choices, especially in ambiguous situations, which could have good or bad outcomes. An animal in a negative state will benefit from adopting a safety-first, 'pessimistic' response to an ambiguous event — for example interpreting a rustle in the grass as signalling a predator – while an animal in a positive state will benefit from a more 'optimistic' response, interpreting it as signalling prey.

More here.

Jilted salamanders lash out

From Nature:

Salamandar Amphibian Casanovas beware: the ladies aren't likely to take infidelity lying down. Male salamanders returning home after a night of disloyalty can expect a beating, a new study reveals. The finding has surprised behavioural researchers: most instances of infidelity punishment place females on the receiving end of the abuse. Female red-backed salamanders are no different, but they don't just take it — they dish it out too, finds behavioural ecologist Ethan Prosen of the University of Louisiana in Lafayette, who led the research1.

“This is the only species I know of where the male is intimidating the female and the female returns the favour,” says Prosen. No one is sure how common infidelity is among salamanders. But male red-backed salamanders are known to be aggressive toward female partners that have visited other males. But as males and females of this species are evenly matched in size, Prosen wondered why the females were putting up with this violent treatment.

More here.

Wednesday Poem

Snowsbanks North of the House

Those great sweeps of snow that stop suddenly six
feet from the house …
Thoughts that go so far.
The boy gets out of high school and reads no more
books;
the son stops calling home.
The mother puts down her rolling pin and makes no
more bread.
And the wife looks at her husband one night at a
party, and loves him no more.
The energy leaves the wine, and the minister falls
leaving the church.
It will not come closer
the one inside moves back, and the hands touch
nothing, and are safe.

The father grieves for his son, and will not leave the
room where the coffin stands.
He turns away from his wife, and she sleeps alone.

And the sea lifts and falls all night, the moon goes on
through the unattached heavens alone.

The toe of the shoe pivots
in the dust …
And the man in the black coat turns, and goes back
down the hill.
No one knows why he came, or why he turned away,
and did not climb the hill.

by Robert Bly
from The Man in the Black Coat Turns

derivatives

Babylon_0324

In “Babylon Revisited,” F. Scott Fitzgerald’s 1931 short story about the aftermath of the 1929 Wall Street crash, Fitzgerald makes the point that such collapses are slips in morality as much as financial failures. Charlie Wales, the story’s emotionally fragile hero, returns to Paris in a desperate effort to regain custody of his nine-year-old daughter. “I heard that you lost a lot in the crash,” says the Ritz bartender. Implying his moral lapses, Charlie replies that yes, he did, “but I lost everything I wanted in the boom.” In fact, upper-middle-class people like Charlie hesitated during the first months of the market’s run-up—until early in 1928. That was when they joined the gambling frenzy, and that was when, as John Kenneth Galbraith wrote in The Great Crash, 1929, the “mass escape into make-believe, so much a part of the true speculative orgy, started in earnest.” Eight decades later, stock-market investors like Charlie had no role in bringing on or profiting from the 2008 financial crisis. This time they stood on the sideline as major financial institutions engaged in a speculative orgy. Guided by no moral compass, the most sophisticated financial players in the world were betting big with one another about interest rates, commodity prices, and whether companies or governments would default.

more from William J. Quirk at the American Scholar here.

the new great game

Azcauccamap

In the 19th Century the British Empire and Tsarist Russia competed for hegemony in Central Asia. London fought to slow down Moscow’s expansionism, fearing that tessera after tessera Russia would have reached the borders of India, which was Britain’s most prized possession. Russia instead worked to restrict British influence, in a region perceived both as its own backyard and as a buffer zone. The comings and goings of spies and wheeler-dealers, ruthless traders and officers, the ups and downs of plots and intrigue, double-crossing and diplomatic discourtesies reported between the Caspian and Kabul during the 19th Century were catalogued under one single heading, the Great Game. The copyright, it is said, came from Arthur Conolly, an English secret agent serving with the East India Company. The novelist Rudyard Kipling took possession of the saying, bringing it to the attention of the public. Times change, as do situations, empires die and imperial democracies are born, but Central Asia, this vast portion of the world bordered on the west by the Caspian, on the east by China, on the north by Russia and on the south by Pakistan, Afghanistan and Iran, continues to be the theatre of significant manoeuvring. It is no coincidence that geopolitical analysts call it the New Great Game.

more from Matteo Tacconi at Reset here.

the listeners

Levi_1

“You do not interest me. No man can say these words to another without committing a cruelty and offending against justice,” writes philosopher Simone Weil. To turn a deaf ear is an offence not only to the ignored person but also to thinking, justice and ethics. Coleridge’s Ancient Mariner is cursed because no one will listen to his story. The Italian chemist-turned-writer Primo Levi was preoccupied with this fable because of his fear that on returning from Auschwitz people like him would be either ignored or simply disbelieved. Regardless, listening gets a very mixed press amongst critics and intellectuals. There is a suspicion of “wistful optimism” or the quasi-religious appeal to “hold hands” and play priest at the confessional. These qualms miss the centrality of listening to a radical humanism which recognises that dialogue is not merely about consensus or agreement but engagement and criticism. This is something that Primo Levi understood.

more from Les Back at Eurozine here.

james franco is interesting

Franco100726_1_250

Not so long ago, James Franco’s life and career were fairly normal. He grew up in Palo Alto, California, where his parents had met as Stanford students. Young James was, at his father’s urging, a math whiz—he even got an internship at Lockheed Martin. As a teenager, he rebelled, got in trouble with the law (drinking, shoplifting, graffiti), and eventually migrated toward the arts. His hero was Faulkner. He fell in love with acting when he played the lead in a couple of dark and heavy high-school plays. After freshman year, he dropped out of UCLA, very much against his parents’ wishes, to try to make a career of it. He was good, lucky, and driven, and within a couple of years, he got his first big break: Judd Apatow cast him in what would become the cult TV series Freaks and Geeks. When the series was canceled after just a season, Franco landed the lead in the TNT biopic James Dean. He played the part with a slumping intensity that seemed like a reasonable replication of the real thing—or at least much closer than anyone had a right to expect from a TNT biopic—and the performance won a Golden Globe. Soon after, he was cast as Robert De Niro’s drug-addicted son in the film City by the Sea. That same year, he entered mainstream consciousness as Peter Parker’s best friend in Spider-Man. Franco had become, in other words, a working Hollywood actor. An unusual actor—he overprepared for minor roles, read Dostoyevsky and Proust between takes, and occasionally drove colleagues crazy with his intensity—but still identifiably an actor, with an actor’s career. As he climbed toward leading-man status, however, Franco had a crisis of faith. He found himself cast in a string of mediocre films—Annapolis, Flyboys, Tristan + Isolde—most of which bombed. He felt like he was funneling all his effort into glossy, big-budget entertainment over which he had no control, and of which he wasn’t proud.

more from Sam Anderson at New York Magazine here.

Tuesday, August 3, 2010

The Honor of Exile

647_thumb Norman Manea in Project Syndicate:

The Romanian sculptor Brancusi once said that when the artist is no longer a child, he is dead. I still don't know how much of an artist I have become, but I grasp what Brancusi was saying. I can grasp – even at my age – my childish enduring self. Writing is a childish profession, even when it becomes excessively serious, as children often are.

My long road of immaturity began more than half a century ago. It was July 1945, a few months after I returned from a concentration camp called Transnistria. I lived that paradisiacal summer in a little Moldavian town, overwhelmed by the miraculous banality of a normal, secure environment. The particular afternoon was perfect, sunny and quiet, the room's semi-obscurity hospitable. I was alone in the universe, listening to a voice that was and wasn't mine. My partner was a book of Romanian fairy tales with a hard green cover, which I had been given a few days before when I turned the solemn age of 9.

That is when the wonder of words, the magic of literature started for me. Illness and therapy began at the same time. Soon, too soon, I wanted myself to be part of that family of word wizards, those secret relatives of mine. It was a way of searching for “something else'' beyond the triviality of everyday life, and also of searching for my real self among the many individuals who inhabited me.

What Darwin Got Wrong

Fodor200 John Dupre reviews Fodor and Piatelli-Palmarini's What Darwin Got Wrong, in The Philosopher's Magazine:

Neo-Darwinism is, very roughly, the claim that natural selection is by far the most important explanation of biological form, the particular characteristics of particular kinds of organism. It usually includes a commitment to gradualism (the idea that evolution occurs in small steps), and often involves attributing central importance to genes as the units that natural selection selects, or at any rate as the objective measure of evolutionary change. Versions have been prominently defended in recent years by such authors as Richard Dawkins, Daniel Dennett and Jerry Coyne.

Neo-Darwinism is, however, a perspective under ever-growing pressure, not (or not only) from the antiscientific assaults of the religious, but from the advancement of science. The decline of this intellectual monolith is generally to be welcomed, not least because it may be expected to bring down with it some of its less appetising academic fellow travellers, most notably Evolutionary Psychology. At the same time those contributing to the demise of neo-Darwinism must be aware of the risk, especially in the United States, that they will provide succour for fundamentalist Creationists and aficionados of so-called Intelligent Design.

Fodor and Piatelli-Palmarini’s (henceforth FPP) book is intended as a contribution to the critical task just mentioned, and they are well aware of the potential hazards. Sadly, however, the book is an almost tragic failure: it is unlikely to be taken seriously as a contribution to the dismantling of neo-Darwinism and it has been, and will continue to be, picked up by the fundamentalist enemies of science.

The first half of the book does a decent job of summarising the recent scientific insights responsible for the growing difficulties facing neo-Darwinism. Neo-Darwinism, by virtue of its emphasis on natural selection, sees evolution as driven from outside, by the environment. Central among the difficulties that FPP emphasise are crucial respects in which evolution is constrained, or even driven, by internal features of the organism. This realisation has been promoted by evolutionary developmental biology (“evo-devo”), which has also highlighted the unacceptable black-boxing of development in mainstream evolutionary theory, a concomitant of the exclusive focus on external determinants of change. Also crucial has been a gradual move away from excessively atomistic views of organisms and an appreciation of the necessity of treating them as integrated wholes, illustrated by the impossibility of analysing the genome into a unique set of discrete elements, “genes”. And equally important has been the disclosure of the complexity of the relations between genomes and phenotypes.

While much material is presented that does indeed reveal the dire straits in which neo-Darwinism finds itself, the overall argument is generally elusive. I speculate that this is because there are two quite different conclusions in the offing.