The best of the year’s lives and letters

From The Guardian:

Shakespeare_main_2406463bWhat makes us curious to read about someone’s life? Hype is not always enough. Much of the noise surrounding the launch of Salman Rushdie’s memoir Joseph Anton (Cape, £25) was the sound of backfiring, after his American literary agent made any lorry driver, bookseller or newspaper editor liable to $250,000 damages should a copy of the book (which champions freedom of speech) fall into the public’s hands before publication day. Once free to buy it for themselves, only some 12,000 readers responded. And yet, under the blanket coverage, a fine book was struggling to get out – overlong maybe, but also funny and painful.

Of course, it helps if the memoirist has a good enemy. Edna O’Brien’s Country Girl (Faber, £20) charts her escape from a succession of foes just as unyielding as the Ayatollah. First, the Virgin Mary – O’Brien was briefly in a convent; then her birthplace, Ireland, where she dug herself a “pestiferous, vile, slime-ridden pool of transgression”; then marriage, to a jealous rival who snarls: “You can write and I can never forgive you.” The early sections smack of the gilt-edged pages of O’Brien’s religious treasury. Words run away with her as she elopes with lover after hectic lover, and enters “the carnivore world of celebrity” on the arm of actors such as Richard Burton, whose diaries (The Richard Burton Diaries, Yale, £25, edited by Chris Williams) curiously omit their “mesmerised” encounters. But her account of bringing up two boys is extraordinarily affecting, and her understanding of the mystery of writing seems spot on: “It comes out of afflictions, out of the gouged times, when the heart is cut open.”

More here.

Sorry, vegans: Eating meat and cooking food is how humans got their big brains

Christopher Wanjek in the Washington Post:

Bigstock-grunge-abstract-background-16904822Vegetarian, vegan and raw diets can be healthful, probably far more healthful than the typical American diet. But to call these diets “natural” for humans is a bit of a stretch in terms of evolution, according to two recent studies.

Eating meat and cooking food made us human, the studies suggest, enabling the brains of our prehuman ancestors to grow dramatically over a few million years.

Although this isn’t the first such assertion from archaeologists and evolutionary biologists, the new studies demonstrate that it would have been biologically implausible for humans to evolve such a large brain on a raw, vegan diet and that meat-eating was a crucial element of human evolution at least a million years before the dawn of humankind.

At the core of this research is the understanding that the modern human brain consumes 20 percent of the body’s energy at rest, twice that of other primates. Meat and cooked foods were needed to provide the necessary calorie boost to feed a growing brain.

One study, published last month in the Proceedings of the National Academy of Sciences, examined the brain size of several primates. For the most part, larger bodies have larger brains across species. Yet humans have exceptionally large, neuron-rich brains for our body size, while gorillas — three times as massive as humans — have smaller brains with one-third the neurons. Why?

More here.

Two-State Solution on the Line

Gro Harlem Brundtland and Jimmy Carter in the New York Times:

ScreenHunter_50 Nov. 29 12.58In the current political climate, it is highly unlikely that bilateral talks between Israel and the Palestinians can restart. Action is needed that will alter the current dynamic. As Elders, we believe that the Palestinian statehood bid at the United Nations is such a moment.

On Nov. 29, U.N. member states will be asked to vote on a resolution to grant “non-member observer state status” to Palestine, a significant upgrade from its current “observer entity” status. We urge all member states to vote in favor.

In going to the General Assembly, Palestinian President Mahmoud Abbas is not carrying out a provocative act. Nor is he undermining trust and distracting from the pursuit of peace, as his critics have said.

This is a vote for human rights and the rule of law. It is completely consistent with decades of commitment by the United States, Europe and the rest of the world to peace in the Middle East based on the creation of a viable and contiguous Palestinian state existing side by side with Israel. It is a lawful, peaceful, diplomatic act in line with past U.N. resolutions and international law.

More here. [Photo of Carter and Brundtland in East Jerusalem from here.]

Thursday Poem

Let me tell you about my marvelous god

Let me tell you about my marvelous god, how he hides in the hexagons
of the bees, how the drought that wrings its leather hands
above the world is of his making, as well as the rain in the quiet minutes
that leave only thoughts of rain.
An atom is working and working, an atom is working in deepest
night, then bursting like the farthest star; it is far
smaller than a pinprick, far smaller than a zero and it has no
will, no will toward us.
This is why the heart has paced and paced,
will pace and pace across the field where yarrow
was and now is dust. A leaf catches
in a bone. The burrow’s shut by a tumbled clod
and the roots, upturned, are hot to the touch.
How my god is a feathered and whirling thing; you will singe your arm
when you pluck him from the air,
when you pluck him from that sky
where grieving swirls, and you will burn again
throwing him back.

by Susan Stewart
from Columbarium
(University of Chicago Press, 2003)

Past 5,000 years prolific for changes to human genome

From Nature:

GenomeThe human genome has been busy over the past 5,000 years. Human populations have grown exponentially, and new genetic mutations arise with each generation. Humans now have a vast abundance of rare genetic variants in the protein-encoding sections of the genome1, 2. A study published today in Nature3 now helps to clarify when many of those rare variants arose. Researchers used deep sequencing to locate and date more than one million single-nucleotide variants — locations where a single letter of the DNA sequence is different from other individuals — in the genomes of 6,500 African and European Americans. Their findings confirm early work by Akey1 suggesting that the majority of variants, including potentially harmful ones, were picked up during the past 5,000–10,000 years. Researchers also saw the genetic stamp of the diverging migratory history of the two groups.

The large sample size — 4,298 North Americans of European descent and 2,217 African Americans — has enabled the researchers to mine down into the human genome, says study co-author Josh Akey, a genomics expert at the University of Washington in Seattle. He adds that the researchers now have “a way to look at recent human history in a way that we couldn’t before.” Akey and his colleagues were able to dig out genetic variants occurring in less than 0.1% of the sample population — a resolution that is a full order of magnitude finer than that achieved in previous studies, says Alon Keinan, a statistical geneticist at Cornell University in Ithaca, New York, who was not involved with the study. Of 1.15 million single-nucleotide variants found among more than 15,000 protein-encoding genes, 73% in arose the past 5,000 years, the researchers report. On average, 164,688 of the variants — roughly 14% — were potentially harmful, and of those, 86% arose in the past 5,000 years. “There’s so many of [variants] that exist that some of them have to contribute to disease,” says Akey

More here.

Wednesday, November 28, 2012

Can a Jellyfish Unlock the Secret of Immortality?

Nathaniel Rich in the New York Times:

02jellyfish1-articleLarge-v3After more than 4,000 years — almost since the dawn of recorded time, when Utnapishtim told Gilgamesh that the secret to immortality lay in a coral found on the ocean floor — man finally discovered eternal life in 1988. He found it, in fact, on the ocean floor. The discovery was made unwittingly by Christian Sommer, a German marine-biology student in his early 20s. He was spending the summer in Rapallo, a small city on the Italian Riviera, where exactly one century earlier Friedrich Nietzsche conceived “Thus Spoke Zarathustra”: “Everything goes, everything comes back; eternally rolls the wheel of being. Everything dies, everything blossoms again. . . .”

Sommer was conducting research on hydrozoans, small invertebrates that, depending on their stage in the life cycle, resemble either a jellyfish or a soft coral. Every morning, Sommer went snorkeling in the turquoise water off the cliffs of Portofino. He scanned the ocean floor for hydrozoans, gathering them with plankton nets. Among the hundreds of organisms he collected was a tiny, relatively obscure species known to biologists as Turritopsis dohrnii. Today it is more commonly known as the immortal jellyfish.

Sommer kept his hydrozoans in petri dishes and observed their reproduction habits. After several days he noticed that his Turritopsis dohrnii was behaving in a very peculiar manner, for which he could hypothesize no earthly explanation. Plainly speaking, it refused to die.

More here. [Thanks to Steve Chasan.]

JACK GILBERT, 1925-2012

Alice Quinn in The New Yorker:

JajackgilbertIn California recently I introduced the poet Kay Ryan by admitting to the audience that I took an awfully long time to catch up to her work while I was poetry editor at The New Yorker. Kay had been sending in poems through the late eighties and early nineties, and I always responded respectfully, asking for another chance, politely declining the poems she’d sent. Eventually, she chided me, and I asked to see all the poems from her forthcoming book that had not been published. She sent them—a packet of ten or so, including poems I’d seen and sent back to her, and we accepted several, publishing three in one issue. Sometimes it takes the first big yes to unlock someone’s work. We read all the poems differently after we’ve fallen in love and been changed by that single poem.

Since Jack Gilbert died, on November 11th, friends have written to me of their passion for his work and of the big string of his poems that ran in The New Yorker in 2004 and 2005. I was late catching up to his work, too, in spite of my exposure to it. I was at Knopf when Gordon Lish published Jack’s second book of poetry, “Monolithos,” in 1982. At the time I didn’t peer into the book deeply enough to be captivated by the poems as I later decisively would be. Perhaps I was a little jealous— Gordon had obviously landed a big fish. It wasn’t until I made my way to The New Yorker and to the poetry desk that I took the measure of Jack’s immense singularity and had the opportunity to hold his work aloft myself.

More here.

No octaves. Ever.

09score-biscardi-blog427

But living in Rome for a year in the 70s changed me forever. Giacinto Scelsi (“the Charles Ives of Italy’) was there back then (I used his Steinway piano to compose there). He would sit at his dual Ondolias in his apartment overlooking the Palatine Hill, recording primordial drones and spacey improvisations onto reel-to-reel tapes; Fredric Rzweski lived here, too — his variations on The People United worker song mixed Lizstian virtuosity with leftist political edginess, and made a big splash everywhere. Back then I used to trek up to see the gentle Goffredo Petrassi in his elegant apartment for lessons and lunch. I sat with Luciano Berio in Lukas Foss’ studio at the Villa Aurelia, as he lamented, “life is too short to have to sit through the mindless repetitions of those minimalists.”

more from Robert Beaser at The Opinionater here.

do we make reality?

Globe-e1352998099776

Holt’s guiding premise here comes from the great 17th-century philosopher Baruch Spinoza: “Of all the possible resolutions to the mystery of existence,” Holt writes, “perhaps the most exhilarating would be the discovery that, contrary to all appearances, the world is causa sui: the cause of itself.” For Spinoza, all mental and physical existents were temporally modified expressions of a single substance, an infinite substance that he called God or Nature. Albert Einstein embraced Spinoza’s idea that the world was divine and self-causing, as more recently have other “metaphysically inclined physicists” like Sir Roger Penrose and the late John Archibald Wheeler. Such scientists go even further, proposing that human consciousness has a critical role in the world’s self-creation. “Although we seem to be a negligible part of the cosmos,” Holt writes in summary of these ideas, “it is our consciousness that gives reality to it as a whole.”

more from Jay Tolson at The American Scholar here.

One, Two or Three States

Map 2

While a one-man one-vote scenario in a single bi-national state would no doubt look good on paper, and possibly even help heal some of the territorial, psychological, and legal wounds dividing the two communities, the lack of a clear pathway for achieving this goal and the seemingly insurmountable obstacles found in its way are such that this idea is often dismissed as utopian or a simple thought experiment at best. Israel’s categorical refusal to even consider a one-state paradigm, given that this would imply an end to the Jewish demographic majority, coupled with the great uncertainty and high risk of violence that would follow an eventual disbanding of the PA or an Israeli physical re-occupation of the West Bank are also matters worth considering. The Gaza Strip, ruled by Hamas as a separate entity from the West Bank since 2007, is another question mark hovering over the one-state paradigm, and the continued division between the Palestinian factions of Hamas and Fatah further complicates not only the one-state but also the two-state formula given that in practice what has been developing on the ground is a three-state reality. Finally, many critics of the one-state framework point to the fact that nationalism and a quest for sovereignty continue to be the major driving forces underpinning the conflict and further that there is no functioning example of a bi-national state in the Middle East. On the contrary, the worrying example of Lebanon’s ethno-religious conflicts is often cited as a case in point.

more from Andrea Dessì at Reset here.

Invisible Borders: Mohsin Hamid’s Moth Smoke

Emily St. John Mandel in The Millions:

0312273231.01.MZZZZZZZHamid is best-known for his second novel, The Reluctant Fundamentalist.Moth Smoke, recently re-released by Riverhead, was his first. The plotting is masterful, especially for a first novel; Hamid’s shifts in perspective are effective, and while we know from the courtroom scene at the beginning that a child will be killed, we don’t know which one, which makes the appearance of every child in these pages an event fraught with peril. Moth Smoke was published to considerable critical acclaim in 2000, which is to say after Pakistan tested its first nuclear weapons, and the arms race between Pakistan and India form the jittery backdrop to Daru’s long goodbye.

The book is suffused through and through by a terrible uncertainty, a sense of ground shifting beneath one’s feet. Hamid uses Daru’s descent from middle-class respectability to desperation as a lens through which to examine the corruption and the complexities of late-90′s Pakistani society. It’s not an especially flattering portrait, but a major strength of the book is Hamid’s refusal to take an easy moralistic stance. Daru is a victim of circumstance, but he’s also capable of cruelty, poor decisions, and hypocrisy. He feels victimized by the monied classes, but mistreats the boy who keeps house for him. He carries a strain of entitlement; he’s insulted by the suggestion that he sell cars for a living. Through his education and through his friendship with Ozi he’s been allowed a glimpse at an entirely different Pakistan, the Lahore of the monied elite, and his rage is fueled by the impossibility of crossing the border from his Pakistan to Ozi’s. Ozi’s Pakistan is a fortress. Money is the only way in.

Ozi is amoral, but on the other hand, as he notes in a chapter told from his perspective, he manages not to slap his housekeeper and he doesn’t sleep with his best friend’s wife, which is more than can be said for Daru. Yes, Ozi acknowledges, the rich in Pakistan use their money to create their own reality, separate from the rest of the struggling country, but what choice do they have?

More here.

Wednesday Poem

Keats

When Keats, at last beyond the curtain
of love’s distraction, lay dying in his room
on the Piazza di Spagna, the melody of the Bernini
Fountain “filling him like flowers,”
he held his breath like a coin, looked out
into the moonlight and thought he saw snow.
He did not suppose it was fever or the body’s
weakness turning the mind. He thought, “England!”
and there he was, secretly, for the rest
of his improvidently short life: up to his neck
in sleigh bells and the impossibly English cries
of street vendors, perfect
and affectionate as his soul.
For days the snow and statuary sang him so far
beyond regret that if now you walk rancorless
and alone there, in the piazza, the white shadow
of his last words to Severn, “Don’t be frightened,”
may enter you.

by Christopher Howell
from Light’s Ladder
University of Washington Press, 2004

Alexandre Dumas

From Harvard Magazine:

DumasHe was the son of a black slave and a renegade French aristocrat, born in Saint-Domingue (now Haiti) when the island was the center of the world sugar trade. The boy’s uncle was a rich, hard-working planter who dealt sugar and slaves out of a little cove on the north coast called Monte Cristo—but his father, Antoine, neither rich nor hard-working, was the eldest son. In 1775, Antoine sailed to France to claim the family inheritance, pawning his black son into slavery to buy passage. Only after securing his title and inheritance did he send for the boy, who arrived on French soil late in 1776, listed in the ship’s records as “slave Alexandre.” At 16, he moved with his father, now a marquis, to Paris, where he was educated in classical philosophy, equestrianism, and swordsmanship. But at 24, he decided to set off on his own: joining the dragoons at the lowest rank, he was stationed in a remote garrison town where he specialized in fighting duels. The year was 1786. When the French Revolution erupted three years later, the cause of liberty, equality, and fraternity gave him his chance. As a German-Austrian army marched on Paris in 1792 to reimpose the monarchy, he made a name for himself by capturing a large enemy patrol without firing a shot. He got his first officer’s commission at the head of a band of fellow black swordsmen, revolutionaries called the Legion of Americans, or simply la Légion Noire. In the meantime, he had met his true love, an innkeeper’s daughter, while riding in to rescue her town from brigands.

If all this sounds a bit like the plot of a nineteenth-century novel, that’s because the life of Thomas-Alexandre Davy de la Pailleterie—who took his slave mother’s surname when he enlisted, becoming simply “Alexandre (Alex) Dumas”—inspired some of the most popular novels ever written. His son, the Dumas we all know, echoed the dizzying rise and tragic downfall of his own father in The Three Musketeers and The Count of Monte Cristo. Known for acts of reckless daring in and out of battle, Alex Dumas was every bit as gallant and extraordinary as D’Artagnan and his comrades rolled into one. But it was his betrayal and imprisonment in a dungeon on the coast of Naples, poisoned to the point of death by faceless enemies, that inspired his son’s most powerful story.

More here.

The History of Boredom

From Smithsonian:

Historyofboredom-42-34955923+(1)-+FLASH“Boredom” first became a word in 1852, with the publication of Charles Dickens’ convoluted (and sometimes boring) serial, Bleak House; as an emotional state, it obviously dates back a lot further. Roman philosopher Seneca talks about boredom as a kind of nausea, while Greek historian Plutarch notes that Pyrrhus (he of the “Pyrrhic victory”) became desperately bored in his retirement. Dr. Peter Toohey, a Classics professor at the University of Calgary, traced the path of being bored in 2011 in Boredom: A Lively History. Among the stories he uncovered was one from the 2nd century AD in which one Roman official was memorialized with a public inscription for rescuing an entire town from boredom (the Latin taedia), though exactly how is lost to the ages. And the vast amount of ancient graffiti on Roman walls is a testament to the fact that teenagers in every era deface property when they have nothing else to do.

In Christian tradition, chronic boredom was “acedia”, a sin that’s sort of a proto-sloth. The “noonday demon”, as one of its early chroniclers called it, refers to a state of being simultaneously listless and restless and was often ascribed to monks and other people who led cloistered lives. By the Renaissance, it had morphed from a demon-induced sin into melancholia, a depression brought on by too aggressive study of maths and sciences; later, it was the French ennui. In the 18th century, boredom became a punitive tool, although the Quakers who built the first “penitentiary” probably didn’t see it that way. In 1790, they constructed a prison in Philadelphia in which inmates were kept in isolation at all hours of the day. The idea was that the silence would help them to seek forgiveness from God. In reality, it just drove them insane.

More here.

Tuesday, November 27, 2012

The Neurofeminist

AnneJaapJacobson

Richard Marshall interviews Anne Jaap Jacobson, in 3:AM Magazine:

3:AM: …In your paper ‘Dennett’s Dangerous Ideas:ediel science. Elements of a critique of Cognitivism’ you begin with the blunt statement, ‘Human beings do sometimes believe false generalisations about themselves… we have, or may have, false beliefs about our psychology.’ You give a thorough tour of the geography of this terrain. A key thinker for you, in that he helps set the terms of the discussion, is Steven Stich who claims that we may well not have beliefs and desires. Could you say something about how someone might think such a thing?

AJJ: We talk about beliefs and desires a lot, and it seems bizarre to say that that is all wrong. One sensible thing one might mean, though, is that beliefs and desires are not the natural kinds that we need for a deeper understanding of how the mind works. I do not think, as eliminativists are inclined to do, that we can simply give up such ways of talking. Our moral and prudential language is tied to our psychological ascriptions. I have tentatively distinguished between synthetic and analytic approaches, where the synthetic approach attempts to capture the phenomenology. The analytic approach looks more to the scientific explanations in a possible quite different language. I think we can expected there to be corrections in either direction.

3:AM: OK, so then you argue that there is a mistake at the very heart of recent philosophy of mind. You think this is a mistake inherited from eighteenth century thought and has been transplanted into the work of philosophers such as Dennett and Fodor. So first can you tell us about this error?

AJJ: Let me explain how my views have become enlarged, though not really changed. I thought for some time that I simply could not do philosophy of mind. I first encountered it in the taxonomies of Gilbert Ryle and Tony Kenny. Ryle was in fact my over-all BPhil supervisor at Oxford. Rather to my horror, I realised that, as it were, I was failing to grasp what it was to go on in the same way. It was completely eluding me. At that time, I was doing a lot of work on causation, and so I reckoned that I’d better stick to metaphysics and epistemology, which seemed to me more tractable. I now think that one problem was that mental terms were being assumed to have some unity that I now believe they do not.

Infinite Proofs: The Effects of Mathematics on David Foster Wallace

1353775765

Kyle McCarthy reviews David Foster Wallace's Everything and More, in the LA Review of Books:

TO THE EXTENT THAT HE WAS AT HOME anywhere, David Foster Wallace was at home in the world of math. As an undergraduate, he studied modal logic; Everything and More, his book on infinity, explained Georg Cantor’s work on set theory to a general audience, and Infinite Jest includes a two-page footnote that uses the Mean Value Theorem to determine the distribution of megatonnage among players in a nuclear fallout game.

But Wallace didn’t just talk about math. He structured his work with it. In a 1996Bookworm interview with Michael Silverblatt, Wallace explained that he modeledInfinite Jest after a Sierpinski Gasket, a type of fractal in which a triangle is infinitely subdivided into smaller triangles using the midpoint of its borders. Pressed by Silverblatt on why he chose such a formation, Wallace elaborated: “Its chaos is more on the surface; its bones are its beauty.”

Now, many people agree that Infinite Jest is a singular novel, sui generis, akin perhaps only to Moby-Dick in its originality, but the qualities that earn the book that praise — its grotesque hyperrealism, exuberant asides, and melding of academese and slang, its spikes and spurts of kindness and abjection — seem to have nothing to do with Wallace’s experimental use of fractals. Wallace’s genius lies in his guts, his encyclopedic imagination, his eyes and ears, but not, it appears, in his tricks with advanced math. And yet perhaps the fact that the casual reader remains oblivious to the Sierpinski Gasket is proof of its success. Traditional narrative structures — the Fichtean curve, Aristotle’s rising action — are designed to keep us engaged and organized, yet remain invisible; a well placed climax pops and hooks, even if we don’t notice its strategic placement. And as an organizing principle, the fractal has an intuitive logic: the best novels already have a fractionalized quality — each chapter, and indeed every paragraph and sentence, reproduce in miniature its central conflict and arc. Wallace’s comment to Silverblatt made me wonder if fractals, or some other mathematical pattern, might generate order from everyday experience without the ordinary contrivances of plot.

Rorty and the Democratic Power of the Novel

Voparil_200w

Christopher J. Voparil in Eurozine:

The context of Rorty's embrace of the novel within the development of his own thought is instructive for understanding his view of the distinctive power of this genre over others, including philosophy. As he fleshed out the political consequences of his sweeping philosophical critique in Philosophy and the Mirror of Nature, in the 1990s social and political concerns came to the forefront of his work in an unprecedented way. A central preoccupation of Rorty's was “how we treat people whom we think not worth understanding” – that is, those people whom “are not viewed as possible conversational partners”. The thesis I will argue is that Rorty's turn to the novel is part of an effort to bring excluded voices into what he called in the final section of Mirror, the “conversation of mankind”.

The primary thrust of Philosophy and the Mirror of Nature advocates a fundamental shift away from a conception of knowledge as accuracy of representation and towards an understanding of knowledge as conversation and social practice. The idea that conversation is “the ultimate context with which knowledge is to be understood” leads Rorty to a preoccupation with “conversation with strangers”, understood as those who fall outside our “sense of community based on the imagined possibility of conversation”. If, as Rorty claimed, “the community is the source of epistemic authority”, and, building on Wilfrid Sellars, “we can only come under epistemic rules when we have entered the community where the game governed by these rules is played,” then we attribute knowledge to beings “on the basis of their potential membership in this community”. To illustrate this point, Rorty gives the example of how we are more likely to get sentimental about “babies and the more attractive sorts of animal” as having feelings than, say, “flounders and spiders”. Likewise, we are more likely to care about koalas than pigs, he tells us, even though pigs rate higher on the intelligence scale, because “pigs don't writhe in quite the right humanoid way, and the pig's face is the wrong shape for the facial expressions which go with ordinary conversation”.

Derrida: A Biography

Derrida-A-Biography

Terry Eagleton reviews Benoît Peeters new book, in The Guardian:

In May 1992, the dons of Cambridge University filed into their parliament to vote on whether to award an honorary degree to the French philosopher Jacques Derrida, founder of so-called deconstruction. Despite a deftly managed smear campaign by the opposition, Derrida's supporters carried the day. It would be interesting to know how many of those who tried to block him in the name of rigorous scholarship had read a single book of his, or even a couple of articles.

The truth is that they did not need to. The word was abroad that this purveyor of fashionable French gobbledegook was a charlatan and a nihilist, a man who believed that anything could mean anything and that there was nothing in the world but writing. He was a corrupter of youth who had to be stopped in his tracks. As a teenager, Derrida had fantasised with some of his friends about blowing up their school with some explosives they had acquired. There were those in Cambridge who thought he was planning to do the same to western civilisation. He did, however, have an unlikely sympathiser. When the Duke of Edinburgh, chancellor of Cambridge University, presented Derrida with his degree in the year in which Charles and Diana separated, he murmured to him that deconstruction had begun to affect his own family too.

Deconstruction holds that nothing is ever entirely itself. There is a certain otherness lurking within every assured identity. It seizes on the out-of-place element in a system, and uses it to show how the system is never quite as stable as it imagines. There is something within any structure that is part of it but also escapes its logic. It comes as no surprise that the author of these ideas was a Sephardic Jew from colonial Algeria, half in and half out of French society. If his language was French, he could also speak the patois of working-class Arabs. He would later return to his home country as a conscript in the French army, a classic instance of divided identity.