“Happy families are all alike”, claimed Tolstoy, while “every unhappy family is unhappy in its own way.” The same could be said of individuals. Happiness, a sense of well being, involves a feeling of rightness with the world, of belonging in one’s own skin, while unhappiness and dysfunction have their own infinite variety. The mind’s response to emotional pain is ever inventive. Self-destruction is a creative business. In many cases it turns out to be a life’s work, as those who give their true confessions to the artist Gillian Wearing attest.
In his book I’m Ok, You’re Ok (1969), Eric Berne’s post-Freudian model of transactional analysis, the relationships between internal adult, parent and child are explored so that the maladaptations embedded in old childhood scripts can be confronted in order for an individual to become free of inappropriate emotions that are not a true reflection of the here-and-now. Because people decide their stories and their destinies, attitudes, it is argued, can be changed. That is the ideal anyway. Yet for many of those who chose to answer a small ad placed in Time Out in 1994, which read: ‘Confess all on video. Don’t worry, you will be in disguise. Intrigued? Call Gillian’, they may have felt that they had little choice when it came to addictive, sad or compulsive behaviour.
It was this act that set in motion the artist Gillian Wearing’s work with strangers. Whilst she explores cultural notions of production versus the finished work such technical niceties are much less interesting than the stories that her sitters have to tell and the apparent compulsion that they have to share their pain, on record, with whoever happens to be listening. Wearing first began to use masks, along with joke shop wigs and false beards, in this 1994 video in which variously disguised figures speak straight into the camera. Confess All on Video… consists of ten voices edited into a continuous 30 minute piece. There is an array of confessions from the admission of a first visit to a brothel to an incredibly sad narrative from a nervous man disguised as George Bush who tells of an incestuous relationship with his siblings that has quite literally ruined his life. Protected by their anonymity and free of any judgmental response the participants are remarkably candid. This seems to connect back to the use of masks in ancient Greek drama. The mask, then, was a significant element in the worship of Dionysus and is known to have been used since the time of Aeschyluss by members of the chorus, who were there to help the audience know what a character was thinking. Illustrations from 5th century display helmet-like masks, covering the entire face and head of the actors, with holes for the eyes and a small aperture for the mouth, as well as an integrated wig. It is interesting to note that these ancient paintings never show actual masks on the actors in performance; they are mostly shown being handled by the actors before or after a performance, emphasising the liminal space between the audience and the stage, between myth and reality. The mask melted into the face allowing the actor to vanish into a role. Research suggests that the mask served as a resonator for the head, thus enhancing vocal acoustics and altering its quality leading to an increased energy and presence that allowed for the more complete metamorphosis of the actor into his character. Many of these aspects remain true in Gillian Wearing’s work.
An empty downtown, with boarded-up shops and desolate sidewalks, is truly a sad sight to behold. It is also symptomatic of much larger forces, namely the flight from the urban core into the suburbs that wound up decimating the vitality of American cities during the second half of the 20th century. Last month I argued that the urban form of the skywalk was a partial and misguided response to reviving the emptied-out downtowns of American cities. In most instances these structures, which sought to connect buildings without touching the street, were a prolonged, painful failure, because they further segregated street life and did not succeed in drawing people back into that urban core, at least in a way that could be considered dynamic and responsive to the larger needs of the urban fabric. In a sense, much was expected of skywalks, but in fact they were little more than a Band-Aid, and served to only exacerbate the problem through the fundamentally anti-social tendencies that underlie their design and use.
And yet, like any other urban form, skywalks are agnostic – what determines their success is not just their design and implementation, but also the problem that they seek to address. It is perhaps more accurate to say that skywalks, along with many other forms of intervention in the urban built environment, reveal the question that designers have posed themselves, believing that that question, whatever it might be, is in fact the correct and most pressing one. So, in the case of American cities, skywalks were employed to revive downtowns, and, generally speaking, failed. Other cities around the world have enlisted skywalks not because there is too little density, but because there is too much. Does this new context increase the possibility of success? In order to understand what a difference a difference makes, we first need to consider the forces that shaped cities in the West, and what the difference might be between this phenomenon and that of the global urban South.
The narrative describing the development of American cities can be retold as a narrative of excessive space. When energy and labour are cheap, economic logic drives growth outward; it is always easier to build on virgin ground rather than re-organize an existing built environment. This is especially true when urban areas are not bounded by geographic obstacles such as water or mountains – a condition true of most mid-Western cities and not a few coastal ones.
Fussell had written a guide to poetic form and an equally fine critical life of Samuel Johnson when, in 1975, he broke out as an intellectual celebrity with The Great War and Modern Memory, which won the National Book Award and National Book Critics Circle Award. The Great War tells the story of the destruction of the 19th century —of its class system and its faith in progress; really, of any way of living predicated on a stable system of value —by World War I. Out of the mass experience of pointless death, a new way of speaking and writing, devoid of euphemism, arose, a plain style we associate with Hemingway (“Abstract words such as glory, honor, courage, or hallow were obscene beside the concrete names of villages, the number of roads, the names of rivers, the number of regiments and dates”) but in England may just as easily evoke Siegfried Sassoon, Wilfred Owen, Robert Graves, and Edmund Blunden —writers who saw action in the Great Fuck-Up, as infantrymen soon called it, writers who, as a result of firsthand acquaintance with the trenches, sought a way of making literature without any recourse to elevated literary diction.
The Great War chronicles the loss of the old rhetoric, of high pieties, of sacrifice and roseate dawns, in favor of “blood, terror, agony, madness, shit, cruelty, murder, sell-out, pain and hoax,” as Fussell lists it at one point; the sound of “ominous gunfire heard across water.” Fussell himself fought in World War II, and himself wrote in a candid style. “I am saying,” he concludes one chapter in The Great War, as if replying to a margin note from a junior editor, “that there seems to be one dominating form of modern understanding; that it is essentially ironic; and that it originates largely in the application of mind and memory to the events of the Great War.”
Richard Fortey has spent most of his life looking at fossils, the imprints of the skeletons of the very thoroughly dead. Here he sets out — like a more deeply thoughtful David Attenborough, without the cameras — to describe the distinguished groups of organisms that are still recognizable and thriving after millions and millions of years. The horseshoe crabs, velvet worms and other venerable creatures he encounters are Earth’s true conservatives. “We’ve devised a system that works very well for our niche,” they would tell us. “No big changes necessary. Maybe just a tweak at the molecular level.” As Fortey says, “to look at a living horseshoe crab is to see a portrait of a distant ancestor repainted by time, but with many of its features still unchanged.”
Fortey’s dozen or so subjects have survived the many cataclysms the planet has thrown at them over the past 450 million years. As if repeated earthquakes, volcanic eruptions and ice sheets weren’t enough, there were two mass-extinction events. The best known was the disaster 65 million years ago that led to the downfall of the dinosaurs. We’re less familiar with the more devastating earlier extinction — about 251 million years ago — that erased 90 percent of life from the sea and almost as large a percentage of the little things struggling on land. The horseshoe crab made it through; its fossil remains date from 450 million years ago.
Somewhere then, perhaps at the bottom of a poisoned sea, with tsunamis rolling above, some organisms stayed alive, including something we would recognize as the horseshoe crab if it clambered up onto the beach. It’s astonishing to consider that the lucky few — arthropods, snails, clams, jellyfish, worms and a few small four-legged creatures on land — that survived the worst extinction gave rise to everything that followed, including us.
Of all the indignities visited on the writer’s life these days, none is more undignified than the story or pitch meeting, a ritual to which every writer, from the gazillion-dollar screenwriter to the lowly essayist, will sooner or later submit. “So tell us the story,” the suits say after a few minutes of banter and schmooze, and the writer gulps and jumps in. “Well, uh, it’s sort of, like—it’s sort of a fish out of water story…“and then as one pale incident succeeds the next, the tycoons emit a slow burn of polite disbelief and boredom, ending with a forced smile and a we’ll-get-back-to-you. Sometime. Soon…
And yet something interesting, even encouraging, is revealed in this ritual, all its humiliations aside. Stories, more even than stars or spectacle, are still the currency of life, or commercial entertainment, and look likely to last longer than the euro. There’s no escaping stories, or the pressures to tell them. And so the pathetic story-pitcher turns to pop science—to Jonathan Gottschall’s new book, “The Storytelling Animal,” for instance— for some scientific, or at least speculative, ideas about what makes stories work and why we like them. Gottschall’s encouraging thesis is that human beings are natural storytellers—that they can’t help telling stories, and that they turn things that aren’t really stories into stories because they like narratives so much. Everything—faith, science, love—needs a story for people to find it plausible. No story, no sale.
We all know the adage that dogs are man’s best friend. And we’ve all heard heartwarming stories about dogs who save their owners—waking them during a fire or summoning help after an accident. Anyone who has ever loved a dog knows the amazing, almost inexpressible warmth of a dog’s companionship and devotion. But it just might be that dogs have done much, much more than that for humankind. They may have saved not only individuals but also our whole species, by “domesticating” us while we domesticated them.
One of the classic conundrums in paleoanthropology is why Neandertals went extinct while modern humans survived in the same habitat at the same time. (The phrase “modern humans,” in this context, refers to humans who were anatomically—if not behaviorally—indistinguishable from ourselves.) The two species overlapped in Europe and the Middle East between 45,000 and 35,000 years ago; at the end of that period, Neandertals were in steep decline and modern humans were thriving. What happened?
A stunning study that illuminates this decisive period was recently published inScience by Paul Mellars and Jennifer French of Cambridge University. They argue, based on a meta-analysis of 164 archaeological sites that date to the period when modern humans and Neandertals overlapped in the Dordogne region of southwest France, that the modern-human population grew so rapidly that it overwhelmed Neandertals with its sheer numbers.
This past October, at an Occupy encampment in Cleveland, Ohio, “suspicious males with walkie-talkies around their necks” and “scarves or towels around their heads” were heard grumbling at the protesters' unwillingness to act violently. At meetings a few months later, one of them, a 26-year-old with a black Mohawk known as “Cyco,” explained to his anarchist colleagues how “you can make plastic explosives with bleach,” and the group of five men fantasized about what they might blow up. Cyco suggested a small bridge. One of the others thought they’d have a better chance of not hurting people if they blew up a cargo ship. A third, however, argued for a big bridge – “Gotta slow the traffic that's going to make them money” – and won. He then led them to a connection who sold them C-4 explosives for $450. Then, the night before the May Day Occupy protests, they allegedly put the plan into motion – and just as the would-be terrorists fiddled with the detonator they hoped would blow to smithereens a scenic bridge in Ohio’s Cuyahoga Valley National Park traversed by 13,610 vehicles every day, the FBI swooped in to arrest them.
Right in the nick of time, just like in the movies. The authorities couldn’t have more effectively made the Occupy movement look like a danger to the republic if they had scripted it. Maybe that's because, more or less, they did.
Morning traffic makes excuses at every changing light. Now the whole world is jealous of Shanghai, humming And hawing, purring and growling. One heart beats faster As it runs the red, a double-decker bus screams with rage: A spluttering of outrage, a fit of blue pique. There is only One life, you fool, the conductor says, as he pummels The green bonnet of something smaller than mankind. This morning the whole world should be one Shanghai, Quick-witted, light-footed, like the handsome young novelists Having cocktails at the Glamour Bar, beautiful as Shanghai That turns its proud head to foreigners, not caring a damn As the fish rise in the canal seine-net, as the scholar turns To split an atom. I turn into the Nanking Road and find A sheet of luminescent Venetian glass, colours of agave And opal, texture of oriental cloth upon the window of morning, Upon the face of my Lord Byron. Traffic like this is worth it – Worth it to have so many humans in the nest of life together. Though these are mostly the swollen lights and embarrassed tails Of saloon and estate Fords, of aluminium grills, forgotten marques, Stray Bentleys and garaged ZV plates. A mist of life, An apron of colour, a human luminescence, levitates over The asphalt of Shanghai. Here, the sin of the world is bleached With business. It is history that the traffic circles round, History that pulls the night-grill from its earth-lock and begins To trade. Such morning traffic: here in Shanghai it teaches me improvisations of perfect form, the beginnings of Yeatsean indigo, a shadow disturbed, the colour of beautiful. .
B. F. Skinner’s notorious theory of behavior modification was denounced by critics 50 years ago as a fascist, manipulative vehicle for government control. But Skinner’s ideas are making an unlikely comeback today, powered by smartphone apps that are transforming us into thinner, richer, all-around-better versions of ourselves. The only thing we have to give up? Free will.
David H. Freedman in The Atlantic:
Most of us know someone who lost weight years ago and has kept it off, and we all see celebrities who claim to have slimmed down for good using plain old diet and exercise, from Bill Clinton to Drew Carey to Jennifer Hudson. But we keep hearing that the vast majority of us—98 percent is a figure that gets thrown about—can’t expect to do the same.
Alcoholics don’t seem to face such dismal prospects, thanks to Alcoholics Anonymous and similar multistep programs, which are widely regarded as effective treatments. With obesity, we’re apparently at a loss for a clear answer. Fads like the Atkins diet slowly fade in popularity after dieters watch the weight return. We’re left with the impression that the techniques needed to permanently lose weight don’t exist, or apply to only a tiny percentage of the population, who must be freaks of willpower or the beneficiaries of exotic genes. Scientists and journalists have lined up in recent years to pronounce the diet-and-exercise regimen a nearly lost cause—a view argued in no fewer than three cover stories and another major article in The New York Times Magazine over the past 10 years, and in a cover story in this magazine two years ago.
All of which is odd, because weight-loss experts have been in fairly strong agreement for some time that a particular type of diet-and-exercise program can produce modest, long-term weight loss for most people. But this program tends to be based in clinics operated by relatively high-priced professionals, and requires a significant time commitment from participants—it would be as if the only way to get treated for alcoholism were to check into the Betty Ford Center. The problem is not that we don’t know of a weight-control approach that works; it’s that what works has historically been expensive and inconvenient.
LEHRER: The title of The Self Illusion is literal. You argue that the self – this entity at the center of our personal universe – is actually just a story, a “constructed narrative.” Could you explain what you mean?
HOOD: The best stories make sense. They follow a logical path where one thing leads to another and provide the most relevant details and signposts along the way so that you get a sense of continuity and cohesion. This is what writers refer to as the narrative arc – a beginning, middle and an end. If a sequence of events does not follow a narrative, then it is incoherent and fragmented so does not have meaning. Our brains think in stories. The same is true for the self and I use a distinction that William James drew between the self as “I” and “me.” Our consciousness of the self in the here and now is the “I” and most of the time, we experience this as being an integrated and coherent individual – a bit like the character in the story. The self which we tell others about, is autobiographical or the “me” which again is a coherent account of who we think we are based on past experiences, current events and aspirations for the future.
The neuroscience supports the claim that self is constructed. For example, Michael Gazzaniga demonstrated that spilt-brain patients presented with inconsistent visual information, would readily confabulate an explanation to reconcile information unconsciously processed with information that was conscious. They would make up a story. Likewise, Oliver Sacks famously reported various patients who could confabulate accounts to make sense of their impairments. Ramachandran describes patients who are paralyzed but deny they have a problem. These are all extreme clinical cases but the same is true of normal people. We can easily spot the inconsistencies in other people’s accounts of their self but we are less able to spot our own, and when those inconsistencies are made apparent by the consequences of our actions, we make the excuse, “I wasn’t myself last night” or “It was the wine talking!” Well, wine doesn’t talk and if you were not your self, then who were you and who was being you?
Americans of all types — Democrats and Republicans, even some Good Progressives — are just livid that a Pakistani tribal court (reportedly in consultation with Pakistani officials) has imposed a 33-year prison sentence on Shakil Afridi, the Pakistani physician who secretly worked with the CIA to find Osama bin Laden on Pakistani soil. Their fury tracks the standard American media narrative: by punishing Dr. Afridi for the “crime” of helping the U.S. find bin Laden, Pakistan has revealed that it sympathizes with Al Qaeda and is hostile to the U.S. (NPR headline: “33 Years In Prison For Pakistani Doctor Who Aided Hunt For Bin Laden”; NYT headline: “Prison Term for Helping C.I.A. Find Bin Laden”). Except that’s a woefully incomplete narrative: incomplete to the point of being quite misleading.
What Dr. Afridi actually did was concoct a pretextual vaccination program, whereby Pakistani children would be injected with a single Hepatitis B vaccine, with the hope of gaining access to the Abbottabad house where the CIA believed bin Laden was located. The plan was that, under the ruse of vaccinating the children in that province, he would obtain DNA samples that could confirm the presence in the suspected house of the bin Laden family. But the vaccine program he was administering was fake: as Wired‘s public health reporter Maryn McKenna detailed, “since only one of three doses was delivered, the vaccination was effectively useless.” An on-the-ground Guardian investigation documented that ”while the vaccine doses themselves were genuine, the medical professionals involved were not following procedures. In an area called Nawa Sher, they did not return a month after the first dose to provide the required second batch. Instead, according to local officials and residents, the team moved on.”
That means that numerous Pakistani children who thought they were being vaccinated against Hepatitis B were in fact left exposed to the virus.
In the days when he was the hip young gunslinger of British fiction, the Martin Amis interview tended to follow a certain form. This would involve tyro journalists – Amis wannabes for the most part – joining their subject at the snooker table or on the tennis court, where the author would go through his famously competitive paces, presenting the journalist with the tricky dilemma of whether to throw the game and curry his favour, or beat him and risk his resentment. But at 62, time and Amis’s recent relocation to New York have put something of a damper on his sporting enthusiasms. The pub and snooker evenings were long ago sacrificed to family life. And he no longer plays tennis. 'It just got so tragic,’ he says with a sigh. 'I hated it so much – because I wasn’t winning. Isabel says, “Play 80-year-olds, you’ll win against them.” But that’s no good. I can still run – not as fast. My game was built on mobility; didn’t have any big shots or anything. A defensive lob was my big shot. But it’s more to do with reflexes. You shape to do it and you’re not there – you’re crowding it, and the ball’s out of reach, and it fills you with a weird sort of self-disgust. Solemn exasperation and self-disgust.’ Nowadays, he can’t even watch the Premier League because he is unable to operate the television. 'Pathetic!’ He gives a rueful shrug. 'The technology has moved so far beyond my competence.’
Amis relocated to New York some 18 months ago, and now lives in the Cobble Hill district of Brooklyn, in a handsome four-storey brownstone, with his wife, the writer Isabel Fonseca, and their two teenage daughters, Fernanda and Clio. It is tempting to read something into the move. One of the recurring themes of Amis’s pronouncements over the past few years has been a palpable disenchantment with England and English life: the 'skanky town’ malice of London’s literary world; his bald declaration to a French newspaper that he would 'prefer not to be English’; the sense that his homeland is a busted flush; the fact that his new book, Lionel Asbo, is a satire on the shallowness and vulgarity of celebrity-obsessed Britain. All of this may or may not be true, but it is not the reason he has decamped to America. Isabel, he says, is a New Yorker, and wanted to be closer to her mother and stepfather as they grew older.
In looking at In Our Time and A Moveable Feast, we've mainly focused on Hemingway as a young man: fit, young and heading for the stratosphere. But as Mogger64 noted in his original nomination, it's significant that A Moveable Feast was “written at the end of his life”. It isn't quite the work of an old man. Hemingway never made it that far. But it's pretty much the last word from someone on the way out. It speaks as loudly of Hemingway at the end of his career as it does of the beginning. And that career was remarkable. He had done it all by 1956, when he was spurred into reminiscence following the rediscovery of some old Paris notebooks which had lain for many years in a trunk in the basement of the Ritz hotel. He'd won the Nobel prize. He'd won the Pulitzer prize. He'd sold hundreds of thousands of books. He'd inspired dozens of imitators. He'd become an adjective and a legend. His life outside writing was just as celebrated: the bull fight aficionado, the boxer, the big game hunter, the fisherman, the friend of Spanish Republicans, the man who liberated Paris. Papa: the tall, handsome, heavyweight alpha male.
But by 1956 all that was heading into memory, if it had ever really existed.
The Pakistani writer Mohammed Hanif is living proof that you can sometimes tell the truth more easily with fiction than facts. Hanif is a journalist in one of the world's more dangerous places to be a journalist: Pakistan. He's also become one of the country's most prominent and provocative novelists. His book A Case of Exploding Mangoes told the tale of real-life Pakistani dictator Zia-ul-Haq, who died in a plane crash in 1988. Few believed it was an accident, and Hanif's novel delved into the conspiracies (and conspiracy theories).
Hanif joins NPR's Steve Inskeep to discuss the reception of A Case of Exploding Mangoes and his new novel, Our Lady of Alice Bhatti, the story of a poor hospital nurse in the city of Karachi.
On choosing to fictionalize Zia-ul-Haq's death
“Like all young reporters, I was like, this is going to be my big story, and I started working on it. After a few months, I realized that there was no way I was going to get to the bottom of it. There were layers and layers and layers of deception and cover-ups to cover the other cover-ups. Then it occurred to me that I would just make up my own facts. If no one was willing to tell me who did it, then as a fictional character, I'll raise my hand and say, 'Well, I did it,' and I'll write a book about it. And so, basically, it was a failed journalist's revenge.”
On people accepting his version of events
“The funny thing is, after the book came out, a lot of people — and some of them were heads of intelligence agencies — I've run into them at a party or at a social gathering, and they take me into a corner and say, 'Son, you've written a brilliant novel. Now tell me, who's your source?' I used to find it a bit scary at the beginning that, my God, these people are running my country and they actually believe all the lies that I've written.”
We don’t have a good name for it – “nature writing” is about the best one can come up with. But that label has an obvious flaw. Anyone who was around 50 years ago will recall that the terms of reference changed radically, almost overnight, in the 1960s. The word “nature” gave way in popular discourse to “environment” and “nature writing” mutated into varieties of “eco-criticism”. The difference was conceptual. “Nature”, drawing on primeval myth and Romantic literature, had traditionally been conceived of as something superhuman and invincible. As Wordsworth grandly described it: A motion and a spirit that impels All thinking things, all objects of all thought, And rolls through all things. Humans were part of that everything. Our species was no more capable of “destroying” nature than jellyfish can reverse the course of the Gulf Stream.