How to Die

Atul Gawande argues that physicians should focus care on the good life—including its very end.

Sophia Rosenfeld in The Nation:

ScreenHunter_1157 Apr. 24 22.05In the early 1990s, an upstate New York doctor became the medical director of a nursing home populated almost entirely by severely disabled elderly people. Unhappy about all the unhappiness he saw around him, the doctor launched an experiment. Shifting attention from “treatment” to “care,” he introduced plants in the living quarters, flowers and vegetables in the garden, and a veritable menagerie all around the property, including two dogs, four cats, and 100 parakeets. Eventually, he added an outdoor play area for the employees’ children. The results were surprising: greater contentedness in the home’s residents (measurable in part by a large decrease in the need for psychotropic drugs like Haldol), but also extended lives.

Atul Gawande believes in targeted fixes, especially small ones. His previous book, The Checklist Manifesto, detailed the outsize benefits of that favorite of highly organized people, the checklist. In Being Mortal, the writer-physician turns his attention to what happens when the elderly or infirm are granted a plant to look after, a chance to break an in-house rule, or even a sustained conversation about their future. His contention is that such little adjustments not only produce big payoffs for well-being, but also represent significant breakthroughs in terms of our thinking about questions of such daunting ethical and emotional magnitude that we generally avoid contemplating them at all. Questions like: What can we do to improve the existence of people in the final phase of life? How do we prepare others—and eventually ourselves—for the end?

More here.

a short history of american images

UrlLucy Ives at Triple Canopy:

We Go for the Union is an anonymous, undated nineteenth-century American painting. Titled, by curatorial fiat, after the political banner at its right, it depicts a group of men who are each, in varying ways, skilled in the manipulation of paint.

One man (of greatest height and most costly attire) holds a palette and narrow brush for applying precise touches. To this auteur’s right, a black man in stained overalls holds a bucket of paint at the ready. This paint has been applied across the background of a painting within the painting, a political sign featuring the likeness of George Washington. At far left, a strangely proportioned, muscular figure with pinched head and blurred features laboriously grinds blue pigment against a stone. An orange horizon emanates through the workshop windows, indicating, if not a clearly identifiable hour, then, perhaps, the sublime.

In this scene the labor of painting is divided into three, with only one worker accorded the status of artist: The giant, yellow-jacketed professional limns august features instantly

recognizable as those of the union’s first president. The artist seems to work from memory, pausing in reserved admiration before the realistic image he apparently believes himself to have authored. Yet we know that this image is in fact a copy of Gilbert Stuart’s famous Athenaeum portrait, which Stuart himself never officially finished, instead copying it at least seventy-five times and selling each reproduction for $100.

more here.

A lethal nostalgia

Deborah Rudacille in Aeon:

PAR46470-306x192Thousands of working-class communities around the country lament the shuttering of blast furnaces, coke ovens, mines and factories. This yearning for a vanishing industrial United States, a place in long, slow decline thanks to globalisation and technological change, has a name – smokestack nostalgia. It is a paradoxical phenomenon, considering the environmental damage and devastating health effects of many of the declining industries. Our forebears worked gruelling shifts in dangerous jobs, inhaling toxic fumes and particulates at work and at home. Many lived in neighbourhoods hemmed in by industries that pumped effluent into rivers, streams and creeks.

During the 1960s and ’70s, a fine red dust coated my home town near Sparrows Point. The local rivers and creeks, which fed into the Chesapeake Bay, became so contaminated with run-off that dead fish often littered the beaches. As the Sparrows Point monument testifies, many workers died gruesome deaths: burned, crushed, gassed, dismembered. Others experienced a slower, though no less painful, demise from diseases caused by exposure to asbestos, benzene and other toxins.

Few of the steelworkers I’ve known deny the negative aspects of living and working on the Point, including long-standing racial, class and gender discrimination. Still, they grieve the shuttering of the Sparrows Point works, which provided not just union jobs with generous benefits, but a sense of family and community, identity and self-worth. At a ceremony on 24 November 2014 honouring the legacy and history of Sparrows Point, in advance of the demolition of what was once the largest blast furnace in the western hemisphere, steelworkers described what the Point meant to them. ‘My heart will always be in this place. This is hallowed ground,’ said Michael Lewis, a third-generation steelworker and union officer. Troy Pritt, another steel worker, read a poem calling the steelworks ‘home’.

Read the full article here.

the crooked tower

50965500-galileo-pisa-428623a762fe0818477747e1c0d1e40c2dad997bGreg Siegel at Cabinet Magazine:

Galileo taught mathematics at the University of Pisa from 1589 to 1592, and sometime during this period he mounted a dramatic public demonstration of one of his more unorthodox notions. Clutching two lead spheres of different sizes and masses, he climbed the stairs of the campanile, the bell tower in the Piazza del Duomo, behind the cathedral. The young professor then proceeded—before an assembly of expectant onlookers, many of them faculty and students from the university—to drop the test objects simultaneously from the upper balcony. The plummeting orbs reached the ground together; with no temporal interval between their terrestrial impacts, a single resounding thump announced their coincident landing. Aristotelian physics, for ages the dominant paradigm, held that the velocities of free-falling bodies moving through the same medium vary in direct proportion to their weights. Galileo’s so-called Leaning Tower of Pisa Experiment conclusively disproved Aristotle’s doctrine of natural downward motion: heavier objects do not fall to earth faster than lighter objects, after all. In a veritable instant, the old certainties, all those dusty apriorisms of ancient and medieval inheritance, were upended. Science and knowledge had at last entered the modern era.

more here.

The Armenian genocide: the journey from victim to survivor

Armenians_marched_by_Turkish_soldiers,_1915Anoosh Chakelian at The New Statesman:

When I mention that I’m Armenian to new people I meet, I usually receive one of two reactions. One involves Kim Kardashian. The other is a vague awareness of something horrible that happened during the First World War. It’s particularly noticeable this year, as the world (including cousin Kim) has lingered a little longer than usual on the events of the Armenian genocide as it reaches its centenary.

Today is 24 April, a date that has resonated for me ever since I was born. Well, the Armenian pronunciation of it has, anyway (“Uhbril Ksan Chors”). Today marks 100 years since the Ottoman Turks rounded up hundreds of Armenian community leaders and intellectuals in Constantinople, and executed them.

This was the first phase of a genocide that lasted throughout the First World War. The ensuing century has perpetuated the pain with silence and denial.

The facts are already out there – pretty much every western journalist or historian who has written about this subject, and many Turkish ones at that, will give you a similar account. During death marches and massacres perpetrated by the Young Turk regime as the bloody conclusion of its “Turkification” programme, 1.5m Armenians were killed of an estimated population of 2.1 million.

more here.

Christian America is an invention: Big business, right-wing politics and the religious lie that still divides us

Kevin M. Kruse in Salon:

Huckabee_bush_roveThe idea of “one nation under God” is a modern one — and does not date back to the Founding Fathers.

When he ran for the White House, Texas governor George W. Bush took a similarly soft approach, though one that came from the right. A born-again Christian, he shared Bill Clinton’s ability to discuss his faith openly. When Republican primary candidates were asked to name their favorite philosopher in a 1999 debate, for instance, Bush immediately named Christ, “because He changed my heart.” Despite the centrality of faith in his own life, Bush assured voters that he would not implement the rigid agenda of the religious right. Borrowing a phrase from author Marvin Olasky, Bush called himself a “compassionate conservative” and said he would take a lighter approach to social issues including abortion and gay rights than culture warriors such as Pat Buchanan. But many on the right took issue with the phrase. For some, the “compassionate” qualifier implicitly condemned mainstream conservatism as heartless; for others, the phrase seemed an empty marketing gimmick. (As Republican speechwriter David Frum put it, “Love conservatism but hate arguing about abortion? Try our new compassionate conservatism—great ideological taste, now with less controversy.”) But the candidate backed his words with deeds, distancing himself from the ideologues in his party. In a single week in October 1999, for instance, Bush criticized House Republicans for “balancing the budget on the backs of the poor” and lamented that all too often “my party has painted an image of America slouching toward Gomorrah.”

In concrete terms, Bush’s “compassionate conservatism” constituted a promise to empower private religious and community organizations and thereby expand their role in the provision of social services. This “faith­ based initiative” became the centerpiece of his campaign. In his address to the 2000 Republican National Convention, Bush heralded the work of Christian charities and called upon the nation to do what it could to sup­port them. After his inauguration, Bush moved swiftly to make the pro­posal a reality. Indeed, the longest section of his 2001 inaugural address was an expansive reflection on the idea. “America, at its best, is compassionate,” he observed. “Church and charity, synagogue and mosque lend our communities their humanity, and they will have an honored place in our plans and in our laws.” Bush promoted the initiative at his first Na­tional Prayer Breakfast as well. But it was ill-fated. Hamstrung by a lack of clear direction during the administration’s first months, it was quickly overshadowed by a new emphasis on national security after the terrorist attacks of 9/11.

More here.

Why Almost Everything Dean Ornish Says about Nutrition Is Wrong

Melinda Wenner Moyer in Scientific American:

FatLast month, an op–ed in The New York Times argued that high-protein and high-fat diets are to blame for America’s ever-growing waistline and incidence of chronic disease. The author, Dean Ornish, founder of the nonprofit Preventive Medicine Research Institute, is no newcomer to these nutrition debates. For 37 years he has been touting the benefits of very low-fat, high-carbohydrate, vegetarian diets for preventing and reversing heart disease. But the research he cites to back up his op–ed claims is tenuous at best. Nutrition is complex but there is little evidence our country’s worsening metabolic ills are the fault of protein or fat. If anything, our attempts to eat less fat in recent decades have made things worse. Ornish begins his piece with a misleading statistic. Despite being told to eat less fat, he says, Americans have been doing the opposite: They have “actually consumed 67 percent more added fat, 39 percent more sugar and 41 percent more meat in 2000 than they had in 1950 and 24.5 percent more calories than they had in 1970.” Yes, Americans have been eating more fat, sugar and meat, but we have also been eating more vegetables and fruits (pdf)—because we have been eating more of everything.

What’s more relevant to the discussion is this fact: During the time in which the prevalence of obesity in the U.S. nearly tripled, the percentage of calories Americans consumed from protein and fat actually dropped whereas the percentage of calories Americans ingested from carbohydrates—one of the nutrient groups Ornish says we should eat more of—increased. Could it be that our attempts to reduce fat have in fact been part of the problem? Some scientists think so. “I believe the low-fat message promoted the obesity epidemic,” says Lyn Steffen, a nutritional epidemiologist at the University of Minnesota School of Public Health. That’s in part because when we cut out fat, we began eating foods that were worse for us. Ornish goes to argue that protein and saturated fat increase the risk of mortality and chronic disease. As evidence for these causal claims, he cites a handful of observational studies. He should know better. These types of studies—which might report that people who eat a lot of animal protein tend to develop higher rates of disease—“only look at association, not causation,” explains Christopher Gardner, a nutrition scientist at the Stanford Prevention Research Center. They should not be used to make claims about cause and effect; doing so is considered by nutrition scientists to be “inappropriate” and “misleading.” The reason: People who eat a lot of animal protein often make other lifestyle choices that increase their disease risk, and although researchers try to make statistical adjustments to control for these “confounding variables,” as they’re called, it’s a very imperfect science. Other large observational studies have found that diets high in fat and protein are not associated with disease and may even protect against it. The point is, it’s possible to cherry-pick observational studies to support almost any nutritional argument.

More here.

Friday Poem

The Child Who Was Shot Dead
.

The child is not dead
the child raises his fists against his mother
who screams Africa screams the smell
of freedom and heather
in the locations of the heart under siege

The child raises his fists against his father
in the march of the generations
who scream Africa scream the smell
of justice and blood
in the streets of his armed pride

The child is not dead
neither at Langa nor at Nyanga
nor at Orlando nor at Sharpeville
nor at the police station in Philippi
where he lies with a bullet in his head

The child is the shadow of the soldiers
on guard with guns saracens and batons
the child is present at all meetings and legislations
the child peeps through the windows of houses and into the hearts of mothers
the child who just wanted to play in the sun at Nyanga is everywhere
the child who became a man treks through all of Africa
the child who became a giant travels through the whole world

Without a pass
.

by Ingrid Jonker Trust
from Rook en Oker
publisher: Afrikaanse Pers, Johannesburg, 1963

Thursday, April 23, 2015

How the Truth About Palestine Won Netanyahu the Israeli Election

1024px-Yitzhak_Rabin_(1986)-web

Omri Boehm in The Boston Review (Image: Wikimedia commons):

The universal deceit exposed by Netanyahu just before Election Day had two main lies for pillars. First, of course, his own lie, the 2009 “Bar Ilan Speech,” in which the Prime Minister, facing European and American pressure, pretended to renounce everything that he himself and the Likud Party had ever believed in, and publically endorsed the two-state solution:

The truth is that in the area of our homeland, in the heart of our Jewish Homeland, now lives a large population of Palestinians. We do not want to rule over them. We do not want to run their lives. We do not want to force our flag and our culture on them. In my vision of peace, there are two free peoples living side by side in this small land, with good neighborly relations and mutual respect, each with its flag, anthem and government, with neither one threatening its neighbor's security and existence.

Non-experts may be unable to appreciate how dramatic this statement was. Ever since the 1930s, the ideological difference dividing Ben-Gurion’s mainstream Labor Zionism from Jabotinsky’s nationalist-revisionist alternative has turned on the question of the land’s partition. In 1947, Ben-Gurion enthusiastically supported the United Nation’s Partition Plan—encouraging the establishment of a two-state solution within today’s ’48 borders—while Jabotinsky’s successor as revisionist leader, Menachem Begin, fiercely objected.

True: some thirty years later, as Israel’s Prime Minister, Begin would hand over to Sadat the entire Sinai Peninsula. But this was a territorial compromise to the Arab Republic of Egypt, not a political-territorial concession to the Palestinian people, whose existence revisionists have always denied. Also true: Ariel Sharon, as Prime Minister, handed over occupied land to the Palestinian Authority in Gaza, and to that end evacuated thousands of settlers. But in order to do this, Sharon had to leave the Likud party, his natural home, and establish a new party, Kadima, with Labor leaders such as Shimon Peres.

So when Netanyahu stood in Bar-Ilan University and announced, as Israel’s Prime Minister and Likud leader, that the Palestinians deserve to get “their own flag, anthem and government,” he did something genuinely new in Zionist history. For die-hard revisionists such as Rubi Rivlin, Israel’s current friendly-looking president, Netanyahu’s two-state concession came as a shock. If the Likud Party kept relatively calm, it was because everybody knew that Netanyahu was lying.

More here.

The Function of Criticism at the Present Time

Virginia-jackson1

Virginia Jackson in the LA Review of Books:

LAUREN BERLANT is a critic’s critic, a feminist’s feminist, and a thinker’s friend. This is most simply true because of the number, depth, and influence of her abundant authored and co-authored and edited and co-edited books, her ever more numerous articles, essays, interviews, dialogues and monologues, and especially her proliferating collaborations; she always seems to be writing yet another book with yet another interesting someone else. Lots of people think with and because of Lauren Berlant. But academic “productivity” (that ubiquitous and ugly word, itself a symptom of the corporate manufacture of a crisis in the humanities) isn’t the most important reason that my first proposition — that Berlant is a critic’s critic — is just true. The reason that Lauren Berlant occupies this moment in critical theory so capaciously is that what she really always thinks about is genre.

Once upon a time, or so the story goes, the genre system was hierarchical and taxonomic (though not so fixed that at least as early as Aristotle’s Poeticsit wasn’t open to debate), with “tragedies” clearly separated from “comedies,” for example. Later, in modernity (the novel is usually considered both the origin and result of this shift), genres became modes of recognition — complex forms instantiated in popular discourse, relying on what we could or would recognize collectively, in common — and so subject to historical change and cultural negotiation. Once genres became historical, the story continues, it then became the critic’s job to manage and translate those emerging forms of recognition for the benefit of readers who experienced them without knowing exactly what it was they were seeing and feeling.

Genre seems like an old-fashioned, belletristic frame to impose on Berlant’s political, cultural, and affective range. I may be wrong, but I’m betting that if I asked you to think of a queer theorist, Berlant is one of the first critics you would name, and if I asked you to think of a theorist of public culture or affect or performativity or media publics or marginal aesthetics or crisis, Berlant would be one of the first critics you would name, but if I asked you to think of a genre theorist, Berlant would not be the first critic to come to mind. No one would accuse Lauren Berlant of being a purely literary critic.

More here.

Scientists seek rare muon conversion that could signal new physics

Diana Kwon in Symmetry:

ScreenHunter_1155 Apr. 23 17.48This weekend, members of the Mu2e collaboration dug their shovels into the ground of Fermilab's Muon Campus for the experiment that will search for the direct conversion of a muon into an electron in the hunt for new physics.

For decades, the Standard Model has stood as the best explanation of the subatomic world, describing the properties of the basic building blocks of matter and the forces that govern them. However, challenges remain, including that of unifying gravity with the other fundamental forces or explaining the matter-antimatter asymmetry that allows our universe to exist. Physicists have since developed new models, and detecting the direct conversion of a muon to an electron would provide evidence for many of these alternative theories.

“There's a real possibility that we'll see a signal because so many theories beyond the Standard Model naturally allow muon-to-electron conversion,” said Jim Miller, a co-spokesperson for Mu2e. “It'll also be exciting if we don't see anything, since it will greatly constrain the parameters of these models.”

Muons and electrons are two different flavors in the charged-lepton family. Muons are 200 times more massive than electrons and decay quickly into lighter particles, while electrons are stable and live forever. Most of the time, a muon decays into an electron and two neutrinos, but physicists have reason to believe that once in a blue moon, muons will convert directly into an electron without releasing any neutrinos. This is physics beyond the Standard Model.

Under the Standard Model, the muon-to-electron direct conversion happens too rarely to ever observe. In more sophisticated models, however, this occurs just frequently enough for an extremely sensitive machine to detect.

The Mu2e detector, when complete, will be the instrument to do this.

More here. [Thanks to Farrukh Azfar.]

Kiss goodbye to freedom

Raymond Geuss foresees a future of strict controls or war over resources. Matthew Reisz meets the radical philosopher and traces his intellectual development.

Matthew Reisz in The Times:

Philosophy_and_Real_PoliticsRaymond Geuss would like his fellow philosophers (and many of his fellow citizens) to think about politics in a radically different way. His latest book, Philosophy and Real Politics, is a quietly ferocious broadside against much received wisdom.

Among its targets are “the highly moralised tone in which some public diplomacy is conducted, at any rate in the English-speaking world, and … the popularity among political philosophers of the slogan 'Politics is applied ethics'.” What this slogan usually means in practice is starting off with abstract general principles or intuitions about fairness, justice, equality or rights, and then applying them to specific political situations. Such an approach, Geuss believes, is unlikely to tell us anything much about the real world of power struggles and messy compromises.

Some of his arguments have their roots in his earliest educational experiences. In a tribute to the philosopher Richard Rorty, Geuss offers an entertaining account of his Roman Catholic upbringing and how it had granted him “relative immunity to nationalism” – and in particular the “patently absurd” notion that there was something “special” about the United States.

For the Irish and Irish-American nuns who taught him from the ages of five to 12, only the Roman Catholic Church was truly universal and international. They knew that all the popes had been Italian, but that was for purely fortuitous reasons. “Only an Italian could stand to live in Rome: it was hot, noisy and overcrowded, and the people there ate spaghetti for dinner every day rather than proper food, ie potatoes, so it would be too great a sacrifice to expect someone who had not grown up in Italy to tolerate life there.”

More here.

How Fellini made his modernist masterpiece

Secchiaroli-mastroianni-02Ian Thomson at The Spectator:

Federico Fellini’s La Dolce Vita was a box-office triumph in Italy in 1960. It made $1.5 million at the box office in three months — more than Gone With the Wind had. ‘It was the making of me,’ said Fellini. It was also the making of Marcello Mastroianni as the screen idol with a curiously impotent sex appeal. No other film captured so memorably the flashbulb glitz of Italy’s postwar ‘economic miracle’ and its consumer boom of Fiat 500s and Gaggia espresso machines. Unsurprisingly, the Vatican objected to the scene where Mastroianni makes love to the Swedish diva Anita Ekberg (who died earlier this year at the age of 83) in the waters of the Trevi fountain. Sixties Rome became a fantasy of the erotic ‘sweet life’ thanks in part to that scene.

After La Dolce Vita, Fellini found himself at a creative loss and hung a sign above his desk: ‘NOW WHAT?’ Sophia Loren’s movie-mogul husband, Carlo Ponti, persuaded him to contribute to Boccaccio ’70, a collection of lewd short films inspired by the medieval Decameron, but it was a critical failure. At the end of 1961, still in ‘creative limbo’, Fellini began work on , his eighth and a half film; it was to be a signpost in his development as a magician-director.

more here.

Depictions of insanity through history

2161762323_983946a7ca_oAndrew Scull at The Paris Review:

Modern psychiatry seems determined to rob madness of its meanings, insisting that its depredations can be reduced to biology and nothing but biology. One must doubt it. The social and cultural dimensions of mental disorders, so indispensable a part of the story of madness and civilization over the centuries, are unlikely to melt away, or to prove no more than an epiphenomenal feature of so universal a feature of human existence. Madness indeed has its meanings, elusive and evanescent as our attempts to capture them have been.

Western culture throughout its long and tangled history provides us with a rich array of images, a remarkable set of windows into both popular and latterly professional beliefs about insanity. The sacred books of the Judeo-Christian tradition are shot through with stories of madness caused by possession by devils or divine displeasure. From Saul, the first king of the Israelites (made mad by Yahweh for failing to carry out to the letter the Lord’s command to slay every man, woman, and child of the Amalekite tribe, and all their animals, too), to the man in the country of the Gaderenes “with an unclean spirit” (maddened, naked, and violent, whose demons Christ casts out and causes to enter a herd of swine, who forthwith rush over a cliff into the sea to drown), here are stories recited for centuries by believers, and often transformed into pictorial form. None proved more fascinating than the story of Nebuchadnezzar, the mighty king of Babylon, the man who captured Jerusalem and destroyed its Temple, carrying the Jews off into captivity all apparently without incurring divine wrath.

more here.

The True Story of Rupert Brooke

Scutts-The-True-Story-of-Rupert-Brooke-320Joanna Scutts at The New Yorker:

For a long time, the history of the First World War has been understood via the symbolic transition from Brooke to Wilfred Owen, from posh idiot nationalist to heroic witness. That simple narrative obscures the extent to which Owen worshipped Brooke in the early days and just how long Brooke remained the war’s most famous poet. Until the nineteen-sixties, when the “left-wing myths” about the war gained purchase, Brooke’s sonnets and his image still seemed to represent something true and comforting. In the mid-nineties, an anthology of the nation’s hundred favorite poems included three each by Brooke and Owen, although only one of Brooke’s, “The Soldier,” was a war poem. At the hundredth anniversary of his death, as more letters and lovers are revealed, Brooke is making more headlines as a famous playboy than as a poet or patriot.

It’s impossible to know what Brooke might have written if he had seen what the other war poets saw, or what he might have become if he’d survived his own golden age. In his small body of work, the war sonnets are anomalous—his verse is usually more playful, less po-faced.

more here.

Neuroscientists create the sensation of invisibility

From PhysOrg:

InvisibleThe power of invisibility has long fascinated man and inspired the works of many great authors and philosophers. In a study from Sweden's Karolinska Institutet, a team of neuroscientists now reports a perceptual illusion of having an invisible body, and show that the feeling of invisibility changes our physical stress response in challenging social situations. The history of literature features many well-known narrations of invisibility and its effect on the human mind, such as the myth of Gyges' ring in Plato's dialogue The Republic and the science fiction novel The Invisible Man by H.G. Wells. Recent advances in materials science have shown that invisibility cloaking of large-scale objects, such as a human body, might be possible in the not-so-distant future; however, it remains unknown how invisibility would affect our brain and body perception. In an article in the journal Scientific Reports, the researchers describe a perceptual illusion of having an invisible body. The experiment involves the participant standing up and wearing a set of head-mounted displays. The participant is then asked to look down at her body, but instead of her real body she sees empty space. To evoke the feeling of having an invisible body, the scientist touches the participant's body in various locations with a large paintbrush while, with another paintbrush held in the other hand, exactly imitating the movements in mid-air in full view of the participant.

“Within less than a minute, the majority of the participants started to transfer the sensation of touch to the portion of empty space where they saw the paintbrush move and experienced an invisible body in that position,” says Arvid Guterstam, lead author of the present study.

More here.

Assange: How ‘The Guardian’ Milked Edward Snowden’s Story

WikiLeaks founder Julian Assange investigates the book behind Snowden, Oliver Stone's forthcoming film starring Joseph Gordon-Levitt, Shailene Woodley, Nicolas Cage, Scott Eastwood and Zachary Quinto. According to leaked Sony emails, movie rights for the book were bought for $700,000.

Julian Assange in Newsweek:

Rtr4vr98

The Snowden Files: The Inside Story of the World's Most Wanted Man (Guardian/Faber & Faber, 2014) by Luke Harding is a hack job in the purest sense of the term. Pieced together from secondary sources and written with minimal additional research to be the first to market, the book's thrifty origins are hard to miss.

The Guardian is a curiously inward-looking beast. If any other institution tried to market its own experience of its own work nearly as persistently as The Guardian, it would surely be called out for institutional narcissism. But because The Guardian is an embarrassingly central institution within the moribund “left-of-center” wing of the U.K. establishment, everyone holds their tongue.

In recent years, we have seen The Guardian consult itself into cinematic history—in the Jason Bourne films and others—as a hip, ultra-modern, intensely British newspaper with a progressive edge, a charmingly befuddled giant of investigative journalism with a cast-iron spine.

The Snowden Files positions The Guardian as central to the Edward Snowden affair, elbowing out more significant players like Glenn Greenwald and Laura Poitras for Guardian stablemates, often with remarkably bad grace.

“Disputatious gay” Glenn Greenwald's distress at the U.K.'s detention of his husband, David Miranda, is described as “emotional” and “over-the-top.” My WikiLeaks colleague Sarah Harrison—who helped rescue Snowden from Hong Kong—is dismissed as a “would-be journalist.”

I am referred to as the “self-styled editor of WikiLeaks.” In other words, the editor of WikiLeaks. This is about as subtle as Harding's withering asides get. You could use this kind of thing on anyone.

Read the full piece here.