by Brooks Riley
Category: Recommended Reading
Sunday, May 25, 2014
How the Novel Made the Modern World
William Deresiewicz in The Atlantic:
Martin Amis once remarked, apropos of the idea of writing a book about America, that you might as well try to write one about people, or life. Or, he might have said, the English novel. Yet here we have the fruits of such an enterprise in all their cyclopedic, cyclopean glory: Michael Schmidt’s The Novel: A Biography—1,100 pages spanning nearly 30 dozen authors, starting with the pseudonymous Sir John Mandeville (he of the 14th-century Travels) and ending 45 brisk, brilliant, intimate, assured, and almost unflaggingly interesting chapters later with Amis himself.
Such an effort represents the labor of a lifetime, one would think. In fact, it is a kind of sequel to Lives of the Poets (1998), a comparably commodious compendium. Schmidt—who was born in Mexico, went to school in part in the United States, and has made his career in Britain—is himself a poet and novelist as well as an editor, publisher, anthologist, translator, and teacher. Given the fluidity with which he ranges across the canon (as well as quite a bit beyond it), one is tempted to say that he carries English literature inside his head as if it were a single poem, except that there are sections in The Novel on the major Continental influences, too—the French, the Russians, Cervantes, Kafka—so it isn’t only English. If anyone’s up for the job, it would seem to be him.
Still, 1,100 pages (and rather big ones, at that). I wasn’t sure I had the patience for it. Then I read this, in the second paragraph. Schmidt is telling us about the figures he’s enlisted as our guides along the way, novelist-critics like Henry James, Virginia Woolf, V. S. Pritchett, Gore Vidal, and many others:
They are like members of an eccentric family in an ancestral mansion … Some are full of respect, some reserved, others bend double with laughter; the rebellious and impatient slash the canvases, twist the cutlery, raise a toast, and throw the crystal in the grate. Their damage is another chapter in the story.
It wasn’t the notion that Schmidt was going to orchestrate the volume as a dialogue with and among these practitioners, though that was promising. It wasn’t the metaphor of the eccentric family per se, though that was interesting. It was the writing itself. The language was alive; the book would be alive as well. Take a breath, clear the week, turn off the WiFi, and throw yourself in.
More here.
Matt Taibbi discusses his new book, “The Divide,” and the disasters of inequality
Elias Isquith in AlterNet:
His relentless coverage of Wall Street malfeasance turned him into one of the most influential journalists of his generation, but in his new book, “ The Divide: American Injustice in the Age of the Wealth Gap,” Matt Taibbi takes a close and dispiriting look at how inequality and government dysfunction have created a two-tiered justice system in which most Americans are guilty until proven innocent, while a select few operate with no accountability whatsoever.
Salon sat down last week with Taibbi for a wide-ranging chat that touched on his new book, the lingering effects of the financial crisis, how American elites operate with impunity and why, contrary to what many may think, he’s actually making a conservative argument for reform. The interview can be found below, and has been lightly edited for length and clarity.
So, what is “The Divide”?
The book is really just about why some people go to jail and why some people don’t go to jail, and “the divide” is the term I came up with to describe this phenomenon we have where there are essentially two different criminal justice systems, one that works one way for people who are either very rich or working within the confines of a giant systemically important institution, and then one that works in another way for people who are without means. And that’s what the book is about.
More here.
The story of the Gandhis’ biggest mistake, and how it still haunts Punjab
Hartosh Singh Bal in Caravan:
By the time the smoke cleared over the Darbar Sahib, hundreds of innocent bystanders had died. Bhindranwale lay murdered, and the Akal Takht, where he had set up his final defiance of Delhi, stood shattered. The operation was followed by the assassination of Indira Gandhi by her Sikh bodyguards, and the organised massacre of thousands of Sikhs by Hindu mobs, led mainly by Congress politicians. In Punjab, militancy against the Indian state reached levels unprecedented in the years before Bluestar; it took a decade for a semblance of peace to return.
Over the last thirty years, the debate over Bluestar has played out between two extreme points of view: that of radicals in Punjab and abroad, who dwell on the Congress’s role while overlooking Bhindranwale’s complicity, and that of people in the rest of India, who tend to focus on Bhindranwale with little sense of the Congress’s contribution to the tragedy. Many Indians may believe the events of that June can be consigned to the history books, but their memory remains alive in Punjab. Many Sikhs continue to view the operation, and the figure of Bhindranwale, in a markedly different light from the rest of the country. Without understanding how such distinct perspectives came to exist, it may be impossible to come to terms with the history of Bluestar.
More here.
Will Stem Cell Burgers Go Mainstream?
Lisa Winter in IFL Science:
Scientists are currently working on developing an alternative to conventional meat. No, this isn’t flavored soy-based tofurkey or anything like that; it’s actual meat. Instead of raising animals, researchers can use the animal’s stem cells and generate meat.
A new paper from Cor van der Weele and Johannes Tramper, both of Wageningen University in the Netherlands, explores the practical aspects of lab-grown meat and where the research stands now. The paper was published in Science & Society.
Laboratory meat admittedly doesn’t sound very enticing on the surface, but environmentalists, animal rights activists, and even NASA have been awaiting a commercially-viable alternative to conventional meat using stem cells. The product is typically referred to as “schmeat” due to the fact that it grows in sheets. Without an animal’s skeleton, the cells remain flat as they differentiate into muscle tissue.
The journey to lab meat started nearly 20 years ago when NASA was approved by the FDA to begin developing meat for use during long-term space missions. In 2008, PETA announced a prize of US$1 million to anyone who could create stem-cell derived chicken meat. The deadline of March 4, 2014 has passed without a winner awarded, but even without prize money, researchers are still hard at work.
Schmeat could also begin to make up for the large environmental drawbacks to raising livestock, as it takes a tremendous amount of food, water, and energy to raise and process all of that meat. Additionally, the methane produced in the gastrointestinal system of the livestock is adding considerably to greenhouse gas emissions. In vitro meat could reduce energy consumption by 45%, greenhouse gas emissions by 96%, and land use by 99%.
More here.
Computers, and computing, are broken
Quinn Norton in Medium:
Once upon a time, a friend of mine accidentally took over thousands of computers. He had found a vulnerability in a piece of software and started playing with it. In the process, he figured out how to get total administration access over a network. He put it in a script, and ran it to see what would happen, then went to bed for about four hours. Next morning on the way to work he checked on it, and discovered he was now lord and master of about 50,000 computers. After nearly vomiting in fear he killed the whole thing and deleted all the files associated with it. In the end he said he threw the hard drive into a bonfire. I can’t tell you who he is because he doesn’t want to go to Federal prison, which is what could have happened if he’d told anyone that could do anything about the bug he’d found. Did that bug get fixed? Probably eventually, but not by my friend. This story isn’t extraordinary at all. Spend much time in the hacker and security scene, you’ll hear stories like this and worse.
It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire.
Computers, and computing, are broken.
More here.
The entire basis for Obama’s drone strategy may be wrong
Zack Beauchamp in Vox:
The basic premise of the Obama Administration's drone program is that decapitation, the killing of a terrorist organization's top leadership, works. Killing al-Qaeda's leadership should, in theory, limit the organization's ability to plot attacks on the US and its allies.
But what if that's not true? That's the core finding of a just-published study in the prestigious journalInternational Security. In it, Georgia Tech professor Jenna Jordan takes a look at the history of targeting terrorist leaders and draws lessons for the fight against al-Qaeda. According to Jordan, believing that targeted killing can actually weaken al-Qaeda means assuming al-Qaeda depends on a group of charismatic leaders. But that's wrong, and that mistaken assumption has led the Obama Administration to pursue a strategy centered on targeting al-Qaeda's leadership with drones when it'd really be better to cut down on targeted killings altogether.
More here.
Andy Warhol – The Complete Picture
ballade of suicide
clair de lune
Joseph Conrad’s Crystal Ball
Craig Lambert in Harvard Magazine:
Many call Rudyard Kipling the scribe of the British Empire, but novelist Joseph Conrad (1857-1924) may have best rendered its waning years and foreshadowed its demise. Around the turn of the last century, Conrad’s books portrayed terrorism in Europe, limned the reach of multinational corporations, and foresaw patterns of globalization that became clear only a hundred years later. The contemporary Colombian novelist Juan Gabriel Vásquez has described Conrad’s books “as ‘crystal balls in which he sees the twentieth century,’” says professor of history Maya Jasanoff. “Conrad observed the world around him from distinctive and diverse vantage points because of his own cosmopolitan and well-traveled background,” she continues. “Henry James wrote him a letter that said, ‘No-one has known—for intellectual use—the things you know, and you have, as the artist of the whole matter, an authority that no one has approached.’ James meant not only what Conrad had seen, but the depth of his insights. I would echo that.”
Born in Poland, Conrad spent 20 years of his adulthood as a merchant seaman on French, Belgian, and English ships, steaming to Africa, the Far East, and the Caribbean before settling down as an author in England. His grasp of the tensions and forces tearing apart the Victorian-Edwardian world is a counterweight, says Jasanoff, to the “widely held stereotype of the period as a golden age before everything got wrecked in the trenches of World War I. If you read what people were actually saying then, you get a strong sense of social and economic upheaval. World War I didn’t come out of a vacuum. Conrad’s novels suggest what it was like to be a person living in those times. Fiction can bring alive the subjective experience of the moment, which isn’t rendered by the kinds of documents historians usually look at.” “World War I didn’t come out of a vacuum.…Fiction can bring alive the subjective experience of the moment, which isn’t rendered by the kinds of documents historians usually look at.”
More here.
Saturday, May 24, 2014
Pakistan: Worse Than We Knew
Ahmed Rashid reviews Carlotta Gall's The Wrong Enemy: America in Afghanistan, 2001–2014 in The NYRB (photo by Alexandra Boulat/VII):
All the recent books I have seen on the Afghan wars have recounted how the Pakistani military backed the Taliban when they first emerged in 1993, but lost its influence by 2000. Then, after a brief respite following September 11, 2001, Pakistan’s military helped to resurrect the Taliban resistance to fight the Americans. My own three books on Afghanistan describe the actions of the Pakistani military as one factor in keeping the civil war going and contributing to the American failure to win decisively in Afghanistan.*
Now in The Wrong Enemy: America in Afghanistan, 2001–2014, Carlotta Gall, theNew York Times reporter in Afghanistan and Pakistan for more than a decade, has gone one step further. She places the entire onus of the West’s failure in Afghanistan and the Taliban’s successes on the Pakistani military and the Taliban groups associated with it. Her book has aroused considerable controversy, not least in Pakistan. Its thesis is quite simple:
The [Afghan] war has been a tragedy costing untold thousands of lives and lasting far too long. The Afghans were never advocates of terrorism yet they bore the brunt of the punishment for 9/11. Pakistan, supposedly an ally, has proved to be perfidious, driving the violence in Afghanistan for its own cynical, hegemonic reasons. Pakistan’s generals and mullahs have done great harm to their own people as well as their Afghan neighbors and NATO allies. Pakistan, not Afghanistan, has been the true enemy.
Dogged, curious, insistent on uncovering hidden facts, Gall’s reporting over the years has been a nightmare for the American, Pakistani, and other foreign powers involved in Afghanistan, while it has been welcomed by many Afghans. She quickly emerged as the leading Western reporter living in Kabul. She made her reputation by reporting on the terrible loss of innocent Afghan lives as American aircraft continued to bomb the Pashtun areas in southern Afghanistan even after the war of 2001 had ended. The bombing of civilians was said to be accidental, supposedly based on faulty intelligence; but it continued for years and helped the Taliban turn the population against the Americans.
More here.
the knausgaard phenomenon
Rivka Galchen at The New York Times:
An international best seller, “My Struggle” has been acclaimed, declaimed and compared to Proust. It is said that Norwegian companies have had to declare “Knausgaard-free” days — no reading, no discussion — so work can get done. All of which means whatever it means, but even a skeptical reader, after a few hundred of any of the volumes’ pages, will concede it is highly likely that “My Struggle” is a truly original and enduring and great work of literature. Yet it is an original and enduring and great work of literature that produces the sensation of reading something like an unedited transcript of one man’s somewhat but not all that remarkable life, written in language that is fairly often banal. (The final phrase of Book 3 is “lodged in my memory with a ring as true as perfect pitch.”)
And so a perhaps childish thought, akin to that of young Karl Ove’s, nags at a reader, especially if the reader reveres the book, as this and many other readers do: Is that really all there is to it? Seemingly indiscriminate amounts of detail about whatever it is that actually happens in real life (or close enough) and there you go, that’s a great book? It’s difficult to believe that literature has been replenished not by an obscure and patient pearl fisherman diving into deep waters and coming up with a blue face, but rather by a reasonably successful 40-something Norwegian guy with three (now four) kids and a pretty comfortable bourgeois life near Copenhagen whose work more resembles “diving” for pennies at the local water fountain.
more here.
Under the Influence: John Deakin
Anthony Quinn at The Guardian:
In the emollient climate of today's portrait photography John Deakin's work presents a bracing corrective. Deakin (1912-1972) photographed celebrities in his heyday, but he never cosseted or flattered them in the manner of a Mario Testinoor an Annie Leibowitz. The faces of his sitters, caught in a curious hungover light, loom out at you, bemused, vulnerable, possibly guilty. He called them his “victims”, and no wonder. A portrait he took of himself in the early 1950s is revealing, his pinched features and beady gaze suggesting a spiv or a blackmailer out of a Patrick Hamilton novel. “An evil genius,” George Melly said of him, and “a vicious little drunk of such inventive malice that it's surprising he didn't choke on his own venom.”
The inventiveness, if not the malice, is available for inspection in Under the Influence, curator Robin Muir's latest dip into the Deakin archive, which accompanies an exhibition currently showing at the Photographers' Gallery in London. It is a timely book in one way, for it offers glimpses of a Soho – Deakin's stamping ground of the late 40s and 50s – before its tragic fall into respectability.
more here.
The empire of Alain de Botton
Sam Knight at the Financial Times:
So why do you infuriate so many people? I asked. We were back in the lecture theatre, alone. De Botton had just led a quick tour of the Rijksmuseum’s “Gallery of Honour”, followed by television cameras. Like many of his most ambitious projects, Art is Therapy has received some poisonous reviews. “De Botton’s evangelising and his huckster’s sincerity make him the least congenial gallery guide imaginable,” wrote the Guardian’s art critic Adrian Searle. Such hostility has stalked de Botton since his breakout hit How Proust Can Change Your Life was published in 1997. “This reviewer was, unfortunately, intensely irritated by many aspects of de Botton’s thesis, finding it superficial, often contrived and at times patronising,” wrote Teresa Waugh in the Spectator. That morning, a Dutch journalist had turned to me and said: “I suppose he sees himself as a modern Socrates, going around and annoying everybody.”
Early negative reviews of his work, by Proust professors and philosophy dons, devastated him, admitted de Botton. “It was very surprising and upsetting. Then my wife, who is very wise, said to me, ‘It’s obvious, this is a fight.’ This is a turf war, and the battle is about what culture should mean to us.”
more here.
THE USES OF DIFFICULTY
Ian Leslie in MoreIntelligentLife:
Jack White, the former frontman of the White Stripes and an influential figure among fellow musicians, likes to make things difficult for himself. He uses cheap guitars that won’t stay in shape or in tune. When performing, he positions his instruments in a way that is deliberately inconvenient, so that switching from guitar to organ mid-song involves a mad dash across the stage. Why? Because he’s on the run from what he describes as a disease that preys on every artist: “ease of use”. When making music gets too easy, says White, it becomes harder to make it sing. It’s an odd thought. Why would anyone make their work more difficult than it already is? Yet we know that difficulty can pay unexpected dividends. In 1966, soon after the Beatles had finished work on “Rubber Soul”, Paul McCartney looked into the possibility of going to America to record their next album. The equipment in American studios was more advanced than anything in Britain, which had led the Beatles’ great rivals, the Rolling Stones, to make their latest album, “Aftermath”, in Los Angeles. McCartney found that EMI’s contractual clauses made it prohibitively expensive to follow suit, and the Beatles had to make do with the primitive technology of Abbey Road. Lucky for us. Over the next two years they made their most groundbreaking work, turning the recording studio into a magical instrument of its own. Precisely because they were working with old-fashioned machines, George Martin and his team of engineers were forced to apply every ounce of their ingenuity to solve the problems posed to them by Lennon and McCartney. Songs like “Tomorrow Never Knows”, “Strawberry Fields Forever”, and “A Day in the Life” featured revolutionary aural effects that dazzled and mystified Martin’s American counterparts.
…Our brains respond better to difficulty than we imagine. In schools, teachers and pupils alike often assume that if a concept has been easy to learn, then the lesson has been successful. But numerous studies have now found that when classroom material is made harder to absorb, pupils retain more of it over the long term, and understand it on a deeper level. Robert Bjork, of the University of California, coined the phrase “desirable difficulties” to describe the counter-intuitive notion that learning should be made harder by, for instance, spacing sessions further apart so that students have to make more effort to recall what they learnt last time. Psychologists at Princeton found that students remembered reading material better when it was printed in an ugly font.
More here.
How did we get so busy?
Elizabeth Kolbert in The New Yorker:
In the winter of 1928, John Maynard Keynes composed a short essay that took the long view. It was titled “Economic Possibilities for Our Grandchildren,” and in it Keynes imagined what the world would look like a century hence. By 2028, he predicted, the “standard of life” in Europe and the United States would be so improved that no one would need to worry about making money. “Our grandchildren,” Keynes reckoned, would work about three hours a day, and even this reduced schedule would represent more labor than was actually necessary.
Keynes delivered an early version of “Economic Possibilities” as a lecture at a boys’ school in Hampshire. He was still at work revising and refining the essay when, in the fall of 1929, the stock market crashed. Some might have taken this as a bad sign; Keynes was undeterred. Though he quickly recognized the gravity of the situation—the crash, he wrote in early 1930, had produced a “slump which will take its place in history amongst the most acute ever experienced”—over the long run this would prove to be just a minor interruption in a much larger, more munificent trend. In the final version of “Economic Possibilities,” published in 1931, Keynes urged readers to look beyond this “temporary phase of maladjustment” and into the rosy beyond.
According to Keynes, the nineteenth century had unleashed such a torrent of technological innovation—“electricity, petrol, steel, rubber, cotton, the chemical industries, automatic machinery and the methods of mass production”—that further growth was inevitable. The size of the global economy, he forecast, would increase sevenfold in the following century, and this, in concert with ever greater “technical improvements,” would usher in the fifteen-hour week.
More here.
Fred Astaire and Gene Kelly: The Babbitt and the Bromide
What Really Happened in Chile: The CIA, the Coup Against Allende, and the Rise of Pinochet
Jack Devine in Foreign Affairs:
On September 9, 1973, I was eating lunch at Da Carla, an Italian restaurant in Santiago, Chile, when a colleague joined my table and whispered in my ear: “Call home immediately; it’s urgent.” At the time, I was serving as a clandestine CIA officer. Chile was my first overseas assignment, and for an eager young spymaster, it was a plum job. Rumors of a military coup against the socialist Chilean president, Salvador Allende, had been swirling for months. There had already been one attempt. Allende’s opponents were taking to the streets. Labor strikes and economic disarray made basic necessities difficult to find. Occasionally, bombs rocked the capital. The whole country seemed exhausted and tense. In other words, it was exactly the kind of place that every newly minted CIA operative wants to be.
I ducked out of the restaurant as discreetly as I could and headed to the CIA station to place a secure call to my wife. She was caring for our five young children, and it was our first time living abroad as a family, so she could have been calling about any number of things. But I had a hunch that her call was very important and related to my work, and it was.
“Your friend called from the airport,” my wife said. “He’s leaving the country. He told me to tell you, ‘The military has decided to move. It’s going to happen on September 11. The navy will lead it off.’”
This call from my “friend”—a businessman and former officer in the Chilean navy who was also a source for the CIA—was the first indication the agency’s station in Santiago had received that the Chilean military had set a coup in motion.
More here.
Rationalize that!
G. Randolph Mayes in The Dance of Reason:
If I could take one word back from the English language, change its common meaning without anyone noticing, it would be: rationalize.
Don't get me wrong. I think its cool that English words evolve over time, even when it's due to error. (Hey, that's how evolution works, right?) I don't mind that it's now OK to say literally when you mean figuratively or nonplussed when you mean unperturbed. I don't even really care about the plundering of philosophical terms like begs the question, which now means something completely different in the vernacular (raises the question) than it does when we use it in informal logic (takes for granted the point at issue).
But rationalize? Come on. Certainly the obvious and intuitive meaning of rationalize is: to make more rational. But today the term has come to mean almost exclusively the opposite: to make something appear rational, when it is not. Dude, you know this is bullshit, you're just rationalizing.
Well, before I explain why I really find this meaning irritating, I have to admit that it isn't quite as perverse as I make out. The suffix 'ize' means 'to cause to be or be like'. And, of course, once you dance into the semantic cloud of similarity and appearance, it is a small step from 'be like' to 'seem like.' Still, while the word rational isn't the only term to suffer izing in this way (moralize, criminalize, glamorize), it's worth noting that the vast majority of words that endinize do not experience this reversal of meaning.
More here.
