Category: Recommended Reading
william gaddis in conversation
Sergiu Celibidache in Rehearsal with London Symphony Orchestra
There’s More Than Meets the Eye: A René Magritte Survey
Holland Cottor in The New York Times:
Oh, no, I thought when I heard that the Museum of Modern Art’s big fall show was a René Magritte survey. Dozens of undersung modernist painters, many of them women, on at least five continents, have never had a New York moment, and here we’re getting an artist we practically can’t avoid. The pipe; the giant eye; the choo-choo in the fireplace. As it turns out, “Magritte: The Mystery of the Ordinary, 1926-1938,” which opens at MoMA on Saturday, is good solid fun, because Magritte is solid and fun. There’s no mystery about why he’s so popular. His paint-by-numbers illustrational mode reads loud and clear from across a room — a good thing, as the exhibition galleries are sure to be jammed — and reproduces faultlessly, even on a cellphone screen. And he had ideas. He was a sophisticated trickster, a bourgeois gentilhomme with a geek inside, hacking into everyday life and planting little weirdness bugs: legs sprouting from shirt collars, rain falling upward, words having lives of their own. He was an attention-grabber with one gift, but a crucial one: for puzzle-making. You may not get, at first glance, what’s going on in his paintings, but you get that there’s something to get. So you look again. And again. Which is, of course, a marketer’s dream.
One thing’s for sure: We’re unlikely ever to see Magritte look better than he does in the MoMA show. Its organizers, Anne Umland, a curator of painting and drawing at the museum, and Danielle Johnson, a curatorial assistant, have zeroed in on a single — and I would say the only — consistently fresh and interesting decade in his long career, when he was inventing the artist he wanted to be and when his art was all over the place in a good way: witty, nasty, brilliant and bad at the same time.
More here. (Note: Saw the show. Loved it. Recommend highly.)
What are the rules of polygamy?
Julia Layton in How Stuff Works:
Plural marriage is as old as the Bible. Abraham and Jacob each had more than one wife. King David had six. King Solomon had 700 (not to mention 300 concubines). Solomon lost God's favor when he married women who did not give up idolatry, David when he sent a woman's husband to the front lines so he could marry her. Whether ancient or modern, polygamous or monogamous, marriage has rules. There may be ages and genders to consider. In early America, there were races to consider. Often, those considerations draw on religious beliefs: The Quran allows a man to take up to four wives. In Fundamentalist Mormonism, there is no set limit to the number of wives in one marriage. Joseph Smith, the Mormon Prophet who first delivered God's directive that Mormons practice plural marriage, ultimately took dozens of wives.
More here.
Sunday Poem
Dream Tales From the Barn
The white rooster is too new at this,
too newly glorious, victorious, to notice
how the west wind has flattened
its hand against the barnboards.
Yesterday's hero, named Choochoo,
still flaunting the sheen of his plumage
minus the prize green tail feathers, waiting
for his red crest to rise bravely
from the ashes of frostbite,
has taken it all in, flat wind and flimsy wood,
but he's too busy stewing about the hens
gone over to the other side, and the bald spot
on his back pecked larger by the day.
The flea-bitten brown cock who never stood a chance,
never sees the light of day, sits drilled into his lonely corner,
smugly aware of the wind's highly organized goings-on
cheering for it in his sad, airless heart, waiting
for the barn to cave in on a wild feathered frenzy,
waiting for the dust to settle, one chance in three.
.
by Ellen Doré Watson
from We live in Bodies
Alice James Books 1997
Rafiq Kathwari wins Kavanagh poetry award
From the Irish Times:
The winner of this year’s Patrick Kavanagh Poetry Award is Rafiq Kathwari, who lives in Omeath, Co Louth.
He graduated from the University of Kashmir in 1969 before studying at the New University in New York and Columbia University. Most of his working life has been spent with Ethan Allen, a large manufacturer and retailer of home furnishings based in the United States. He has also worked as photojournalist. Mr Kathwari has published poems in print and online in the US, Ireland and Asia.
More here.
Saturday, September 28, 2013
Building India’s “Shock City of the Twentieth Century” from the Top Down
Samantha Christiansen in Berfrois reviews Howard Spodek's Ahmedabad: Shock City of Twentieth-Century India:
The field of South Asian urban history has a rich history of examining India’s major urban centers. Numerous astute studies of Delhi, Bombay (Mumbai), and Calcutta (Kolkata), for example, have contributed to our understanding of not only the rapid urbanization (and later suburbanization, as explored in the remarkable collection of essays that appeared in a recent special edition of Urban History [February 2012]) of the subcontinent, but the human and economic development that has shaped the region as well.[1] Yet while the field is rich, there are noticeable silences around relatively large swaths of the region. Cities outside of the Indian national border, such as Karachi or Dhaka, rest quietly in the periphery of the historiography; others within the border, such as Ahmedabad, while mentioned in virtually all historical discussions of the subcontinent (being the site of Mahatma Gandhi’s ashram after all), have received little focused attention. Howard Spodek’s Ahmedabad thus provides an important contribution to the field as both an examination of a place conspicuously underrepresented in the urban history of the region and as an excellent piece of urban history that not only greatly informs our understanding of South Asian development, but also has application to a number of cities globally.
Spodek presents a compelling sketch of the last hundred or so years in a city that has been called the “Manchester of India.” In Spodek’s presentation of the city, we see a microcosm of some of India’s major political, economic, and social trajectories: the rise of Gandhi and the independence movement, the drive for modernity and industrialization in postcolonial India, the collapse of the labor unions and the restructuring of the economy within new global markets, and struggles with communal violence and corruption. Spodek successfully balances his portrayal of a city shaped by a concentrated body of power elites within a larger global context, placing Ahmedabad at the center, but recognizing the external forces playing out in the process. In this way, as a case study in urban history, Ahmedabad is instructive both in content and method.
More here.
The History of Fear
Corey Robin discusses fear in political philosophy, over at his blog:
It was on April 5, 1588, the eve of the Spanish Armada’s invasion of Britain, that Thomas Hobbes was born. Rumors of war had been circulating throughout the English countryside for months. Learned theologians pored over the book of Revelation, convinced that Spain was the Antichrist and the end of days near. So widespread was the fear of the coming onslaught it may well have sent Hobbes’s mother into premature labor. “My mother was filled with such fear,” Hobbes would write, “that she bore twins, me and together with me fear.” It was a joke Hobbes and his admirers were fond of repeating: Fear and the author ofLeviathan and Behemoth—Job-like titles meant to invoke, if not arouse, the terrors of political life—were born twins together.
It wasn’t exactly true. Though fear may have precipitated Hobbes’s birth, the emotion had long been a subject of enquiry. Everyone from Thucydides to Machiavelli had written about it, and Hobbes’s analysis was not quite as original as he claimed. But neither did he wholly exaggerate. Despite his debts to classical thinkers and to contemporaries like the Dutch philosopher Hugo Grotius, Hobbes did give fear special pride of place. While Thucydides and Machiavelli had identified fear as a political motivation, only Hobbes was willing to claim that “the original of great and lasting societies consisted not in mutual good will men had toward each other, but in the mutual fear they had of each other.”
But more than Hobbes’s insistence on fear’s centrality makes his account so pertinent for us, for Hobbes was attuned to a problem we associate with our postmodern age, but which is as old as modernity itself: How can a polity or society survive when its members disagree, often quite radically, about basic moral principles? When they disagree not only about the meaning of good and evil, but also about the ground upon which to make such distinctions?
More here and the second part here.
The Pantheon of Animals
Justin E. H. Smith in his own blog:
I’m waiting in line, embarrassed to be here by myself. I’ll be turning forty later this month, and here I am at the natural history museum, childless. The ticket lady is going to look at me funny. There is some kid behind me, four years old or so, speaking Swedish to his dad. He is wearing thick, round glasses made of blue plastic, and a colorful backpack with a cartoon image of a Cro Magnon on it. His progenitor is getting a lecture about how birds are, in truth, dinosaurs. The kid is beaming with pride at his own knowledge of this. To my right is a statue, which, as with all statues, I have taken some time to notice. But when I do, I am startled. It is Émmanuel Fremiet’s 1895 masterpiece, Orang-Outang Strangling a Savage of Borneo, a work of horrible violence, and a congealing of sundry, transparent anxieties of the fin-de-siècle European man. The Swedish boy is now on to the difference between mammoths and mastodons.
I’m next in line. I’m at the Gallery of Comparative Anatomy, the ground floor of a three-storey building also housing the Gallery of Paleontology, both of which are part of the vast complex of galleries, greenhouses, and gardens at the Paris Muséum d’Histoire Naturelle, in the Jardin des Plantes on the Left Bank of the Seine. “Un billet,” I manage to say. “Plein tarif.” I shouldn’t really be here, I know. But it's the only place I really want to be, in this foreign, difficult city, at this puzzling stage of life. I am not a boy, but it is where I belong: among the many bones, whose collectors hoped to lay bare through them the very order of nature.
More here.
Why nearly every sport except long-distance running is fundamentally absurd
David Stipp in Slate:
At first glance the annual Man vs. Horse Marathon, set for June 9 in Wales, seems like a joke sport brought to us by the same brilliant minds behind dwarf tossing and gravy wrestling. It was, after all, the product of a pints-fueled debate in a Welsh pub, and for years its official starter was rock musician Screaming Lord Sutch, founder of the Official Monster Raving Loony Party. But the jokiness is misleading: When viewed through science’s clarifying lens, the funny marathon is one of the few sports that isn’t a joke.
Hear me out, sports fans—I'm a basketball nut myself, and so the joke is as much on me as anyone. To see where I’m coming from, you can’t do better than examining basketball’s most physically talented player, Michael Jordan. He was hailed as nearly repealing the law of gravity, and during his prime he made rival players look as if they were moving in slow motion. But Air Jordan wasn't in the same league as a house cat when it comes to leaping. Consider how casually young cats can jump up onto refrigerators. To match that, a man would have to do a standing jump right over the backboard. And a top-notch Frisbee dog corkscrewing through the air eight feet up to snag a whizzing disc makes Jordan look decidedly human when it comes to the fantastic quickness, agility, strength, and ballistic precision various animals are endowed with.
There's no denying it—our kind started substituting brains for brawn long ago, and it shows: We can't begin to compete with animals when it comes to the raw ingredients of athletic prowess. Yet being the absurdly self-enthralled species we are, we crowd into arenas and stadiums to marvel at our pathetic physical abilities as if they were something special. But there is one exception to our general paltriness: We're the right honorable kings and queens of the planet when it comes to long-distance running.
More here.
Fireman Saves Kitten
Winthrop Kellogg Edey: the man who collected time
Stefany Anne Golberg at The Smart Set:
You’ve heard the phrase, “a man out of step with his time.” We use it to talk about a man that should have existed in another era as, for instance, a man with Victorian sensibilities who happens to live in the present day. But we also use the phrase to talk about a man who exists outside of time altogether. Winthrop K. Edey was such a man. He was hyper-punctual but highly anachronistic. Untimely. The qualities were two sides of the same coin. “Mr. Edey… favored old-fashioned fountain pens over ballpoints and maintained his town house in such 19th century purity that it still has its original working gas jets, tapestries, stove, and marble-slab kitchen table,” said The New York Times. When Edey wanted to take a snapshot he “would lug a huge wooden turn-of-the-century view camera complete with tripod and 11-by-14-inch glass plates” out into the streets. It was a lifestyle of a man living just to the side of time. And the more punctual Edey made his life, the more he arranged time according to his individual whim, the less he was part of the ordinary world. He was like a monk except that a monk arranges his life around a schedule that he does not choose. Winthrop K. Edey’s time was solely his own — or, at least, he tried to make it so. Orson Welles once said that an artist is always out of step with time. This truth is both the beauty and melancholy of the artist. Winthrop K. Edey was an artist of time. He was thus a man destined to be not merely out of step with time, but dislocated in it.
more here.
jorge louis borges as professor
Morten Høi Jensen at The Quarterly Conversation:
A groundbreaking new volume published by New Directions, Professor Borges: A Course on English Literature, offers unprecedented insight into the writer’s lifelong relationship to the English language, as well as an affecting portrait of the Argentine master as lecturer. These twenty-five classes on English literature were recorded by a small group of students in 1966 and later edited by two leading Borges scholars, Martín Arias and Martín Hadis. They have now finally been rendered into English by the incomparable Katherine Silver. Naturally, “English literature” as defined by Borges is highly idiosyncratic and inescapably, well, Borgesian: the book opens with a study of the Norse and Anglo-Saxon inheritance and go on to deal with central figures of English literature proper—Samuel Johnson and Samuel Coleridge, William Wordsworth, and William Blake—before bottlenecking into character studies of Borges’s all-stars: Thomas Carlyle, Robert Browning, William Morris, and Robert Luis Stevenson.
Reading this book, one gathers that Borges’s initial fears of lecturing—he had to overcome both stammer and shyness—eventually gave way to genuine enthusiasm. Colloquial expressions—“Let’s dig into Beowulf”—help convey a sense of what it must have been like listening to him.
more here.
The case against the global novel
Pankaj Mishra in the Financial Times:
Between 1952 and 1957, Naguib Mahfouz did not write any novels or stories. This was not a case of writer’s block. Mahfouz, who had completed his masterwork, The Cairo Trilogy, in the early 1950s, later explained that he had hoped Egypt’s revolutionary regime would fulfil the aims of his realist novels, and focus public attention on social, economic and political ills. Disenchantment would drive him back to fiction, of a more symbolic and allegorical kind. In 1967, Israel’s crushing defeat of Egypt would force Mahfouz to stop again, and then resume with some explicitly political work.
In recent months, Ahdaf Soueif and Alaa al-Aswany, among other Egyptian authors, have been found on the barricades of Cairo. Such a close and perilous involvement of writers in national upheavals may surprise many contemporary readers in the west, who are accustomed to think of novelists as diffident explorers of the inner life – people very rarely persuaded to engage with public events. Literature today seems to emerge from an apolitical and borderless cosmopolis. Even the mildly adversarial idea of the “postcolonial” that emerged in the 1980s, when authors from Britain’s former colonial possessions appeared to be “writing back” to the imperial centre, has been blunted. The announcement this month that the Man Booker, a literary prize made distinctive by its Indian, South African, Irish, Scottish and Australian winners, will henceforth be open to American novels is one more sign of the steady erasure of national and historical specificity.
more here.
41 books sexist prof David Gilmour should read
Roxane Gay in Salon:
Canadian novelist and professor David Gilmour ran into some trouble when he told interviewer Emily Keeler, “I’m not interested in teaching books by women.” He went on to explain that he simply didn’t love women writers enough to teach them. The one woman writer who did meet with Gilmour’s approval, Virginia Woolf, was too sophisticated for his students. He preferred the prose of the manliest of men — Hemingway, Roth, Fitzgerald, Elmore Leonard, Chekhov. There’s an unforgettable bit about eating menstrual pads — what would we do without Philip Roth? It’s not nearly as silly as it is sad that Gilmour hasn’t allowed himself to love and respect contemporary women’s writing. It’s a shame he denies himself and his students the opportunity to appreciate a richer chronicling of the human experience than that provided by the most masculine (in his estimation) of prose writers.
If I were teaching such a course, I would ask students to read a selection of the books I’ve been thinking about lately — a list that is deliberately incomplete, and one that will be ever changing. The class would meet each week to discuss a theme like sexuality, the body, place and displacement, race, difference, violence, love and hate — and how and why modern writers approach these themes. At the end of the semester, I would ask students only one question. What does it mean to be human? Without offering them a diversity of voice, I cannot begin to imagine how they might answer that question. Here’s what they would read:
“Wide Sargasso Sea” by Jean Rhys
“Misery” by Stephen King
“Beloved” by Toni Morrison
“Disgrace” by J.M. Coetzee
“NW” by Zadie Smith
“A Fine Balance” by Rohinton Mistry
“Once Were Warriors” by Alan Duff
“Deliverance” by James Dickey
More here.
Are We Too Concerned That Characters Be ‘Likable’?
Mohsin Hamid in The New York Times:
I’ll confess — I read fiction to fall in love. That’s what’s kept me hooked all these years. Often, that love was for a character: in a presexual-crush way for Fern in “Charlotte’s Web”; in a best-buddies way for the heroes of “Astérix & Obélix”; in a sighing, “I wish there were more of her in this book” way for Jessica in “Dune” or Arwen in “The Lord of the Rings.” In fiction, as in my nonreading life, someone didn’t necessarily have to be likable to be lovable. Was Anna Karenina likable? Maybe not. Did part of me fall in love with her when I cracked open a secondhand hardcover of Tolstoy’s novel, purchased in a bookshop in Princeton, N.J., the day before I headed home to Pakistan for a hot, slow summer? Absolutely. What about Humbert Humbert? A pedophile. A snob. A dangerous madman. The main character of Nabokov’s “Lolita” wasn’t very likable. But that voice. Ah. That voice had me at “fire of my loins.”
So I discovered I could fall in love with a voice. And I could fall in love with form, with the dramatic monologue of Camus’s “Fall,” or, more recently, the first-person plural of Julie Otsuka’s “Buddha in the Attic,” or the restless, centerless perspective of Jennifer Egan’s “Visit From the Goon Squad.” And I’d always been able to fall in love with plot, with the story of a story. Is all this the same as saying I fall in love with writers through their writing? I don’t think so, even though I do use the term that way. I’ll say I love Morrison, I love Oates. Both are former teachers of mine, so they’re writers I’ve met off the page. But still, what I mean is I love their writing. Or something about their writing.
More here.
Friday, September 27, 2013
Making Juries Better: Some Ideas from Neuroeconomics
Virginia Hughes over at the National Geographic's Only Human:
We Americans love jury trials, in which an accused person is judged by a group of peers from the community. Every citizen, when called, must sit on a jury. For anyone who finds this civic duty a painful chore: Go watch12 Angry Men, A Few Good Men, or any episode of Law & Order. You’ll feel all warm and fuzzy with the knowledge that, though juries don’t always make the right call, they’re our best hope for carrying out justice.
But…what if they aren’t? Juries are made of people. And people, as psychologists and social scientists have reported for decades, come into a decision with pre-existing biases. We tend to weigh evidence that confirms our bias more heavily than evidence that contradicts it.
Here’s a hypothetical (and pretty callous) example, which I plucked from one of those psych studies. Consider an elementary school teacher who is trying to suss out which of two new students, Mary and Bob, is smarter. The teacher may think of them as equally smart, at first. Then Mary gets a perfect score on a vocabulary quiz, say, leading the teacher to hypothesize that Mary is smarter. Sometime after that, Mary says something mildly clever. Objectively, that one utterance shouldn’t say much about Mary’s intelligence. But because of the earlier evidence from the quiz, the teacher is primed to see this new event in a more impressive light, bolstering the emerging theory that Mary is smarter than Bob. This goes on and on, until the teacher firmly believes in Mary’s genius.
Even more concerning than confirmation bias itself is the fact that the more bias we have, the more confident we are in our decision.
All of that research means, ironically, that if you start with a group of individuals who have differing beliefs, and present them all with the same evidence, they’re more likely to diverge, rather than converge, on a decision.
More here.
Counter-Counter-Revolution
David Runciman reviews Christian Caryl’s Strange Rebels: 1979 and the Birth of the 21st Century, in the LRB:
What was the most significant year of the 20th century? There are three plausible candidates. The first is 1917, the year of the Russian Revolution and America’s entry into the First World War, which set in train a century of superpower conflict. The second is 1918, the year that saw Russia’s exit from the war and the defeat of the German and Austro-Hungarian Empires, which set the stage for the triumph of democracy. The third is 1919, the year of the Weimar constitution and the Paris Peace Conference, which ensured that the triumph would be squandered. What this means is that it was the dénouement of the First World War that changed everything: a messy, sprawling, disorderly event that spilled out across all attempts to contain it. Its momentous qualities cannot be made to fit into the timeframe defined by a single year. History rarely can.
That is the problem with Christian Caryl’s fascinating and frustrating book, which identifies 1979 as the year that gave birth to the 21st century. Caryl builds his case around five overlapping stories, four about individuals and one about a country. The people are Thatcher, Deng Xiaoping, Ayatollah Khomeini and Pope John Paul II. The place is Afghanistan. The year 1979 mattered to all of them. It was the year Thatcher won her first general election. The year Deng embarked on the economic reforms that would transform China. The year the Iranian Revolution swept Khomeini to power. The year the new pope visited his Polish homeland, sparking vast public outpourings of support in defiance of the communist regime. The year Afghanistan was invaded by the Soviets. These were all momentous events. Caryl weaves them together into a single narrative that tags 1979 as the year that the myth of 20th-century secular progress started to unravel. What joins the different bits of the story together is that each one represents the revenge of two forces that the 20th century was supposed to have seen off, or at least got under control: markets and religion.
More here.
The Real Reason for the Fight over the Debt Limit
Mark Thoma in The Fiscal Times:
We have lost something important as a society as inequality has grown over the last several decades, our sense that we are all in this together. Social insurance is a way of sharing the risks that our economic system imposes upon us. As with other types of insurance, e.g. fire insurance, we all put our money into a common pool and the few of us unlucky enough to experience a “fire” – the loss of a job, health problems that wipe out retirement funds, disability, and so on – use the insurance to avoid financial disaster and rebuild as best we can.
But growing inequality has allowed one strata of society to be largely free of these risks while the other is very much exposed to them. As that has happened, as one group in society has had fewer and fewer worries about paying for college education, has first-rate health insurance, ample funds for retirement, and little or no chance of losing a home and ending up on the street if a job suddenly disappears in a recession, support among the politically powerful elite for the risk sharing that makes social insurance work has declined.
Rising inequality and differential exposure to economic risk has caused one group to see themselves as the “makers” in society who provide for the rest and pay most of the bills, and the other group as “takers” who get all the benefits. The upper strata wonders, “Why should we pay for social insurance when we get little or none of the benefits?” and this leads to an attack on these programs.
More here.
