Peter Orszag in The New Republic:
In an 1814 letter to John Taylor, John Adams wrote that “there never was a democracy yet that did not commit suicide.” That may read today like an overstatement, but it is certainly true that our democracy finds itself facing a deep challenge: During my recent stint in the Obama administration as director of the Office of Management and Budget, it was clear to me that the country’s political polarization was growing worse—harming Washington’s ability to do the basic, necessary work of governing. If you need confirmation of this, look no further than the recent debt-limit debacle, which clearly showed that we are becoming two nations governed by a single Congress—and that paralyzing gridlock is the result.
So what to do? To solve the serious problems facing our country, we need to minimize the harm from legislative inertia by relying more on automatic policies and depoliticized commissions for certain policy decisions. In other words, radical as it sounds, we need to counter the gridlock of our political institutions by making them a bit less democratic.
I know that such ideas carry risks. And I have arrived at these proposals reluctantly: They come more from frustration than from inspiration. But we need to confront the fact that a polarized, gridlocked government is doing real harm to our country. And we have to find some way around it.
Robert Socolow in Bulletin of the Atomic Scientists:
Let's review the messages in our 2004 paper in Science. The paper assumes that the world wishes to act decisively and coherently to deal with climate change. It makes the case that “humanity already possesses the fundamental scientific, technical and industrial know-how to solve the carbon and climate problem for the next half-century.” This core message surprised many people, because our paper arrived at a time when the Bush administration was asserting that, unfortunately, the tools available were not suited for addressing climate change. Indeed, at a conference I attended at that time, Energy Secretary Spencer Abraham insisted that a discovery akin to the discovery of electricity was required.
Our focus on “the next half century” was novel; the favored horizon at the time was a full century — and still is. We argued that “the next fifty years is a sensible horizon from several perspectives. It is the length of a career, the lifetime of a power plant, and an interval whose technology is close enough to envision.”
In a widely reproduced Figure (see below) we identified a Stabilization Triangle, bounded by two 50-year paths. Along the upper path, the world ignores climate change for 50 years and the global emissions rate for greenhouse gases doubles. Along the lower path, with extremely hard work, the rate remains constant. We reported that starting along the flat emissions path in 2004 was consistent with “beating doubling,” i.e., capping the atmospheric carbon dioxide concentration at below twice its “pre-industrial” concentration (the concentration a few centuries ago).
James Salter in the New York Review of Books:
Ernest Hemingway, the second oldest of six children, was born in Oak Park, Illinois, in 1899 and lived until 1961, thus representing the first half of the twentieth century. He more than represented it, he embodied it. He was a national and international hero, and his life was mythic. Though none of his novels is set in his own country—they take place in France, Spain, Italy, or in the sea between Cuba and Key West—he is a quintessentially American writer and a fiercely moral one. His father, Clarence Hemingway, was a highly principled doctor, and his mother Grace was equally high-minded. They were religious, strict—they even forbade dancing.
From his father, who loved the natural world, Hemingway learned in childhood to fish and shoot, and a love of these things shaped his life along with a third thing, writing. Almost from the first there is his distinct voice. In his journal of a camping trip he took with a friend when he was sixteen years old, he wrote of trout fishing, “Great fun fighting them in the dark in the deep swift river.” His style was later said to have been influenced by Sherwood Anderson, Gertrude Stein, Ezra Pound, journalism, and the forced economy of transatlantic cables, but he had his own poetic gift and also the intense desire to give to the reader the full and true feeling of what happened, to make the reader feel it had happened to him. He pared things down.
Although I’ve yet to see sandwich-board men on the steps of the nation’s capitol declaring that the end of the world is nigh, I expect that it won’t be long before the Department of Homeland Security advises the country’s Chinese restaurants to embed the alert in the fortune cookies. President Obama appears before the congregations of the Democratic faithful as a man of sorrows acquainted with grief, cherishing the wounds of the American body politic as if they were the stigmata of the murdered Christ. The daily newscasts update the approaches of weird storms, bring reports of missing forests and lost polar bears, number the dead and dying in Africa and the Middle East, gauge the level of America’s fast-disappearing wealth. Hollywood stages nostalgic remakes of the Book of Revelation; video games mount the battle of Armageddon on the bosom of the iPad. Nor does any week pass by without a word of warning from the oracles at the Council on Foreign Relations, Fox News, and the New York Times. Their peerings into the abyss of what to the Washington politicians are known as “the out years” never fail to discover a soon forthcoming catastrophe (default on the national debt, double-dip recession, global warming, nuclear proliferation, war in Iran) deserving the close attention of their fellow travelers aboard the bus to Kingdom Come. If the fear of the future is the story line that for the last ten years has made it easy to confuse the instruments of the American media with the trumpets of doom, the cloud of evil omens is not without a silver lining.
more from Lewis Lapham at Lapham’s Quarterly here.
Retelling a great myth is like performing a famous piece of music: between faithfulness to the familiar score and personal interpretation of it lie many risks and choices. Between the worldview of a Norse skald, or poet, and that of a writer ten or fifteen centuries later, the scope for risks and choices is immense. Ragnarök, A S Byatt’s contribution to the Canongate Myths series, is a brilliant, highly intelligent, fiercely personal rendition of the Scandinavian mythology. Its personal element has particular resonance for me because, like A S Byatt, I was a child during the Second World War. I, too, read the Norse myths, and like her I found they made sense of the strange world we were growing up in. But California was a long way from the north of England, and the versions of the story I knew were very different from hers. She read the translation of Wägner’s scholarly edition; I read Padraic Colum’s, written principally for younger readers. Colum gave the often incoherent material narrative shape, humanised its brutality to some extent, brought out its harsh humour, and told it in fine, clear prose. Byatt was dealing with something nearer the raw material. But we were both reading a story that moved inexorably through war towards doom.
more from Ursula K Le Guin at Literary Review here.
The reason this crisis keeps grinding ever deeper is because the euro itself is a machine for perpetual destruction. The currency is fundamentally warped and misaligned. It spans a 30pc gap in competitiveness between North and South. Intra-EMU current account deficits have become vast, chronic, and corrosive. Monetary Union is inherently poisonous. The countries in trouble no longer have the policy tools — interest rates, QE, liquidity, and exchange rates — to lift themselves out of debt-deflation. Just as they had few tools to prevent a catastrophic credit bubble during the boom. Their travails were caused in great part by negative real interest rates set by the ECB (irresponsibly) for German needs. Their fiscal deficits (and remember, Spain and Ireland ran big surpluses in the boom) have exploded because of the Great Recession itself — as they have in the UK, US, and Japan. Draconian fiscal tightening might be manageable for these countries if the Teutonic bloc is willing to offset the contraction in demand by cranking up their own stimulus, allowing the intra-EMU imbalances to close from both ends. But the Teutons instead cling to their pieties, and their morality tale. The result is the downward spiral that we can all see.
more from Ambrose Evans-Pritchard at The Telegraph here.
From The Boston Globe:
Somewhere in the world, every second of every day, people are being beaten, shot, and stabbed. The news is a litany of bombings and political assassinations, deadly riots and gang warfare. The lucky among us merely hear about it. Some days, when the body count is particularly high, it can be hard to stave off the sense that our species is more brutal and more bloodthirsty than at any other point in history.
Steven Pinker used to wince at the carnage like everybody else, and wonder how the human race had managed to lose its way so horribly. Then, in 1989, he stumbled upon something remarkable: a graph in a history book, compiled by a political scientist named Ted Robert Gurr, showing that the homicide rate in England had declined sharply since the 13th century. Pinker was astonished. The rate had fallen in some areas by as much as one hundredfold. Could it be true, he wondered, that humans had actually become less violent with time, as opposed to more? And if so, how had we done it? Pinker, now a psychology professor at Harvard, was then a rising star at MIT known primarily for his work on how the mind processes language and vision. In the years after his eye-opening encounter with the Gurr graph, his interest in broader questions about human nature and the brain would lead him to write a series of bestselling books, including “How the Mind Works” and “The Blank Slate,” which helped establish him as one of the most recognizable public intellectuals of the past 20 years.
Adam Rubin in Science:
I'm not good at meeting celebrities. I don't just mean the time I narrowly missed shaking hands with Judd Hirsch because my girlfriend had to use the bathroom, which still makes me a little angry every time I think about it. I mostly mean that the few times I've met famous people, my brain has locked up and I couldn't make normal conversation. Case in point: The summer before college, I met then-President Bill Clinton. He said, “Nice to meet you,” and I replied with the same. Then he asked where I was going to go to college, and instead of saying “Princeton,” I said, “Uh … I forget.” I think I even gave him a look that said, “Come on, help me think of the name of this place. Surely you've at least got a good guess — you're the president!” And that was the whole encounter. Which is why, when I learned that I would meet Nobel laureate Elizabeth Blackburn (physiology or medicine, 2009), I had nightmares that she'd ask, “How are you?” and I'd reply, “Balloons.”
So I familiarized myself with the work that earned her the Nobel Prize: her co-discovery of telomerase, the enzyme that elongates telomeres within cells. But when we finally met, Dr. Blackburn threw me a curveball. Before I could say any of the intelligent scientific things I'd rehearsed, she told me that she'd read my book, which immediately filled me with shame, because my book is a comical guide to stealing free food in graduate school. I wondered whether I had negatively influenced the future of science by stealing time from a brilliant researcher, time she could have otherwise used to cure cancer. I imagined having to apologize to the world's cancer patients: “Sorry, guys. I know you were hoping for a cure, but I distracted the scientist who was your best hope with jokes about muffins.” Dr. Blackburn enjoyed the book, or at least she was nice enough to lie about it. “You're a very good writer,” she said, and that's exactly when my brain locked up. Knowing, at a primal level, that I ought to answer a compliment with a compliment, I said, “Thanks. You're … very good … at, uh, learning about … telomeres.” Then, following an awkward silence during which I realized I'd possibly just made the stupidest comment Dr. Blackburn had ever heard, someone else called her away to another part of the room.
Somewhat on the wonk side but very, very interesting: Ashwin Parameswaran over at macroresilience (portrait of Knut Wicksell from Wikimedia Commons):
The conventional cure for insufficient aggregate demand and the one that has been preferred throughout the Great Moderation is monetary easing. The argument goes that lower real rates, higher inflation and higher asset prices will increase investment via Tobin’s Q and increase consumption via the wealth effect and reduction in rewards to savings, all bound together in the virtuous cycle of the multiplier. As I discussed in a previous post, QE2 and now Operation Twist are not as unconventional as they seem. They simply apply the logic of interest rate cuts to the entire yield curve rather than restricting central bank interventions to the short-end of the curve as was the norm during the Great Moderation.
But despite asset prices and corporate profits having rebounded significantly from their crisis lows and real rates now negative till the 10y tenor in the United States, a rebound in investment or consumption has not been forthcoming in the current recovery. This lack of responsiveness of aggregate demand to monetary policy is not as surprising as it first seems:
* The responsiveness of consumption to monetary policy is diminished when the consumer is as over-levered as he currently is. The “success” of monetary policy during the Great Moderation was primarily due to consumers’ ability to lever up to maintain consumption growth in the absence of any tangible real wage growth.
* The empirical support for the impact of real rates and asset prices on investment is inconclusive. Drawing on Keynes’ emphasis on the uncertain nature of investment decisions, Shackle was skeptical about the impact of lower interest rates in stimulating business investment. He noted that businessmen when asked rarely noted at the level of interest rates as a critical determinant. In an uncertain environment, estimated profits “must greatly exceed the cost of borrowing if the investment in question is to be made”.
If the problem with reduced real rates was simply that they were likely to be ineffective, there could still be a case for pursuing monetary policy initiatives aimed at reducing real rates. One could argue that even a small positive effect is better than not trying anything. But this unfortunately is not the case.
Steve Almond in Salon:
So, like I said, I wanted to talk about this brief, intense period of time when — and I realize this is a memory, so it's totally subjective — but it felt like you really hated me.
Yeah. It was mostly in this metal shop class we took together.
I definitely remember taking that metal shop class in eighth grade. And I was thinking about it, since you sent that original email, and I do remember being in a relationship with someone where I was the bully or the dominant, because I remember feeling that. But I never would have put two and two together and thought it was you.
I had this sense of being totally frozen out. And it was clear, or it seemed clear to me, that you were calling the shots. You were the alpha of that group.
It's funny you would say that, because this was around the time that Billy Dempsey entered the picture —
Yeah, I remember Billy coming up to me at the lockers, I think you were there for this, and threatening to kick my ass.
I don't remember that, but it wouldn't surprise me. The thing is, we had this very tortured relationship where I spent the entire time trying to prove myself to him. Billy was athletically more gifted than me and he was fearless and willing to get into fights with anybody, whereas I always saw myself as an egghead nerd. So it's quite possible, I could easily see, if there was an opportunity for me to prove to Billy that I was his equal in terms of being the macho guy I would have grabbed at it.
From The Telegraph:
‘The history of the Victorian age will never be written,” Lytton Strachey announced at the start of his waspish clutch of biographical sketches Eminent Victorians, not because of what has fallen between the cracks of the historical record, but because “we know too much about it”. The same is true of Dickens’s life, which has often been treated as the pivot around which the Victorian age revolved. From the spelling mistake on his birth certificate, to the neatly folded notes he left for his children if they used bad language, every document has been filleted for facts, every stray anecdote transformed into a revealing flash of personality. As with Shakespeare, his only serious rival for the title of the nation’s favourite author, the books, articles and blogs about him have multiplied to the extent that nobody can possibly read them all. Attempting then to write about him is like trying to cut up a blue whale with a penknife.
That doesn’t stop us trying. Next week sees the publication of my new biography Becoming Dickens, in which I investigate how in the space of five years an unknown reporter became the most famous novelist in the world. Within a few days it will be joined by Claire Tomalin’s cradle-to-grave Charles Dickens: a Life and Lucinda Hawksley’s more compact Charles Dickens, and later by Simon Callow’s book on Dickens’s love of the theatre. They will be followed by several documentaries, glossy BBC adaptations of Great Expectations and The Mystery of Edwin Drood, and a film about his lengthy secret affair with the actress Ellen Ternan. In 2012, his bicentenary year, Dickens’s face will be everywhere, his presence inescapable.
Abigail Zuger in The New York Times:
Take what must be the greatest cheap medical fix in all of history: the bar of soap. Soap never stops proving itself. As recently as 2005, a study from the slums of Karachi, Pakistan, showed that free bars of soap (and lessons in how to use them) cut rates of childhood killers like diarrhea and pneumonia by half. But you don’t find soap in American hospitals anymore, at least not in its classic solid rectangular form. A variety of expensive improvements have replaced it, all created in response to the various ways in which modern doctors and patients reflexively undermine good, inexpensive tools. First, we automatically capture these things for our own personal use: Bars of soap left in any public place are likely to disappear in short order. (That is why toilet paper rolls are generally locked into their little metal houses.) Second, we find fault with them. People will actually use the observation that bar soap is “dirty” as an excuse not to wash their hands. (Studies have shown that you will not pick up somebody else’s germs from a piece of soap, however dingy it may look.)
Alexander Downes in Boston Review:
Beyond the question of whether it is wise for the United States to seek regime change in yet another country while it continues to clean up the mess from the last two, the Libya adventure begs a reconsideration of the wisdom of regime change in general. Focusing on consequences, I will steer clear of issues of legality and moral justification. Rather, I ask what the historical record tells us about the capacity of externally imposed regime change to bring peace, stability, and democracy to target countries. Is the bloody aftermath of regime change in Afghanistan and Iraq the exception or the rule? Does regime change work?
The short answer is: rarely. The reasons for consistent failure are straightforward. Regime change often produces violence because it inevitably privileges some individuals or groups and alienates others. Intervening forces seek to install their preferred leadership but usually have little knowledge of the politics of the target country or of the backlash their preference is likely to engender. Moreover, interveners often lack the will or commitment to remain indefinitely in the face of violent resistance, which encourages opponents to keep fighting. Regime change generally fails to promote democracy because installing pliable dictators is in the intervener’s interest and because many target states lack the necessary preconditions for democracy.
Norman Birnbaum reviews Eric Miller's Hope in a Scattering Time A Life of Christopher Lasch, in The Nation (photo from Wikimedia Commons):
Born in Omaha in 1932, the year Franklin Roosevelt was elected president, Christopher Lasch graduated from Harvard in 1954, during the Eisenhower era’s mood of anxious complacency, and from there went directly to Columbia to do graduate work in history. Lasch’s career as a historian began as it would end forty years later with his death, with a search for the moral resources for the next New Deal. Lasch rejected the liberal history of Arthur Schlesinger Jr.—whose legitimation of the cold war he disliked, and whose view of the permanence of the New Deal’s achievements he found naïve. He learned much of modern social science as well as European political and social thought, and took psychoanalysis and theology seriously. He became one of the nation’s most prominent intellectuals, but he increasingly doubted the capacity of his colleagues to guide their fellow citizens. His first book, The American Liberals and the Russian Revolution, a critique of liberalism’s early capitulation to imperialism, sold a few hundred copies when it appeared in 1962. His next book was published three years later. Called The New Radicalism in America, 1889–1963: The Intellectual as a Social Type, it depicted intellectuals’ sometimes unintended subservience to power, and it made him famous. Lasch regarded his success in part as a burden, and throughout his life he would insist on the importance of his ties to family, friends, colleagues and students.
Cynicism, for Bierce, was not just an attitude; it was his life force. It’s ironic then that The Devil’s Dictionary is seen today primarily as a delightful little book of irreverent (if now anachronistic) witticisms. This is entirely Bierce’s fault. In life and in art, Bierce made it his prerogative to present himself as a Class A misanthropic know-it-all. Much of the real sensitivity and even anguish that produced The Devil’s Dictionary is obscured by an intentional ironic distance. By the time The Devil’s Dictionary was published, Bierce was 69. He had made a career as a curmudgeon, a writer with a big personality who always kept distance between himself and his public. He was famous for his motto “nothing matters” and was known as “Bitter Bierce.” Even his popular short stories, based on his experiences of the Civil War (see the classic “An Occurrence at Owl Creek Bridge”) were never autobiographical, never meant to bring readers closer to the man. He publicly attacked friends, employers, and of course, other writers. (Bierce had a literary run-in with Oscar Wilde once after the latter declared satire to be “as sterile as it is shameful, and as impotent as it is insolent.” Bierce responded in print with a torrent of insults, calling Wilde “a gawky gowk,” a “dunghill he-hen.” and the “littlest and looniest of a brotherhood of simpletons” who had “the divine effrontery to link his name with those of Swindburne, Rosetti and Morris.”) How could someone who addressed his book to “those…enlightened souls who prefer dry wines to sweet, sense to sentiment, wit to humor and clean English to slang” be taken all that seriously, especially by 21st-century readers? Today, The Devil’s Dictionary comes off as smart but smug. Who was Ambrose Bierce to pronounce such judgments on humanity?
more from Stefany Anne Golberg at The Smart Set here.
Most of what we associate with Victorian art is condensed into this picture: story-telling, social nuance, naturalism down to the last detail (“all the red-headed boys in Finchley” took turns sitting for the tousled ginger-haired child, for example, who appears as a mere fragment), humour (the cabbages dangling from the boat’s edge) balancing sentimentality, with the whole animated by vivid, piercing colour – the woman’s brilliant bonnet ribbon, fuchsia, crimson, mauve, magenta, fluttering across the picture; the deep maroon skeins of the deck rope – set against the grey wintry light and swell of a dull green sea. For the first time since this painting left Brown’s studio in 1855 it is shown here alongside a delicious preparatory oil sketch that reveals significant differences: the faces are finely featured, delicate as ivory and porcelain, rather than weather-beaten; textural details – an elaborate green and red embroidered shawl rather than the plain grey, for example – give a sumptuous surface sheen redolent of a Flemish miniature.
more from Jackie Wullschlager at the FT here.
And some time make the time to drive out west
Into County Clare, along the Flaggy Shore,
In September or October, when the wind
And the light are working off each other
So that the ocean on one side is wild
With foam and glitter, and inland among stones
The surface of a slate-grey lake is lit
By the earthed lightning of a flock of swans,
Their feathers roughed and ruffling, white on white,
Their fully grown headstrong-looking heads
Tucked or cresting or busy underwater.
Useless to think you'll park and capture it
More thoroughly. You are neither here nor there,
A hurry through which known and strange things pass
As big soft buffetings come at the car sideways
And catch the heart off guard and blow it open.
by Seamus Heaney
from The Spirit Level
publisher:Farrar, Straus and Giroux, 1996
Hals was an unusual artist in that, especially in the first half of his career, he was able to paint, with little or no coyness, people grinning, or being plain happy. The relative scarcity in the history of painting of people giggling or looking like they have just said or heard something tickling indicates how hard it must be for a painter to bring off such a thing. Hals’s images of laughter and mirth come across as being the underpinning of his approach. It is as if his work is based on a philosophical position, and he is saying, “We are alive, so how can we not be cheerful?” Far from all his people are effervescent. He was hardly a painter propounding a thesis. As Seymour Slive, our foremost authority on the artist, has suggested, Hals seems to have taken the key to each of his pictures from the nature of his encounter with the sitter. (Slive’s writings on Hals have the same warmth, directness, energy, and clarity that rise from the paintings.) The experience of the 1989 retrospective, which was largely Slive’s work and which can almost be recaptured in its catalog, where the reproductions are large and good, is that we are encountering a storehouse of subtle moods and expressions.
more from Sanford Schwartz at the NYRB here.