by Brooks Riley
Category: Recommended Reading
Sunday, June 12, 2016
How to Understand ISIS
Malise Ruthven in The New York Review of Books:
The extreme jihadists, of course, are now mainly drawn to the so-called caliphate ofISIS, also known as Daesh. While several books have already charted the rise of ISISout of the chaos of the 2003 US-led invasion of Iraq, in ISIS: A History, Fawaz Gerges joins Lynch in explaining its provenance more specifically as a direct consequence of the sectarian feelings the invasion unleashed, for which America must bear responsibility:
By destroying state institutions and establishing a sectarian-based political system, the 2003 US-led invasion polarized the country along Sunni-Shia lines and set the stage for a fierce, prolonged struggle driven by identity politics. Anger against the United States was also fueled by the humiliating disbandment of the Iraqi army and the de-Baathification law, which was first introduced as a provision and then turned into a permanent article of the constitution.
In his well-researched and lucidly argued text Gerges shows how the US de-Baathification program, combined with the growing authoritarianism and exclusion of Sunnis under Prime Minister Nuri al-Maliki, provided fertile conditions for the emerging of ISIS out of al-Qaeda under the brutal leadership of Abu Musab al-Zarqawi and the self-styled caliph Abu Bakr al-Baghdadi, his even more extreme successor. Al-Baghdadi is an evident fraud whose claim to legitimacy by virtue of descent from the Prophet’s tribe Gerges discredits on genealogical grounds.
De-Baathification, based on the American envoy Paul Bremer’s foolish analogy with the postwar denazification of Germany, had deprived the country of the officer class and administrative cadres that had ruled under Saddam Hussein, leaving the field to sectarian-based militias. As Gerges rightly observes, Baathism as practiced in Iraq and Syria was “less of a coherent ideology than a hizb al-Sulta, a ruling party that distributed rewards to stakeholders based on loyalty to the head of the party.” In view of the absence of ideological content, it was hardly surprising that disenfranchised former officers of Saddam Hussein’s army, facing exclusion from Maliki’s Shia-dominated government, should have migrated to the militant version of Sunnism Gerges calls Salafi-jihadism.
In analyzing ISIS’s success, Gerges points to the legacy of Paul Bremer: some 30 percent of the senior figures in ISIS’s military command are former army and police officers from the disbanded Iraqi security forces. It was the military expertise of these men that transformed the Sunni-based insurgent movement of al-Qaeda in Iraq into ISIS, “an effective fighting machine, combining urban guerilla warfare and conventional combat to deadly effect.”
More here.
Nothing Inorganic
Mark Noble in The LA Review of Books:
THERE IS A MOMENT in Henry David Thoreau’s Journal that has always bothered me. It’s the middle of August 1851, and Thoreau begins a desultory afternoon entry with regrets about the finitude of human perspectives. Long hikes require so much gear, we cannot migrate so easily as birds, we are not everywhere at home like bugs. So he concedes it’s often easier, and perhaps no less profitable, to just stay in and record events of the mind:
As travellers go round the world and report natural objects & phenomena—so faithfully let another stay at home and report the phenomena of his own life. Catalogue stars—those thoughts whose orbits are as rarely calculated as comets. It matters not whether they visit my mind or yours—whether the meteor falls in my field or in yours—only that it came from heaven.
If the mind is like the sky, then astronomy legitimates introspection. Mental landscapes compel attention as natural landscapes. But what authorizes this analogy also effaces the idea that one’s thoughts could be one’s own. Maybe some thoughts are as luminous as stars, but are they also as remote? Can Thoreau really mean that the exteriority of a thought, or even its celestial origin, so utterly trivializes the idea that thoughts belong to anyone in particular? In the very moment we’re granted permission to indulge the life of the mind, we’re also dispossessed of it. If you would presume to have your own thoughts, he seems to argue, then you should search the night sky in hopes of tracing their ancient patterns.
Few studies have illuminated both the challenges and the exhilarations of this dispossession as powerfully as Branka Arsić’s new book, Bird Relics: Grief and Vitalism in Thoreau, which reorients our understanding of Thoreau’s materialist vitalism. Arsić’s reading of both canonical texts and understudied fragments uncover a radical philosophy of life — a vibrant ontology in which writing about what generates our experience also means blurring conventional distinctions between the realistic and the fantastic, animate bodies and inanimate ones, what it means to live and what it means to die.
More here.
‘In Praise of Forgetting,’ by David Rieff
Gary J. Bass in the New York Times Book Review:
“It was like the sound of rain, the sound of firebombs dropping,” Keiko Utsumi remembers. She is an elderly, dignified Japanese woman, retired as a nurse and a midwife, impeccably dressed in a beige linen blazer in the sweltering Tokyo summer heat. Late in World War II, during the spring of 1945, she was 16 years old, put to work at a military factory in the port city of Yokohama, just south of Tokyo. During one of the United States’ incendiary bombing raids, she recalls huddling in a bomb shelter all night, terrified, watching the inferno of wooden houses all around. When she emerged into a scorched wasteland the next morning, with the ground so hot it melted her shoes, she saw the dead: “They were all black, all burned.”
Seventy years after the end of the war, Utsumi met me in central Tokyo last August to tell her story. Remarkably, she had never discussed her terrible experiences with anyone. “When I was leaving the house this morning,” she said, “and told my son I’d be in an interview about the war, my son asked, ‘You were in the war?’ ”
This kind of stoic quietude may seem odd, even unhealthy, to Americans, accustomed to ventilating the most mundane experiences, with no incident too banal to be rehashed. But respect for such forbearance is at the heart of David Rieff’s insightful and humane new book.
More here.
THE MISTRUST OF SCIENCE
The following was delivered as the commencement address at the California Institute of Technology, on Friday, June 10th.
Atul Gawande in The New Yorker:
If this place has done its job—and I suspect it has—you’re all scientists now. Sorry, English and history graduates, even you are, too. Science is not a major or a career. It is a commitment to a systematic way of thinking, an allegiance to a way of building knowledge and explaining the universe through testing and factual observation. The thing is, that isn’t a normal way of thinking. It is unnatural and counterintuitive. It has to be learned. Scientific explanation stands in contrast to the wisdom of divinity and experience and common sense. Common sense once told us that the sun moves across the sky and that being out in the cold produced colds. But a scientific mind recognized that these intuitions were only hypotheses. They had to be tested.
When I came to college from my Ohio home town, the most intellectually unnerving thing I discovered was how wrong many of my assumptions were about how the world works—whether the natural or the human-made world. I looked to my professors and fellow-students to supply my replacement ideas. Then I returned home with some of those ideas and told my parents everything they’d got wrong (which they just loved). But, even then, I was just replacing one set of received beliefs for another. It took me a long time to recognize the particular mind-set that scientists have. The great physicist Edwin Hubble, speaking at Caltech’s commencement in 1938, said a scientist has “a healthy skepticism, suspended judgement, and disciplined imagination”—not only about other people’s ideas but also about his or her own. The scientist has an experimental mind, not a litigious one.
More here.
We tend to be cooperative—unless we think too much
Matthew Hutson in Nautilus:
Many people cheat on taxes—no mystery there. But many people don’t, even if they wouldn’t be caught—now, that’s weird. Or is it? Psychologists are deeply perplexed by human moral behavior, because it often doesn’t seem to make any logical sense. You might think that we should just be grateful for it. But if we could understand these seemingly irrational acts, perhaps we could encourage more of them. It’s not as though people haven’t been trying to fathom our moral instincts; it is one of the oldest concerns of philosophy and theology. But what distinguishes the project today is the sheer variety of academic disciplines it brings together: not just moral philosophy and psychology, but also biology, economics, mathematics, and computer science. They do not merely contemplate the rationale for moral beliefs, but study how morality operates in the real world, or fails to. David Rand of Yale University epitomizes the breadth of this science, ranging from abstract equations to large-scale societal interventions. “I’m a weird person,” he says, “who has a foot in each world, of model-making and of actual experiments and psychological theory building.”
In 2012 he and two similarly broad-minded Harvard professors, Martin Nowak and Joshua Greene, tackled a question that exercised the likes of Thomas Hobbes and Jean-Jacques Rousseau: Which is our default mode, selfishness or selflessness? Do we all have craven instincts we must restrain by force of will? Or are we basically good, even if we slip up sometimes? They collected data from 10 experiments, most of them using a standard economics scenario called a public-goods game.1 Groups of four people, either American college students or American adults participating online, were given some money. They were allowed to place some of it into a pool, which was then multiplied and distributed evenly. A participant could maximize his or her income by contributing nothing and just sharing in the gains, but people usually gave something. Despite the temptation to be selfish, most people showed selflessness. This finding was old news, but Rand and his colleagues wanted to know how much deliberation went into such acts of generosity. So in two of the experiments, subjects were prodded to think intuitively or deliberately; in two others, half of the subjects were forced to make their decision under time pressure and half were not; and in the rest, subjects could go at their own pace and some naturally made their decisions faster than others. If your morning commute is any evidence, people in a hurry would be extra selfish. But the opposite was true: Those who responded quickly gave more. Conversely, when people took their time to deliberate or were encouraged to contemplate their choice, they gave less.
More here.
Fleet Foxes – Mykonos
viktor korchnoi (1931 – 2016)
jerome bruner (1915 – 2016)
Sunday Poem
The Traveler
I
Among the quiet people of the frost,
I remember an Eskimo
walking one evening
on the road to Fairbanks.
II
A lamp full of shadows burned
on the table before us;
the light came as though from far off
through the yellow skin of a tent.
III
Thousands of years passed.
People were camped on the bank
of a river, drying fish
in the sun. Women bent over
stretched hides, scraping
in a kind of furry patience.
There were long hints through
the wet autumn grass,
meat piled high in caches –
a red memory against whiteness.
IV
We were away for a long time.
The footsteps of a man walking alone
on the frozen road from Asia
crunched in the darkness
and were gone.
.
by John Haines
from The Owl in the Mask of the Dreamer
Graywolf Press, 1993
.
Saturday, June 11, 2016
Economics Struggles to Cope With Reality
Noah Smith in Bloomberg View:
There are basically four different activities that all go by the name of macroeconomics. But they actually have relatively little to do with each other. Understanding the differences between them is helpful for understanding why debates about the business cycle tend to be so confused.
The first is what I call “coffee-house macro,” and it’s what you hear in a lot of casual discussions. It often revolves around the ideas of dead sages — Friedrich Hayek, Hyman Minsky and John Maynard Keynes. It doesn’t involve formal models, but it does usually contain a hefty dose of political ideology.
The second is finance macro. This consists of private-sector economists and consultants who try to read the tea leaves on interest rates, unemployment, inflation and other indicators in order to predict the future of asset prices (usually bond prices). It mostly uses simple math, though advanced forecasting models are sometimes employed. It always includes a hefty dose of personal guesswork.
The third is academic macro. This traditionally involves professors making toy models of the economy — since the early ’80s, these have almost exclusively been DSGE models (if you must ask, DSGE stands for dynamic stochastic general equilibrium). Though academics soberly insist that the models describe the deep structure of the economy, based on the behavior of individual consumers and businesses, most people outside the discipline who take one look at these models immediately think they’re kind of a joke. They contain so many unrealistic assumptions that they probably have little chance of capturing reality. Their forecasting performance is abysmal. Some of their core elements are clearly broken. Any rigorous statistical tests tend to reject these models instantly, because they always include a hefty dose of fantasy.
The fourth type I call Fed macro. The Federal Reserve uses an eclectic approach, involving both data and models. Sometimes the models are of the DSGE type, sometimes not. Fed macro involves taking data from many different sources, instead of the few familiar numbers like unemployment and inflation, and analyzing the information in a bunch of different ways. And it inevitably contains a hefty dose of judgment, because the Fed is responsible for making policy.
How can there be four very different activities that all go by the same name, and all claim to study and understand the same phenomena?
More here.
Who Rules?
Richard Marshall interviews David Estlund in 3:AM Magazine:
3:AM: You’ve defended democracy from the attack that it is the rule of the know-nothings with what you call ‘epistemic proceduralism.’ Before we look at your defence could you say something about the attack. On the face of it it does seem mad that experts aren’t the people we go to to govern. After all, we wouldn’t want a non-expert dentist, so why not use the same approach to dealing with problems of government? What’s the problem with just ensuring that decisions being made are good decisions by handing power over to the experts?
DE: That’s exactly the question that motivated my work on democracy (as you know), and of course it’s the ancient challenge to democracy stemming from Plato. I didn’t find the modern idea very satisfying—that we could answer that challenge by pointing to a right of the people to rule themselves. That would have the advantage of explaining why the people get to rule even if they aren’t good at it. The right to rule oneself individually doesn’t seem (to us moderns) to depend on whether we’d be good at it, so this might seem like an extension. But the analogy with a (say, Millian) right to self-rule, interpreted as individual immunity from interference in self-regarding choices, is very weak. When you “rule” as a member of the democratic people you contribute to coercing others, not just yourself. That’s precisely the limit on the Millian idea of individual self-rule. So, I didn’t see that broad approach as an adequate answer to the ancient question: if your political decisions will affect (and even coerce) the prospects and choices of others, why should you get to do that even if you’re not good at it? Plato’s challenge is powerful.
So, we have to confront the possibility that ruling ought to be done by those who can actually do it well (though I reject it in the end). I find that students, at least, squirm at the very idea that some might be able to rule better than others, and yet they nod happily at the suggestion that some are much worse than others. So, since the stakes of political decision are so very high, why shouldn’t rule be by the much-less-bad? I came to think that an important key lies in the fact that, even if we agree that some would be better and some worse at ruling justly and well, we are very unlikely to agree on who is in which category. It would be one thing if all decent points of view did agree, but that’s just not plausible. The problem here is a moral one, not one about how to keep the dissidents in line. So, on one hand, it’s not plausible that the people simply have a right to collective self-rule even though their acts will momentously affect and interfere with others against their will. On the other hand, and here we push back against Plato, there is no strong reason to think that someone’s being correct about what should be done is enough to justify their having the power to impose it on others. What’s driving things, on this telling, is not a positive right of self-rule but some sort of right (hopefully defeasible!) not to be ruled, wisely or otherwise, by others. While the ancient puzzle is first raised by pointing to ignorance of the masses, it turns out that the moral problem might not mainly be about their ignorance. After all, there is still a problem for rule by the non-ignorant. So, at this point, an initial answer—err, question—to your question why we shouldn’t be ruled by the experts, is roughly: they might be correct, but what makes them boss?
More here.
Interview with Steven Strogatz
Elena Holodny in Business Insider:
Elena Holodny: What's interesting in chaos theory right now?
Steven Strogatz: I'm often very interested in whatever my students get interested in. I primarily think of myself as a teacher and a guide. I try to help them – especially my Ph.D. students – become the mathematicians they're trying to become. The answer often depends on what they want to do.
In broad terms, the question of how order emerges out of chaos. Even though we talk about it as “chaos theory,” I'm really more interested in the orderly side of nature than the chaotic side. And I love the idea that things can organize themselves. Whether those things are our system of morality or our universe or our bodies as we grow from a single cell to the people we eventually become. All this kind of unfolding of structure and organization all around us and inside of us, to me, is inspiring and baffling. I live for that kind of thing, to try to understand where these patterns come from.
More here.
‘The Big Picture,’ by Sean Carroll
Anthony Gottlieb in the New York Times:
The physical world is “largely illusory,” an editorial in The New York Times announced on Nov. 25, 1944. Wishful thinking on a depressing day? No. Had The Times gone mad? Not quite. It was endorsing the ideas of Sir Arthur Eddington, an eminent British astronomer and popularizer of science, who had just died.
Eddington began his best-known book, “The Nature of the Physical World,” by explaining that he had written it at two tables, sitting on two chairs and with two pens. The first table was the familiar kind: It was colored, substantial and relatively long-lasting. The second was what he called a “scientific table,” a colorless cloud of evanescent electric charges that is “mostly emptiness.” Likewise the two chairs and two pens. Only the scientific objects were really there, according to Eddington. Hence the idea that our familiar world is a deception on a grand scale.
Coming to terms with science is not getting any easier. Today’s popularizers face two challenges, both of which are admirably met by Sean Carroll, a theoretical physicist at the California Institute of Technology, in his new book, “The Big Picture: On the Origins of Life, Meaning, and the Universe Itself.” First, there is more to explain than ever before, as the sciences extend their embrace to a widening range of phenomena. Fortunately, Carroll is something of a polymath. His accounts of the latest thinking about microbiology or information theory are as adroit as his exploration of the links between entropy and time or his elucidation of Bayesian statistics.
The second challenge for today’s explainers is that the theories are getting weirder. Einstein used to worry that, according to quantum mechanics, God seems to be playing dice with the universe. Now it appears that he has put a stage magician in charge of the casino.
More here.
Oneida: From Free Love Utopia to the Well-Set Table
Lorraine Berry at The Guardian:
In 19th-century America, a number of utopian communities, oblivious to the defeatist etymology of the word utopia (Greek for “not” plus “place”, or “no-place”), were established, mostly throughout the north-east. Amos Bronson Alcott (father of Louisa May), Robert Owen and a group of transcendentalists all tried their hands at creating separate communities of peace and understanding. All of these efforts failed fairly quickly.
The exception was the Oneida community, founded by John Humphrey Noyes, in the Leatherstocking region of central New York state. The region had already surrendered its secrets to the young Joseph Smith when he discovered the gold books of the angel Moroni buried in a drumlin near Palmyra, and founded the Church of Jesus Christ of Latter-day Saints. Oneida, which is located 80 miles to the east, provided a home for Noyes’s nascent Society of Inquiry when its members fled from Putney, Vermont in the 1840s after Noyes’ doctrine of “complex marriage” offended the local townspeople.
more here.
a sobering look at Palestinian life and resistance in the West Bank
The way to the spring…is blocked. At least that’s the case for the Palestinians of Nabi Saleh, a small village northwest of Ramallah. The expansion-minded residents of a nearby Jewish settlement, with the aid of the Israeli army that occupies the West Bank, have taken over the town’s water source, which Palestinian farmers depended on to irrigate their fields.
Ben Ehrenreich, an award-winning writer based in Los Angeles, discovered as much when he moved to the West Bank, which Israel captured from Jordan in a war with its Arab neighbors in 1967. Ehrenreich, who lived in that troubled land intermittently between 2011 and 2014 (in part, reporting for Harper’s and the New York Times Magazine), demonstrates that Nabi Saleh is no anomaly. “The Way to the Spring: Life and Death in Palestine” emerges as a sobering, iconoclastic “collection of stories about resistance, and about people who resist,” marred slightly by the author’s unwillingness to subject Palestinian militant activity, which has often included terrorism, to moral scrutiny.
“The spring is the face of the occupation,” Bassem Tamimi of Nabi Saleh tells the author. Every Friday, the villagers, joined by international and Israeli solidarity activists, march toward it in a regularized act of protest. “And every Friday Israeli soldiers beat them back with tear gas, stun grenades, and rubber-coated bullets,” observes the author. Afterward, groups of male youths situated some distance away hurl stones at the soldiers, who are generally beyond their reach.
more here.
A Walk in the Park’, by Travis Elborough
Andrew Martin at The Financial Times:
Travis Elborough is the affectionate chronicler of faded Englishness. He has been described as “the hipster Bill Bryson”, and it is a mystery to some of us why he is not as well-known as Bryson. His books have so far covered the Routemaster bus, the long-playing record, and the sale of London Bridge (“the world’s largest antique”, as he put it) to a Texan millionaire.
His writing combines subtle drollery with a fantastical, Monty Python-ish strain. Early in his narrative, he takes an excursion to Versailles, an important site in the aforementioned evolution. Louis XIV, the Sun King — a “control freak in modern parlance” — arranged the planting around a central axis whose focal point was his own bedroom, and the gardens were micromanaged according to his whim: “The fountains were magnificent features. But there was only enough water to keep the ones closest to the palace at a constant spurt. The others were switched off and on as the king approached and departed, his movements closely monitored and signalled to staff, with flags waved and whistles blown in a complex system of field telegraph.”
more here.
Bill Clinton, Natasha Mundkur, Billy Crystal, and Rabbi Michael Lerner eulogize Muhammad Ali
The Indelible Stain of Donald Trump
Peter Wehner in The New York Times:
Peter Wehner (@Peter_Wehner), a senior fellow at the Ethics and Public Policy Center, served in the last three Republican administrations and is a contributing opinion writer.
Mr. Trump knows his target audience, which explains why, beginning the morning of the Indiana primary on May 3 (the day he became the de facto nominee), he has — among other in-the-gutter moments — implied that Senator Ted Cruz’s father was implicated in the assassination of President Kennedy; insinuated that Vince Foster, a friend of the Clintons who was White House deputy counsel, was murdered (five official investigations determined that Mr. Foster committed suicide); engaged in a racially tinged attack on Gonzalo Curiel, a district court judge presiding over a fraud lawsuit against Trump University; and expressed doubt that a Muslim judge could remain neutral in the case. This is conspiratorial craziness and rank racism — and all of it has happened after we were told Mr. Trump would raise his game. The surprise is that so many Republicans are now expressing consternation at what Mr. Trump is doing. Has any recent presidential candidate ever advertised quite as openly as Mr. Trump the kind of vicious attacks he’d engage in? We were warned in neon lights what was coming. The idea that he will now engage in a “course correction” — that he will flip a switch and transform himself into a decent and dignified man — is laughable. Mr. Trump has repeatedly stated that he won’t change his approach. (“You win the pennant and now you’re in the World Series — you gonna change?”) In this one area, Republicans should take him at his word.
When a narcissist like Mr. Trump is victorious, as he was in the Republican primary, and when he has done it on his terms, he’s not going to listen to outside counsel from people who think they can change the patterns of a lifetime. Republicans have not changed Mr. Trump for the better; he has changed them for the worse. So here we are, with Republicans who lined up behind Mr. Trump now afraid of being led off a high cliff. If the prospect of a November shellacking isn’t enough to unnerve these Republicans, there’s also this to factor in: What we are talking about is potential generational damage to the Republican Party. Consider this historical comparison: In 1956 the Republican nominee, Dwight D. Eisenhower, won nearly 40 percent of the black vote. In 1960, Richard Nixon won nearly a third. Yet in 1964, in large part because of his opposition to the Civil Rights Act, Barry Goldwater (who was no racist) won only 6 percent. More than a half-century later, that figure has remained low. Mr. Trump — through his attacks against Hispanics that began the day he announced his candidacy — is doing with Hispanics today what Senator Goldwater did with black voters in the early 1960s. The less resistance there is to Mr. Trump now, the more political damage there will be later. The stain of Trump will last long after his campaign. His insults, cruelty and bigotry will sear themselves into the memory of Americans for a long time to come, especially those who are the targets of his invective.
Mr. Trump is what he is — a malicious, malignant figure on the American political landscape. But Republican primary voters, in selecting him to represent their party, and Republican leaders now rallying to his side, have made his moral offenses their own.
More here.
Saturday Poem
by Nils Peterson
from Walk to the Center Things
