The War with Radical Islam

Jeffrey D. Sachs in Project Syndicate:

Sachs2011_250French Prime Minister Manuel Valls was not speaking metaphorically when he said that France is at war with radical Islam. There is, indeed, a full-fledged war underway, and the heinous terrorist attacks in Paris were part of it. Yet, like most wars, this one is about more than religion, fanaticism, and ideology. It is also about geopolitics, and its ultimate solution lies in geopolitics as well.

Crimes like those in Paris, New York, London, and Madrid – attacks on countless cafes, malls, buses, trains, and nightclubs – affront our most basic human values, because they involve the deliberate murder of innocents and seek to spread fear throughout society. We are wont to declare them the work of lunatics and sociopaths, and we feel repulsed by the very idea that they may have an explanation beyond the insanity of their perpetrators.

Yet, in most cases, terrorism is not rooted in insanity. It is more often an act of war, albeit war by the weak rather than by organized states and their armies. Islamist terrorism is a reflection, indeed an extension, of today’s wars in the Middle East. And with the meddling of outside powers, those wars are becoming a single regional war – one that is continually morphing, expanding, and becoming increasingly violent.

From the jihadist perspective – the one that American or French Muslims, for example, may pick up in training camps in Afghanistan, Syria, and Yemen – daily life is ultra-violent.

More here. [Thanks to Syed Tasnim Raza.]

An Economics Lesson: Teaching for Disciplinary Understanding

S. Abu Rizvi in Education Week:

ScreenHunter_948 Jan. 18 17.06Fifteen years ago, my colleagues and I observed that most economics undergraduates we taught quickly lost a third to half of their knowledge. “A” students turned into “C” students in a matter of weeks, right after final exams. For those of us who wanted disciplinary understanding to be useful to students well after they left college, this and similar findings were sobering. They spurred us to revamp how and what we teach while keeping an eye on why: to prepare students to use their understanding of the disciplines in other times and places.

Let's begin where we want to end up, with an example of the successful and flexible use of disciplinary understanding. As we consider the activities of two professional economists, Atif Mian and Amir Sufi, we should keep in mind that the concepts they employ are taught in introductory economics classes.

Mian and Sufi's intervention arose from the Great Recession at the end of the last decade. Economic turmoil left many homeowners “underwater,” with homes worth less than what was owed on mortgages. Federal debt relief was a policy that was considered. But Timothy Geithner, the Secretary of Treasury at the time, claimed that the impact of relief on the economy would be tiny. By freeing overburdened homeowners to spend, even a large program of $700 billion “would have increased annual personal consumption by just 0.1 to 0.2 percent.” Mian and Sufi thought this figure was too low. They used the concept of the marginal propensity to consume (MPC), “a very well-researched question,” to show that relief this big would have had an impact six to thirteen times higher than Geithner claimed. His figure for the policy's economic impact was far too small. Their argument, made at the right time, could have carried the day against Geithner's proposal.

More here.

Radical Linguistics in an Age of Extinction

Ross Perlin in Dissent:

PisaEvery language has a complex grammar—an almost invisible glue between words that enables meaning-making—and new vocabulary can always be borrowed or coined. Some languages may specialize in melancholy, or seaweed, or atomic structure, or religious ritual; some grammars may glory in conjugating verbs while others bristle with syntactic invention. Hawaiian has just thirteen phonemes (meaningful sounds) while the Caucasian language Ubykh, extinct as of 1992, had eighty-four. “English” (with all its technical varieties) is said to be adding up to 8,500 words per year, more than many Australian aboriginal languages have to begin with. But these are surface inequalities—questions of personality.

Perceptions of linguistic superiority or inferiority are instead based on power, class, and social status. Historically, it was languages that were swept in with strong political, economic, or religious backing—Latin, Greek, Sanskrit, Hebrew, Arabic, Persian, and Chinese in the Eurasian core—that were held to be the oldest, the holiest, and the most perfect in structure, their “classical” status cemented by the received weight of canonical tradition. By the nineteenth century, the imperial nation-states of Europe were politely shunting them off to the museum and imposing their own equivalents: newly standardized “modern” languages like English and French. Johann Gottfried Herder’s Treatise on the Origin of Language (1772) inspired would-be nation-builders to document, restore, and develop their own neglected vernaculars. One by one, the nationalists of Central and Eastern Europe adopted Herder’s program, as has virtually every modern nation-state sooner or later: warding off imperial languages from without by establishing a dominant standardized language within, at the expense of minority languages and local varieties.

More here.

Saturday, January 17, 2015

The History Manifesto

A Roundtable on The History Manifesto: The Role of History and the Humanities in a Digital Age from Heyman Center/Society of Fellows on Vimeo.

From the introduction of The History Manifesto, “a call to arms to historians and everyone interested in the role of history in contemporary society. Leading historians David Armitage and Jo Guldi identify a recent shift back to longer-term narratives, following many decades of increasing specialization, which they argue is vital for the future of historical scholarship and how it is communicated.”

A spectre is haunting our time: the spectre of the short term.

We live in a moment of accelerating crisis that is characterised by the shortage of long-term thinking. Even as rising sea-levels threaten low-lying communities and coastal regions, the world’s cities stockpile waste, and human actions poison the oceans, earth, and groundwater for future generations. We face rising economic inequality within nations even as inequalities between countries abate while international hierarchies revert to conditions not seen since the late eighteenth century, when China last dominated the global economy. Where, we might ask, is safety, where is freedom? What place will our children call home? There is no public office of the long term that you can call for answers about who, if anyone, is preparing to respond to these epochal changes. Instead, almost every aspect of human life is plotted and judged, packaged and paid for, on time-scales of a few months or years. There are few opportunities to shake those projects loose from their short-term moorings. It can hardly seem worth while to raise questions of the long term at all.
In the age of the permanent campaign, politicians plan only as far as their next bid for election. They invoke children and grandchildren in public speeches, but electoral cycles of two to seven years determine which issues prevail. The result is less money for crumbling infrastructure and schools and more for any initiative that promises jobs right now. The same short horizons govern the way most corporate boards organise their futures. Quarterly cycles mean that executives have to show profit on a regular basis.1 Long-term investments in human resources disappear from the balance sheet, and so they are cut. International institutions, humanitarian bodies, and non-governmental organisations (NGOs) must follow the same logic and adapt their programmes to annual or at most triennial constraints. No one, it seems, from bureaucrats to board members, or voters and recipients of international aid, can escape the ever-present threat of short-termism.
There are individuals who buck the trend, of course. In 1998, the Californian cyber-utopian Stewart Brand created the Long Now Foundation to promote consciousness of broader spans of time. ‘Civilization is revving itself into a pathologically short attention span’, he wrote: ‘Some sort of balancing corrective to the short-sightedness is needed – some mechanism or myth that encourages the long view and the taking of long-term responsibility, where “the long term” is measured at least in centuries.’ Brand’s charismatic solution to the problem of short-termism is the Clock of the Long Now, a mechanism operating on a computational span of 10,000 years designed precisely to measure time in centuries, even millennia.
More here.

What SAT Critics Miss

Test-prep-web

Jeffrey Aaron Snyder reviews Lani Guinier's The Tyranny of Meritocracy: Democratizing Higher Education in America in Boston Review (Image: zaveqna):

“The world . . . provides us with more than one correct answer to most questions,” Guinier says, and nods to Bard College President Leon Botstein who tells us that no professional “pursues her vocation” by choosing the “right” answer from “a set of prescribed alternatives that trivialize complexity and ambiguity.” Incisive points, to be sure, but there are alternatives to the multiple-choice format. Many standardized tests, for instance, now include “open-response” items, which require students to fashion their own answers rather than simply choosing the one “correct” answer from a ready-made list. In my view, however, the limitations of standardized testing with respect to prefabricated questions are far more important than the shortcomings associated with prefabricated answers. The ability to formulate a significant question is a hugely important skill, especially for college-level work, and one that no standardized test even attempts to measure. Standardized tests, then, too often reinforce the dreary lesson taught by many schools that it is the job of students to answer rather than to ask questions.

I never thought I would feel compelled to defend the integrity of the College Board or the number-crunchers at U.S. News and World Report, but a few corrections of the kind of fanciful exaggerations favored by anti-testing crusaders are in order. It has been over twenty years since the SAT ceased to be an acronym but it seems the SAT will always be known as the Scholastic Aptitude Test in the popular imagination, forever associated with the attempt to measure native intellectual ability. Guinier only reinforces this common misconception, stating that the SAT “doesn’t even pretend to measure achievement.” But as the College Board website explains, the SAT “doesn’t test logic or abstract reasoning.” Rather, “it tests the skills you’re learning in school: reading, writing and math.” In other words, today’s SAT is meant to be an achievement rather than an aptitude test.

Guinier, like many critics of the SAT, is dismissive of the test’s predictive power, claiming that the correlation between SAT scores and first-year college grade-point-average is “very, very slight.” In fact, most studies put the figure in the neighborhood of .45, which is a shade higher than the correlation between rates of smoking and incidences of lung cancer. It is also only a tad lower than the correlation between cumulative high school GPA and first-year college GPA.

Finally, according to Guinier, the U.S. News annual college rankings “rely heavily on SAT scores for their calculations.” Admissions test scores actually account for just over 8 percent of a school’s ranking.

More here.

On Edgar Allan Poe

Robinson_1-020515_jpg_250x1273_q85

Marilynne Robinson in the NYRB (photo from Enoch Pratt Free Library, Baltimore):

In the last year of his life he wrote a prose poem, Eureka, which would have established this fact beyond doubt—if it had not been so full of intuitive insight that neither his contemporaries nor subsequent generations, at least until the late twentieth century, could make any sense of it. Its very brilliance made it an object of ridicule, an instance of affectation and delusion, and so it is regarded to this day among readers and critics who are not at all abreast of contemporary physics. Eureka describes the origins of the universe in a single particle, from which “radiated” the atoms of which all matter is made. Minute dissimilarities of size and distribution among these atoms meant that the effects of gravity caused them to accumulate as matter, forming the physical universe.

This by itself would be a startling anticipation of modern cosmology, if Poe had not also drawn striking conclusions from it, for example that space and “duration” are one thing, that there might be stars that emit no light, that there is a repulsive force that in some degree counteracts the force of gravity, that there could be any number of universes with different laws simultaneous with ours, that our universe might collapse to its original state and another universe erupt from the particle it would have become, that our present universe may be one in a series.

All this is perfectly sound as observation, hypothesis, or speculation by the lights of science in the twenty-first century. And of course Poe had neither evidence nor authority for any of it. It was the product, he said, of a kind of aesthetic reasoning—therefore, he insisted, a poem. He was absolutely sincere about the truth of the account he had made of cosmic origins, and he was ridiculed for his sincerity. Eureka is important because it indicates the scale and the seriousness of Poe’s thinking, and its remarkable integrity. It demonstrates his use of his aesthetic sense as a particularly rigorous method of inquiry.

More here.

Sins of the fathers: child sex abuse in the Catholic Church

Francis Beckett in New Humanist:

Betrayed: The English Catholic Church and the Sex Abuse Crisis (Biteback) by Richard Scorer.

The Devil’s Advocate: Child Abuse and the Men in Black (Devil’s Advocate Library) by Graham Wilmer

BookPriestly sex abuse has done far more harm to the Catholic Church in the USA, Canada, Ireland and Australia than it has in Britain, which leads some British Catholics to the comforting conclusion that there is less of it here. But at least 61 Catholic priests have been convicted of sexual offences in the criminal courts of England and Wales since 1990, and there may well be more, for the church still has no single centralised record of known offenders. However, American courts award much higher sums in compensation to victims, which is why American dioceses have been ruined. And the English Catholic Church has been ruthless in its efforts to keep the lid on the scandal, to silence victims, and to protect priests who use young children for their own sexual gratification.

Over and over again, the princes of the church have silently and cynically moved a priest from one school or parish where he was discovered to be abusing children, to another where he was unknown and could find more children to abuse. Of course children were abused in many institutions, not just Catholic ones, but the fact, though Catholics refuse to face it, is that the church had a culture of abuse like no other organisation. If there was ever any doubt about that, two new books have dispelled it. Richard Scorer is a lawyer who has represented many victims of priestly sexual abuse. He has written Betrayal in clear, luminous prose, telling us only what he has heard and seen. He avoids conjecture, does not seem to be anti-Catholic and does not editorialise. The result is compulsive reading.

More here.

Selected Letters of Norman Mailer: a thrilling and revealing collection

Norman-Mailer-010Alexis Forss at The Guardian:

Seven years have passed since the death of Norman Mailer, and a campaign is being waged in his name on several fronts. The publication late last year of J Michael Lennon’s authorised biography asked us to contemplate what its title referred to as a double life. A series of Random House reissues shifts attention to the essays and novels. With the release of the selected letters the most congenial approach to Mailer is illuminated: one in which the works and days are understood as marching, like his “armies of the night”, in lockstep. If John Updike’s larger body of work somehow seems a less of a vertiginous challenge than Mailer’s 44 books, it is because Updike’s chief legacy is his style: the profusion of opiate sentences that delivers us hit after euphoric hit. Mailer bequeathed us no style. What he wanted to do was to save our souls, and that was a battle to be fought in a variety of guises: General Marijuana, Aquarius, the Prisoner, and, of course, the Great Illeist – someone who refers to themselves in the third person – Norman Mailer himself. Selected Letters of Norman Mailergrants us access to the dressing room.

Lennon’s role as custodian of Mailer’s literary estate may seem redundant: after all, acting as his own curator was part of Mailer’s addiction to self‑dramatisation. Commencing with the taming and recontextualisation of the juvenilia and marginalia in 1959’s Advertisements for Myself, he was determined to frame his own works and set the parameters within which they were to be considered. However, his executor’s pruning proves indispensable: having given himself over to the fanatical labour of making a selection from more than 45,000 letters, Lennon presents us with 716 key missives, dating from 1940 to Mailer’s death in 2007.

more here.

‘A Voice Still Heard: Selected Essays of Irving Howe’

18foer-blog427Franklin Foer at The New York Times:

“A Voice Still Heard” appears at a relatively inert moment in American intellectual life and, therefore, at just the right time. Though Howe’s reputation dissipated quickly after his death in 1993, he was an American Orwell: our most thrilling dissident, a socialist with conservative cultural sympathies, a scything polemicist capable of the most tender, patient literary explication. Unlike Orwell, Howe never went on great foreign adventures — there was no journalism in him. And his standing will never be buoyed by his novels, because there aren’t any. But Orwell and Howe shared a romantic vision of their chosen path in life, and it’s that ­marrow-deep commitment to heterodoxy that makes the current climate feel uninspired and careerist by contrast.

Howe grew up in what Paul Goodman called decent poverty in the immigrant Bronx, although his family couldn’t even afford to hold his bar mitzvah in his neighborhood synagogue. It was socialism, with its radiant dialectics and its universalist promise of shared bonds with a world beyond the ghetto, that excited his young mind. (Not that he was alone in this obsession, especially not in the outer boroughs in the thick of the Depression.) He gravitated to the figure of the warrior-essayist Leon Trotsky.

Socialism did plenty to distort his young mind.

more here.

‘When the Facts Change: Essays 1995-2010’, by Tony Judt

C23786c9-8737-4e22-8a64-c17be9f659beMark Mazower at the Financial Times:

Tony Judt was a historian whose journalism includes some of the finest things he wrote. At the time of his death from motor neurone disease in 2010 at the age of 62, he was a fixture of the Manhattan intellectual scene and his regular platform in the New York Review of Books allowed him to excoriate the follies of politicians and pundits alike. He was no stranger to controversy. But the essays collected in When the Facts Change remind us that he was much more than a controversialist. Composed during the last 15 years of Judt’s life, they chart the gradual souring of hope across the west that took place once the cold war’s euphoric end disappeared from view, and the feel-good Clinton era gave way to George W Bush and the everlasting war on terror.

When he came to teach in New York, Judt was chiefly known for a series of scholarly works on French socialism. If one precondition for his emergence as a public intellectual was the Manhattan cultural scene, another was his own response to the fall of the Berlin Wall and its aftermath. Better and faster than anyone else, Judt realised that in order to rethink Europe’s future, it was necessary to rethink its past. In the early 1990s, he turned himself into a genuine Europeanist, forging links with scholars in central and eastern Europe, hosting conferences and seminars at which he presided with characteristically self-deprecating energy.

more here.

Saturday Poem

Gift

He said: here is my soul.
I did not want his soul
but I am a Southerner
and very polite.
I took it lightly
as it was offered. But did not
chain it down.
I loved it and tended
it. I would hand it back
as good as new.

He said: How dare you want
my soul! Give it back!
How greedy you are!
It is a trait
I had not noticed
before!

I said: But your soul
never left you. It was only
a heavy thought from
your childhood
passed to me for safekeeping.

But he never believed me.
Until the end
he called me possessive
and held his soul
so tightly
it shrank
to fit his hand.

by Alice Walker
from Her Blue Body Everything We Know
Harvest Books, 1991

Among the Disrupted: the state of culture in the digital age

Leon Wieseltier in The New York Times:

BookAmid the bacchanal of disruption, let us pause to honor the disrupted. The streets of American cities are haunted by the ghosts of bookstores and record stores, which have been destroyed by the greatest thugs in the history of the culture industry. Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind. Everybody talks frantically about media, a second-order subject if ever there was one, as content disappears into “content.” What does the understanding of media contribute to the understanding of life? Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes: Digital expectations of alacrity and terseness confer the highest prestige upon the twittering cacophony of one-liners and promotional announcements. It was always the case that all things must pass, but this is ridiculous.

Meanwhile the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms: Economists are our experts on happiness! Where wisdom once was, quantification will now be. Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology. The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past. Beyond its impact upon culture, the new technology penetrates even deeper levels of identity and experience, to cognition and to consciousness. Such transformations embolden certain high priests in the church of tech to espouse the doctrine of “transhumanism” and to suggest, without any recollection of the bankruptcy of utopia, without any consideration of the cost to human dignity, that our computational ability will carry us magnificently beyond our humanity and “allow us to transcend these limitations of our biological bodies and brains. . . . There will be no distinction, post-Singularity, between human and machine.” (The author of that updated mechanistic nonsense is a director of engineering at Google.)

More here.

Friday, January 16, 2015

Culture: A scientific idea “ready for retirement”?

Bk_423_pascal__boyer

Alberto Acerbi over at the LSE's International Cognition and Culture Instiute, with comments in the comment section by Pascal Boyer (pictured), Dan Sperber and others:

Every year the website edge.org asks their panel a general question on science and/or society. The 2014 question was: What scientific idea is ready for retirement? I did not read (yet) all the answers, but I was surprised to see that two of them, from Pascal Boyer and John Tooby, were one and the same: culture. One could take the answers as a provocation of two evolutionary psychology-minded scholars against mainstream cultural anthropology (which I’d subscribe to). However, knowing Boyer and Tooby's work, and since, when people ask me what my research is about, I tend to answer “human culture” or “cultural evolution”, I think I have to take this challenge quite seriously.

On one level, I agree completely with the answer: “culture” cannot be considered as an unproblematic explanation of any phenomenon. I was recently reflecting on the fact that, while I consider myself an atheist, I find it often unpleasant to hear – let alone pronounce – profanities. Rationally, I know that they are simply a series of sounds, but still I cannot avoid being annoyed. The imaginary naive anthropologist would say: of course, it is your culture! (I am Italian, and I received a then standard Catholic education). But this is exactly what we want to explain: why is this specific “cultural stuff” (being bothered by profanities) and not others (say going to church or pray) still present?

I think that every reader of this blog would agree that it is not useful to use culture as an explanation: we can not explain X (my problematic relationship with profanities, the readiness to perceive interpersonal threats in Southern USA, etc.) with “culture”. As Boyer writes in his answer, “that such processes could lead to roughly stable representations across large numbers of people is a wonderful, anti-entropic process that cries out for explanation”. However I feel like this is a starting point. I would be interested in X as a “cultural stuff”, and then try to explain it. Boyer and Tooby do not seem to agree: “culture”, in their view, is not just mistakenly used as an explanation. It is not a scientific concept at all…

[Dan Sperber in the comments] Culture is a property. What property? Take all practices, artefacts, mental states in a population over time that have some informational content. Ask how are their contents related? Well, they are all links in many causal chains where for an item to be a link in such a chain is to owe some of its content to having been at least partly caused by previous links in the chain. So in a chain of perfect copying, as exists now on the internet, each new token of, say, a given youtube video owes its whole content to the token that has been copied in producing it. In most cases however, in particular before the internet, items having informational content have a more complex informational aetiology, owing some of their content to one causal chain, some to another causal chain, and some to the more or less idiosyncratic process of their production in the mind or through the behaviour of one or several individuals. Take a very idiosyncratic item: someone’s original dream. Even that item owes quite a bit of its content to being a causal descendent of conversations, stories, images, and so on. It is cultural too. You get my point. The more an item gets its informational content from the causal chains in which it occurs, the more cultural it is. Note that to be 100% cultural in this way, it should typically owe all of its content to a single chain, otherwise the recombination itself is likely to involve some idiosyncratic construction.

More here.

Being Johnny Rotten

D195876c-9bf5-11e4_1122352hWesley Stace at The Times Literary Supplement:

Lydon the narrator is endlessly self-contradictory – there is no use criticizing the book on the basis of this essential component of his character. He is also abrasive, immodest, given to outlandish claims, prone to speaking about himself in the third person (“poor old Johnny Rotten”), and either very funny or mesmerizingly humourless. He seems to will misunderstanding, purely so he can complain about it, and is equally happy to speak ill of the dead and the living in his eternal battle over the soul of the Pistols. Occasionally, as the book goes on, he picks a fight with himself just to pass the time.

The tone changes when he writes tenderly, and uxoriously, of his wife, Nora, and extended family. Libraries have been another kind of saviour and there are paeans to Dickens, Wilde, Ted Hughes, Muriel Spark and John Keats. Punk did not brush away the musical past as its publicists have claimed, and Lydon emphasizes the musical continuum, praising influential musical acts from Can and Hawkwind, through Kool & the Gang, to Duran Duran and Depeche Mode. Of the Edgar Broughton Band, he wisely notes: “I don’t expect the music would bear up too much today, but that isn’t the be-all-and-end-all”.

It’s perhaps fitting that the career of Lydon, who undoubtedly sees himself as a force of nature, took a left turn when he became the presenter of “extreme” nature programmes, including John Lydon’s Megabugs and John Lydon Goes Ape. During the filming of the latter, he finally met his match: “You can not train [gorillas], they will not have it . . . But then I’m untrainable too”.

more here.