In The Nation’s VideoNation, two videos on Pakistan, one on the country through the eyes of students at the liberal National College of Arts and the other through the eyes of some students at Punjab University who are also members of the IJT, the student wing of the Jamaat-e-Islami.
The second part on Punjab University can be found here. (You can see a poster of Eqbal Ahmad, my late mentor and ardent, progressive champion of tolerance, democracy, and openness in Pakistan, in the background of one of the scenes with dissenters in Punjab University. The scene also has a heartening image of literature as a bulwark of human decency.)
Today’s find, via Mind Hacks, is an online archive at UCSD dedicated to the memory of the great Soviet neuropsychologist Alexander R. Luria. (Lots of the links are broken, though.)
Today Luria’s probably best known for the “neurographies” he wrote, like The Mind of a Mnemonist and The Man with a Shattered World, which inspired Oliver Sacks’s famous ventures in this line. But he actually made really important scientific contributions, which deserve to be remembered.
Luria began his career as a disciple of Lev Vygotsky, who had a fascinating pre-cognitive theory of how individuals acquire higher mental functions through a scaffolding provided by cultural traditions (especially language) and social interaction. Vygotskyism was an explicitly Marxist theory: it was supposed to be a scientific account of how thought arises from practice. While it is very hard to accept some of Vygotsky’s more extreme statements, there is I think a core of very real insight here, about both individual development and collective cognition, and one which moreover is fundamentally compatible with sound computational views of the mind.
To support the theory, Luria led an expedition to Uzbekistan which sought to document how the Soviet introduction of modern education and collective agriculture (!) was transforming the mentality of the natives. The resulting report — translated as Cognitive Development: Its Cultural and Social Foundations — is an astonishing mixture of fascinating experiments and conjectures, and equally fascinating displays of colonialist blindness. Most of Luria’s subjects were Uzbekistani peasants who’d been forced onto collective farms a few years earlier; a decade previously the whole province was the scene of the basmachi revolt, which was suppressed by the Red Army with the usual measures. It never crossed Luria’s mind, so far as I can tell, that a bunch of Russian academics, asking questions which clearly indicated that the Russians thought the Uzbeks were idiots, would meet with anything less than full and sincere cooperation.
From Wall Street to Washington, we’re constantly being told that the future can be forecast, that the world is knowable, and that risk can be measured and managed. Nassim Nicholas Taleb (shown) is having none of this. In his new book, The Black Swan, the finance guru and author of the surprise hit Fooled by Randomness argues that history is dominated not by the predictable but by the highly improbable — disruptive, unforeseeable events that Taleb calls Black Swans. The effects of wars, market crashes, and radical technological innovations are magnified precisely because they confound our expectations of the universe as an orderly place. In a world of Black Swans, the first step is understanding just how much we will never understand.— James Surowiecki
Wired: If Black Swans are the crucial determining events in history, why do we think we can predict anything at all? Taleb: After they happen, in retrospect, we think that Black Swans were predictable. We think that if we can explain why something happened in the past, we can explain what will happen in the future.
But with better models and more computational power, won’t we get better at predicting Black Swans? We know from chaos theory that even if you had a perfect model of the world, you’d need infinite precision in order to predict future events. With sociopolitical or economic phenomena, we don’t have anything like that. And things are getting worse, not better, because the growing complexity of the world dwarfs any improvement in sophistication or computational power.
In Harvard International Review, Alex De Waal on intervention:
However attractive it might be from a distance, actually providing physical protection for Darfurians with international troops is not feasible. And unfortunately, the clamor for UN troops has consumed most of the diplomatic energies of the United States and its allies over the last 18 months, diverting efforts from achieving a peace agreement that was within grasp a year ago but has now slipped away. And as a direct result, the existing AU troops have been left without funds, and sometimes without food or fuel, and above all without any effort to upgrade their numbers and capability.
Meanwhile, the focus on numbers, armor, and mandate obscured the fundamental question of the concept of operations. What are the troops there to do? Effective peace support is nine parts political work and community relations to one part force or the threat of force, but the Darfur debate has focused on force alone and not the politics of stability. Making Darfur the test case for the R2P has not helped the search for political solutions in Darfur. It unrealistically raised the hopes of the rebels and intensified the fears of the government. This illustrates the blind alley down which the concept of humanitarian intervention has led many idealistic, principled, and concerned people.
There is no such thing as humanitarian military intervention distinct from war or counterinsurgency. Intervention and occupation should not be confused with classic peacekeeping, which is difficult enough even with a ceasefire agreement and the consent of the parties. If we want an intervention to overthrow a tyranny, protect citizens from their own government, or deliver humanitarian aid during an ongoing conflict, we should be honest with ourselves – we are arguing for a just war. And if we wish to make this case, let us be clear that the war is political (and must be very smartly political to succeed); that military logic will dictate what happens (including probable escalation and various unpredictable factors); and that it will entail bloodshed including the killing of innocent people.
In sixteenth-century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted in a sling on a stage and slowly lowered into a fire. According to historian Norman Davies, “[T]he spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonized.” Today, such sadism would be unthinkable in most of the world. This change in sensibilities is just one example of perhaps the most important and most underappreciated trend in the human saga: Violence has been in decline over long stretches of history, and today we are probably living in the most peaceful moment of our species’ time on earth.
In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.
Fungus enthusiast Edward Gange amassed 52,000 sightings of mushroom and toadstools during walks around Salisbury over a 50-year period. Analysis by his son Alan, published in the journal Science, shows some fungi have started to fruit twice a year. It is among the first studies to show a biological impact of warming in autumn. “My father was a stonemason, and his hobby was mycology,” recounted Alan Gange, an ecology professor at Royal Holloway, University of London. “For 50 years of his life, he went out and recorded the appearance of mushrooms and toadstools around Salisbury, and he also got his friends in the local natural history group to bring back samples they found when they were out walking.
“When he retired, he bought himself a computer, taught himself (the database program) Excel, and typed in all these 52,000 records.” Now Mr Gange senior finds his enthusiasm and diligence rewarded as a named author on a paper in one of the two most eminent scientific journals in the world. “I’m on top of the world, I can’t quite believe it yet,” he told the BBC News website.
What kind of moral order does capitalism rest upon? Conversely, does the market give rise to a distinctive set of beliefs, habits, and social bonds? These questions are certainly as old as social science itself. In this review, we evaluate how today’s scholarship approaches the relationship between markets and the moral order. We begin with Albert Hirschman’s characterization of the three rival views of the market as civilizing, destructive or feeble in its e&ects on society. We review recent work at the intersection of sociology, economics and political economy and show that these views persist both as theories ofmarket society and moral arguments about it. We then argue that a fourth view, which we call “moralized markets,” has become increasingly prominent in economic sociology. This work sees markets as cultural phenomena and moral projects in their own right, and seeks to study the mechanisms and techniques by which such projects are realized in practice.
On the day the Iranian government defused an international crisis by releasing 15 British sailors held captive since 23 March, a French newspaper revealed that Iran has also prevented a French scientist from leaving the country for more than 2 months. Sociologist Stéphane Dudoignon, of the National Centre for Scientific Research (CNRS) in Paris, was arrested on 30 January after taking photos of a religious procession in southeastern Iran. He was later released, but he has not received his passport and other documents and is stuck in Tehran.
The French Ministry of Foreign Affairs had kept Dudoignon’s detention under wraps, but confirmed it after it was reported today by the newspaper Le Monde. The French government says it has asked Iran to release Dudoignon, who also teaches at the Graduate School for Social Sciences Studies in Paris.
Painting can still do what the best traditional painting has always done: evoke, with telling emotion and exquisite precision, “the ancient hereditary ground” of “the human esthetic,” to use Wilson’s words. But the lesson of modernist painting is that it must enlist the medium in the service of the human esthetic if it is to continue to do so convincingly.
“Human nature exists, and it is both deep and highly structured” — “evidence accumulated to date leaves little room for doubt,” Wilson, the founder of sociobiology, reminds us. But the traditional means of emotional “transmission of [its] intricate details” — the “universals or near-universals [that] emerged in the evolution of culture,” and that reflect the “archetypes” or “inherited regularities of mental development that compose human nature” — no longer “communicate feeling” convincingly in the modern world.
Günter Grass belonged to that generation. He was an enthusiastic member of the Hitler Youth and probably internalized its maxims. In other words, his silence about being in the Waffen-SS was not due to a sense of guilt or implication in Nazi crimes. He was too young, his service too brief. Instead, he probably kept this secret as a kind of wellspring: an impetus for his creativity, a goad for the imagination, a source of diabolical energy that nurtured his books and drove him to write. The secret was his magic flute. This is also why now, near the end of his life, the secret can be disclosed. This is why his revelation does not come as a remorseful confession, but is embedded in the edifice of his achievement. He has taken a kind of vain retrospective glance at a distant errant youth that harbors an element of the strange and offensive.
Despite all rhetoric then, Grass’s revelation is not primarily about guilt. Here is a self-confident writer who, even after the revelation, had no compunctions about involving himself in an election campaign in Berlin, as if nothing had happened. That is comprehensible only by the logic of the secret, a logic that turns into a zest for confession: “I knew something you didn’t.”
Michèle Gerber Klein: You were a filmmaker before you were a photographer.
Robert Polidori: I used to work in avant-garde film, or what was known as structural film. It all started in 1969, when I was a freshman in college in Florida. Annette Michelson came and showed some films, including Wavelength by Michael Snow, and that changed my life. When I came to New York a couple of months later, she was kind enough to let me stay with her for a brief period. Through Annette I met Jonas Mekas and then I worked at Anthology Film Archives, even before they opened their first location at the Public Theater.
MGK: So how did you start making still photography? RP: Well, my films were about the temporality between still and motion. But it really came about because I read a book called The Art of Memory by the late Frances Yates. This book was about ancient mnemonic systems, and rooms play a central role in “memory theaters.”
When American women won the right to vote in 1919, the logical question was, What next? Suffragists had the answer ready: full enjoyment of civil and domestic life for women, equal to that of men. But suffragists found out that what was next was not much. It would be decades before American women gained anything like gender equality in the home, in the workplace, and in higher education.
And they faced another unsettling fact: Flappers were next. To the dismay of early feminists, these unruly daughters of feminism were driven by an apolitical appetite for clothes, boys, and the outward signs of freedom. The image of the 1920s flapper endures to this day: the frank gaze, the kiss curls and cropped hair, the slender figure, the painted eyebrows and bright red lips. In that era, the “It Girl” was It. But the American It Girl was also the German neue Frauen, the Japanese moga, the Indian vamp, the Chinese modeng xiaojie, and the French garçonnes.
Iterations of the flapper around the world had in common an explicit eroticism and an uncommon power to challenge social conventions. In the interval between the world wars, her iconic image — with regional adjustments — appeared not just in the United States but in all five continents. The history of the modern girl is the fertile territory staked out by six feminist historians from the University of Washington, Seattle. Their Modern Girl Project, now in its ninth year, has opened a many-layered, transnational view of how culture and commodities flow around the globe.
President Ahmadinejad announced the release of the 15 British naval personnel like a card player flinging down his hand to scoop the pool. Iran had good cards and played them well. It made its point about defending its borders, dominated international television with pictures of its prisoners and their “confessions” and, when it perhaps judged that it had got as much as it could expect to out of the confrontation, ended it with a flourish. Iran will project this as a victory (the medals given publicly to the officers who led the operation was an immediate example) against a country still viewed with suspicion in Iran because of its past interventions.
It also put out an indirect warning that any attack on its nuclear plants would be met with vigour. At the same time, the British government can argue that it managed to put enough pressure on Iran to force it to put an end to the confrontation without Britain having to make any formal statement, even of regret, at the incident.
Just as precious to Dr. Atul Gawande as his loupes — magnifying glasses he wears during surgery — is his iPod, which he carries with him into the operating room and plugs into a little speaker there. On a recent day, when he took out a gallbladder, two thyroids and what was supposed to be a parathyroid gland but maybe wasn’t, the playlist included David Bowie, Arcade Fire, Regina Spektor, Aimee Mann, Bruce Springsteen, Elvis Costello, the Decemberists and the Killers.
The music wasn’t turned up high, but it rocked sufficiently that the anesthesiologist bobbed his head, the O.R. nurse tapped her toe, and the member of the team in charge of all the clamps and retractors drummed his fingers on the instrument tray. “It all depends on who’s in the room,” Dr. Gawande said of his selections. “You can’t play anything hard-hitting if there’s anyone over 45.”
Dr. Gawande is 41, and it might be said that that’s a little old for the Killers. In every other respect, however, he is almost annoyingly admirable. He is tall, handsome, brilliant (a former Rhodes scholar and currently the recipient of a MacArthur “genius” grant); he has three children and a wife with musical interests so eclectic that when they pooled their vinyl record collections, his 800 and her 600, there were only 10 overlaps; he’s an accomplished surgeon and an equally accomplished writer, whose second book, “Better: A Surgeon’s Notes on Performance” (Metropolitan Books), comes out this week.
The shamelessness of Mugabe’s brutality—and his gloating pride in it—aroused the attention of the international press and diplomatic corps. But the story of Zimbabwe’s violent misrule and national degradation is not a new one. Mugabe, who is eighty-three, came to power in 1980 as a leader of the long and bloody liberation struggle against the white-supremacist regime of Ian Smith’s Rhodesia, and he has always used his hero’s mantle as cover for terrorizing his opponents, real and perceived. He has murdered thousands of his people and deprived the rest of meaningful freedom. In the process, he has transformed one of Africa’s most prosperous and promising countries into one of the poorest and weakest on earth.
Zimbabwe’s inflation rate is already more than seventeen hundred per cent, the highest in the world, and the International Monetary Fund warns that it could exceed five thousand per cent by year’s end. Unemployment is around eighty per cent, and the average income is less than a dollar a day. With chronic food shortages and no medical system left to speak of, life expectancy has plunged from sixty years, in 1990, to less than thirty-seven years (the shortest anywhere), while the infant-mortality rate has increased by more than fifty per cent. Not surprisingly, as many as three million Zimbabweans—a quarter of the population—have fled the country. Yet last week Mugabe’s information minister, Sikhanyiso Ndlovu, declared, “There is no crisis whatsoever in Zimbabwe.”
John Allen Paulos in his Who’s Counting column at ABC News:
Martin Kruskal, a renowned mathematician and physicist at Rutgers University, died in December 2006. Of his many accomplishments there is an intriguing trick that almost anyone can appreciate.
I explain it here, and, prompted by April Fool’s Day, I also sketch a sort of biblical hoax based on it that I first proposed in my 1998 book “Once Upon a Number.”
Kruskal’s trick can be most easily explained in terms of a well-shuffled deck of cards with all the face cards removed. The deck has 1s (aces) through 10s only. Imagine two players, Hoaxer and Fool. Hoaxer asks Fool to pick a secret number between 1 and 10…
Michael Bywater reviews The Lucifer Effect: How Good People Turn Evil by Philip Zimbardo, in the Times of London:
Imagine a world in which doctors knew the underlying causes of many diseases and had a pretty good idea about most. They could cure many, alleviate more and were working on the rest.
But imagine, too, that in this world the media and politicians devoted their discourse to philtres and quackery. Scientific medicine, when mentioned at all, was presented as the preserve of bleeding-heart liberals, something that would never work. Unthinkable that we might live in such a world.
Now turn from medicine to human society. The social sciences (as important for the body politic as medicine is for the body physiological) are regularly passed over in favour of a monochrome absolutism as daft as any swivel-eyed fundamentalist babbling of the Devil.
Google “evil” – a word so empty that it should surely have withered away – and up come 136m hits in a third of a second. Tony Blair swore to confront evil wherever he found it. George W Bush would be lost without the word: his name is co-googled with it more than 2m times.
Both men – indeed all politicians and social commentators – should read this book by Philip Zimbardo, Professor Emeritus of Psychology at Stanford University. Zimbardo’s central thesis is that evil is not just about those who inflict it, but the situations and systems that promote it. Take the scandal of the American guards-turned-torturers at Abu Ghraib. The standard line on the case (backed up by the guards’ trials) is that a few rotten apples can taint the whole barrel. In other words, the way to prevent future Abu Ghraibs is simple: when giving men and women absolute power over others, we should screen them carefully for the job. The alternative is embarrassing: serious misconduct, wholly unacceptable, few rotten apples, let down the regiment, steps taken, won’t happen again, mmph, dealt with, move on.