Jaron Lanier at Edge.org:
It’s funny being an “old timer” in the world of the Internet. About six years ago, when I was 40 years old, a Stanford freshman said to me, “Wow Jaron Lanier—you’re still alive?” If there’s any use to sticking around for the long haul — as computers get so much more powerful that every couple of years our assumptions about what they can do have to be replaced — it might be in noticing patterns or principles that may not be so apparent to the latest hundred million kids who have just arrived online.
There’s one observation of mine, about a potential danger, that has caused quite a ruckus in the last half-year. I wrote about it initially in an essay called “Digital Maoism.”
Here’s the idea in a nutshell: Let’s start with an observation about the whole of human history, predating computers. People have often been willing to give up personal identity and join into a collective. Historically, that propensity has usually been very bad news. Collectives tend to be mean, to designate official enemies, to be violent, and to discourage creative, rigorous thought. Fascists, communists, religious cults, criminal “families” — there has been no end to the varieties of human collectives, but it seems to me that these examples have quite a lot in common. I wonder if some aspect of human nature evolved in the context of competing packs. We might be genetically wired to be vulnerable to the lure of the mob.
More here.
Josh Getlin in the Los Angeles Times:
It started off with bestselling author James Frey admitting his memoir, “A Million Little Pieces,” was in fact a work of fiction, and ended with celebrity publisher Judith Regan getting fired for allegedly making anti-Semitic comments after her proposed O.J. Simpson confessional book-TV deal got shot down.
In between came charges that 19-year-old Harvard novelist Kaavya Viswanathan had lifted passages from a rival chick-lit author, and hotly disputed allegations that Ian McEwan, one of the most respected names in modern literary fiction, may have been guilty of plagiarism.
More here.
John McWhorter in the New York Sun:
In the rush of the holiday season you may have missed that a white buffalo was born at a small zoo in Pennsylvania. Only one in 10 million buffalo is born white, and local Native Americans gave him a name in the Lenape language: kenahkihinen, which means “watch over us.”
They found that in a book, however. No one has actually spoken Lenape for a very long time. It was once the language of what is now known as the tristate area, but its speakers gradually switched to English, as happened to the vast majority of the hundreds of languages Native Americans once spoke in North America.
The death of languages is typically described in a rueful tone. There are a number of books treating the death of languages as a crisis equal to endangered species and global warming. However, I’m not sure it’s the crisis we are taught that it is.
There is a part of me, as a linguist, that does see something sad in the death of so many languages. It is happening faster than ever: It has been said that a hundred years from now 90% of the current 6,000 languages will be gone.
Each extinction means that a fascinating way of putting words together is no longer alive. In, for example, Inuktitut Eskimo, which, by the way, is not dying, “I should try not to become an alcoholic” is one word: Iminngernaveersaartunngortussaavunga.
More here.
From some initial editorials, many Kurds seem none too pleased with Saddam’s execution. Amin Matin in Kurdish Media:
Iraq’s highest court upheld Saddam Hussein’s death sentence for killing of nearly 150 Shiite Arabs, paving the way for the former dictator to be hanged within 30 days. The execution order still needs to be approved by the office of Iraqi president, Mr. Jalal Talabani.
Saddam is also on trial for crimes against humanity and genocide that he and his regime committed in Southern Kurdistan. These atrocities resulted in killing of over 200,000 civilian Kurds and were part of a final solution code named Anfal that also included use of weapons of mass destruction such as chemical bombs. Executing Saddam prior to concluding the current trial will deny justice to Kurdish victims and strips Kurds from the possibility of serving justice to Anfal survivors. Proving the case for Kurdish genocide has enormous values for Iraqi Kurds and Kurdish nation. There are still people in Iraq and Arab world that deny the systematic genocide against Kurds. Saddam’s trial for crimes against humanity and genocide committed against Kurdish nation is a rare opportunity for Kurds to validate the depth and scope of the atrocities committed against Kurdish nation in a Iraqi court of law.
Some responses by Kurdish Media readers can be found here.
From The New York Times:
CRAWFORD, Tex., Dec. 29 — The capture of Saddam Hussein three years ago was a jubilant moment for the White House, hailed by President Bush in a televised address from the Cabinet Room. The execution of Mr. Hussein, though, seemed hardly to inspire the same sentiment.
Before the hanging was carried out in Baghdad, Mr. Bush went to sleep here at his ranch and was not roused when the news came. In a statement written in advance, the president said the execution would not end the violence in Iraq.
After Mr. Hussein was arrested Dec. 13, 2003, he gradually faded from view, save for his courtroom outbursts and writings from prison. The growing chaos and violence in Iraq has steadily overshadowed the torturous rule of Mr. Hussein, who for more than two decades held a unique place in the politics and psyche of the United States, a symbol of the manifestation of evil in the Middle East.
Now, what could have been a triumphal bookend to the American invasion of Iraq has instead been dampened by the grim reality of conditions on the ground there.
More here.
From The Washington Post:
“A leader is like a shepherd,” Nelson Mandela proclaimed more than a decade ago in his autobiography. “There are times when a leader must move out ahead of his flock, go off in a new direction, confident that he is leading his people in the right way.”
It’s an arrogant statement — could any other democratically elected politician get away with equating his constituents with sheep? — and yet supremely apt. For Mandela is arguably the greatest political leader of our time, the one person worthy of mention alongside FDR, Churchill and Gandhi. Mandela led the political and moral crusade for majority rule in South Africa against a white supremacist police state, risking his life, surrendering his personal freedom and his family’s well-being. He spent 27 years in prison only to emerge as a wise, dynamic and conciliatory figure binding black and white together as father of his nation and inspiration for the world.
The danger, of course, is that in extolling Mandela’s virtues, it’s all too easy to turn him into a saint — worshipped and untouchable and therefore of no practical value as a guide for our own behavior — and to lose track of the flawed, flesh-and-blood human being whom we can learn from and seek to emulate. As George Orwell once warned, “Saints should always be judged guilty until they are proved innocent.”
More here.
Friday, December 29, 2006
Jack Weatherford in the Los Angeles Times:
Genghis Khan recognized that victory came by conquering people, not land or cities. In contrast to the Americans in 2003, who sought to take the largest cities first in a campaign of shock and awe, the Mongols in 1258 took the smallest settlements first, gradually working toward the capital. Both the Mongols and the Americans used heavy bombardment to topple Baghdad, but whereas the Americans rushed into the capital in a triumphant victory celebration, the Mongols wisely decided not to enter the defeated — but still dangerous — city. They ordered the residents to evacuate, and then they sent in Christian and Muslim allies, who seethed with a variety of resentments against the caliph, to expunge any pockets of resistance and secure the capital. The Americans ended up as occupiers; the Mongols pulled strings, watching from camps in the countryside.
The Mongols also immediately executed the caliph and his sons on charges that they spent too much money on their palaces and not enough defending their nation. They killed most members of the court and administration. The Mongols took no prisoners and allowed no torture, but they executed swiftly and efficiently, including the soldiers of the defeated army who, they believed, would be a constant source of future problems if allowed to live. The first several months of a Mongol invasion were bloody, but once the takeover ended, the bloodshed ended.
By contrast, the American military campaign was quick, with comparatively few Iraqi (or coalition) casualties, but the bloodshed has continued for years. Constrained from decisively dispatching enemies of a new Iraq, the United States has allowed Iraqi terrorists to select who lives and who dies, including women and children, in a slow-motion massacre.
More here.
From The New York Times:
Laughter may be universal, but what provokes it is not. Even within a culture, humor can change drastically over a relatively short period. This truth is abundantly documented in “City of Laughter,” Vic Gatrell’s study of comic prints produced in London during the late 18th and early 19th centuries, a period he deems the golden age of satire.
The humor on display in the prints of James Gillray, Thomas Rowlandson, and George Cruikshank — the big three in Mr. Gatrell’s pantheon — was often coarse, bawdy, scatological and obscene. Private parts were on graphic display. Chamber pots and their contents stood front and center. Prostitutes cavorted with princes. Everything that the readers of Jane Austen regarded as private or shameful was shown in living color, on large, beautifully printed sheets hung in the windows of dealers for all London to see, and to laugh at.
More here.
Probably the central dispute about abstract art in the 20th century hinged on the ostensible spiritual content or impact of the work. Some, like Barnett Newman, insisted that his paintings were “religious art which through symbols will catch the basic truth of life.” Others were profoundly superficial materialists like Frank Stella, who famously opined, “What you see is what you see.” While I have never found Newman’s paintings very convincing arguments, the same cannot be said for the work of Mark Rothko, whose shimmering veils of color can — under the right conditions — produce something resembling an out-of-body experience.
more from the LA Weekly here.
Mutual intoxications of art and money come and go. I’ve witnessed two previous booms and their respective busts: the Pop nineteen-sixties, which collapsed in the long recession of the seventies, and the neo-expressionist eighties, whose prosperity plummeted, anvil fashion, in 1989. In each instance, overnight sensations foundered and a generation of aspiring tyros was more or less extirpated. (They were out of style before the market revived.) But tough economic times nudge artists into ad-hoc communities and foster what-the-hell experimentation. The seventies gave rise to gritty conceptual maneuvers, supported by government and foundation grants, nonprofit institutions, and a few heroically, or masochistically, committed collectors. The nineties were dominated by festivalism: theatrical, often politically attitudin-izing installations that were made to order for a spreading circuit of international shows and contemporary museums and Kunsthallen. I disliked the nineties. I knew what all the righteously posturing art was for, but not whom it was for. It invoked a mythical audience, whose supposed assumptions were supposedly challenged. I missed the erotic clarity of commerce—I give you this, you give me that—and was glad when creative spunk started leeching back into unashamedly pleasurable forms. Then came this art-industrial frenzy, which turns mere art lovers into gawking street urchins. Drat.
more from The New Yorker here.
From The Harvard Gazette:
As a young photojournalist in South Africa in the 1980s, Guy Tillim found that photography could be a way of bridging the racial gap that apartheid had imposed on his society. “A camera was the perfect tool to cross those boundaries, to see what was going on in my own country.” Working for both local and foreign media, Tillim produced a powerful body of work and won a number of important awards for his documentation of social conflict and inequality in the countries of Africa. He has exhibited his photos in more than a dozen countries and has published in numerous volumes and journals.
Tillim’s powerful images and his commitment to using photography as a way of exploring the human condition so impressed members of a search committee representing the Peabody Museum of Archaeology and Ethnology that they chose him as the first recipient of the Robert Gardner Fellowship in Photography.
More here.
Eric Berger in the Houston Chronicle (via Accidental Blogger):
A starter violin costs about $200. A finely crafted modern instrument can run as much as $20,000. But even that’s loose change when compared with a violin made three centuries ago by Antonio Stradivari.
His 600 or so surviving violins can cost upward of $3.5 million.
For more than a century, artists, craftsmen and scientists have sought the secret to the prized instruments’ distinct sound. Dozens have claimed to have solved the mystery, but none has been proved right.
Now, a Texas biochemist, Joseph Nagyvary [in photo above], says he has scientific proof the long-sought secret is chemistry, not craftsmanship. Specifically, he says, Stradivari treated his violins with chemicals to protect them from wood-eating worms common in northern Italy. Unknowingly, Nagyvary says, the master craftsman gave his violins a chemical noise filter that provided a unique, pleasing sound.
More here. [Thanks to Ruchira Paul.]
William Henry Gates in Scientific American:
Imagine being present at the birth of a new industry. It is an industry based on groundbreaking new technologies, wherein a handful of well-established corporations sell highly specialized devices for business use and a fast-growing number of start-up companies produce innovative toys, gadgets for hobbyists and other interesting niche products. But it is also a highly fragmented industry with few common standards or platforms. Projects are complex, progress is slow, and practical applications are relatively rare. In fact, for all the excitement and promise, no one can say with any certainty when–or even if–this industry will achieve critical mass. If it does, though, it may well change the world.
Of course, the paragraph above could be a description of the computer industry during the mid-1970s, around the time that Paul Allen and I launched Microsoft. Back then, big, expensive mainframe computers ran the back-office operations for major companies, governmental departments and other institutions. Researchers at leading universities and industrial laboratories were creating the basic building blocks that would make the information age possible. Intel had just introduced the 8080 microprocessor, and Atari was selling the popular electronic game Pong. At homegrown computer clubs, enthusiasts struggled to figure out exactly what this new technology was good for.
But what I really have in mind is something much more contemporary: the emergence of the robotics industry, which is developing in much the same way that the computer business did 30 years ago.
More here.
Barack Obama in the New York Times:
It’s been almost ten years since I first ran for political office. I was thirty-five at the time, four years out of law school, recently married, and generally impatient with life. A seat in the Illinois legislature had opened up, and several friends suggested that I run, thinking that my work as a civil rights lawyer, and contacts from my days as a community organizer, would make me a viable candidate. After discussing it with my wife, I entered the race and proceeded to do what every first-time candidate does: I talked to anyone who would listen. I went to block club meetings and church socials, beauty shops and barbershops. If two guys were standing on a corner, I would cross the street to hand them campaign literature. And everywhere I went, I’d get some version of the same two questions.
“Where’d you get that funny name?”
And then: “You seem like a nice enough guy. Why do you want to go into something dirty and nasty like politics?”
I was familiar with the question, a variant on the questions asked of me years earlier, when I’d first arrived in Chicago to work in low-income neighborhoods. It signaled a cynicism not simply with politics but with the very notion of a public life, a cynicism that-at least in the South Side neighborhoods I sought to represent-had been nourished by a generation of broken promises.
More here.
Thursday, December 28, 2006
When, in the 1920s, the Italian Futurists had fantasies about concreting over the canals of Venice and turning them into roads, they were not just indulging in gratuitous vandalism but reacting against the accumulated weight of dead-city literature that the Symbolist and Decadent writers of the fin de siècle had generated. The fin-de-siècle cities ended in whimpers, but the Futurists wanted them to go with a bang. It needed a big, bloody war – violent death and the flattening of entire towns under mortar shells – to revise the way people thought about the deaths of cities and their human inhabitants. The exaggerated exultation of the Futurists and Vorticists about machine-age death and destruction can partly be traced to the glut of pallid degeneration narratives on which they would have been drip-fed: dead-city poems by Rainer Maria Rilke and Henri de Régnier and Gabriele D’Annunzio, wispy lyrical novels and countless atmospheric travelogues that revisited the same tropes and clichés of urban exhaustion and desuetude.
The central figure in the dead-city cult was the Belgian poet and novelist Georges Rodenbach, and the totemic city was Bruges, or, to give it its full fin-de-siècle name, Bruges-la-Morte, the title of Rodenbach’s novel of 1892.
more from the TLS here.
Faust avoided Gretchen’s question “Do you believe in God?” But what should someone say who refuses to avoid the question and yet isn’t naive? I believe that on the one hand the need to believe in God is not only a cultural, but also an anthropological phenomenon, founded in the structure of human being. Today, however, people can’t give in to this need without fooling themselves. What we have here is a contradiction between need and feasibility. Seen logically, such contradictions are harmless, and relatively normal in human life.
Let me clarify this with an example. People – at least in general – have a need to go on living. That too is anthropologically founded. Yet this need stands in contradiction to reality: all individual life ceases to exist after a time. However the need to go on living is so deeply rooted that people in all cultures have attempted in one way or another, with or without religion, to construct a life after death.
more from Sign and Sight here.
Following up on Abbas’ post, in this video, Akeel Bilgrami discusses Gandhi’s nonviolence and how it stands in contrast to the moral psychology of liberalism and the Enlightenment.
Akeel Bilgrami argues that Gandhi, who was assassinated on Jan. 30, 1948, believed the adoption of moral principles generated criticism of others and eventually led to violence. In contrast with Western understanding, Bilgrami argues that Gandhi believed exemplary actions, not principles, are at the root of his philosophy on non-violence.
You think the French would’ve learned from the model of national champions, the Concord, minitel, etc., but apparently not:
The war waged by French president Jacques Chirac against “Anglo-Saxon” cultural imperialism suffered a blow today when the Germans announced they were pulling out of a rival European search engine to Google.
Earlier this year Mr Chirac announced a series of ambitious technological projects designed to challenge the global dominance of the US. They included Quaero, a Franco-German search engine whose name is Latin for “I search”, but which was swiftly dubbed “Ask Chirac”.
Today German officials confirmed they were abandoning the €400m (£270m) project. Senior officials in Germany’s economics and technology ministry said they had decided to dump Quaero because they had been sceptical it would ever be able to challenge the might of Google and Yahoo!
Cooperation with France had “not been simple,” they said. Asked today what had gone wrong, a ministry spokeswoman told the Guardian: “There were disagreements. The French wanted a search engine. We wanted something else.”
Instead, Germany has now decided to launch its own national search engine, Theseus.
In Counterpunch, Alexander Cockburn makes the case:
If Ford had beaten back Carter’s challenge in 1976, the neo-con crusades of the mid to late Seventies would have been blunted by the mere fact of a Republican occupying the White House. Reagan, most likely, would have returned to his slumbers in California after his abortive challenge to Ford for the nomination in Kansas in 1976.
Instead of an weak southern Democratic conservative in agreement to almost every predation by the military industrial complex, we would have had a Midwestern Republican, thus a politician far less vulnerable to the promoters of the New Cold War.
Would Ford have rushed to fund the Contras and order their training by Argentinian torturers? Would he have sent the CIA on its mostly costly covert mission, the $3.5 billion intervention in Afghanistan? The nation would have been spared the disastrous counsels of Zbigniev Brzezinski.
Those who may challenge this assessment of Ford’s imperial instincts should listen to the commentators on CNN, belaboring the scarce cold commander-in-chief for timidity and lack of zeal in prosecuting the Cold War. By his enemies shall we know him.
From Thrilling Wonder:
2. Bolivia’s “Road of Death”
North Yungas Road is hands-down the most dangerous in the world for motorists. If the previous road is just impassable, this one clearly endangers your life. It runs in the Bolivian Andes, 70 km from La Paz to Coroico, and plunges down almost 3,600 meters in an orgy of extremely narrow hairpin curves and 800-meter abyss near-misses. A fatal accident happens there every couple of weeks, 100-200 people perish there every year. In 1995 the Inter-American Development Bank named the La Paz-to-Coroico route “the world’s most dangerous road.”
5. Most Dangerous Tourist Hiking Trail (China)
Not a car road, but the most hair-raising experience you can have on your own two legs. This is a heavy-tourist traffic area in Xian (Mt.Huashan); this link explains more about the area:
More here.