G. Pascal Zachary asks whether calling Darfur a “genocide” makes it harder to stop the killing, in Salon.
[D]oes the conflict in Darfur, however bloody, qualify as genocide? Or does the application of the word “genocide” to Darfur make it harder to understand this conflict in its awful peculiarity? Is it possible that applying a generic label to Darfurian violence makes the task of stopping it harder? Or is questioning the label simply insensitive, implying that whatever has happened in Darfur isn’t horrible enough to justify a claim on the world’s conscience, and thus invite inaction or even the dismissal of Darfur altogether?
These questions — and the paradoxical nature of the G-word — lie at the heart of a much-needed new book by Gerard Prunier, a scholar of African affairs. In “Darfur: The Ambiguous Genocide,” Prunier, a professor at the University of Paris, casts aside labels and lays bare the anatomy of the Darfur crisis, drawing on a mixture of history and journalism to produce the most important book of the year on any African subject. Clearly and concisely, he describes a complex civil war, where “Arabs” and “Africans” are often indistinguishable from one another to outsiders. Members of both groups can be dark-skinned, Muslim, poor and neglected. Indeed, this last characteristic of Darfurians, the extent of their neglect by Sudan’s central government, may be the most significant for understanding the roots of today’s conflict. (Although racism cannot be discounted; racial bias exists in Sudan with some people demonizing blacks and holding them as slaves.) Prunier emphasizes the legacy of Darfur’s isolation, which began under Britain, colonial ruler of Sudan until its independence in 1955. In 1916, the British incorporated Darfur, which had been an independent country for centuries, into colonial Sudan and then pathetically left it to crumble (as late as the 1930s there was not a single high school student in Darfur, and only four primary schools for younger kids).
Edward Castronova discusses whether the “Horde” in the enormously successful massive multiplayer online role playing game is genuinely “evil” and if choosing an evil avatar shows a moral failing.
To choose orc, it was said, does not carry with it any particular moral or ethical baggage. It was a matter of playstyles, tastes, personal interests.
Goodness, I could not disagree more. My view is that in a social game, these choices are laden with all kinds of implications for personal integrity. Avatar choice is fraught with broader meaning.
Two concrete examples of where the choice matters:
1. I am a father. A guild of colleagues chose to play Horde. I rolled an Undead. My son (age 3) was afraid of my character. He was afraid of the Undercity. And that’s just from the imagery – he would know nothing of what the Undead actually do in terms of kinapping, imprisonment, and torture. He’s afraid, and he should be afraid, and as his father, my only defense in this frightening choice would have to be that I am just trying out evil, just getting to know it, just using evil instrumentally for some greater purpose. He abviously can’t grasp that now, but even if he could, these are the only possible justifications for me to inhabit such a wicked being. And my point is that the inhabitation would indeed require justification. If my undead warlock were an extension of myself, something I was pursuing for mere enjoyment, then it ought to be a troubling question for me, sholdn’t it? Why am I finding pleasure in expressing myself in a form that frightens 3-year-olds? My assertion is that this is a genuine and significant moral issue that everyone who chooses an avatar needs to think about. Morally compulsory.
In Wired, more reasons to worry about widespread government surveillance.
There are few, if any, studies demonstrating the effectiveness of mass surveillance. People with something to hide are adept at speaking in codes. Teenagers tell their parents they are “going to the movies” when they are going to drink beer. Attackers know to misspell the victim’s name, as journalist Daniel Pearl’s kidnappers and murderers did, to evade e-mail surveillance. Meanwhile, modern filtering technology can’t distinguish between breast cancer websites and pornographic ones.
Any search algorithm, whether public or not, is unlikely to be able to distinguish between innocent and criminal communications.
Even if the technology works, it fails. Even if a TMS was 99.9 percent accurate, it will produce a false positive one in 1,000 times. Whether it’s facial recognition at the Super Bowl, or sifting through e-mail communications, TMS will inevitably produce an unacceptably high number of false positives. Hundreds of thousands of innocent people will not be allowed to board their planes, will have their houses searched, their bank accounts frozen — at least until the mistake can be cleared up. At best, a “hit” will require someone to look more closely at the information, and we’ll need more agents to do it than we currently have, or could have.
At their best, photographs of artists can be totemic: They establish status within the tribe, produce value, dazzle with allure, and manufacture myth; as Barbara Kruger wrote in her 1988 essay “Picturing Greatness,” they “freeze moments, create prominence, and make history.” Sometimes these pictures take on talismanic lives of their own, becoming fetish objects, what philosopher Francis Bacon called “idols of the mind,” as with photos of Pollock painting or Warhol doing almost anything (or nothing). We’ve all been transfixed by Picasso in his underpants at the beach, Bacon in his grimy studio, de Kooning studying his paintings, Leon Golub’s huge head, Hockney’s Dutch boy grin, Kahlo’s unibrow, Schnabel in his pajamas, Mapplethorpe’s image of Louise Bourgeois holding a giant phallus, and his self-portrait as a faun. In our collective mind’s eye we see Beuys in his hat, Baselitz in his castle, and Basquiat in his designer suit; the young and beautiful Johns and Rauschenberg, the rakish Duchamp, and the ruddy Robert Smithson.
more from Salz at the Village Voice here.
Back then, Gilbert was Gilbert Proesch and hailed from a small village in the Dolomites, George was George Passmore from Plymouth. In the iconoclastic spirit of the times, they rejected sculpture and surnames, and decided to turn themselves into a living work of art. They have remained one ever since, their oddness congealing into a kind of signature, as instantly identifiable as the pictures they make. Initially, though, they called themselves ‘The Singing Sculpture’, and specialised in a deadpan version of ‘Underneath the Arches’, which they delivered standing still, side by side in their suits, their hands and faces painted bronze. A friend who saw them back then says they were ‘odd but ubiquitous’, and they have remained so ever since.
more from the Guardian Unlimited here.
From Nature:
Our ears could have started evolutionary life as a tube for breathing, say scientists, after examining the ancestral structure in a 370-million-year-old fossil fish. Evolutionary biologists are intrigued by how complicated sensory organs evolved from structures that may have had completely different uses in ancestral creatures. The ear is a relatively easy organ to study. Its evolving bones have been preserved as fossils, whereas the soft tissues of other specialized features, such as eyes and noses, have long decayed.
So Martin Brazeau and Per Ahlberg of Uppsala University in Sweden decided to take a close look at the ear-like features of an ancient, metre-long monster from the Latvian Natural History Museum in Riga. Panderichthys was a fish, but is thought to be closely related to the earliest four-limbed tetrapods that eventually climbed on to land and gave rise to modern vertebrates. The researchers examined Panderichthys and found that the bony structures in its head combine features of fish and tetrapods, capturing a snapshot of evolution in action.
More here.
From Scientific American:
Sights and sounds fill the world, presenting a panoply of possible foci for the brain. Yet most animals can hone in on whatever sight most demands interest. Then the sounds associated with that sight–be it a loved one talking or a tasty meal skittering through the undergrowth–become all the clearer. This is attention and new research shows how an owl’s brain establishes the state. It also provides tantalizing evidence that brains from across the animal kingdom work the same way.
Neurologists Daniel Winkowski and Eric Knudsen of Stanford University wired 12 owls with electrodes in the areas of their brains that process either visual or auditory input. Each region literally maps the world of sound or sight, determining whether it comes from up or down, left or right. Sending a small electrical charge into the owl’s visual brain region–the so-called arcopallial gaze fields–caused it to move its head and eyes in a particular direction. When a simultaneous audio stimulus matched that direction, the owl’s brain responded more strongly to that noise. It also blocked out competing noises from other directions. Owls are already extremely gifted at tuning in a particular sound, the authors note in their paper published in the current issue of Nature, but pairing a sound with a sight enhanced that ability even further.
More here.
Isobel Coleman in Foreign Affairs:
Although questions of implementation remain, the new Iraqi constitution makes Islam the law of the land. This need not mean trouble for Iraq’s women, however. Sharia is open to a wide range of interpretations, some quite egalitarian. If Washington still hopes for a liberal order in Iraq, it should start working with progressive Muslim scholars to advance women’s rights through religious channels.
More here.
Dan Baum in The New Yorker:
The official name of the program known as the Green Card Lottery in Peru—and in a hundred and seventy-six other countries—is the Diversity Visa Program. Of the more than two hundred visa types provided by the State Department, it is by far the oddest. While the vast majority of immigrant visas still go to people who suffer persecution or possess strictly prescribed qualifications—relatives already in the U.S., strategic skills, or great wealth—the only requirement for winning the Green Card Lottery, other than good fortune, is a high-school education or two years’ experience in one of three hundred and fifty-three career categories ranging from anthropologist to housepainter to poet-and-lyricist. Fifty thousand diversity visas are made available each year; almost six million people applied to the program in 2005. Its future, however, is uncertain. Last month, the House of Representatives passed a border enforcement and immigration bill that included an amendment to abolish the Green Card Lottery. The Senate will consider that bill later this year.
More here.
Stuart Silverstein and Peter Y. Hong in the Los Angeles Times:
A fledgling alumni group headed by a former campus Republican leader is offering students payments of up to $100 per class to provide information on instructors who are “abusive, one-sided or off-topic” in advocating political ideologies.
The year-old Bruin Alumni Assn. says its “Exposing UCLA’s Radical Professors” initiative takes aim at faculty “actively proselytizing their extreme views in the classroom, whether or not the commentary is relevant to the class topic.” Although the group says it is concerned about radical professors of any political stripe, it has named an initial “Dirty 30” of teachers it identifies with left-wing or liberal causes.
More here. [Thanks to Winfield J. Abbe.]
Brian Handwerk in National Geographic:
Grasses and other green growth may produce 10 to 30 percent of Earth’s annual methane output, a new study reports, making plants a surprising—and potentially significant—contributor to global warming.
Until the data were unveiled in this week’s Nature, scientists had believed that plant-related methane formed only in oxygen-free environments, such as bogs.
But a team of European researchers identified a large range of plants that release methane under normal growing conditions. The gas also seeps from dead plant material.
David Lowe is an atmospheric chemist with the National Institute of Water and Atmospheric Research in Wellington, New Zealand. He wrote a review article accompanying the study.
According to Lowe, “We now have the specter that new forests might increase greenhouse warming through methane emissions rather than decrease it by being sinks for carbon dioxide.”
“The identification of a new source should prompt a re-examination of the global methane budget.”
More here.
From Scientific American:
Last December was a special month for U.S. executions. North Carolina gave a lethal injection to Kenneth Boyd, making him the 1,000th person to be executed since the 1976 Supreme Court decision to allow the reinstatement of the death penalty. Soon thereafter, on December 13, California put to death Crip gang founder Stanley “Tookie” Williams. The U.S. remains the only developed Western nation to permit executions despite serious flaws in the system. No need for any pacificist proclivity or liberal leaning to see that–just look at the science.
First, there’s DNA evidence. Although it cannot prove guilt beyond all doubt–who can forget O.J. Simpson?–it can definitively prove innocence. The first DNA exoneration occurred in 1989, and since then many on death row have been set free because of it–the Death Penalty Information Center counts 122 exonerations since 1973. It showed that too many convictions resulted from sloppy or overzealous police work and prosecution, or incompetent defense attorneys. It helped convince then Republican governor George Ryan of Illinois in 2003 to declare the death penalty “arbitrary and capricious” and to commute the sentences of all 157 inmates on the state’s death row.
But DNA isn’t the only contribution from science to this issue.
More here.
Wednesday, January 18, 2006
William Saletan in Slate:
Move over, Mrs. Robinson. The new public enemy is the bespectacled babe who teaches our kids math in the classroom and sex in the parking lot. Dozens of female teachers have been caught with male students in recent years, and the airwaves are full of outrage that we’re letting them off the hook. On cable news, phrases like “double standard” and “slap on the wrist” are poured like pious gravy over photos of the pedagogue-pedophile-pet of the month. “Why is it when a man rapes a little girl, he goes to jail,” CNN’s Nancy Grace complains, “but when a woman rapes a boy, she had a breakdown?”
I hate to change the subject from sex back to math, but this frenzy—I’m trying hard not to call it hysteria—reeks of overexcitement. Sex offenses by women aren’t increasing. Female offenders are going to jail. And while their sentences are, on average, shorter than sentences given to male offenders, the main reason is that their crimes are objectively less vile. By ignoring this difference, we’re replacing the old double standard with a new one.
More here.
Shanaz Ramzi in Newsline:
Q: How is it that from graduating in something as prosaic as business, you began to write novels?
A: I’ve been writing for a long time; in fact, I began years before I went to business school or got a job. Everybody is multi-faceted, but not everyone gets the opportunity to exercise their whims. I’ve been lucky enough to both have a job and write.
Q: Do you think your exposure to world politics has given you an insight into political realities, which come to the fore in your books?
A: I think it is not my career that has given me a political insight but the conversations I’ve grown up with, the focus on following world events in the news and also my exposure to living in the heart of Karachi in the eighties, even if for a limited time. It was like I had a finger on the pulse of everything – I lived on Guru Mandir, and would drive all over the Site area, Korangi, Orangi and Malir, absorbing everything I could. I would watch the cricket match in Khudadad Colony every Friday.
I thought I had a sense of how things were and what was their logical trajectory, and followed my instincts on that trajectory. For instance, I wrote that a dictator would take over the country in a military coup, and in the name of eradicating fundamentalism, he would get rid of all opposition. I mentioned that Pakistan would be the darling of the west – particularly the general – and would be the new emerging economy because of its coalition against fundamentalism. When my father read the book prior to its publication, he insisted that I delete this portion as such a scenario of coups and martial law regimes were a thing of the past. Am I glad I didn’t listen to my dad on that occasion!
More here.
Dorothy Roberts reviews Postcolonial Melancholia by Paul Gilroy, in The Boston Review:
It was not so long ago that the biological meaning of race seemed to be on its way out: the Human Genome Project discredited the genetic basis of racial groupings just as social scientists were declaring that race is a social construction. Apparently, reports of the demise of race as a scientific concept were premature. In June 2005, the Food and Drug Administration approved the first race-specific pharmaceutical, BiDil, to treat heart failure in African-Americans. The drug is just one example of a burgeoning scientific and commercial interest in genetic racial differences. Some scientists have even claimed that clusters of genetic similarity correspond to antiquated racial classifications.
The renewed acceptance of inherent racial differences has gone hand in hand with intensified state surveillance of inner-city communities: racial profiling, mass incarceration, welfare restructuring, and the removal of children from families into foster care. As its lineage foreshadows, the biological definition of race provides a ready rationale for this disenfranchisement of black citizens and complements colorblind policies based on the claim that racism is no longer the cause of social inequality.
Given this alarming convergence, black intellectuals today face a critical question: how can we fight systemic racism without relying on the idea that biology divides human beings into races?
Paul Gilroy’s Postcolonial Melancholia is a deeply engaging exploration of this question.
More here.
Ker Than (of 3 Quarks Daily) in Space.com:
Astronomers have traced the source of mysterious high-energy X-rays and gamma rays in space to a little known star cluster in the Milky Way.
The cluster sits about 19,000 light-years away from Earth in the constellation Scutum. It contains about 20,000 stars, most of which are hot, young blue stars.supernovae.
Astronomers had known about the star cluster before, but it was only recently that they confirmed the number of stars it held.
In the late 1990’s, several observatories detected very high-energy X-rays and gamma rays coming from the region but astronomers were uncertain about their source. It was thought the blasts might have originated from distant galaxies or from pulsating stars known as pulsars, which typically emit radio waves but can send out other wavelengths of energy as well.
A team of researchers is now proposing that the blasts are the result of massive stars known as red supergiants in the cluster that exploded cataclysmically as
More here.
Clay Risen in The New Republic:
It was early 2003, and the newly created Department of Homeland Security was looking for someone to help oversee its vast computer network. The department soon found a candidate who appeared to be a perfect match: Laura Callahan. Not only had Callahan been working with federal IT systems since the mid-’80s, but she came with outstanding academic credentials: bachelor’s and master’s degrees in computer science, topped by a Ph.D. in computer information systems. In April 2003, Callahan was brought on as the department’s senior director in the office of the chief information officer, pulling down a six-figure salary.
But Callahan didn’t last long. A few weeks after her hiring, the Office of Personnel Management opened an investigation into her resumé following the publication of articles questioning her degrees’ provenance. It turned out that Callahan’s vaunted academic achievements were anything but–all three degrees had come from Hamilton University, a now-defunct degree mill operating out of a former Motel 6 in Evanston, Wyoming, that claimed religious affiliation. In June 2003, she was placed on administrative leave. By the time she resigned, in March 2004, a new picture of Callahan had emerged: not a skilled IT executive, but an unqualified hack.
More here.
Mike Peplow in Nature:
Conventional computers do their ‘thinking’ by shuttling electrons through arrangements of transistors called logic gates. But in order for those thoughts to be stored as computer memories, the electrical signals have to be translated by bulky components into magnetic fields on the metallic grains that cover your hard drive. This additional step takes up extra room in a computer.
What’s more, transistors get so hot that it is becoming increasingly difficult to pack more of them on to silicon chips without melting something important.
The solution? Space-saving, cool-headed, magnetic logic gates.
More here.
Over at the Valve, a book event:
[A] series of short essays and comments on Franco Moretti’s Graphs, Maps, Trees, an event similar to those past on Theory’s Empire and The Literary Wittgenstein. Several Valve regulars will contribute, and we also hope to have pieces from Cosma Shalizi and Scott McLemee. Anyone who has read or would like to read Moretti’s book and/or the essays in the NLR from which it is drawn and who has an idea for a guest-post for the event is welcome to contact me with a proposal. Before too long, we hope to be able to make PDFs of Moretti’s NLR articles available to interested readers for a limited time.
Franco Moretti is Professor of English and Comparative Literature at Stanford and also the author of Signs Taken for Wonders, The Way of the World, Modern Epic, and Atlas of the European Novel 1800-1900. Graphs, Maps, Trees is an ambitious work, seeking to “delineate a transformation in the study of literature” through “a shift from close reading of individual texts to the construction of abstract models.” These models come from quantitative history, geography, and evolutionary theory, areas which Moretti suggests have had little interaction with literary criticism, “but which have many things to teach us, and may change the way that we work.”
Explanation before interpretation, a materialist conception of form, and “a total indiffierence to the philosophizing that goes by the name of ‘Theory’ in literature departments,” which should be “forgotten, and replaced with the extraordinary array of conceptual constructions–theories, plural, and with a lower case ‘t’–developed by the natural and by the social sciences” are what Moretti proposes for a “more rational literary history.”
PAMUK The good years are over now. When I was publishing my first books, the previous generation of authors was fading away, so I was welcomed because I was a new author.
INTERVIEWER
When you say the previous generation, whom do you have in mind?
PAMUK
The authors who felt a social responsibility, authors who felt that literature serves morality and politics. They were flat realists, not experimental. Like authors in so many poor countries, they wasted their talent on trying to serve their nation. I did not want to be like them, because even in my youth I had enjoyed Faulkner, Virginia Woolf, Proust—I had never aspired to the social-realist model of Steinbeck and Gorky. The literature produced in the sixties and seventies was becoming outmoded, so I was welcomed as an author of the new generation.
After the mid-nineties, when my books began to sell in amounts that no one in Turkey had ever dreamed of, my honeymoon years with the Turkish press and intellectuals were over. From then on, critical reception was mostly a reaction to the publicity and sales, rather than the content of my books. Now, unfortunately, I am notorious for my political comments —most of which are picked up from international interviews and shamelessly manipulated by some Turkish nationalist journalists to make me look more radical and politically foolish than I really am.
more in the Paris Review here.