We have had neuroeconomics and neuroaesthetics recently, now the logical next step, neuromarketing: “Could brain-scanning technology provide an accurate way to assess the appeal of new products and the effectiveness of advertising?” Article here from The Economist.
Now that the Republican National Convention has ended, here are some statistics: there are 520,000 Republicans in New York City (and 2,700,000 Democrats), meaning they roughly one in fourteen of us. Their convention has been estimated by the city comptroller to have cost the city’s economy about 309 million dollars, of which 281 million is economic losses caused by the closure of businesses and residents’ flight from the city. The remaining amount was government spending for convention security, which cost a total of 78 million, of which 50 million was federal monies. New Yorkers, with the highest tax burden in the nation, paid 28 million dollars of their tax money in order to secure members of a political party with a tiny local following by pre-emptively incarcerating… ourselves.
Thursday, September 2, 2004
Culture draws inpiration from so many places. In 1984, a landmark American movie was released, a big-budget epic about a tribe of desert warriors rising to defeat an imperial power which is characterized as both hubristic and sexually depraved, and whose economy depends on a natural resource found in their desert. This small band of committed fighters defeats the decadent empire without armies or military power, by following the directives of their charismatic leader, a young prophet whose mother and sister dress in black robes that cover all but their faces. The young man teaches his followers techniques for making explosives that they use to “smash our enemies’ bones, and rip out their organs.” He speaks primarily a prophetic, allegorical language: “We will kill and kill again”; “A storm is coming; our storm.” Do you remember this film? As should be obvious, it trades in and romanticizes elements of Islamic resistance movements, particularly in Afghanistan, by reading their struggles as akin to those of American revolutionaries, in both cases the enemy a greedy colonial power. Yet such a production would be profoundly impossible to mount today, since no one would accept the possibility of an epic simile likening ‘them’ to ‘us.’ How populist sympathies shift in twenty years, at each moment representing momentary affiliations as both the certainty of tradition and the ahistorical truth.
“Mr. Jones, who was one of Wright’s best-known pupils, created the Ozark style of architecture, a term he considered misleading because his work extends from coast to coast and can be seen in cypress swamps, cotton fields and crowded urban lots as well as on the hills of his native Arkansas… In 2000 Mr. Jones’s Thorncrown Chapel in the Arkansas Ozarks was voted the fourth-best building of the 20th century by the American Institute of Architects after Wright’s Fallingwater and the Chrysler and Seagram buildings in Manhattan.”
“Every election year, Jim Morris hits the trail as a Presidential hopeful—or two, or three. Morris, a rangy forty-seven-year-old from Massachusetts, is arguably the country’s leading political impressionist. He came to notice during the 1980 campaign, with his twinkly-eyed Ronald Reagan, and later provided the voices of all the political characters on the ‘Saturday Night Live’ cartoon ‘The X-Presidents.'”
Says Morris: “At the moment, I see John Kerry as two parts Herman Munster and one part Bill Walton—his build and facial structure and the cavities in his head, the nasal stuffiness. Then, there’s a bit of Hugh Grant in his smile; some Robert Stack raspiness in the voice; some Jim Nabors in the shoulders and face; some Bea Arthur in the face; and a hint of that Indian who cried from the highway about the litter. Oh, and a little bit of my dog, Tex.” More here in the New Yorker.
“What I’m now thinking — though it certainly needs further work — is basically that the point of there being a phenomenally rich subjective present is that it provides a new domain for selfhood. Gottlob Frege, the great logician of the early 20th century, made the obvious but crucial observation that a first-person subject has to be the subject of something. In which case we can ask, what kind of something is up to doing the job? What kind of thing is of sufficient metaphysical weight to supply the experiential substrate of a self — or, at any rate, a self worth having? And the answer I’d now suggest is: nothing less than phenomenal experience — phenomenal experience with its intrinsic depth and richness, with its qualities of seeming to be more than any physical thing could be.”
So says Nicholas Humphrey, in this interview at Edge.
Nicholas Humphrey is a research psychologist whose interests are wide ranging: He studied mountain gorillas with Dian Fossey in Rwanda; was the first to demonstrate the existence of “blindsight” after brain damage in monkeys; and is the only scientist ever to edit the literary journal Granta. Thirty years ago he breathed life into the newly developing field of evolutionary psychology with his theory about “the social function of intellect.” His more recent ideas concern the nature of phenomenal consciousness.
Unlike Daniel C. Dennett, who sees the role of philosophers as disabusing people of their “primitive” ideas about the nature of consciousness, Humphrey believes that we should take these primitive intuitions at face value. If people say that the problem is what it “feels like” to be conscious, then the problem is indeed to explain “feeling.” Humphrey and Dennett are a pair of bookends. Humphrey has been described as a “romantic scientist”, who believes in the heuristic value of stories that go beyond the limits of established facts. But he would probably not agree that there is a hard and fast line between facts and stories. “I’m me,” he says. “I’m living an embodied existence, in the thick moment of the conscious present. I’m trying to work out why.”
“Mervyn Jacobson, executive chairman of the small Australian biotech firm Genetic Technologies Ltd. (GTG), failed to create much of a stir in biotech circles when his company secured patents for the 97 percent of our genome known as “junk DNA.” Recently, however, researchers have come to realize that the non-coded DNA that was long considered extraneous may represent the next generation of genetic research, and many will likely be hearing from Jacobson.
Jacobson’s Melbourne-based company, whose numerous patents cover the analysis and mapping of junk DNA in all genes and all species, may actually own the rights to unraveling the etiology of diseases such as HIV, Alzheimer’s, and several forms of cancer—at least until his patents begin running out, in 2010. If, as looks increasingly likely, more researchers focus on junk DNA, Jacobson—whose company secures the majority of its revenue from licensing fees—will be very busy doing what he’s become notorious for: sending polite letters to biotech companies around the world reminding them that they owe him a lot of money.”
Read more here at Seed Magazine.
“In February 2003, astronomers involved in the search for extraterrestrial intelligence (SETI) pointed the massive radio telescope in Arecibo, Puerto Rico, at around 200 sections of the sky.
The same telescope had previously detected unexplained radio signals at least twice from each of these regions, and the astronomers were trying to reconfirm the findings. The team has now finished analysing the data, and all the signals seem to have disappeared. Except one, which has got stronger.
This radio signal, now seen on three separate occasions, is an enigma. It could be generated by a previously unknown astronomical phenomenon. Or it could be something much more mundane, maybe an artefact of the telescope itself.
But it also happens to be the best candidate yet for a contact by intelligent aliens in the nearly six-year history of the SETI@home project, which uses programs running as screensavers on millions of personal computers worldwide to sift through signals picked up by the Arecibo telescope.”
This article was posted yesterday at New Scientist.
In Chapter 17 of The Prince, Machiavelli writes:
“Returning to the question of being feared or loved, I come to the conclusion that, men loving according to their own will and fearing according to that of the prince, a wise prince should establish himself on that which is in his own control and not in that of others; he must endeavour only to avoid hatred, as is noted.”
This last part is generally omitted when people insist that it is better to be feared than to be loved. The danger for the United States is that they way it has fought the recent wars has excited hatred in much of the world, and hatred is a force that can be used to override fear.
Now Immanuel Wallerstein raises a related and more terrifying question for the United States.
“The CFR [Council on Foreign Relations] published a commentary on the poll by three of its Fellows – Lee Feinstein, James M. Lindsay, and Max Boot. Here is their analysis:
These disparities suggest something deeper than divisions over the Iraq war are at work. Bush supporters and Kerry supporters are taking sides in the longstanding debate over the relative importance of ‘hard’ versus ‘soft’ power. Will the U.S. be safer and more prosperous if it is feared, or if it is loved? Are America’s military strength, and the willingness to use it, what count most, or is America’s reputation abroad equally important?
I believe that this commentary is correct, but it evades an important analytic question, which seems to have escaped the attention of the three CFR Fellows, and probably of the large bulk of the American population.
Suppose the United States is neither feared nor loved? Is this credible? And if so, what are the implications of such a view of the U.S. by people elsewhere for war and peace, geopolitical realignments, and the U.S. view of itself in the decades to come?” (My emphasis. Read on, here.)
Martha Nussbaum, one of my favorite living political philosophers, has a new paper on what the capabilities approach on political justice implies on the international level.
“The capabilities approach is an outcome-oriented approach. It says that a world in which people have all the capabilities on the list is a minimally just and decent world. Domestically, it interprets the purpose of social cooperation as that of establishing principles and institutions that guarantee that all human beings have the capabilities on the list, or can effectively claim them if they do not. . . .In the international case, how should the approach precede? . . . We think about human dignity and what it requires. My approach suggests that we ought to do this in an Aristotelian/Marxian way, thinking about the prerequisites for living a life that is fully human rather than subhuman, a life worthy of the dignity of the human being. We include in this idea the idea of the human being as a being with, in Marx’s phrase, “rich human need,” which includes the need to live cooperatively with others. We insist that a fundamental part of the good of each and every human being will be to cooperate together for the fulfillment of human needs and the realization of fully human lives. We now argue that this fully human life requires many things from the world: adequate nutrition, education of the faculties, protection of bodily integrity, liberty for speech and religious self-expression – and so forth. If this is so, then we all have entitlements based on justice to a minimum of each of these central goods.”
Those interested in her paper, “Beyond the Social Contract: Capabilities and Global Justice,” can find it here.
The story of the Swiss Muslim scholar Tariq Ramadan, who was to become Henry R. Luce professor of religion, conflict and peace building at Norte Dame before the Department of Homeland Security revoked his visa, has become a trope of the times. Some see in these events a story of how the new post-9/11 environment has been destroying an open society. Others see a sensible measure in circumstances of war. For an instance of the latter, see Daniel Pipes’ comments here. Scott Martens at Fistful of Euros has responses here and here. Ramadan himself has this to say in the op-ed pages of today’s New York Times.
Assuming rationality and egoism, it has always been difficult to explain why people punish violators of norms and agreements. Punishment has been considered a classic free rider problem. Everyone receives the benefits of punishing violators or defectors. Norms are maintained, and the example deters future malfeasance by others. But the cost is born entirely by the punisher, and these can be high. Those who carry out punishments effectively are engaging in an act of altruism. . . well, sort of. New research suggests that punishment may be its own reward.
“As the journal Science puts it, the study reveals what goes on in Dirty Harry’s head when ‘he succinctly informs a norm violator that he anticipates deriving satisfaction from inflicting altruistic punishment’. . .
The researchers determined that deciding to impose this penalty, an altruistic punishment, activated a brain region, the dorsal striatum, involved in experiencing enjoyment or satisfaction.” (Read on, here. And those with access to Science can read the full report of the study “Sweet Revenge?” in volume 305, pages 1246-1247)
It was bound to happen at the mass market level and offer the promise of new kinds of yearbooks, wedding albums, and vacation photos. This article explains. (click the “click here” in the right box to see the process.)
“New animation software can turn digital videos into smoothly animated cartoons.
Computer scientist Michael Cohen, of Microsoft research in Redmond, Washington, honed the prototype on a video of his daughter, Lena. The software scans the film for prominent objects – such as Lena swinging on monkey bars – then turns that movement into a cartoon.”
Wednesday, September 1, 2004
“For too many Arab intellectuals, Saddam Hussein remains an admirable Antar. One Egyptian lawyer who has come forward to help represent Saddam at his upcoming trial has said on Iraqi television that to defend Saddam is to defend the honor and dignity of the Arabs, as if it were not possible to criticize the US occupation of Iraq while rejoicing in the overthrow of a butcher.”
Short opinion piece on Mohammed Barrada’s short story and the state of the Arab world, by Charles Paul Freund, here in the Daily Star.
“Scientists have proposed that humans have a history of polygyny before (our sperm, for example, looks like the sperm of polygynous apes and monkeys, for example). But with these new DNA results, the Arizona researchers have made a powerful case that polygyny has been common for tens of thousands of years across the Old World. It’s possible that polygyny was an open institution for much of that time, or that secret trysts made it a reality that few would acknowledge. What’s much less possible is that monogamy has been the status quo for 50,000 years.
People are perfectly entitled to disagree over what sort of marriage is best for children or society. But if you want to bring nature or tradition into the argument, you’d better be sure you know what nature and tradition have to say on the subject.”
This is from Carl Zimmer’s blog here.
“When great science minds collide, the insults traded and the bile spilt has been both personal and scandalous. But all too often, the victor’s reputation is scrubbed clean by the passage of history. William Hartston rakes up some of the muck that has always been part and parcel of the nature of scientific practice, but that few of us know about.”
This is a series of five programs on squabbles in science, done by the BBC:
1. Isaac Newton and Gottfried Wilhelm Leibniz
2. Joseph Priestley and Antoine Laurent Lavoisier
3. Henry Thomas De La Beche and Roderick Impey Murchison
4. Trofim Denisovitch Lysenko and Nikolai Ivanovitch Vavilov
5. Arthur Stanley Eddington and Subrahmanyan Chandrasekhar
Read more and listen to the programs here.
“Have we gotten any smarter over the past quarter century, or have instant access to information and all those wonderful technological advances made us more confused and far crazier than we were in 1979? That’s the question that scientists, hackers and artists will debate this week at Ars Electronica, the world’s largest festival of technology and art. This year’s theme is ‘Timeshift — The World in 25 Years,’ in honor of the 25th anniversary of the festival, which will be held Sept. 2-7 in Linz, Austria. The many exhibitions, symposiums and events are all intended to identify the ideas that are likely to be the driving forces in art, technology and society over the next quarter century.”
More here from Wired.
Tuesday, August 31, 2004
“The central image in Art Spiegelman’s new book of comics is that of the north tower’s glowing skeletal form, incandescent and ghostly in the fleeting seconds before its collapse: a searing image, witnessed by the author himself, that sunny morning of Sept. 11, 2001. It is an image that conjures up the moment when history swerved from its expected course and time seemed to stop, and an image, too, that embodies the haunting aftermath of 9/11, the afterimage that’s been burned into our collective imagination… It is a testament to Art Spiegelman’s uncompromising vision that ‘In the Shadow of No Towers’ – his account of 9/11 and its aftermath – makes no effort to contain or domesticate the surreal awfulness of that day.”
Book review here by Michiko Kakutani in the New York Times.
Maybe this explains my temper: “Sometimes it can be worth judging by appearances: it seems that people with less symmetrical features are likely to be more aggressive. In a study of stressful telephone conversations, those with uneven faces and bodies were more prone to angry reactions.” More here from Nature.
Jordan Ellenberg reviews Everything and More: A Compact History of Infinity by David Foster Wallace here in Seed Magazine. If you want to get a substantive taste of the subtle intricacies of Cantorian mathematics, read this challenging but ultimately rewarding book. But you have to be willing to get used to David Foster Wallace’s eccentric and sometimes bratty prose.