Montesquieu

20080612_montesquieuh165jpg

Montesquieu would make most everyone’s top-ten list of political philosophers, but he is not prominent in the ranks of natural philosophers. Following the lead of the American Founders, who referred to him as “the celebrated Montesquieu,” we associate his name with new discoveries and improvements in the science of politics rather than science proper. However, as a young man in his late twenties, decades before the publication of his masterwork, The Spirit of the Laws (1748), Montesquieu seems to have been interested in a variety of scientific questions.

The young nobleman was elected to the Academy of Bordeaux in 1716. In keeping with that body’s preference for scientific endeavors, Montesquieu shifted away from literary and political explorations. Although his first presentation to the Academy was a “Discourse on the politics of the Romans in religion,” his subsequent offerings owed more to Descartes than Machiavelli.

more from The New Atlantis here.

the end

Wanderer_lg

The post-catastrophic novel began with Mary Shelley’s The Last Man (1826), in which a plague kills most of humanity and provokes incessant warfare. Plague remains the triggering calamity in much post-catastrophe fiction up through the Manhattan Project; even as late as George Stewart’s Earth Abides (1949), plague rather than nuclear war is the problem. But between the invention of James Watt’s coal-fired steam engine in 1784 and the start of the Cold War, the most haunting sci-fi visions were not visions of the end of the world. They were visions—in dystopian novels like We, Brave New World, and 1984—of the consolidation of technological civilization into a system of total social control. Zamyatin, Huxley, and Orwell did not imagine a time when the boots stamping on human faces could no longer be industrially manufactured, so that people would return to smashing one another’s faces the old-fashioned way, with stones. The bombing of Hiroshima revived this notion of a reduced, brutally simplified future; and from Nevil Shute’s On the Beach (1957) to Denis Johnson’s Fiskadoro (1985), through many novels in between, the idea of a future more primitive than the past ran alongside the idea of a future ever more technologically advanced.

more from n+1 here.

the suffering of keats

080707_r17502_p233

In July, 1820, John Keats published his third and final book, “Lamia, Isabella, The Eve of St. Agnes and Other Poems.” He had no reason to expect that it would be a success, with either the public or the critics: in his short career, the twenty-four-year-old poet had known nothing but rejection on both fronts. After his first book, “Poems,” appeared, in 1817, his publishers, the brothers Charles and James Ollier, refused to have anything more to do with him. In a letter to the poet’s brother George, they wrote, “We regret that your brother ever requested us to publish his book, or that our opinion of its talent should have led us to acquiesce in undertaking it.” They went on, “By far the greater number of persons who have purchased it from us have found fault with it in such plain terms, that we have in many cases offered to take the book back rather than be annoyed with the ridicule which has, time after time, been showered upon it.”

more from The New Yorker here.

greenaway’s last supper

Greenaway128

With a glint of a dagger and a blaze of celestial light, Leonardo da Vinci’s The Last Supper burst into new life on Monday night after Peter Greenaway finally secured permission to reinvent the crumbling, 510-year-old masterpiece as a sound and light show.

In a remarkable coup for the British film director, the Italian authorities allowed Greenaway to wheel a battery of projectors, computers and speakers into the usually hushed and air-sealed refectory of Santa Maria delle Grazie, where the image of Christ telling the apostles one of them will betray him decorates an end wall. Inside, Greenaway unveiled a provocative vision of one of Christianity’s most sacred and fragile paintings, reimagined “for the laptop generation”.

more from The Guardian here.

Wednesday Poem

///
The City That Never Sleeps

Frederico Garcia Lorca

In the sky there is nobody asleep.  Nobody, nobody.

Nobody is asleep.

The creatures of the moon sniff and prowl about their cabins.

The living iguanas will come and bite the men who do not dream,

and the man who rushes out with his spirit broken will meet on the
            street corner

the unbelievable alligator quiet beneath the tender protest of the
            stars.


Nobody is asleep on earth.  Nobody, nobody.

Nobody is asleep.

In a graveyard far off there is a corpse

who has moaned for three years

because of a dry countryside on his knee;

and that boy they buried this morning cried so much

it was necessary to call out the dogs to keep him quiet.


Life is not a dream.  Careful!  Careful!  Careful!

We fall down the stairs in order to eat the moist earth

or we climb to the knife edge of the snow with the voices of the dead
            dahlias.

But forgetfulness does not exist, dreams do not exist;

flesh exists.  Kisses tie our mouths

in a thicket of new veins,

and whoever his pain pains will feel that pain forever

and whoever is afraid of death will carry it on his shoulders.


One day

the horses will live in the saloons

and the enraged ants

will throw themselves on the yellow skies that take refuge in the
            eyes of cows.


Another day

we will watch the preserved butterflies rise from the dead

and still walking through a country of gray sponges and silent boats

we will watch our ring flash and roses spring from our tongue.

Careful!  Be careful!  Be careful!

The men who still have marks of the claw and the thunderstorm,

and that boy who cries because he has never heard of the invention
            of the bridge,

or that dead man who possesses now only his head and a shoe,

we must carry them to the wall where the iguanas and the snakes
            are waiting,

where the bear’s teeth are waiting,
where the mummified hand of the boy is waiting,
and the hair of the camel stands on end with a violent blue shudder.


Nobody is sleeping in the sky.  Nobody, nobody.

Nobody is sleeping.

If someone does close his eyes,

a whip, boys, a whip!

Let there be a landscape of open eyes

and bitter wounds on fire.

No one is sleeping in this world. No one, no one.

I have said it before.

No one is sleeping.

But if someone grows too much moss on his temples during the
            night,

open the stage trapdoors so he can see in the moonlight

the lying goblets, and the poison, and the skull of the theaters.

Translation: Robert Bly

///

Revolutions per Minute

From Orion Magazine:

Rev Sex before marriage. Bob and his boyfriend. Madame Speaker. Do those words make your hair stand on end or your eyes widen? Their flatness is the register of successful revolution. Many of the changes are so incremental that you adjust without realizing something has changed until suddenly one day you realize everything is different. I was reading something about food politics recently and thinking it was boring.

Then I realized that these were incredibly exciting ideas—about understanding where your food comes from and who grows it and what its impact on the planet and your body are. Fifteen or twenty years ago, hardly anyone thought about where coffee came from, or milk, or imagined fair-trade coffee. New terms like food miles, fairly new words like organic, sustainable, non-GMO, and reborn phenomena like farmers’ markets are all the result of what it’s fair to call the food revolution, and it has been so successful that ideas that were once startling and subversive have become familiar en route to becoming status quo. So my boredom was one register of victory.

More here.

THE END OF THEORY: Will the Data Deluge Makes the Scientific Method Obsolete?

From Edge:

Andersonchris200 Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition. They are the children of the Petabyte Age.

The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies. According to Chris Anderson, we are at “the end of science”, that is, science as we know it.” The quest for knowledge used to begin with grand theories. Now it begins with massive amounts of data. Welcome to the Petabyte Age.”

More here.

Tuesday, July 1, 2008

wood and the text

29_jameswood_lgl

IT is not unusual, post-Foucault, to observe the decline of God as a source of meaning in the West since the Enlightenment and the subsequent diminishment of the power of the Bible. Nor is it unusual to point out that this occurred side by side with the rise of the novel.

In fact, in The Art of the Novel (1985), Czech novelist Milan Kundera makes an even bolder claim on behalf of the novel, as not only an expression of the 17th century’s dawning humanism, but as one of its chief enabling technologies. By creating stories and characters that require the suspension of authorial judgment in order to come fully alive, Kundera argues, the novel is a form that is per se sceptical, and that makes us think beyond the boundaries of any religious dogma.

But in his reviews and two books of critical essays, The Broken Estate (1999) and The Irresponsible Self (2005), British literary critic James Wood makes an even more extravagant claim for the novel. Wood’s critical practice is based on the idea that fiction at a certain point took over the cachet and power of the sacred. For Wood, the best novelists combine the humane scepticism of the novel form with a quasi-religious drive to improve it.

At the bottom line of Wood’s writing is a conception of the novel that is almost kabbalistic.

more from The Australian here.

nas

Nas080630_250x375

To say that Nas’s new album is one of the most anticipated of the summer is true, but it misses the point—every Nas album is highly anticipated, because no rapper is held to as high a standard. When a Nas record is about to drop, hip-hop fans cross their fingers and wonder: Will he change hip-hop forever, again? Will the new record be as good as Illmatic?

Released in 1994, when Nas was 20, Illmatic was a prodigious debut. No other rap debut, and few rap albums at all, are as lauded. It’s hard to quantify the album’s achievement precisely, except to say that rapping is a craft, and Nas was the first to discover how to do it right. Rap has two components—beats and rhymes. On Illmatic, the beats were mostly good, sometimes great; and there had been virtuoso emcees before Nas who’d moved the stylized rhythms of early groups like N.W.A toward the conversational. But no one had ever sounded as natural as Nas. “One Love,” which takes the form of a monologue to an incarcerated friend, exploits poetic devices like enjambment so subtly that it works as prose. Every rapper who hopes to be taken seriously—from Kanye West to the Game—must grapple with Nas’s discovery.

more from New York Magazine here.

the pitiless dahlberg

Becauseiwasflesh2

He grew up in Missouri, the son of a lady barber. And in order to get a flavor of the man one must read lines like these, describing the orphanage where he spent childhood time:

They were a separate race of stunted children who were clad in famine. Swollen heads lay on top of ashy uniformed orphans. Some had oval or oblong skulls; others gigantic watery occiputs that resembled the Cynocephali described by Hesiod and Pliny. The palsied and the lame were cured in the pool of Bethesda, but who had enough human spittle to heal the orphans’ sore eyes and granulated lids.

Dahlberg talked and wrote like this. Unlike Charles Olson, whom he’d met at Harvard and whose work would always return to postwar Eurpoean philosophy and American politics, the autodidactic Dahlberg had identified with the proletarian underground since the twenties—and with ancient texts; he went into seven years of withdrawal from writing to study these. His first book, Bottom Dogs, had an introduction by D.H. Lawrence. After his withdrawal, he renounced his former self, his politics, everyone he knew, almost all men who aspired to write, and his early works.

more from Poetry here.

darwin and wallace

Dar460x276

In early 1858, on Ternate in Malaysia, a young specimen collector was tracking the island’s elusive birds of paradise when he was struck by malaria. ‘Every day, during the cold and succeeding hot fits, I had to lie down during which time I had nothing to do but to think over any subjects then particularly interesting me,’ he later recalled.

Thoughts of money or women might have filled lesser heads. Alfred Russel Wallace was made of different stuff, however. He began thinking about disease and famine; about how they kept human populations in check; and about recent discoveries indicating that the earth’s age was vast. How might these waves of death, repeated over aeons, influence the make-up of different species, he wondered?

Then the fever subsided – and inspiration struck. Fittest variations will survive longest and will eventually evolve into new species, he realised. Thus the theory of natural selection appeared, fever-like, in the mind of one of our greatest naturalists. Wallace wrote up his ideas and sent them to Charles Darwin, already a naturalist of some reputation. His paper arrived on 18 June, 1858 – 150 years ago last week – at Darwin’s estate in Downe, in Kent.

Darwin, in his own words, was ‘smashed’.

more from The Guardian here.

memory is home

Konradsubhead

PERHAPS THE most idiosyncratic characteristic of twentieth-century despotism was its obsession with historical revision. When considered against history’s many brutal tyrants, Hitler, Mussolini, Mao, and Stalin stand out as pathological rewriters of personal and state history. If, as Stalin said, a totalitarian regime imposed ideological consent through the engineering of human souls, then much of this effort was spent creating and enforcing elaborate counter-histories. “Day by day and almost minute by minute,” George Orwell wrote in 1984, “the past was brought up to date.”

What ails a polity, however, can also cure it, and the late-twentieth-century’s civil resistance to totalitarianism was not only against the state’s nefarious reach into the present but also the past. Retrospection—in its refusal to participate in the present—became the ultimate technique of antipolitics. Soviet-bloc novelists Milan Kundera, Ivan Klima, Joseph Skvorecky, Czeslaw Milosz, Danilo Kiš, and George Konrád not only resisted the reigning culture of kitsch; they became vast cataloguers of personal history, practitioners of what can be called the “semi-autobiography.” Their novels served as covert antihistories that, in their non-linearity and depoliticization of memory, refused to accept the pervasive oneness of state history.

more from Dissent here.

the ongoing islam debate at sign and sight

A common feature of Bruckner’s kind of polemics is the frequent use of the words “appeasement” and “collaborator”. This is rarely done innocently. The idea is to associate people who seek an accommodation with the majority of Muslims with Nazi collaborators. Unless he is simply being vicious, this can only mean that Bruckner sees the rise of Islamism as something on a par with the emergence of the Third Reich. If so, he is not alone. While seeing the dangers of Islamism, I regard this as too alarmist.

But here we get to the final Brucknerian sleight of hand, for after all his huffing and puffing about not giving an inch to the Muslims, about defending Ayaan Hirsi Ali against “the enemies of freedom,” such as myself, he suddenly concludes that “there is nothing that resembles the formidable peril of the Third Reich” and even that “the government of Mullahs in Tehran is a paper tiger.” Now it is us, the armchair philosophers, who are the panic-stricken alarmists, who have lost the courage to “defend Europe.” Now where have we heard that kind of thing before? The need to defend Europe against alien threats; the fatigued, self-doubting, weak-kneed intellectuals… but no, now I am descending to the level of Pascal Buckner, the rebel king of the Left Bank.

more here.

dick

Pkd

When an art form or genre once dismissed as kids’ stuff starts to get taken seriously by gatekeepers – by journals, for example, such as the one you are reading now – respect doesn’t come smoothly, or all at once. Often one artist gets lifted above the rest, his principal works exalted for qualities that other works of the same kind seem not to possess. Later on, the quondam genius looks, if no less talented, less solitary: first among equals, or maybe just first past the post. That is what happened to rock music in the late 1960s, when sophisticated critics decided, as Richard Poirier put it, to start ‘learning from the Beatles’. It is what happened to comics, too, in the early 1990s, when the Pulitzer Prize committee invented an award for Art Spiegelman’s Maus. And it has happened to science fiction, where the anointed author is Philip K. Dick.

When he died in 1982, Dick was a cult figure, admired unreservedly in the science fiction subculture, and in the American counterculture as a chronicler of psychedelia and fringe religion. By then he had published more than thirty novels, most of them as fleeting mass-market paperbacks, and well over a hundred short stories, most of them in SF magazines. By dying in March, Dick missed the May premiere of Ridley Scott’s Blade Runner, the first movie made from his work.

more from the LRB here.

byatt spins

Thethreefates372

We think of our lives – and of stories – as spun threads, extended and knitted or interwoven with others into the fabric of communities, or history, or texts. An intriguing exhibition at Compton Verney in Warwickshire, The Fabric of Myth, mixes ancient and modern – Penelope’s shroud, unpicked nightly, with enterprising tapestries made in a maximum security prison out of unravelled socks. In an essay in the accompanying catalogue, Kathryn Sullivan Kruger collects words that connect weaving with storytelling: text, texture and textile, the fabric of society, words for disintegration – fraying, frazzling, unravelling, woolgathering, loose ends. A storyteller or a listener can lose the thread. The word “clue”, Kruger tells us, derives from the Anglo-Saxon cliwen, meaning ball of yarn. The processes of cloth-making are knitted and knotted into our brains, though our houses no longer have spindles or looms.

more from The Guardian here.

The Worms Crawl In: Scientist infects himself

From The New York Times:

Pritchard190 In 2004, David Pritchard applied a dressing to his arm that was crawling with pin-size hookworm larvae, like maggots on the surface of meat. He left the wrap on for several days to make sure that the squirming freeloaders would infiltrate his system. “The itch when they cross through your skin is indescribable,” he said. “My wife was a bit nervous about the whole thing.” Dr. Pritchard, an immunologist-biologist at the University of Nottingham, is no masochist. His self-infection was in the interest of science.

While carrying out field work in Papua New Guinea in the late 1980s, he noticed that Papuans infected with the Necator americanus hookworm, a parasite that lives in the human gut, did not suffer much from an assortment of autoimmune-related illnesses, including hay fever and asthma. Over the years, Dr. Pritchard has developed a theory to explain the phenomenon. “The allergic response evolved to help expel parasites, and we think the worms have found a way of switching off the immune system in order to survive,” he said. “That’s why infected people have fewer allergic symptoms.” To test his theory, and to see whether he can translate it into therapeutic pay dirt, Dr. Pritchard is recruiting clinical trial participants willing to be infected with 10 hookworms each in hopes of banishing their allergies and asthma.

Never one to sidestep his own experimental cures, Dr. Pritchard initially used himself as a subject to secure approval from the National Health Services ethics committee in Britain.

More here.

Monday, June 30, 2008

Sunday, June 29, 2008

In Defense of Advertising

Pict0109 Winston Fletcher in New Humanist:

Hostility to advertising among British intellectuals goes back a long way. In 1843 Thomas Carlyle dubbed it a “deafening blast of puffery,” and at the end of that century the Society for Controlling the Abuses of Public Advertising (SCAPA) included among its members such notables as William Morris, Rudyard Kipling, Holman Hunt, Arthur Quiller-Couch and Sir John Millais – as well as Sydney Courtauld and the Fry chocolate family. But even then the public did not follow their leaders. 500 copies of SCAPA’s polemical leaflet were printed. Only 30 were sold.

Still, the critics kept up their fire. Many of the attacks were well-worn retreads. But in 1980 Professor Raymond Williams took the arguments a stage further. Williams – an influential Marxist academic, social commentator, critic and novelist – published an essay called Advertising: The Magic System. Far from being too materialistic, Williams argued, modern advertising is not materialistic enough, because the images with which advertisements surround goods deliberately detract attention from the goods’ material specifications: “If we were sensibly materialist we should find most advertising to be an insane irrelevance” he averred. In the 19th century he said, more or less accurately, advertising was generally factual and informative, except for fraudulent patent medicine and toiletry advertisements, which had already adopted the undesirable practices which later became commonplace. In other words Williams was not attacking all advertising, just most present day advertisements.

Why, he asked, do advertisements exploit “deep feelings of a personal and social kind?” His answer: because the concentration of economic power into ever larger units forces those units to make human beings consume more and more, in order for the units to stay operative. “The fundamental choice… set to us by modern industrial production, is between man as a consumer and man as a user.”

 

Quantum Mechanics and a Test of Realism

16reality3684 Joshua Roebke in Seed:

In may of 2004 Markus Aspelmeyer met Anthony Leggett during a conference at the Outing Lodge in Minnesota. Leggett, who had won the Nobel Prize the year before, approached Aspelmeyer, who had recently become a research assistant to Zeilinger, about testing an idea he first had almost 30 years before.

In 1976 Leggett left Sussex on teaching exchange to the University of Science and Technology in Kumasi, the second largest city in Ghana. For the first time in many years, he had free time to really think, but the university’s library was woefully out of date. Leggett decided to work on an idea that didn’t require literature because few had thought about it since David Bohm: nonlocal hidden variables theories. He found a result, filed the paper in a drawer, and didn’t think about it again until the early 2000s.

Leggett doesn’t believe quantum mechanics is correct, and there are few places for a person of such disbelief to now turn. But Leggett decided to find out what believing in quantum mechanics might require. He worked out what would happen if one took the idea of nonlocality in quantum mechanics seriously, by allowing for just about any possible outside influences on a detector set to register polarizations of light. Any unknown event might change what is measured. The only assumption Leggett made was that a natural form of realism hold true; photons should have measurable polarizations that exist before they are measured. With this he laboriously derived a new set of hidden variables theorems and inequalities as Bell once had. But whereas Bell’s work could not distinguish between realism and locality, Leggett’s did. The two could be tested.

A History of My People…American Nerds

300h Kevin O’Kelly reviews Benjamin Nugent’s American Nerd: The Story of My People, in boston.com:

Of the myriad changes that occurred in American society in the late 20th century, perhaps none was so surprising and subtle as the shift toward partial acceptance – and even occasional celebration – of the American nerd.

From the late 19th century onward, it was more or less accepted that the ideal purpose of American education and parenting was to produce athletic, popular young men and women, the sort who end up in business, law, or politics. But sometime during the 1980s it began to be a lot harder to dismiss the awkward kids with thick glasses, obsessive interests, and no social skills. Sure, life was still rough for those kids, but they were learning they weren’t alone, thanks to TV shows like “Square Pegs” and movies like “Sixteen Candles.” As computers began to play a larger role in business, education, and life in general, the former class presidents were learning that the former class geeks held everyone’s future in their hands. Soon one nerd (Alan Greenspan) was running the economy, another nerd (Al Gore) was running for president, and two unbelievably rich nerds (Bill Gates and Steve Jobs) were changing the ways a lot of us lived and worked.

Blending social history, memoir, and reportage, recovering nerd Benjamin Nugent takes on a tour of the world of “my people,” who they are, and how they came to be. As the 19th-century educational movement alluded to above became pervasive in the nation’s schools (a movement perhaps best summarized by Groton headmaster Endicott Peabody’s remark “I’m not sure I like boys who think too much”), it was all too obvious that there were plenty of young men who would never fit the mold. “American Nerd” is in large part the story of how these young men (and later women) found subcultures where they did fit in.