Concise Fiction

In Wired, a few science fiction and fantasy writers answer the challenge to come up with very short stories, six words or fewer!

We’ll be brief: Hemingway once wrote a story in just six words (“For sale: baby shoes, never worn.”) and is said to have called it his best work. So we asked sci-fi, fantasy, and horror writers from the realms of books, TV, movies, and games to take a shot themselves.

Dozens of our favorite auteurs put their words to paper, and five master graphic designers took them to the drawing board. Sure, Arthur C. Clarke refused to trim his (“God said, ‘Cancel Program GENESIS.’ The universe ceased to exist.”), but the rest are concise masterpieces.

Failed SAT. Lost scholarship. Invented rocket.- William Shatner

Computer, did we bring batteries? Computer?- Eileen Gunn

Vacuum collision. Orbits diverge. Farewell, love.- David Brin

Gown removed carelessly. Head, less so.- Joss Whedon

Automobile warranty expires. So does engine.- Stan Lee

Machine. Unexpectedly, I’d invented a time- Alan Moore

Longed for him. Got him. Shit.- Margaret Atwood



Milton Friedman, 1912-2006

In The New York Times:

Milton Friedman, the grandmaster of conservative economic theory in the postwar era and a prime force in the movement of nations toward lesser government and greater reliance on free markets and individual responsibility, died today. He was 94 years old.

A spokesman for the Milton and Rose D. Friedman Foundation confirmed his death.

Conservative and liberal colleagues alike viewed Mr. Friedman as one of the 20th century’s leading economic scholars, on a par with giants like John Maynard Keynes, Joseph A. Schumpeter and Paul Samuelson.

Flying the flag of economic conservatism, Mr. Friedman led the postwar challenge to the hallowed theories of Lord Keynes, the British economist who maintained that governments had a duty to help capitalistic economies through periods of recession and to prevent boom times from exploding into high inflation.

The Social Responsibility in Teaching Sociobiology

David P. Barash in the Chronicle of Higher Education:

Deathofsocrates1Socrates was made to drink hemlock for having “corrupted the youth of Athens.” Is sociobiology or — as it is more commonly called these days — “evolutionary psychology” similarly corrupting? Although the study of evolution is, in my opinion, one of the most exciting and illuminating of all intellectual enterprises, there is at the same time, and not just in my opinion, something dark about the implications of natural selection for our own behavior.

More here.

George Plimpton: An American Man of Letters

From the Plimpton Project (via Yahoo! Picks):

Screenhunter_1_25George Plimpton didn’t just write, he threw himself into the ring with his subjects, bringing a dash of bravado to the occupation of journalist. Now, a group of Plimptophiles has crafted an online homage to the man who “attacked life” with such “gusto and grace.” The photos section cuts quickly to the way George went about things: Here he is listening to Muhammad Ali, slouching backstage at Caesar’s, or joking with Jonathan Winters. There he is, caught in full regalia, playing with the Bruins, the Boston Celtics, and the Detroit Lions. No fan ever had it so good. The “Arcana” section features a marvelous collection of his quotes. And we’re glad to know that an effort to erect a larger-than-life-size monument to the author-adventurer is under way. As for whether he should be depicted alongside a bicycle, dangling his boxing mitts, or astride a noble steed, we vote for all three. With gusto and grace.

More here.

BEYOND REDUCTIONISM

From Edge:

Kauffman200 Reinventing The Sacred by Stuart A. Kauffman: Stuart A. Kauffman studies the origin of life and the origins of molecular organization. Thirty-five years ago, he developed the Kauffman models, which are random networks exhibiting a kind of self-organization that he terms “order for free.” He asks a question that goes beyond those asked by other evolutionary theorists: if selection is operating all the time, how do we build a theory that combines self-organization (order for free) and selection? The answer lies in a “new” biology:

“While it may sound as if ‘order for free’ is a serious challenge to Darwinian evolution, it’s not so much that I want to challenge Darwinism and say that Darwin was wrong. I don’t think he was wrong at all. I have no doubt that natural selection is an overriding, brilliant idea and a major force in evolution, but there are parts of it that Darwin couldn’t have gotten right. One is that if there is order for free — if you have complex systems with powerfully ordered properties — you have to ask a question that evolutionary theories have never asked: Granting that selection is operating all the time, how do we build a theory that combines self-organization of complex systems — that is, this order for free — and natural selection? There’s no body of theory in science that does this. There’s nothing in physics that does this, because there’s no natural selection in physics — there’s self organization. Biology hasn’t done it, because although we have a theory of selection, we’ve never married it to ideas of self-organization. One thing we have to do is broaden evolutionary theory to describe what happens when selection acts on systems that already have robust self-organizing properties. This body of theory simply does not exist.”

More here.

At Radcliffe, Yale scholar talks about Great Britiain and ‘brown babies’

From The Harvard Gazette:

Brownbabies1450 World War II, with its influx of multiracial colonial volunteers and billeted American troops, was the caldron that created Great Britain as a state in which race became an instrument of policy and a tool of cultural division. That’s the thesis brought to the Radcliffe Institute for Advanced Study on Nov. 2 by Hazel V. Carby, a Yale University scholar of race, gender, and literature. The war, she said, prompted the emergence of Britain “as a modern racialized state.”

As early as 1942, the British Colonial office worried “what the future population of the nation would look like” in the face of a sexual invasion by black soldiers. By 1947, orphans of mixed race probably numbered in the hundreds, but the numbers were regularly inflated. “The lonely piccaninny” became a staple of the popular press, said Carby. It was an image that hid deeper fears of British cultural identity, and anxiety over a disappearing empire. Showing one tabloid image, Carby said, “A British subject is what this piccaninny is not.” In the end, she said, it was this war-induced “homegrown composite racial consciousness … that gave the English national culture its character, its meaning, its substance, and its resonance.”

Homi Bhabha, the Anne F. Rothenberg Professor of the Humanities and director of the Humanities Center, introduced Carby, whose work he called a robust confrontation with “intellectual pieties and scholarly orthodoxies.”

More here.

Wednesday, November 15, 2006

Of Human Bondage

Anthony Lane in The New Yorker:

Casinoroyalewallpaper1Who said this: “It is interesting for me to see this new Bond. Englishmen are so odd. They are like a nest of Chinese boxes. It takes a very long time to get to the center of them. When one gets there the result is unrewarding, but the process is instructive and entertaining.” The speaker is Mathis, a kindly French liaison officer in “Casino Royale,” Ian Fleming’s first James Bond novel, published in 1953. More than half a century later, we are back with “Casino Royale,” No. 21 in the roster of official Bond films, and we are back with Mathis. As played by Giancarlo Giannini, who was recently seen having his intestines removed in “Hannibal,” he is pouchy, affable, and dangerously wise, and his presence hints that this new adventure will not be an occasion for silliness: no calendar girls, no blundering boffins, no giants with dentures of steel. The same goes for hardware, with rockets and gadgets alike being trimmed to the minimum. It is true that Bond keeps a defibrillator in the glove compartment of his Aston Martin, but, given the cholesterol levels of the kind of people who drive Aston Martins, a heart-starter presumably comes standard, like a wheel jack. Whether Bond has a heart worth starting is another matter.

More here.

Darwin at the Zoo

Jonathan Weiner in Scientific American:

D25a15f4e7f299df3238208b0d11d7ab_1It was not until a year and a half after his voyage on board the Beagle that Charles Darwin first came face to face with an ape. He was standing by the giraffe house at the London Zoo on a warm day in late March of 1838. The zoo had just acquired an orangutan named Jenny. One of the keepers was teasing her–showing her an apple, refusing to hand it over. Poor Jenny “threw herself on her back, kicked & cried, precisely like a naughty child,” Darwin wrote in a letter to his sister.

In the secret notebooks that he kept after the voyage, Darwin was speculating about evolution from every angle, including the emotional, and he was fascinated by Jenny’s tantrum. What is it like to be an ape? Does an orangutan’s frustration feel a lot like ours? Might she cherish some sense of right and wrong? Will an ape despair because her keeper is breaking the rules–because he is just not playing fair?

More here.

Micro Images

From the Micro Images Blog:

Chalkdust2500xChalk Dust (2500 X)

For the longest time, I wanted to see what chalk dust looked like up close. It turns out, you dont see all that much more, even at 2500x the dust is pretty fine. It is interesting to see that chalk dust is made up of two general sizes of particles, and that they tend to clump together a bit.

.

.

.

Stomate1500xStoma (1500 X)

This is one of millions of holes found on the common leaf used for gas exchange in plants. What you see here is a waxy outer layer of leaf. Inside the hole you can also see the remains of two guard cells which help open and close this tiny pore to regulate air and water exchange. This leaf was found dried out already, which means this level of preservation is shocking. I did not expect to find anything so detailed left.

.

Cobweb450x Cobweb (450 X)

Ever wonder what that very fine cobweb looks like closer up? Well no longer. I wonder if some spiders make more curly webs than others? In the upper left corner, you have a thread of some fiber which shows just how thin the cobwebs are.

.

.

.

More here.

The Secret History of Mathematicians

Daniel S. Silver in American Scientist:

Fullimage_200610512181_307Historian George Sarton often said that science advances in darkness, invisible to the majority of people, who are more interested in battles and other noisier activities. In his 1957 book The Study of the History of Mathematics, Sarton went on to say that if the history of science is secret, then the history of mathematics is doubly so, “for the growth of mathematics is unknown not only to the general public, but even to scientific workers.”

Sarton’s words help us understand why few have ever heard of Arthur Cayley (1821-95) or James Joseph Sylvester (1814-97), two of the most profound and prolific mathematicians of the Victorian era. Cayley’s seminal investigations of matrix algebra, which constituted only a tiny portion of his 967 papers, were crucial for the development of linear algebra. The terms matrix, determinant and Jacobian, familiar to most science students, were invented by Sylvester, an enthusiastic poet who called himself the “mathematical Adam.”

It is not clear when Cayley and Sylvester first met, but by 1847 they were corresponding to share thoughts about mathematics.

More here.  [Photo shows Arthur Cayley.]

ecofiction

Critique_of_criminal_reason

The echo of Eco still lures philosophers tempted by literary fame. True to their calling, aspirants find the notion occurs to them as a hypothetical.

Suppose, the wannabe star reflects, I combine the profundities of truth and meaning I handle with my left hand in seminars with the fast-paced narrative ratiocination I prize in mysteries (the books I actually consume instead of rereading philosophy texts assigned in those seminars). Then I soak it all in the sex, blood, and historical detail that attracts me as a run-of-the-mill cultural citizen.

Wouldn’t I rival the success of Umberto Eco himself, whose The Name of the Rose (1983), with its wonderfully deductive William of Baskerville and his terribly loyal sidekick Adso, conquered international best-seller lists in the 1980s and launched the Bologna professor of semiotics on a heady mass-market career?

more from The Chronicle Review here.

hitch on clive

Clive_james

The great Peter De Vries, when asked about the nature of his ambition, replied that he yearned for a mass audience that would be large enough for his elite audience to despise. In this latest volume of his tragicomic autobiography, Clive James admits twice to a similar aspiration. Meeting the dazzling Nicholas Tomalin and accidentally making a good impression on him with a piece of gaucherie about wine, he finds (or fancies) that Tomalin is describing him round town as “the boy from the bush who could quote Wittgenstein”. Looking back at the close of North Face of Soho, he rues his own tendency to fall for projects “that would duplicate the effects of the Italian Renaissance while helping to save the baby seals in the rain forest”. The first of these moments comes just as James has left the Footlights in Cambridge to launch himself in the metropolis, and the second occurs when he is back home in Cambridge trying to recuperate from the flopperoo that was the West End launch of his mock-epic poem about the grooming of Prince Charles. This, in other words, is about that weird transitional interlude “the 1970s”; a decade of “becoming” for many boomers. He faithfully notes that many people tried to warn him about the “Charles Charming” fiasco – Mark Boxer discreetly, James Fenton firmly, your humble servant rudely and coarsely – so here might be the place to state that the 70s in London would have been infinitely less amusing without the willingness of Clive James to take chances including – which is that most vertiginous of all risks – the danger of making himself look ridiculous.

more from the TLS here.

when in doubt, dot

Kuspit11102s

“When in doubt, dot,” Jennifer Bartlett has said, and so, as though to keep herself from doubting — her creativity? herself? — she obsessively dots, creating vivid grids of color dots. Sometimes grids within grids, as in Random Sequence, Random Changing Space and Color Titles with Samples (the subtitles of a stunning series of Untitled works from 1969). A useful way of understanding Bartlett is via of Lucretius’ De Rerum Natura, with its Democritean vision of nature as a system of atoms in motion. It was an enlightened scientific view intended to liberate people from animistic superstition, which sees every natural thing — a mountain, a tree, a stream — as inhabited by some deity who must be appeased and of whom one must be wary. But there’s one catch to the system: Venus starts it. She sets the invisible atoms in motion: They “swerve” — deviate ever so slightly from their neat paths — to converge, forming molecules of visible matter, because of the goddess’ power. Eros gets the cosmos going and keeps it moving, generates its complex togetherness. Eros keeps it from running down — becoming an entropic grid.

more from Artnet here.

the long journey of Doris lessing

Lessing_doris199604181

It is as if some gauze or screen has been dissolved away from life, that was dulling it, and like Miranda you want to say, What a brave new world! You don’t remember feeling like this, because, younger, habit or the press of necessity prevented. You are taken, shaken, by moments when the improbability of our lives comes over you like a fever. Everything is remarkable, people, living, events present themselves to you with the immediacy of players in some barbarous and splendid drama that it seems we are part of. You have been given new eyes.

—Doris Lessing, Time Bites

Doris Lessing, who turned eighty-seven in October, is telling us what “old” feels like. Not a believer in “the golden age of youth,” she “shudders” at the very idea of living through her teens again, even her twenties. Since she left Africa for England more than half a century ago, a single mother and a high school dropout with a wardrobe full of avatars—angry young woman, mother superior, bad-news bear, bodhisattva—she has published an astonishing fifty-five books. Although Time Bites is her first collection of articles, lectures, book reviews, and broadcasts, The Story of General Dann and Mara’s Daughter, Griot and the Snow Dog is her twenty-fifth novel. Nor does the fact that she’s four inches shorter than she used to be make her a shrinking violet. “Old” is as nice as she gets in Time Bites. Her default mode is usually imperious, as if ex cathedrawere the normal respiration of her intelligence.

more from the NY Review of Books here.

Outlets Are Out

Outlet

From Science:

Imagine recharging your cell phone without plugging it in. Or powering your iPod while you walk around the house with it. Researchers at the Massachusetts Institute of Technology (MIT) have taken the first steps towards such wireless energy transfer by conceptualizing a way to transmit electricity over room-size distances. One day, they say, the technology could power whole households or even motor vehicles wirelessly.

The MIT team calls the concept a nonradiative electromagnetic field. It involves two simple ring-shaped devices made of copper. One, connected to a conventional power source, would generate magnetic fields similar to those that power electric motors. These fields would stretch outward a few meters and would only affect the receiving–or companion device–which would be outfitted with a second copper ring tuned to a specific frequency. Team leader Marin Soljačić says he began working on the concept because he wanted to find a better alternative to having to recharge his laptop computer and cell phone so frequently. He presented the team’s findings today at an American Institute of Physics forum in San Francisco, California.

More here.

Soy and fish protect from cancer

From Scientific American:

People who ate soy regularly as children have a lower risk of breast cancer, researchers reported on Tuesday. And men who eat fish several times a week have a lower risk of colon cancer, a second team of researchers told a meeting in Boston of the American Association for Cancer Research. The studies add to a growing body of evidence about the role of diet in cancer. Cancer experts now believe that up to two-thirds of all cancers come from lifestyle factors such as smoking, diet and lack of exercise.

Dr. Larissa Korde of the National Cancer Institute and colleagues at the University of Hawaii studied studied 597 Asian-American women with breast cancer and 966 women without the disease. The mothers of some of the women were also available to answer questions about what they fed their daughters as children. The women who ate the most soy-based foods such as tofu and miso when aged 5 to 11 reduced their risk of developing breast cancer by 58 percent, the researchers found. “Childhood soy intake was significantly associated with reduced breast cancer risk in our study, suggesting that the timing of soy intake may be especially critical,” Korde said. It is not clear how soy might prevent cancer, although compounds in soy called isoflavones have estrogen-like effects.

More here.

Tuesday, November 14, 2006

Religiosity and Social Health

In the Skeptic:

It is commonly held that religion makes people more just, compassionate, and moral, but a new study suggests that the data belie that assumption. In fact, at first glance it would seem, religion has the opposite effect. The extensive study, “Cross-National Correlations of Quantifiable Societal Health with Popular Religi-osity and Secularism in the Prosperous Demo-cracies,” published in the Journal of Religion and Society (http://moses.creighton.edu/JRS/2005/2005-11.html) examines statistics from eighteen of the most developed democratic nations. It reveals clear correlations between various indicators of social strife and religiosity, showing that whether religion causes social strife or not, it certainly does not prevent it.

The author of the study, Gregory S. Paul, writes that it is a “first, brief look at an important subject that has been almost entirely neglected by social scientists…not an attempt to present a definitive study that establishes cause versus effect between religiosity, secularism and societal health.” However, the study does show a direct correlation between religiosity and dysfunctionality, which if nothing else, disproves the widespread belief that religiosity is beneficial, that secularism is detrimental, and that widespread acceptance of evolution is harmful.

Paul begins by explaining how far his findings diverge from common assumptions. He even quotes Benjamin Franklin and Dostoevsky to show how old these common-misconceptions are. Dostoevsky wrote, “if God does not exist, then everything is permissible.” Benjamin Franklin noted, “religion will be a powerful regulator of our actions, give us peace and tranquility within our minds, and render us benevolent, useful and beneficial to others.”

Gregory Paul’s article in The Journal of Religion and Society (there may be something of an ecological fallacy here):

In general, higher rates of belief in and worship of a creator correlate with higher rates of homicide, juvenile and early adult mortality, STD infection rates, teen pregnancy, and abortion in the prosperous democracies (Figures 1-9). The most theistic prosperous democracy, the U.S., is exceptional, but not in the manner Franklin predicted. The United States is almost always the most dysfunctional of the developed democracies, sometimes spectacularly so, and almost always scores poorly. The view of the U.S. as a “shining city on the hill” to the rest of the world is falsified when it comes to basic measures of societal health. Youth suicide is an exception to the general trend because there is not a significant relationship between it and religious or secular factors. No democracy is known to have combined strong religiosity and popular denial of evolution with high rates of societal health. Higher rates of non-theism and acceptance of human evolution usually correlate with lower rates of dysfunction, and the least theistic nations are usually the least dysfunctional. None of the strongly secularized, pro-evolution democracies is experiencing high levels of measurable dysfunction.

The Sepoys of the Great Mutiny of 1857, A Precursor to Al Qaeda?

William Dalrymple, in his new book The Last Mughal, suggests that the Great Mutiny of 1857, the Indian uprising against the British, contained precursors to Al Qaeda. In Outlook India, Irfan Habib responds. (Via signandsight.com).

[The historian] Percival Spear and ‘Talmiz Khaldun’ were doubtless pioneers in English in trying to look at the Mutiny in Delhi from the eyes of the Delhi court, citizenry and the sepoys. The fact that the sepoys had to live and get the money out of the Delhi citizenry always created problems for a city under siege by an implacable enemy. This was a situation partly specific to Delhi. But even so the role of the mutineers in facing these difficulties has been well underlined by Prof Iqbal Husain, for example, in his essay on Bakht Khan.

The reference to Bakht Khan brings me to consider Dalrymple’s rather unfortunate assumption that the Wahabis and Muslim sepoys were somehow the precursors of Al Qaeda and the Taliban. This ignores the vital fact that religion in 1857 was the medium through which a growing resentment against the multiple inequities of the British rule was expressed. Ray brings this out fairly well. The Bengal Army sepoys throughout maintained a surprising inter-communal unity among them, a fact noted by Syed Ahmed Khan in his Asbab Baghawat-i Hind. He admitted that the Hindu and Muslim sepoys, having shed their blood together for their British masters for so long, were now so closely linked to each other in a common brotherhood that they could not but fight till the end once the uprising had begun. Such anti-colonial spirit suggests analogies as strong with Vietnam as with Iraq or Palestine. It would be too narrow to see it in a ‘jehad’ framework of our own creation.

Danto on Botero’s Abu Ghraib

In The Nation, Arthur Danto on Fernando Botero’s Abu Ghraib:

Though transparently modern, Botero’s style is admired mainly by those outside the art world. Inside the art world, critic Rosalind Krauss spoke for many of us when she dismissed Botero as “pathetic.”

When it was announced not long ago that Botero had made a series of paintings and drawings inspired by the notorious photographs showing Iraqi captives, naked, degraded, tortured and humiliated by American soldiers at Iraq’s Abu Ghraib prison, it was easy to feel skeptical–wouldn’t Botero’s signature style humorize and cheapen this horror? And it was hard to imagine that paintings by anyone could convey the horrors of Abu Ghraib as well as–much less better than–the photographs themselves. These ghastly images of violence and humiliation, circulated on the Internet, on television and in newspapers throughout the world, were hardly in need of artistic amplification. And if any artist was to re-enact this theater of cruelty, Botero did not seem cut out for the job.

As it turns out, his images of torture, now on view at the Marlborough Gallery in midtown Manhattan and compiled in the book Botero Abu Ghraib, are masterpieces of what I have called disturbatory art–art whose point and purpose is to make vivid and objective our most frightening subjective thoughts. Botero’s astonishing works make us realize this: We knew that Abu Ghraib’s prisoners were suffering, but we did not feel that suffering as ours. When the photographs were released, the moral indignation of the West was focused on the grinning soldiers, for whom this appalling spectacle was a form of entertainment. But the photographs did not bring us closer to the agonies of the victims.

Botero’s images, by contrast, establish a visceral sense of identification with the victims, whose suffering we are compelled to internalize and make vicariously our own. As Botero once remarked: “A painter can do things a photographer can’t do, because a painter can make the invisible visible.”

The Hatred of Paris Hilton

In City Journal, Kay Hymowitz asks a question that I also have asked on many an occasion: why are people so obsessed with the life of a celebrity they deeply hate, Paris Hilton?

Paris certainly knows how to show off her considerable evolutionary advantages to the camera, where it matters most these days; she adroitly tilts her perfectly styled head like that, angles her sweetheart chin just so, arches her long, lean back comme ça, and gives that sideways, heavy-lidded, come-hither look (now known as a Come Fuck Me) that has bewitched fans since the days of Silver Screen.

But the evolutionary theory of celebrity does not begin to explain Paris Hilton mania for one reason: people hate the woman. She must be the most powerful snark magnet in history…

[T]o check out the megabytes of commentary that follow Paris’s every embarrassing move is to be struck by a loathing that confutes the Darwinian explanation. Cries of “nonentity,” “rich white trash,” “no-talent,” “brainless hussy,” and “hotel heirhead” echo throughout cyberspace. Politically incorrect slurs like “tramp,” “tart,” “slut,” “skank,” and “skanktron” have suddenly become acceptable again, as long as Paris is their target…

[T]he reason for this bile goes even deeper than Grove’s accurate indictment. What drives Americans crazy about Paris is what has incensed Americans since before the Revolution: her haughty air of highborn privilege. She is our Marie Antoinette: “I’m the closest thing to American royalty,” Paris explained when she wrote to Prince Charles to ask for permission to use Westminster Abbey or Windsor Castle for her wedding to her soon-to-be ex-fiancé.