How Bible publishers went forth and multiplied

Greg Beato in Reason:

ScreenHunter_02 Jan. 22 10.57 In the 16th century, when William Tyndale translated the Bible from Hebrew and Greek into English, thereby unlocking the Word of God to the common man, he was rewarded for his efforts by being burned at the stake. So were most of the copies of his translation. In colonial times, it was illegal to print Bibles in North America; only certain printers in England and Scotland were authorized to publish the holy book. During the Revolution those imports stopped, creating, according to The Centennial History of the American Bible Society, a “famine of Bibles.” So in 1782 the Philadelphia printer Roger Aitken printed 10,000 copies of America’s first complete English Bible. The book came with a congressional endorsement, but when the war ended, cheap imports resumed, domestic competition exploded, and thousands of copies of the Aitken Bible failed to sell. In 1791, he wrote a letter to Pennsylvania’s tax man stating that he’d lost $4,000 on the venture. Today America is characterized by Biblical obesity, not Biblical famine. A 2003 survey conducted by Zondervan, one of the nation’s largest Christian book publishers, found that the average U.S. household contains 3.9 Bibles, and U.S. consumers purchase approximately 20 million new Bibles annually. “Business analysts describe Bible publishing as a mature industry with little prospect for strong growth,” The Boston Globe reported in 1986, but year in and year out, the Bible remains the best-selling book in America.

The glut, in fact, is what creates the demand. Long before Web 2.0 billionaires decided that $0.00 was a price point consumers would find even more tempting than Eve’s apple, Bible societies had started distributing millions of copies for free or at little cost to establish brand awareness, build a user base, and make the formerly expensive, scarce, and highly regulated item a ubiquitous presence in the culture.

More here.

Stories from Pakistan about living upstairs, downstairs and everywhere in between

Michael Dirda in the Washington Post:

PH2009021301272 Because of Salman Rushdie, Arundhati Roy and Rohinton Mistry, to mention just a few of the most prominent authors, American readers have long been able to enjoy one terrific Indian novel after another. But Daniyal Mueenuddin's In Other Rooms, Other Wonders is likely to be the first widely read book by a Pakistani writer. Mueenuddin spent his early childhood in Pakistan, then lived in the United States — he attended Dartmouth and Yale — and has since returned to his father's homeland, where he and his wife now manage a farm in Khanpur. These connected stories show us what life is like for both the rich and the desperately poor in Mueenuddin's country, and the result is a kind of miniaturized Pakistani “human comedy.”

In the original Comédie humaine, Balzac had the ingenious notion of tying his various novels together by using recurrent characters. Eugène de Rastignac is the protagonist of Le Père Goriot but is subsequently glimpsed in passing or sometimes just referred to in several other books. In like fashion, Mueenuddin interlaces eight stories, while also linking them to the household of a wealthy and self-satisfied landowner named K.K. Harouni. In “Saleema,” for instance, Harouni's elderly valet, Rafik, falls into a heartbreaking affair with a young maidservant, and we remember this, with a catch in our throat, when in another story we see him bring in two glasses of whiskey on a silver tray.

More here. And here's a nice video of Daniel Mueenuddin and Mohsin Hamid from the Asia Society:

[Thanks to Laura Claridge.]

The Decline of the Decline of Arabic Science

Austin Dacey in the Skeptical Inquirer:

ScreenHunter_01 Jan. 22 10.13 Just as soon as anyone notes the dismal state of science in contemporary Muslim-majority countries, someone else with a little knowledge of history will observe that the Islamic world was once the center of the scientific world, and Arabic was once the lingua franca. From the eighth to the end of the fourteenth centuries, the most important work in the fields of mathematics, astronomy, optics, and medicine took place under Muslim rule.

Before Europe’s first university had opened in Bologna, the House of Wisdom in Baghdad was amassing a library that reportedly housed as many as four hundred thousand volumes. There, under the patronage of the Abbasid dynasty, Arabic-speaking scholars—including Persians, Christians, Jews, and others—translated Greek texts by authors such as Aristotle, Plato, Pythagoras, Euclid, Ptolemy, Hippocrates, and Galen, as well as material in Persian, Syriac, and Sanskrit. It was not until the twelfth and thirteenth centuries that this ancient learning came to Europe, primarily by way of Muslim Spain. As late as the seventeenth century, European colleges still relied on the Canon, a medical textbook by Avicenna, the Latinized name of the medieval physician and polymath Ibn Sina.

This Golden Age is rightly held up as one of the glories of Arabic-Islamic civilization. However, it only makes more pointed the question of how Arabic-language science (defined broadly as natural philosophy) came to be so rapidly and totally surpassed by European science.

More here.

Aleksandar Hemon in Conversation with Colum McCann

Hemon_mccall In the Believer:

COLUM McCANN: What are we doing here? Why aren’t we in a pub?

ALEKSANDAR HEMON: Because you live in the provinces, far away from everything.

CM: So, we’re here… to talk (as the bishop to the hooker). The next question is: why are we here? That, of course, is easy to answer. But, seriously, sometimes I wonder if we—I mean, we, us, as writers—have to increasingly justify ourselves, you know, like visual artists, whose primary mode of entry into their art seems to be the painstaking explanation of it. Forget the painting. There’s a whole business built up around it. The artists have to acquire a specific language. Have you read any of those “statements of purpose” (!) by some of the contemporary artists? It’s like stepping through acres of fresh tar. You pick one foot up only to find the other sinking further.

AH: Actually, I have not read any of those statements of purpose, but I can imagine what they look like. I wouldn’t be so hard on artists, though. On the one hand, every artist, writers included, have an ethics and an aesthetics, whether they can formulate them or not. I happen to think that it is good to be able to formulate—it is good to know what you are doing and to be able to talk about it. On the other hand, art is so widely (and often thinly) spread, that anything can be it. A lot of it is nothing but a gesture, not an object, not a thing unto itself, and it literally does not exist without interpretation. I am all for interpretation, but for the past century or so, an interpretation can be slapped on everything and anything. Literature, on the other hand, is always something—it is either story or poetry, ideally both. That is, you always know what it is and even if the interpretation is not available, the experience of language is. Language is so inherent to humanity, so necessary for even basic thinking, that stories and poetry are available to anyone who can process language. So it’s easy for us.

CM: I happen to think that an ounce of empathy is worth a boatload of judgment. A writer can disease himself or herself with his or her own position, thinking about it too much. But, that said, I’m slightly off-put by our world getting increasingly rarefied, like the world of art, where we must justify ourselves with our meaning. Imagine constantly explaining ourselves. Like a football commentary or something…

Don’t Play With That, Or You’ll Go Blind

Avatar_JamesCameron-thumb-400x601-11525.img_assist_custom Caleb Crain in n+1:

James Cameron's 3-D movie Avatar [is]…a finished corruptness. The easiest way I can think of to describe it is by comparison with The Matrix, a movie which is merely disingenuous, and to some extent struggling with its disingenuousness. The moral lesson that The Matrix purports to offer is that the glossy magic of life inside a simulation distracts from painful truth. But the moral problem faced by the Matrix is that this lesson is betrayed by the fun that the movie has in playing inside the simulation. A viewer enjoys the scenes of jumping over buildings, and of freezing explosions and fistfights in midair and then rotoscoping through them. In fact, the viewer enjoys them much more than the scenes of what, within the conceit of the movie, is considered reality. There may be a brief yucky thrill to learning that in reality people are grown in pods so their energy can be harvested by robots, but as a matter of aesthetics, reality in The Matrix turns out to be drab and constricted by gravity and other laws of physics. The closing sequence, where Neo (Keanu Reeves) plugs back in to the matrix and runs a sort of special-effects victory lap, makes no sense, in terms of the moral victory he is supposed to have won. If he has really joined the blue-pill team, he ought to be sitting down to another bowl of bacterial gruel with his ragged, unshowered friends, and recommitting himself to the struggle. Instead he's leaping around in a Prada suit. So the viewer departs from the movie with a slightly queasy feeling, a suspicion that visual pleasures aren't to be trusted. That queasiness is the trace of the movie's attenuated honesty.

And such queasiness and honesty are completely absent from Avatar. Some might protest: But what about Avatar's anti-imperialism and anti-corporate attitudinizing? They're red herrings, in my opinion, planted by Cameron with the cynical intention of distracting the viewer from the movie's more serious ideological work: convincing you to love your simulation—convincing you to surrender your queasiness. The audacity of Cameron's movie is to make believe that the artificial world of computer-generated graphics offers a truer realm of nature than our own.

Thursday, January 21, 2010

Sinomania

The 9780713992540NYT reports that China is subsidizing Chinese language instruction in US high schools. Perry Anderson in the LRB:

These days Orientalism has a bad name. Edward Said depicted it as a deadly mixture of fantasy and hostility brewed in the West about societies and cultures of the East. He based his portrait on Anglo-French writing about the Near East, where Islam and Christendom battled with each other for centuries before the region fell to Western imperialism in modern times. But the Far East was always another matter. Too far away to be a military or religious threat to Europe, it generated tales not of fear or loathing, but wonder. Marco Polo’s reports of China, now judged mostly hearsay, fixed fabulous images that lasted down to Columbus setting sail for the marvels of Cathay. But when real information about the country arrived in the 17th and 18th centuries, European attitudes towards China tended to remain an awed admiration, rather than fear or condescension. From Bayle and Leibniz to Voltaire and Quesnay, philosophers hailed it as an empire more civilised than Europe itself: not only richer and more populous, but more tolerant and peaceful, a land where there were no priests to practise persecution and offices of the state were filled according to merit, not birth. Even those sceptical of the more extravagant claims for the Middle Kingdom – Montesquieu or Adam Smith – remained puzzled and impressed by its wealth and order.

A drastic change of opinion came in the 19th century, when Western predators became increasingly aware of the relative military weakness and economic backwardness of the Qing empire. China was certainly teeming, but it was also primitive, cruel and superstitious. Respect gave way to contempt, mingled with racist alarm – Sinomania capsizing into Sinophobia. By the early 20th century, after eight foreign forces had stormed their way to Pekin to crush the Boxer Uprising, the ‘yellow peril’ was being widely bandied about among press and politicians, as writers like Jack London or J.H. Hobson conjured up a future Chinese takeover of the world. Within another few decades, the pendulum swung back, as Pearl Buck and Madame Chiang won popular sympathy for China’s gallant struggle against Japan. After 1948, in a further rapid reversal, Red China became the focus of still greater fear and anxiety, a totalitarian nightmare more sinister even than Russia. Today, the high-speed growth of the People’s Republic is transforming Western attitudes once again, attracting excitement and enthusiasm in business and media alike, with a wave of fashion and fascination recalling the chinoiserie of rococo Europe. Sinophobia has by no means disappeared. But another round of Sinomania is in the making.

When thinking of Haiti, don’t be fooled by its borders

Our own Morgan Meis in The Smart Set:

Morgan-Meis In the first years of the 19th century, Napoleon decided he'd had enough of the Haitian patriot, freedom fighter, and self-proclaimed defender of the French Revolution, Toussaint Louverture. Louverture had organized slave revolts in Haiti and defeated armies sent by the Spanish, English, and the French. A man of the Enlightenment, he took the ideas of liberté, egalité, and fraternité quite seriously. Never sentimental, Napoleon realized that France's Caribbean colonies were heavy on the lucre, and that he needed slave labor to keep the profits flowing. Louverture had become a nuisance.

Louverture was tricked into a meeting and then captured by the French in 1802. He was brought back to France, where he lived and quickly expired in a little dungeon called Fort de Joux. But before his death, he’d managed to stir the hearts of quite a few. William Wordsworth was one. A youngish Wordsworth penned the following lines in honor of the great Haitian:

TOUSSAINT, the most unhappy of men!
Whether the whistling Rustic tend his plough
Within thy hearing, or thy head be now
Pillowed in some deep dungeon's earless den; –
O miserable Chieftain! where and when
Wilt thou find patience? Yet die not; do thou
Wear rather in thy bonds a cheerful brow:
Though fallen thyself, never to rise again,
Live, and take comfort. Thou hast left behind
Powers that will work for thee; air, earth, and skies;
There's not a breathing of the common wind
That will forget thee; thou hast great allies;
Thy friends are exultations, agonies,
And love, and man's unconquerable mind.

Wordsworth saw Toussaint as a participant in the promising events of the time: Toussaint in Haiti, the sans-culottes in France, the revolutionaries in New York and Boston. Recent events — not to mention the last 200 years of Haitian history — have proven Wordsworth a poor prophet when it comes to Haiti.

More here.

connecting the classical and the quantum

Pendulum_320x198

Working in a cramped MIT laboratory in 1961, meteorologist Edward Lorenz stumbled upon a new science. Wanting a closer look at the data of a weather simulation he was running, Lorenz restarted it in the middle. Within a few minutes, everything changed and the data he had expected to see had morphed into strange new patterns. A stunned Lorenz checked his inputs. He had rounded the starting values by about .0001, which should have been insignificant. And yet it was not. At the time, scientists thought small changes in starting values should make only a small difference in most systems. But sometimes such tiny shifts will cause a very different outcome, completely out of proportion with the size of the change—this hypersensitivity to initial conditions is what Lorenz dubbed the “butterfly effect” and what we now call chaos. Chaos, which underlies systems as diverse as fractals, ferns, and weather, causes behavior so complex and unexpected that, though it is fundamental to the natural order, it was only recently that scientists began to characterize it. Chaotic movement is unstable and unpredictable, but completely deterministic, meaning that it’s controlled by its starting conditions.

more from Veronique Greenwood at Seed here.

Terror Incognito

Amitava Kumar credit MADHU KAPPARATH An interview with Amitava Kumar on his new book, in Time Out Delhi:

In 2006, Amitava Kumar went to Walavati, south of Mumbai, to report the case of an accused terrorist, Iqbal Haspatel, whom he later wrote about in Time Out. Kumar, a writer who teaches English at Vassar College, had already explored the dark world of Islamophobia in Husband of a Fanatic, prompted by his marriage to a Pakistani Muslim. In his new book, Evidence of Suspicion, he goes a step further in asking why we hate the people we hate, in this case, men accused of terrorism. “Terror is the new fetish,” he told Raghu Karnad. “Its meaning is taken for granted.”

Before you wrote the Haspatel story, you obviously already had strong feelings about the way India prosecutes terrorism.

I decided to go to Walavati and meet the Haspatels because a piece of textile machinery – a bobbin – found in their living room had been mistaken for a projectile. They were tortured for days because the police had made a malicious mistake.

But before that, I had been working on another story, also about terrorism, but in the US. It was the story of Hemant Lakhani, a used-clothing salesman convicted of selling a missile to the FBI. Both illustrated the problems of the war on terror: the state’s desperation to find villains, its ability to produce only victims.

Philip Kitcher: Religion After Darwin

Kitcher's Hampshire College lecture:

Many people believe that Darwin’s theory of evolution by natural selection poses a threat to religion (specifically to Christianity). Dr. Kitcher suggests that, taken on its own, Darwin’s work can be assimilated by many world religions and many versions of Christianity. There is, however, a deeper problem.

The scientific approach that underlies Darwin’s achievements is inimical to all but the most liberal forms of religion. Once this point is appreciated, it is tempting to believe, as the militant Darwinian atheists of our time triumphantly proclaim, that religious practices should simply be eradicated.

Dr. Kitcher argues that this is incorrect, and that a genuinely humane secularism – a real Secular Humanism – should absorb some characteristically religious attitudes. We need to discard the myths offered by supernaturalist doctrines, but we also need what Dewey called “A Common Faith.”

The Work of Art in the Age of Mechanical Reproduction

455px-Knight-Death-and-the-Devil Cosma Shalizi over at Three-Toed Sloth:

This thread over at Unfogged reminds me of something that's puzzled me for years, ever since reading this: why didn't prints displace paintings the same way that printed books displaced manuscript codices? Why didn't it become the expected thing that visual artists, like writers, would primarily produce works for reproduction? (No doubt, in that branch of the wave-function*, obsessive fans still want to get the original drawings, but obsessive fans also collect writer's manuscripts, or even their typewriters, as well as their mass-produced books.) 16th century engraving technology was strong enough that it could implement powerful works of art (vide), so that can't be it. And by the 18th century at least writers could make a living (however precarious) from writing for the mass public, so why were visual artists (for the most part) weren't artists? (Again, it's manifestly not as though technology has regressed.) Why is it still the case that a real, high-class visual artist is someone who makes one-offs? I know that reproductions have been important since at least the late 1800s, but for works and artists who first made their reputation with unique, hand-made objects, which is as though the only books which got sent to the printing press were ones which had first circulated to acclaim in manuscript.

Some possibilities I don't buy:

  1. Aesthetic limitations. There are valuable effects which can be achieved with a big original painting which prints just can't match. Response: there are effects you can achieve with an illuminated, calligraphic manuscript which you can't match with movable type, either. Those weren't valuable enough to keep printed books from taking over. Why the difference? Why not a focus on what can be done through prints, which is quite a lot? (Witness the experience of the 20th century and later, when most art lovers know most works of art they enjoy through reproductions.)

Thursday Poem

Alhazen of Basra

If I could travel a thousand years back
to August 1004, to a small tent
where Alhazen has fallen asleep among books
about sunsets, shadows, and light itself,
I wouldn’t ask whether light travels in a straight line,
or what governs the laws of refraction, or how
he discovered the bridgework of analytical geometry;
I would ask about the light within us,
what shines in the mind’s great repository
of dream, and whether he’s studied the deep shadows
daylight brings, how light defines us.

by Brian Turner

from Here, Bullet; Alice James Books, 2005

Behind your secret racism

From Salon:

Book Of the many viral-video meltdowns pop culture has endured, few are as viscerally disturbing, as painful to watch, as Michael Richards' racist rant during a 2006 stand-up appearance. As you'll no doubt remember, the man better known as Kramer lashed out at a heckler in his audience with a shocking string of slurs, including the brutally memorable line, “Fifty years ago, we'd have you upside down with a fork up your ass.” The breakdown so outraged the general public that even today, if you Google “Michael Richards,” it auto-completes to “Michael Richards racist.”

Shankar Vedantam, a science writer with the Washington Post, uses the Michael Richards incident in his new book, “The Hidden Brain,” to illustrate the way he believes our unconscious can betray us — and reveal biases we wouldn't even acknowledge to ourselves. Vedantam uses a wide array of vivid true stories to make his point: The tragic tale of a woman who is brutally beaten in front of dozens of onlookers illustrates how a crowd's inaction can trick our brain into ignoring pleas for help; two transsexuals who've experienced both sides of the gender divide help illuminate how unconscious sexism can change lives.

More here.

Your Brain’s Got Game

From Science:

Brain Always stunk at video games? Perhaps you've been cursed with a small striatum, a region of the brain involved in learning and memory. Researchers have found that college students with relatively large striatums learned how to play a challenging video game faster than their small-striatum peers. Large-striatum individuals were also better at shifting priorities from, say, shooting a target to outrunning an enemy–abilities that could translate to the real world.

The game isn't exactly Halo or Assassin's Creed. Instead, Space Fortress looks a lot like the very first arcade games, with geometric shapes subbing for spaceships and buildings. “The graphics stink,” admits Arthur Kramer, a psychologist at the University of Illinois, Urbana-Champaign, who designed the game in the early 1980s. Gameplay is fairly complex, however: Players must shoot down a fortress with their ship while avoiding enemies, the bad guys look a lot like the good guys, and the ship has no brakes. Over the years, researchers have used the game to study memory, motor control, and learning speed. The U.S. Air Force and the Israeli air force have even changed their training regimens based on how cadets fared as players. Recent studies have suggested that players appear to heavily utilize their striatum during gameplay. So Kramer and Kirk Erickson, a psychologist at the University of Pittsburgh in Pennsylvania, decided to investigate whether the size of the striatum alone might be responsible for these abilities.

More here.

The Darwin Show

Steven Shapin in the London Review of Books:

397px-Charles_Darwin_by_G__Richmond Even conceding the more expansive claims for Darwin’s genius and influence, we’re still some way from understanding what the festivities have been about. There are other claimants for the prize of towering scientific genius, and for ‘making the modern world’, but none of them has been the occasion for global festivities on anything like this scale. The 400th anniversary of Galileo’s birth was 1964, and Descartes’s 1996; Newton’s Principia turned 300 in 1987; Einstein’s Wunderjahr papers in Annalen der Physik, changing the way physicists think about space, time and matter, had their centenary in 2005. All were duly marked, mainly by historians, philosophers and physicists, but there was nothing remotely approaching Darwin 200. Even if we had an unambiguous metric for ranking scientific genius and modernity-making – one by which Galileo, Descartes, Newton and Einstein were chopped liver compared to Darwin – neither genius nor influence would be a sufficient explanation for the events of 2009.

The very idea of paying homage to the great scientists of the past is problematic. Scientists are not widely supposed either to be heroes or to have heroes. Modern sensibilities insist on scientists’ moral equivalence to anyone else, and notions of an impersonal Scientific Method, which have gained official dominance over older ideas of scientific genius, make the personalities of scientists irrelevant in principle. Honouring past scientists is therefore a different sort of thing from, say, paying homage to history’s generals, politicians or, indeed, imaginative artists.

More here.

Diamond Oceans Possible on Uranus

Eric Bland in Discovery News:

ScreenHunter_01 Jan. 21 09.03 Oceans of liquid diamond, filled with solid diamond icebergs, could be floating on Neptune and Uranus, according to a recent article in the journal Nature Physics.

The research, based on the first detailed measurements of the melting point of diamond, found diamond behaves like water during freezing and melting, with solid forms floating atop liquid forms. The surprising revelation gives scientists a new understanding about diamonds and some of the most distant planets in our solar system.

“Diamond is a relatively common material on Earth, but its melting point has never been measured,” said Eggert. “You can't just raise the temperature and have it melt, you have to also go to high pressures, which makes it very difficult to measure the temperature.”

Other groups, notably scientists from Sandia National Laboratories, successfully melted diamond years ago, but they were unable to measure the pressure and temperature at which the diamond melted.

Diamond is an incredibly hard material. That alone makes it difficult to melt. But diamond has another quality that makes it even harder to measure its melting point. Diamond doesn't like to stay diamond when it gets hot. When diamond is heated to extreme temperatures it physically changes, from diamond to graphite.

More here. [Thanks to Sean Carroll.]

Wednesday, January 20, 2010

Moscow’s stray dogs

Suzanne Sternthal in the Financial Times:

ScreenHunter_02 Jan. 21 09.44 Russians can go nutty when it comes to dogs. Consider the incident a few years ago that involved Yulia Romanova, a 22-year-old model. On a winter evening, Romanova was returning with her beloved Staffordshire terrier from a visit to a designer who specialises in kitting out canine Muscovites in the latest fashions. The terrier was sporting a new green camouflage jacket as he walked with his owner through the crowded Mendeleyevskaya metro station. There they encountered Malchik, a black stray who had made the station his home, guarding it against drunks and other dogs. Malchik barked at the pair, defending his territory. But instead of walking away, Romanova reached into her pink rucksack, pulled out a kitchen knife and, in front of rush-hour commuters, stabbed Malchik to death.

Romanova was arrested, tried and underwent a year of psychiatric treatment. Typically for Russia, this horror story was countered by a wellspring of sympathy for Moscow’s strays. A bronze statue of Malchik, paid for by donations, now stands at the entrance of Mendeleyevskaya station. It has become a symbol for the 35,000 stray dogs that roam Russia’s capital – about 84 dogs per square mile. You see them everywhere. They lie around in the courtyards of apartment complexes, wander near markets and kiosks, and sleep inside metro stations and pedestrian passageways. You can hear them barking and howling at night. And the strays on Moscow’s streets do not look anything like the purebreds preferred by status-conscious Muscovites. They look like a breed apart.

More here.