Why Spinoza still matters


Steven Nadler in Aeon:

Spinoza’s philosophy is founded upon a rejection of the God that informs the Abrahamic religions. His God lacks all the psychological and moral characteristics of a transcendent, providential deity. The Deus of Spinoza’s philosophical masterpiece, the Ethics (1677), is not a kind of person. It has no beliefs, hopes, desires or emotions. Nor is Spinoza’s God a good, wise and just lawgiver who will reward those who obey its commands and punish those who go astray. For Spinoza, God is Nature, and all there is is Nature (his phrase is Deus sive Natura, ‘God or Nature’). Whatever is exists in Nature, and happens with a necessity imposed by the laws of Nature. There is nothing beyond Nature and there are no departures from Nature’s order – miracles and the supernatural are an impossibility.

There are no values in Nature. Nothing is intrinsically good or bad, nor does Nature or anything in Nature exist for the sake of some purpose. Whatever is, just is. Early in the Ethics, Spinoza says that ‘all the prejudices I here undertake to expose depend on this one: that men commonly suppose that all natural things act, as men do, on account of an end; indeed, they maintain as certain that God himself directs all things to some certain end; for they say that God has made all things for man, and man that he might worship God’.

Spinoza is often labelled a ‘pantheist’, but ‘atheist’ is a more appropriate term. Spinoza does not divinise Nature. Nature is not the object of worshipful awe or religious reverence. ‘The wise man,’ he says, ‘seeks to understand Nature, not gape at it like a fool’. The only appropriate attitude to take toward God or Nature is a desire to know it through the intellect.

More here.

Holy Wars

Chappel-bannerJames G. Chappel in Boston Review:

The American public sphere is blessed with many religious experts. In the midst of the Syrian refugee crisis, pundits reminded us that Christianity enjoins the welcoming of refugees. Many of the same people, it turns out, are also deeply familiar with Islam, allowing them to piously intone that it is a “religion of peace.” These claims often come from people who are not themselves affiliated with those faiths or any other: they are political interventions masquerading, sometimes insultingly, as exegesis. They serve an important function, however, as a form of wish fulfillment. If these pat, nervous descriptions of long and complex religious traditions were true, the age-old problem of religion in the public square could vanish into a puff of banalities. Peace and refugee assistance are perfectly good secular, progressive goals, and it would be convenient if Christianity and Islam, which long antedate secular progressivism, happened to enjoin the same things. Alas, the world is not so simple. But what, then, are we to do? What should we expect from religion in a secular society?

The conservative position on religiosity has the virtue of coherence: America, from this perspective, is a Christian nation. Even if other religions should be tolerated in the name of Christian charity, they should cede pride of place to America’s exceptional Christian heritage. Progressives have a much more difficult time, and we ricochet between contradictory and unsustainable positions. On the one hand, religion is transparently absurd, but on the other the triumphant atheism of Richard Dawkins is embarrassing, too. When someone such as Kim Davis forces us to confront difficult issues of law and faith, we often have recourse to uncomfortable mockery, unsure why it is wrong to disobey political authority in the name of individual conscience. The old Marxist account of religion as an “opiate of the people” survives, too, in the conventional wisdom that evangelical voters cling to guns and religion because they are distracted from their true economic interests. These attempts to sidestep the question of religion’s role are dangerous but understandable. The great philosopher Richard Rorty once sighed that religion was a conversation-stopper: If someone claims to be acting for religious reasons, what is there to say? If he were alive today, he would know that if we cease talking about religion, we start shouting about it.

More here.

‘The Language Animal’ by Charles Taylor

51bxj89D7xL._SX327_BO1,204,203,200_Jonathan Rée at The Guardian:

Over the past hundred years, philosophical interest in language has become, as Charles Taylor puts it, “close to obsessional”. The obsession goes back to a remark made by Ludwig Wittgenstein in 1915: “The limits of my language mean the limits of my world.” If Wittgenstein was right, then language is not so much a device for recording and communicating information, as the framework of all our knowledge and experience.

But the philosophers who drew inspiration from Wittgenstein’s remark could not agree about what it implied. The positivists among them thought of language as a strict map of impersonal facts, dismissing everything else as rhetoric, emotion or superstition. The humanists, on the other hand, saw it as a creative force that gives wings to our perceptions and opens us to the unknown. For the positivists, you might say, language aspires to the condition of natural science, but for the humanists it is essentially a poem.

Taylor is on the side of the poets, and in his latest book he makes the case with eloquence, force and broad historical sweep. He starts withÉtienne de Condillac, the 18th-century proto-positivist who suggested that language came into existence when our ancestors got bored with instinctive grunts and gestures, and decided to share their ideas by means of artificial vocal sounds.

more here.


Other-womanTara Cheesman at The Quarterly Conversation:

The Swedish writer Therese Bohman seems to have an affinity for aimless young women vulnerable to the attentions of older men. In two of her novels, Drowned and the newly translated The Other Woman, she channels the psyches of twenty-something University students engaged in liaisons with men already involved with other women.

The books share so much in common that they might be the same novel: both explore almost identical situations, share many of the same structural and plot devices, and the author’s and translator Marlaine Delargy’s prose styles remain the same from book to book. What differences there are prove to be relatively superficial. Drowned and The Other Womanare conveyances for Bohman’s thoughts on feminism, sisterhood, and perhaps even the socio-economic status of women in modern society. Regardless of the ambiguous morality of her female characters’ decisions, Bohman’s treatment of them is inarguably sympathetic. Their affairs with men may be the impetus for coming-of-age journeys, but they do not represent a final destination.

Drowned is a psychological thriller—dark, gothic, and fraught with eroticized violence—and technically the better, more innovative novel. It is the story of two sisters. Stella, the elder, lives in a beautiful “yellow wooden house” with a garden; she has the perfect job at the local parks and gardens department; her boyfriend, Gabriel, is devastatingly attractive and a successful novelist.

more here.

‘The Wood for the Trees’, by Richard Fortey

Y450-293Melissa Harrison at The Financial Times:

On retirement from the Natural History Museum, where he was senior palaeontologist, Richard Fortey used the proceeds of a television series to purchase a small beech wood in the Chilterns. It’s clearly kept him busy since then, for in The Wood For the Trees he presents not only an account of the wood’s long history but a year-long study of its biodiversity. For this he has called on the expertise of a lifetime’s-worth of friends and colleagues, who arrive with pooters, cherry-pickers and high-tech gear to help him understand absolutely everything about it. The wood may only be four acres, but it’s quite an undertaking.

Fortey is an award-winning science writer whose previous books include Trilobite!(2000), The Earth: An Intimate History (2004) and The Hidden Landscape: A Journey into the Geological Past (1993). He’s a regular on TV, too, recently exploring Hawaii, Madagascar and Madeira in stripy braces and Panama hat forNature’s Wonderlands: Islands of Evolution on BBC4. His style on the page mirrors that on the small screen: deeply knowledgeable, enthusiastic, avuncular and a little bit old-fashioned. Words such as “thrice”, “pace” and even “fain” dot his prose like relict trees among the newer growth — and are just as pleasing.

The Wood for the Trees opens in April as the bluebells are coming out and concludes at the end of March, taking in a year’s cycle in the wood. Fortey’s nature notes form the basis of each chapter, the larger story of the wood — its geological past and human history — told piecemeal as the book unfolds.

more here.

Lord Byron’s Darkest Summer

Nina Martyris in Lapham's Quarterly:

Byron1On the evening of April 5, 1815, Mount Tambora, in the Indonesian archipelago, lost its head. So furious was the volcanic eruption that the top third of the 4,300-meter mountain disappeared. More than 10,000 people were incinerated, while an additional 30,000 across the world perished from the crop failures, famine, and disease that resulted from extreme weather triggered by the explosion. Volcanic ash blotted out much of the sun for more than a year, seeding wild rumors that the sun was dying. In Europe and North America, there were snowfalls in June, dry fogs, streaky sunsets, and unseasonal storms. The average global temperature dropped by a whole degree. The climate changed overnight. Unexpectedly, however, this catastrophe spurred two remarkable works of apocalyptic literature in distant Europe. As we mark their bicentenary, these works can be viewed as forerunners to the literature of climate change. The more famous of the two is Mary Wollstonecraft Shelley’s Frankenstein, the mesmerizing and moving story of a hubristic scientist, Victor Frankenstein, who creates a yellow-skinned and watery-eyed monster in his laboratory—and then loses control of it. It has become the classic cautionary tale against what Shelley’s vainglorious scientist upholds as “the unquestioned belief that the products of science and technology are an unqualified blessing for mankind.” The other is the lesser known but equally haunting poem “Darkness,” by the romantic poet George Gordon Byron. It imagines the horrific end days of human life on an earth that has become “a lump of death—a chaos of hard clay.” These two works share a unique kinship: not only were they goaded into being by the gloomy Tambora weather, but they were conceived in the same month, July 1816, and in the same place—on the shores of a storm-lashed Lake Geneva, where Byron and the Shelleys had rented neighboring villas.

The story of how Frankenstein was born has passed into literary legend. The year 1816 was known by the clammy epithet of the Year Without a Summer; thunderstorms and what Mary described as “an almost perpetual rain” kept them indoors. The group of friends—which included Byron’s personal physician Dr. John Polidori and Claire Clairmont, Mary’s eighteen-year-old stepsister, who was madly in love with Byron and pregnant with his child—decided to pass the time by inventing ghost stories. Eighteen-year-old Mary Shelley came up with Frankenstein, whose opening page, shivery with icy winds, manifests a deep longing for a place where “the sun is forever visible.”

More here.

Walt Whitman Promoted a Paleo Diet. Who Knew?

Jennifer Schuessler in The New York Times:

WHITMANLEDE-blog427In 1858, when Walt Whitman sat down to write a manifesto on healthy living, he came up with advice that might not seem out of place in an infomercial today. “Let the main part of the diet be meat, to the exclusion of all else,” Whitman wrote, sounding more than a little paleo. As for the feet, he recommended that the comfortable shoes “now specially worn by base-ball players” — sneakers, if you will — be “introduced for general use,” and he offered warnings about the dangers of inactivity that could have been issued from a 19th-century standing desk. “To you, clerk, literary man, sedentary person, man of fortune, idler, the same advice,” he declared. “Up!”

Whitman’s words, part of a nearly 47,000-word journalistic series called “Manly Health and Training,” were lost for more than 150 years, buried in an obscure newspaper that survived only in a handful of libraries. The series was uncovered last summer by a graduate student, who came across a fleeting reference to it in a digitized newspaper database and then tracked down the full text on microfilm. Now, Whitman’s self-help-guide-meets-democratic-manifesto is being published online in its entirety by a scholarly journal, in what some experts are calling the biggest new Whitman discovery in decades. “This is really a complete new work by Whitman,” said David S. Reynolds, the author of “Walt Whitman’s America” and a professor of English at the Graduate Center of the City University of New York, who was not involved with the find.

More here.

Saturday Poem

33 Years Old

Upon my arrival in Stuttgart, sad news:
a strong young man, supermarket worker
killed himself going to buy things at a supermarket.
I’m 33 and I know that he was too
and will be, eternally . . .
33 years old was the Mexican poet who killed himself
on the road from Bari to Brindisi, going to board
a boat bound for Greece.
We all die a little at 33 . . .
While the funeral procession of the wind strikes
the windows, the nights, the days
making us remember our childhood, something tells us,
that one day He will come.
The wind opens the windows with gloves of dead leaves.
The young man died and now I occupy his room
and I’m afraid because I’m the same age he was.
In this room I have two windows:
one looks out on a strange castle full of tourists and the other
on a forest. Beautiful at dawn and fearsome at night.
I am so close to both windows! One on the old world,
the other on the wild.
Both worlds call to me, they strike at my window at every moment
and they will keep going until the end of days.
The young man died headed for the supermarket . . .
At 33 . . .
I open the window to listen to the sound of the forest, the colors
threading into the dark sky. Smell of a kerosene heater
going in the depths of chest. It’s the forest’s heart!
I never lived close to a forest.
And I think I haven’t even ever seen one.
This is a beautiful, strong, tall, friendly forest . . .
That is 33 years old . . .

by Washington Cucurto
from Poetry International
translation: Jordan Lee Schnee

Read more »

Is Polite Philosophical Discussion Possible?

Nomy Arpaly in Pea Soup:

AliforemanI’ll never forget the old guy who asked me, at an APA interview: “suppose I wanted to slap you, and suppose I wanted to slap you because I thought you were giving us really bad answers, and I mistakenly believed that by slapping you I’ll bring out the best in you. Am I blameworthy?”.

When he said “suppose I wanted to slap you”, his butt actually left his chair for a moment and his hand was mimicking a slap in the air.

Since that event – which happened back when I was a frightened youngster with all the social skills of a large rock – I have thought many times about the connection between philosophy and rudeness – especially the connection between philosophical debating and rudeness. It seems to me that the connection between philosophical argument and rudeness is similar to the connection between fighting a war and immorality. Surprisingly precise analogies can be drawn between the soldier in a just war and the philosophical arguer in pursuit of the truth. Let me explain.

It is a big part of moral behavior in ordinary situations not to kill people. Yet the morally healthy inhibition against killing people has to be lost, of necessity, in war – even in a morally justified war.

It is a big part of politeness – not in the sense of using the right fork, but in the sense of civility – in ordinary situations not to tell another person that she is wrong and misguided about something she cares a lot about, or that she cares about being right about. For brevity’s sake, let’s just say it’s a big part of politeness or civility not to correct people. Yet the civilized inhibition against correcting people has to be lost, of necessity, in a philosophical argument.

A soldier who is fighting, even for a just cause, is in a precarious situation, with regard to morality, because he has lost, of necessity, the basic moral inhibition against killing people.

A philosopher who is arguing with another, even in pursuit of truth, is in a precarious situation with regard to politeness, because she has lost, of necessity, the basic civil inhibition against correcting people.

More here.

Who Will Debunk The Debunkers?

Daniel Engber in FiveThirtyEight:

ScreenHunter_1902 Apr. 29 20.18In 2012, network scientist and data theorist Samuel Arbesman published a disturbing thesis: What we think of as established knowledge decays over time. According to his book “The Half-Life of Facts,” certain kinds of propositions that may seem bulletproof today will be forgotten by next Tuesday; one’s reality can end up out of date. Take, for example, the story of Popeye and his spinach.

Popeye loved his leafy greens and used them to obtain his super strength, Arbesman’s book explained, because the cartoon’s creators knew that spinach has a lot of iron. Indeed, the character would be a major evangelist for spinach in the 1930s, and it’s said he helped increase the green’s consumption in the U.S. by one-third. But this “fact” about the iron content of spinach was already on the verge of being obsolete, Arbesman said: In 1937, scientists realized that the original measurement of the iron in 100 grams of spinach — 35 milligrams — was off by a factor of 10. That’s because a German chemist named Erich von Wolff had misplaced a decimal point in his notebook back in 1870, and the goof persisted in the literature for more than half a century.

More here.

On Jenny Diski, 1947–2016

Justin E. H. Smith in n + 1:

ScreenHunter_1901 Apr. 29 20.12Jenny Diski was my friend. We exchanged a flood of ideas during her preparations for her 2013 book, What I Don’t Know About Animals. I am re-reading parts of our exchange now—the writing, by some sort of magic I’ll never really understand, continues to live. The preoccupations we shared, at least at the time, were: animals; humans; the vague boundaries of what constituted cannibalism (she brought up the rumor that Keith Richards had snorted the ashes of his own father, which plainly trumped my example of tuberculosis patients getting prescriptions, well into the 19th century, to drink the blood of executed prisoners); our reclusiveness; and, occasionally, our day-to-day accomplishments, travails, and happinesses.

I do not think the book I helped Jenny to birth is her best, but its focus was also the focus of our friendship, so let me dwell on it. One important thing that I learned from her early in our correspondence (circa 2008), at a time when I was struggling to inhabit convincingly the social role of a philosophy professor, is that it really does not matter in the slightest what philosophy professors think. Why listen to them in particular? I had never considered this question before until I began writing with Jenny, who helped me to realize that it was perhaps the most important new question of my adult life. It had come up regarding J. M. Coetzee’s then-popular The Lives of Animals, and the enthusiasm with which philosophers had taken it up as a blunt pedagogical tool to introduce students to the moral dilemmas of meat eating. Jenny found the book “enragingly self-righteous—and a very lumpy piece of writing.” Then we discussed Cora Diamond on Kafka, and Martha Nussbaum’s advocacy for the cultivation of our moral faculties through literature. “Philosophers drool so easily over novelists,” she wrote. (She made an exception for Stanley Cavell, and what he has to say about animals and humans—Cavell, who does not drool over anyone, who, in Jenny’s words, did not have “a smidgeon of self-righteousness.”)

Much of our correspondence was devoted to discussions of the moral and metaphysical implications of meat-eating.

More here.

Can an Unfinished Piece of Art Also Be Complete?

Picasso_Carafe-and-Candlestick_imgBarry Schwabsky at The Nation:

The question of finish was crucial to the emergence of modernism. The gauntlet was first thrown down by Manet, whose works in the 1860s were declared by critics to be unfinished—in fact, not even paintings but mere sketches. Similarly, a few years later, Whistler was accused by the Victorian sage himself, John Ruskin, of doing nothing more than “flinging a pot of paint in the public’s face.” But Baudelaire had already anticipated the howls of Manet’s and Whistler’s denigrators when he observed, in 1845, “that in general what is ‘completed’ is not ‘finished’ and that a thing ‘finished’ in detail may well lack the unity of the ‘completed’ thing.” From Manet and Whistler (or, indeed, from their predecessor Corot, who was the object of Baudelaire’s defense) until today, artistic modernism has been inseparable from the critique of finish. And this change in painting and sculpture occurred in tandem with similar developments in the other arts. Consider the difference, for example, between the omniscient narrator of the high Victorian novel and Flaubert’s style indirect libre, which depends on the reader making implicit connections and intuiting unmarked shifts in viewpoint; or, in the 20th century, the rejection by modernist architects of ornament—which had long been considered indispensable to a building’s finish—as something that, as August Perret remarked, “generally conceals a defect in construction.”

For all that, the force of the unfinished was far from a discovery of the 19th century, as Kelly Baum and Andrea Bayer point out in the catalog for “Unfinished: Thoughts Left Visible,” the exhibition they’ve curated with Sheena Wagstaff at the Metropolitan Museum of Art.

more here.

why Alec Ross is a moron

Alec-Ross-01Evgeny Morozov at The Baffler:

Ross’s tenure at the State Department was, by and large, a failure. His efforts to promote “twenty-first-century statecraft”—Clinton’s lofty vision for American power that would put “Internet freedom” and digital technologies at its core—floundered after the State Department was confronted by Cablegate, the release of a massive library of leaked diplomatic cables that began in late 2010 and was coordinated by WikiLeaks. Ross, who claimed the twenty-first-century-statecraft concept as his own and hoped that it would become “a major part of [Clinton’s] legacy,” was suddenly forced into damage control. Few would find his pronouncements on “Internet freedom” credible after the State Department’s reaction to WikiLeaks. An even more unglamorous picture of his activities emerges from Clinton’s email trove. The good news is that Ross did innovate on at least one front—spin. In 2012, Ross wrote to Cheryl D. Mills, Clinton’s chief of staff: “‘Hillary Clinton is the most innovation-friendly American diplomat since Benjamin Franklin.’ Thought you’d enjoy that line. It appears in minute 10 of show I did on CSPAN. I’m going to continue to use it.”

Ross’s brief moment of national fame had more to do with his penchant for self-promotion than innovation. In summer 2010, Ross and Cohen took a delegation of American technology executives from the likes of Cisco and Microsoft to Damascus to meet with Bashar al-Assad—strange are the twists of twenty-first-century statecraft. Never missing an opportunity to show off, the pair tweeted all the fun they were having in Syria. (Cohen: “I’m not kidding when I say I just had the greatest frappuccino ever at Kalamoun University north of Damascus”; Ross: “Creative Diplomacy: @jaredcohen challenged Minister of Telecom to cake-eating contest.”) By Ross’s account, though, the trip pursued the much nobler objective of fomenting regime change via social media. As he wrote in another email to Mills, “When Jared and I went to Syria, it was because we knew that Syrian society was growing increasingly young (population will double in 17 years) and digital and that this was going to create disruptions in society that we could potential [sic] harness for our purposes.”

more here.

Why Prince May Have Been the Greatest Guitarist Since Hendrix

Debeacafcd0f62a70642622316721ee395e7df34Jack Hamilton at Slate:

By the time Prince emerged into superstardom, the notion of a post-Hendrix black rock guitar god had become more or less unthinkable to rock fans, who were mired in the throes of the “Disco Sucks” movement. (Never mind that the best guitar player on the face of the Earth in the late 1970s was probably Chic’s Nile Rodgers.) Purple Rain, the 1984 film and accompanying album that made Prince a superstar, brought the Minneapolis prodigy’s guitar chops to the forefront, literally: The soundtrack’s lead single, “When Doves Cry,” opens with a distortion-drenched run that’s one of the more breathtaking displays of virtuosity ever heard on the instrument. (In a recent interview with the Washington Post, ZZ Top’s great guitarist Billy Gibbons spoke of the many hours he’d spent over the years trying pin down that opening lick.) The movie included copious footage of Prince as guitar hero, from the torrential outro of “Let’s Go Crazy” to the soaring, gorgeous solo that closes “Purple Rain” itself.

But in years since Prince’s position in the rock pantheon has remained unstable. On Rolling Stone’s list, he ranked 33rd, five spots beneath Johnny Ramone, a guitarist widely beloved for not being very good. Any list like this is stupid, but this is really, really stupid. Prince may have been the greatest guitarist of the post-Hendrix era and often seemed to carry Hendrix’s aura more intrepidly than anyone, most notably in his incredible versatility. Our pop-cultural memory of Hendrix is dominated by gnashing feedback squawls and pyrotechnics both figurative and literal, a misguided belief that his signature moments were the last few minutes of “Wild Thing” at Monterey or quoting “Taps” in the early morning at Woodstock. But Hendrix’s true greatness lay in his ability to do almost anything and everything with the instrument, from the dreamy Curtis Mayfield-isms of “Little Wing” to the psychedelic frenzy of “Purple Haze” to the chicken scratches and pentatonic howls of “Voodoo Child (Slight Return)” to the sumptuous melodicism of “Burning of the Midnight Lamp.”

more here.

Same but Different

Siddhartha Mukherjee in The New Yorker:

IdOn October 6, 1942, my mother was born twice in Delhi. Bulu, her identical twin, came first, placid and beautiful. My mother, Tulu, emerged several minutes later, squirming and squalling. The midwife must have known enough about infants to recognize that the beautiful are often the damned: the quiet twin, on the edge of listlessness, was severely undernourished and had to be swaddled in blankets and revived. The first few days of my aunt’s life were the most tenuous. She could not suckle at the breast, the story runs, and there were no infant bottles to be found in Delhi in the forties, so she was fed through a cotton wick dipped in milk, and then from a cowrie shell shaped like a spoon. When the breast milk began to run dry, at seven months, my mother was quickly weaned so that her sister could have the last remnants.

Tulu and Bulu grew up looking strikingly similar: they had the same freckled skin, almond-shaped face, and high cheekbones, unusual among Bengalis, and a slight downward tilt of the outer edge of the eye, something that Italian painters used to make Madonnas exude a mysterious empathy. They shared an inner language, as so often happens with twins; they had jokes that only the other twin understood. They even smelled the same: when I was four or five and Bulu came to visit us, my mother, in a bait-and-switch trick that amused her endlessly, would send her sister to put me to bed; eventually, searching in the half-light for identity and difference—for the precise map of freckles on her face—I would realize that I had been fooled. But the differences were striking, too. My mother was boisterous. She had a mercurial temper that rose fast and died suddenly, like a gust of wind in a tunnel. Bulu was physically timid yet intellectually more adventurous. Her mind was more agile, her tongue sharper, her wit more lancing. Tulu was gregarious. She made friends easily. She was impervious to insults. Bulu was reserved, quieter, and more brittle. Tulu liked theatre and dancing. Bulu was a poet, a writer, a dreamer. Over the years, the sisters drifted apart. Tulu married my father in 1965 (he had moved to Delhi three years earlier). It was an arranged marriage, but also a risky one. My father was a penniless immigrant in a new city, saddled with a domineering mother and a half-mad brother who lived at home. To my mother’s genteel West Bengali relatives, my father’s family was the embodiment of East Bengali hickdom: when his brothers sat down to lunch, they would pile their rice in a mound and punch a volcanic crater in it for gravy, as if marking the insatiable hunger of their village days. By comparison, Bulu’s marriage, also arranged, seemed a vastly safer prospect. In 1967, she married a young lawyer, the eldest son of a well-established clan in Calcutta, and moved to his family’s sprawling, if somewhat decrepit, mansion.

More here.

How Europe exported the Black Death

Andrew Lawler in Science:

BlackThe medieval Silk Road brought a wealth of goods, spices, and new ideas from China and Central Asia to Europe. In 1346, the trade also likely carried the deadly bubonic plague that killed as many as half of all Europeans within 7 years, in what is known as the Black Death. Later outbreaks in Europe were thought to have arrived from the east via a similar route. Now, scientists have evidence that a virulent strain of the Black Death bacterium lurked for centuries in Europe while also working its way back to Asia, with terrifying consequences.

At the Society for American Archaeology meetings earlier this month in Orlando, Florida, researchers reported analyzing the remains of medieval victims in London; Barcelona, Spain; and Bolgar, a city along the Volga River in Russia. They determined that the victims all died of a highly similar strain of Yersinia pestis, the plague bacterium, which mutated in Europe and then traveled eastward in the decade following the Black Death. The findings “are like pearls on a chain” that begins in western Europe, said Johannes Krause at the Max Planck Institute for the Science of Human History in Jena, Germany, an author of a soon-to-be-published study. (The lead author is Maria Spyrou, also at Jena.) That chain may have stretched far beyond Russia. Krause argues that a descendant of the 14th century plague bacterium was the source of most of the world’s major outbreaks, including those that raged across East Asia in the 19th and 20th centuries and one afflicting Madagascar today.

More here.

Friday Poem

The Glassblowers, 6 A.M.

Night draws its plough through the fields.
A fine mist: the breath
of a black horse, dreaming.
Under its eyelid, the moon.

This early no one wakes
but the glassblowers
secretive insects in their hive.
At the end of each sting
a dollop of luminous honey.

Mostly they are just boys,
lean shadows aping the maestros.
When nobody's looking they clown around
swapping greasy sombreros, goosing each other,
then lapse from play so quickly it seems
a mirage
the after-image of that childhood
they've long since left behind.

Muscles steaming with sweat
eyes glazed by smoke
how they dance round the furnace
transforming night's lead into gold!
Even while eating they circle the fire.
The ordinary sun cannot draw them
outside, where the black horse churns the furrow
and girls in flowering blouses
stroll to the dairy.

No magnet beyond this centre
and the girls know it
crowding the doorway for a glimpse
of ruddy flesh,
scattering at the first sight
of those burning, devoted eyes.

by Susan Glickman
The Power to Move
Montreal: Véhicule Press, 1986.