The Ultimate Obit – The periodic table

Ghiorso1 I’ve always wondered if people who know what the first line in their obituary will be are lucky or cursed. Sure, you know already how (and that) history will remember you. But it’s got to be constricting, too—a feeling of already being defined, as if you can’t contribute anymore. It must be doubly worse for scientists, who often do their best work when young, and have it hanging over them for decades.

Of course, it’s even worse to know what the first line in your obit should be—and then not rate an obit at all, because people forgot you. Such was the fate of Albert Ghiorso (hard “g”), who helped discover more chemical elements, a dozen, than any human being who ever lived. Yet his death earned just three measly mentions in newspapers across the country (and those weeks after he died). I’d like to do the little I can to rectify that.

I wrote about Ghiorso in a recent book, and beyond the wizardry of his science, I remember most of all his mischief. He specialized in building radiation detectors that could pick out the presence of just a few atoms of new elements. The discovery of a new element was always a celebratory event—the periodic table is the most precious real estate in science—so during one experiment Ghiorso decided to wire his radiation detector to his building’s fire alarms at the University of California at Berkeley, so it would briiiiiing every time an atom appeared. For various reasons his team ran the experiment at night, and they cheered all through the a.m. as the atoms rang out. It was a complete success, except Ghiorso forgot to unwire the fire alarm the next morning. While he was at home sleeping, it went off during the day, forcing a panicked evacuation. The administration was not amused. In discovering a different element, berkelium, element 97, Ghiorso suggested using “Bm” as the chemical symbol for it, because it had been such a “stinker” to discover. To the eternal disappointment of every sophomore chemistry student in the world, the idea was vetoed.

Read more »

Sunday, January 23, 2011

Old Man in Winter

ID_IC_MEIS_WINTE_AP_001 Morgan over at The Smart Set:

It is a time of dreariness and decay. I'm speaking of winter, of course. I always think, when thinking of winter, of the opening lines of Richard III. Richard, the king-to-be, is musing upon the ascension to the throne of his brother, Edward IV. He says, in lines that are burned into the deep pathways of our neural networks, “Now is the winter of our discontent / Made glorious summer by this son of York.”

These opening lines of the play are actually quite hopeful. The first word, “now,” looks forward to the “made” in the next line. Shakespeare, in that clever way of his, makes the language fresh by making you pay attention. The “now” is a placeholder for the thought to come. It sets the scenario, grabs us with its immediacy, and lingers there for a moment while we wait for the thought to develop. The thought develops into the idea that “now” is being “made glorious summer” by this son of York. The winter of our discontent is in the past. “Now” is, in fact, a time of glorious summer, a renewal brought about by the reign of Edward IV, son of York.

But the phrase “now is the winter of our discontent” is so powerful that it often gets picked out of context and made to stand alone. When you do that, it seems as if “now” is the winter of our discontent. The winter of our discontent isn't going anywhere. It is simply the way it is right now.

Sometimes when I hear that line I even hear it as a statement not about “now” but about winter. If you think of it as a winter statement, you can almost replace the word “now” with the word “winter,” i.e., “winter is the winter of our discontent.” I don't take this as a simple tautology, “winter is winter,” but the equation of winter the season with winter the mood. Winter, the season, is a time of general discontent. Winter, in its dreariness and decay, is the season of wanting things to be otherwise.

And yet, some part of us wants winter, some part of us glories in the winteriness of winter.

Mondo Weiss

1295450984goldberg_011811_380pxB Over at Tablet, Michelle Goldberg profiles Philip Weiss:

When Philip Weiss, the Jewish anti-Zionist writer and blogger, compares himself to Theodor Herzl, he’s not being ironic. “I actually am like him in certain ways,” he says. “Herzl said, ‘Anti-Semites made me Jewish again.’ I would say that neo-conservatives made me Jewish again.”

To the legion of Jews that Weiss has enraged, this will sound perverse. It’s certainly self-aggrandizing. But it also gets at the way that Weiss has abandoned a deeply assimilated life for a profound—if idiosyncratic and tortured—engagement with Jewish questions. As the founder of Mondoweiss, a blog that has become a nucleus of anti-Zionist writing, and a co-editor of a new book about Richard Goldstone’s report on Israel’s 2008 invasion of Gaza, Weiss says that he now thinks about Jewishness all the time. In his fierce critique of tribal identity, he’s found his tribe—one he believes is growing.

“I think I was alienated from a lot of Jewish communal life in my 20s, 30s, 40s,” Weiss says. “One symptom of that is the fact that I’d never been to Israel until 2006. I was 50 before I got to Israel.” Now that he is 55, Israel has become the center of his life. He goes to rabbinical conventions and corresponds with left-wing Israelis. “I love what I’ve undergone in the last few years,” he says. “And I love my engagement with Jewish communal life now.”

Of course, much of that engagement comes in the form of relentless criticism. Weiss’ blog is fulsomely, intensely anti-Israel—it’s a universe in which even Noam Chomsky, hero of anti-imperialists worldwide, is criticized for his residual attachment to the Jewish state. His obsessive focus on Israel has come at the expense of a successful career as a magazine journalist. Harvard-educated, he got his start writing for the New Republic and later contributed features to New York, and the New York Times Magazine and wrote a column for the New York Observer. Initially he launched Mondoweiss as a general-interest blog on the New York Observer website. When he started to focus on Israel, his editor warned him that he was becoming a crank.

He didn’t listen, and in 2007 he left the Observer, taking the blog with him. Today it operates under the umbrella of the nonprofit Nation Institute, which allows Weiss to solicit tax-deductable contributions. But its budget comes entirely from donations, and Weiss has to rely on his wife, the writer and editor Cynthia Kling, to help support him.

It’s a little hard to figure out why Weiss threw so much away for a cause that was so new to him. Naturally, he sees a linear moral logic to his journey. He looks at contemporary Israel and is appalled.

Is 50 really the new 34, or is it a licence to wear elasticated waistbands?

From The Guardian:

Grayson-Perry-50-007 “We are welcoming an era in which 50 is the new 34,” argues Emma Soames, Saga magazine's editor-at-large. The increasingly glamorous image of 50-year-olds has even spawned a new term, the “Quintastics” – thanks, in part, to the visibility of a number of high-profile celebrities who met the event with undiminished glamour in the past year, including Bono, Nigella Lawson, Hugh Grant, Jonathan Ross, Colin Firth, Tilda Swinton and Kristin Scott Thomas. But it's not all good news. “By the time we are 50, we are definitely in the suburbs of mortality,” says Alain de Botton. “After 21, birthdays are really wakes and occasions for mourning – unfairly ascribed a degree of jollity which they absolutely don't require. Yes, older people now look a bit better for a while longer, but essentially, it's pretty much a vale of tears.”

Nevertheless there's something newly cool about turning 50. Just ask George Clooney – whose birthday falls in May and who has almost single-handedly ignited a revival of the Cary Grant/Spencer Tracy brand of suave older man – or Barack Obama (50 in August), still the closest thing we've got to a real-life superhero. As Michelle Pfeiffer said when she reached the landmark: “You just take stock and count your blessings.”

More here.

The Grounds of Courage

From The New Republic:

Bonhoeffer Early in January 1939, the precocious German theologian Dietrich Bonhoeffer, age thirty-two, learned that all males in his age cohort had been ordered to register with the military. A dedicated opponent of the Nazi regime, he might have responded by declaring himself a conscientious objector, but there were two problems with such a course of action. The first was that Bonhoeffer, although pacifist by inclination, was not opposed to violence under all conditions; and he would later play an active role in the conspiracy led by German generals to assassinate Hitler. The second was that his fame in the Confessing Church (more on this below) might encourage other religious leaders critical of the regime to do the same, thereby bringing them under greater suspicion and undermining their efforts to prove that Nazi policies, and especially their rapidly intensifying Jew-hatred, were contrary to the teachings of Jesus Christ.

The solution was provided by America’s most illustrious theologian, Reinhold Niebuhr. Nine years earlier, Bonhoeffer had spent a year in the United States as a free-floating exchange student at Union Theological Seminary, arriving not long after Niebuhr had moved there from Detroit. He had made such a positive impression on Union’s faculty that Niebuhr jumped at the opportunity to bring him back. If we fail to offer him a job, he told Union’s president, Henry Sloane Coffin, Bonhoeffer will wind up in a concentration camp. This was not the stuff of run-of-the-mill letters of recommendation. Union extended the offer. Grateful to have a way out of his dilemma, Bonhoeffer booked passage, and in June 1939 found himself safe in America.

Safe, but unhappy.

More here.

Sunday Poem

The Story We Know

The way to begin is always the same. Hello,
Hello. Your hand, your name. So glad, just fine,
and Good bye at the end. That’s every story we know,

and why pretend? But lunch tomorrow? No?
Yes? An omelette, salad, chilled white wine?
The way to begin is simple, sane, Hello,

and then it’s Sunday, coffee, the Times, a slow
day by the fire, dinner at eight or nine
and Good bye. In the end, this is a story we know

so well we don’t turn the page, or look below
the picture, or follow the words to the next line:
The way to begin is always the same Hello.

But one night, through the latticed window, snow
begins to whiten the air, and the tall white pine.
Good bye is the end of every story we know

that night, and when we close the curtains, oh,
we hold each other against that cold white sign
of the way we all begin and end. Hello,
Good bye is the only story. We know, we know.

by Martha Collins
from
A Catastrophe of Rainbows
Cleveland State University Poetry Center, 1985

H M Naqvi wins first $50,000 DSC Literature Prize for South Asia at the Jaipur Literature Festival

From the Hindustan Times:

ScreenHunter_05 Jan. 23 12.19 DSC Director Manhad Narula, the brain behind the award, said that he hoped it would have an impact of the scene of literature in South Asia as this was the first such prize honouring work on this subject.

“Some literature prizes tend to give more importance to the author rather than to his or her work. But I hope through this award we are able to ensure what matters is what the author is writing about,” he said.

He said that one of the measures that would go a long away in ensuring credibility to the award is the fact that it honours work on the subject of South Asia, be it by any author of any nationality.

“Its about time South Asians have our own damn award,” said an elated Naqvi.

Asked if he had to say something to his critics, the 36-year-old author said, “Mercifully in the US, in India as well as in Pakistan, my critics have been few. But all criticisms of Home Boy are valued. It is a debut novel and it has all the strengths and weaknesses of a debut novel.”

The winner was decided judged by a jury chaired by Nilanjana S Roy, along with Lord Mathew Evans, Ian Jack, Amitava Kumar and Moni Mohsin.

Awarded for the best work of fiction pertaining to the South Asian region, the prize is given to works published in English, and includes translations in English with a share of the award money also going to the translator.

More here. [Congratulations to Husain from all of us at 3QD, where he was once a columnist.]

The Original Sherlock Holmes: How a French Doctor Helped Create Forensic Science

A 19th-century French medical examiner and criminologist was even more skilled than the fictional detective Sherlock Holmes. A new book recounts his biggest case, which heralded the age of forensic science.

Frank Thadeusz in Der Speigel:

ScreenHunter_04 Jan. 23 11.28 On a good day, Joseph Vacher could win over a woman with his disarmingly innocent demeanor. In these states of mind, he wrote letters in an ornate, rounded feminine handwriting and amused children by making faces at them.

But then Vacher would go into uncontrolled rages. Once, he beat his small dog to death with a club because it wasn't eating its food.

His crimes against human beings were much worse. In remote forests and barns, Vacher, the son of a farmer, raped and murdered a total of 11 people, most of them children.

In late 19th-century France, this diminutive serial killer epitomized ordinary citizens' fears of the evil that lurks in the darkness. At the time, the guillotine was still used to execute dangerous criminals in France. In the Vacher case, however, the judges were hesitant to impose the death penalty. Was the mass murderer “a cannibal” who had to be beheaded, or was he a “certifiably insane person” who was to be locked up in an asylum?

Douglas Starr, a professor of journalism at Boston University, has now reconstructed the series of murders Vacher committed.

In his book, “The Killer of Little Shepherds,” Starr does not, however, assign the leading role to Vacher, the child murderer, but to the man who was to solve the Vacher mystery: Alexandre Lacassagne, the head of forensic medicine in the southern city of Lyon.

Lacassagne solved murder cases that seemed unsolvable at the time. To this day, students in police academies are taught the methods of the master criminologist from Lyon.

More here.

Sherry Rehman, Pakistan’s defiant prisoner of intolerance, vows to stay put

Declan Walsh in The Observer:

ScreenHunter_03 Jan. 23 10.46 All Sherry Rehman wants is to go out – for a coffee, a stroll, lunch, anything. But that's not possible. Death threats flood her email inbox and mobile phone; armed police are squatted at the gate of her Karachi mansion; government ministers advise her to flee.

“I get two types of advice about leaving,” says the steely politician. “One from concerned friends, the other from those who want me out so I'll stop making trouble. But I'm going nowhere.” She pauses, then adds quietly: “At least for now.”

It's been almost three weeks since Punjab governor Salmaan Taseer was gunned down outside an Islamabad cafe. As the country plunged into crisis, Rehman became a prisoner in her own home. Having championed the same issue that caused Taseer's death – reform of Pakistan's draconian blasphemy laws – she is, by popular consensus, next on the extremists' list.

Giant rallies against blasphemy reform have swelled the streets of Karachi, where clerics use her name. There are allegations that a cleric in a local mosque, barely five minutes' drive away, has branded her an “infidel” deserving of death. In the Punjabi city of Multan last week opponents tried to file blasphemy charges against her – raising the absurd possibility of Rehman, a national politician, facing a possible death sentence.

More here.

Islamophobia is the moral blind spot of modern Britain

Giles Fraser in The Guardian:

225px-Baronness_Sayeeda_Warsi_crop No one actually comes out and directly says “I hate Muslims” – at least, not on the liberal dinner party circuit that was the target of Lady Warsi's speech. Conversations generally begin with the sort of anxieties that many of us might reasonably share: it cannot be right for women to be denied access to education in some Islamic regimes; the use of the death penalty for apostasy is totally unacceptable; what about the treatment of homosexuals? The conversation then moves on to sharia law or jihad or the burqa, not all of it entirely well informed. Someone places their hands across their face and peers out between their fingers. Another guest giggles slightly. Someone inevitably mentions 9/11. Later, guests travel home on the tube and look nervously at the man in the beard sitting opposite.

The problem Warsi identifies is the problem of slippage. What can begin as a perfectly legitimate conversation about, say, religious belief and human rights, can drift into a licence for observations that in any other circumstance would be regarded as tantamount to racism. Like the 19th-century link between anti-Catholicism and racism towards the Irish, one can easily bleed into the other.

“I treat the Islamic religion with the same respect as the bubble-gum I scrape off my shoe,” suggested one contributor to the website of the Richard Dawkins Foundation for Reason and Science, in response to Warsi's speech. Another offered the following charming observation: “I don't care what the good or bad Baroness has to say about anything at all. I give her no credence nor voice. She is a person of faith so in my book a skinwaste.”

More here. [Photo shows Baroness Saeeda Hussain Warsi.]

Royal Society – Brain Waves: Neuroscience, Society and Policy

Daniel Lende in Neuroanthropology:

Royal-Society-Brain-Waves The Royal Society has just put out the first module of its Brain Waves project, which provides a primer on the state of art in neuroscience and how neuroscience intersects with society. The ten essays cover a range of relevant topics for neuroanthropology, with an introduction written by Prof. Colin Blakemore.

The first section covers the scope and limits of neuroimaging, neuropsychopharmacology, neural interfaces, consciousness, and reward.

The second section focuses on neuroscience and society, with takes on benefits, risks, neuroethics, and governance.

All the essays, which generally range from 8 to 12 pages and are written in clear prose and have citations for further exploration, have been written almost entirely by prominent British experts. They are freely available as pdfs.

I’ve just started to explore, and based on the quality, I am sure to look at them all. Wolfram Schultz’s essay on Reward, Decision Making, and Neuroeconomics is obviously one that immediately caught my eye. Steven Rose’s Risks raises some of the critical questions relevant to many anthropologists.

More here.

Saturday, January 22, 2011

The Medium Is McLuhan

McluhanNicholas Carr reviews Douglas Coupland's Marshall McLuhan: You Know Nothing of My Work! in TNR:

One of my favorite YouTube videos is a clip from a Canadian television show in 1968 featuring a debate between Norman Mailer and Marshall McLuhan. The two men, both heroes of the ’60s, could hardly be more different. Leaning forward in his chair, Mailer is pugnacious, animated, engaged. McLuhan, abstracted and smiling wanly, seems to be on autopilot. He speaks in canned riddles. “The planet is no longer nature,” he declares, to Mailer’s uncomprehending stare; “it’s now the content of an art work.”

Watching McLuhan, you can’t quite decide whether he was a genius or just had a screw loose. Both impressions, it turns out, are valid. As Douglas Coupland argues in his pithy new biography, McLuhan’s mind was probably situated at the mild end of the autism spectrum. He also suffered from a couple of major cerebral traumas. In 1960, he had a stroke so severe that he was given his last rites. In 1967, just a few months before the Mailer debate, surgeons removed a tumor the size of an apple from the base of his brain. A later procedure revealed that McLuhan had an extra artery pumping blood into his cranium.

Between the stroke and the tumor, McLuhan managed to write a pair of extravagantly original books. The Gutenberg Galaxy, published in 1962, explored the cultural and personal consequences of the invention of the printing press, and argued that Gutenberg’s invention shaped the modern mind. Two years later, Understanding Media extended the analysis to the electronic media of the twentieth century, which, McLuhan famously argued, were destroying the individualist ethic of print culture and turning the world into a tightly networked global village.

McLuhan was a scholar of literature, with a doctorate from Cambridge, and his interpretation of the intellectual and social effects of media was richly allusive and erudite. But what particularly galvanized the public was the weirdness of his prose.

General Anopheles

Malaria-575James Pogue in Guernica:

Malaria kills about a million people every year. It's a guileful disease, not a brutish one like yellow fever or smallpox. Unlike those two illnesses, it doesn’t attack in big sweeps, killing some and leaving survivors with permanent immunity. It can hide out in cysts in the liver for years. It’s likely that much of the adult population of Zouérat was infected and that few if any of them knew or cared. Certainly, few of them knew the statistics that worry Western aid agencies: 90 percent of the people who die every year from the disease are Africans, most of them very young children, most of them undiagnosed.

A 2010 book by the journalist Sonia Shah helps explain how the disease could be so widespread but widely ignored by Africans. The Fever: How Malaria Has Ruled Humankind for 500,000 Years, is a history of the disease, but also an attempt to look at the dialectic between Africans—who have lived with the disease for years—and those Westerners who attempt to cure it. In the West, one of the fundamental assumptions of the axis of development economists, philanthropists, and NGOs involved in plotting a happy new future for Africa has been that malaria is a primary cause of misery on the continent, contributing to poverty and the AIDS epidemic. The ubiquitous Columbia economist Jeffrey Sachs has written that Malaria in Africa could be controlled with an investment of just three billion dollars a year. In one of the foundational papers of the Western aid movement, he argues that doing so “offers the potential to initiate a virtuous cycle in which improved health spurs economic growth and rising income further benefits human health.” In other words, Africa could be transformed by attacking a single mosquito-borne disease, and for the amount of money it takes to build a mid-market baseball stadium. What are we waiting for?

Shah complicates this picture. Might malaria be a symptom rather than a cause of poverty? Might the billions of dollars spent fighting malaria by Western organizations like the Bill and Melinda Gates Foundation be, at best, another example of misspent Western dollars and misplaced Western hopes, or, at worst, another example of neo-colonial meddling?

frozen hell

Montefiore_12_10

In 1940, a 22-year-old Soviet engineer named Fyodor Mochulsky finished his studies and was offered a job by the NKVD, Stalin’s secret police, in the Gulag labour-camp system. He was a candidate member of the Communist Party and typical of the so-called Stalin Generation, born after 1917 and reared on Soviet propaganda. Educated, intelligent and extremely able as an engineer and manager, he was also typical in his belief that, however young he was, he was capable of taking on colossal responsibilities. Whatever his hopes for the future, a young man like this would not turn down such an offer from the Party. After all, it was just after the Great Terror, and Europe was already at war: even if a career in the Gulag was not ideal, the consequences of saying no to the Party could be fatal. Weeks later, Mochulsky and three young friends set out for Pechorlag, one of a vast chain of camps in the Komi region in the Arctic Circle, northeast of St Petersburg. Even for the privileged elite of the NKVD and Communist Party, the journey across the tundra by steamboat, horse, foot and ski was perilous. One can only imagine the hardships faced on such trips by prisoners, many of whom died along the way.

more from Simon Sebag Montefiore at Literary Review here.

leskov and fate

Leskov-2

Of the great Russian prose writers of the 19th century, Nikolai Leskov was an outsider. He was not a member of the gentry, he lacked a privileged education, and he wrote about common serfs and the country clergy in their own language. He managed to alienate both the left and right wings of the Russian intelligentsia early in his career, and though his work was popular, critics dismissed it. His work was capable of great darkness and brutal cynicism, but it lacks the angst, romantic and existential, present in so much other prose of the time. (Still, one of his stories was so controversial in its criticisms of the Russian church that it was only published decades later.) And Leskov himself was confused enough as to his own strengths that he said that his brilliant storytelling abilities would be forgotten in favor of his ideas, when, in fact, his legacy lies in the unique qualities of his stories, which are hilarious, unpredictable, surreal, and often baffling. Walter Benjamin and Irving Howe have both paid great tribute to Leskov (Benjamin’s essay characteristically seems to have more to do with Benjamin’s obsessions than with Leskov himself), but neither of them quite characterizes the sheer peculiarity of Leskov’s best work, where the narrative material is subject to perversion along the lines of Euripides, Kleist, Gogol, or Kafka, though with far less malevolence. Leskov’s structural perversities are in service of a particular, peculiar form of morality, one not as doctrinal or particular as Tolstoy’s or Dostoevsky’s, but one that celebrates humility in the face of fate.

more from David Auerbach at The Quarterly Conversation here.

what’s a sentence?

89f23d32-2508-11e0-895d-00144feab49a

This question of how forms of writing produce forms of thought is one that the literary critic and legal scholar Stanley Fish has been wrestling with most of his career. He first came to prominence in the late 1970s with his theory of “interpretative communities”. This held that all readings of literary texts are inescapably bound up with the cultural assumptions of readers, an uncontroversial proposition now but one that quickly earned him the sloppy epithet of “relativist”. In the late 1980s and early 1990s he turned the Duke University English department into the headquarters of the then burgeoning “theory” industry before, in 1999, surprising the academic world by moving to the University of Illinois at Chicago, where he set himself the task of trying to renovate undergraduate education in basic skills like writing. Though he doesn’t mention that experience in his new book, How to Write a Sentence and How to Read One, it’s not far off stage. The problem with Strunk & White, in Fish’s view, is that “they assume a level of knowledge and understanding only some of their readers will have attained,” that is, the Cornell kids whose secondary education did at least a halfway decent job of teaching them the basics. Fish’s aim is to offer a guide to sentence craft and appreciation that is both deeper and more democratic. What, at base, is a sentence? he asks, and then goes on to argue that the standard answer based in parts of speech and rules of grammar teaches students “nothing about how to write”. Instead, we should be examining the “logical relationships” within different sentence forms to see how they organise the world. His argument is that you can learn to write and later become a good writer by understanding and imitating these forms from many different styles.

more from Adam Haslett at the FT here.