Justin Erik Halldór Smith: Me and Politics

Justin E. H. Smith in his blog:

6a00d8341c562c53ef017617728b72970c-300wiSometime in the summer of 1987 I walked out to our rural-route mailbox and found my membership card for the Young Socialist Alliance, accompanied by a typewritten letter filled with both practical information as well as elevated rhetoric about the youth being the future. I had heard that talk before at Catholic Youth Organization meetings, and was annoyed that I was made to join the mere youth auxiliary of the Socialist Workers' Party. But I was 15 and those were the rules, and I was happy enough to now be officially linked to the largest association of Trotskyists in the United States, whose publishing wing, Pathfinder Press, had already taught me so much about the larger world beyond the Sacramento Valley.

By the following year I had obtained another official document with my name on it, from the Department of Motor Vehicles, which enabled me to drive to the national convention of the SWP at Oberlin College in Cleveland. It enabled me, while my mother, for some mysterious reason, permitted me. In what would have been my junior year I had stopped attending high school for some months, out of sheer stubbornness, and didn't seem to have any other concrete plans, so driving off to do something at a university might have been hoped to hold open the possibility of what was known, even then, as a 'positive influence'. A 'positive influence on the youth'.

So I made it through the high desert of Nevada, through the salt flats of Utah, through the locust plagues of Nebraska, through Illinois, Indiana, and, finally, the state in which I would much later reside for two years and where I am still registered to vote: bleak pseudopalindromic Ohio, microcosm of all that is worst of 'these United States', the state Whitman had the most trouble rhapsodising about. But it was all new and fresh to me in 1988 and I was happy to go to some artsy café in the little town next to the campus and meet some dude named Harold who wore the best thrift-shop sweaters and knew more trivia about The Residents and Negativland than I did. This was the larger world too.

More here.

The Plan to Avert Our Post-Antibiotic Apocalypse

A new report estimates that by 2050, drug-resistant infections will kill one person every three seconds, unless the world’s governments take drastic steps now.

Ed Yong in The Atlantic:

ScreenHunter_1951 May. 20 20.23The report’s language is sober but its numbers are apocalyptic. If antibiotics continue to lose their sting, resistant infections will sap $100 trillion from the world economy between now and 2050, equivalent to $10,000 for every person alive today. Ten million people will die every year, roughly one every three seconds, and more than currently die from cancer. These are conservative estimates: They don’t account for procedures that are only safe or possible because of antibiotics, like hip and joint replacements, gut surgeries, C-sections, cancer chemotherapy, and organ transplants.

And yet, resistance is not futile. O’Neill’s report includes ten steps to avert the crisis. Notably, only two address the problem of supply—the lack of new antibiotics. “When I first agreed to do this, the advisors presented it to me as a challenge of getting new drugs,” says O’Neill. “But it dawned on me very quickly that there were just as many, if not more, important issues on the demand side.” Indeed, seven of his recommendations focus on reducing the wanton and wasteful use of our existing arsenal. It’s inevitable that microbes will evolve resistance, but we can delay that process by using drugs more sparingly.

More here.

Unsung Champion of Literature Blanche Knopf Finally Gets Her Due

Tom Blunt in Signature:

ScreenHunter_1948 May. 20 19.09There are two versions of the Blanche Knopf story. The first is one of triumph, documenting the calculated risks taken by the publishing maven to carve out paths for otherwise-neglected authors who would ultimately shape 20th-century culture and change the book business forever. America’s Harlem Renaissance, hard-boiled detective genre, and fascination with Europe’s sexual freedom can all be traced back to Mrs. Alfred A. Knopf’s business gambits, which in most cases sprang directly from her personal interests, or those of her close friends.

The second version is a tale of what might have been. How differently would Mrs. Knopf’s life and career have turned out if her husband had truly made her an equal partner in their business, as he promised when they were young newlyweds? To what greater heights might the company have flown if Mr. Knopf hadn’t vetoed some of her more risqué choices? Might Blanche have eventually summoned enough independence to go her own way if the couple’s gradual estrangement hadn’t nudged her toward a diet pill habit that slowly destroyed her health and eyesight? And perhaps most regretfully: how many more women might have felt called to work in the publishing world if Alfred hadn’t relentlessly downplayed Blanche’s involvement at every turn, only begrudgingly admitting his wife’s contributions long after her death in 1966?

These questions arise several decades too late to make any difference to Mrs. Knopf, and if it wasn’t for Laura Claridge’s new biography The Lady With the Borzoi, they might never have been posed at all.

More here.

A Blues for Albert Murray

AlbertMurray_otu_imgThomas Chatterton Williams at The Nation:

The name Albert Murray was never household familiar. Yet he was one of the truly original minds of 20th-century American letters. Murray, who died in 2013 at the age of 97, was an accomplished novelist, a kind of modern-day oral philosopher, a founder of Jazz at Lincoln Center, and the writer of a sprawling, idiosyncratic, and consistently astonishing body of literary criticism, first-rate music exposition, and cunning autobiography. In our current moment of identity politics and multicultural balkanization, the publication of any new Murray text would serve as a powerful reminder that his complex analysis of art and life remain as timely as ever—probably more so.

more here.

The latest vogue in success literature showcases an ugly elitism

Cover00Thomas Frank at Bookforum:

It’s 2016, and another management guru is revealing the secrets of the creative mind.

It’s not really a very original thing to do. The literature on encouraging corporate nonconformity is already enormous; it goes back many years, to at least 1960, when someone wrote a book called How to Be a More Creative Executive. What was once called “the creative revolution” in advertising got going at around the same time. I myself wrote a book about that subject—a history book!—nearly twenty years ago.

There have been slight variations in the creativity genre over the half-century of its ascendancy, of course. The cast of geniuses on whom it obsessively focuses has changed, for example. And while the study of creativity has always been surrounded with a quasi-scientific aura, today that science is more micro than macro, urging us to enhance our originality by studying the functioning of the human brain.

In the larger literary sense, however, it is now clear that the capitalist’s tribute to creativity and rebellion is an indestructible form. There is something about the merging of bossery and nonconformity that beguiles the American mind. The genre marches irresistibly from triumph to triumph. Books pondering the way creative minds work dominate business-best-seller lists. Airport newsstands seem to have been converted wholly to the propagation of the faith. Travel writers and speechwriters alike have seen the light and now busy themselves revealing the brain’s secrets to aspiring professionals.

more here.

How Archival Fiction Upends Our View of History

Ives-Cavendish-320Lucy Ives at The New Yorker:

Realist historical fictions, with the rustling demands of their costumes and their period-appropriate speech, often depend on painstakingly described physical veracity, sensory believability, to steep a reader in the past. While not necessarily factual, such works say: This really occurred, and now you, too, may experience it. As the literary historian Stephen Greenblatt enthused in a review of “Wolf Hall,” Hilary Mantel’s novel about the rise of Thomas Cromwell—perhaps the paradigmatic contemporary example of such fiction—great historical novels “provide a powerful hallucination of presence, the vivid sensation of lived life.”

But a handful of recent works of fiction have taken up history on radically different terms. Rather than presenting a single, definitive story—an ostensibly objective chronicle of events—these books offer a past of competing perspectives, of multiple voices. They are not so much historical as archival: instead of giving us the imagined experience of an event, they offer the ambiguous traces that such events leave behind. These fictions do not focus on fact but on fact’s record, the media by which we have any historical knowledge at all. In so doing, such books call the reader’s attention to both the problems and the pleasures of history’s linguistic remains.

The book that made this distinction clear to me is a new novel by Danielle Dutton, called “Margaret the First.” Dutton’s Margaret is Margaret Cavendish, Duchess of Newcastle-upon-Tyne, who lived from 1623 to 1673 and was one of the first British women to publish in print under her own name.

more here.

The disease of theory: “Crime & Punishment” at 150

Gary Saul Morson in The New Criterion:

PerovOne hundred and fifty years ago, when Dostoevsky published Crime and Punishment, Russia was seething with reform, idealism, and hatred. Four years earlier, the “tsar-liberator” Alexander II (reigned 1855–1881) had at last abolished serfdom, a form of bondage making 90 percent of the population saleable property. New charters granted considerable autonomy to the universities as press censorship was relaxed. The court system, which even a famous Slavophile said made his hair stand on end and his skin frost over, was remodeled along Western lines. More was to come, including the beginnings of economic modernization. According to conventional wisdom, Russian history alternates between absolute stasis—“Russia should be frozen so it doesn’t rot,” one reactionary writer urged—and revolutionary change. Between Peter the Great (died 1725) and the revolutions of 1917, nothing compared with the reign of Alexander II. And yet it was the tsar-liberator, not his rigid predecessor or successor, who was assassinated by revolutionary terrorists. The decade after he ascended the throne witnessed the birth of the “intelligentsia,” a word we get from Russian, where it meant not well-educated people but a group sharing a set of radical beliefs, including atheism, materialism, revolutionism, and some form of socialism. Intelligents (members of the intelligentsia) were expected to identify not as members of a profession or social class but with each other. They expressed disdain for everyday virtues and placed their faith entirely in one or another theory. Lenin, Trotsky, and Stalin were typical intelligents, and the terrorists who killed the tsar were their predecessors.

The intelligentsia prided itself on ideas discrediting all traditional morality. Utilitarianism suggested that people do, and should do, nothing but maximize pleasure. Darwin’s Origin of Species, which took Russia by storm, seemed to reduce people to biological specimens. In 1862 the Russian neurologist Ivan Sechenov published his Reflexes of the Brain, which argued that all so-called free choice is merely “reflex movements in the strict sense of the word.” And it was common to quote the physiologist Jacob Moleschott’s remark that the mind secretes thought the way the liver secretes bile. These ideas all seemed to converge on revolutionary violence.

More here.

Did Neanderthals Die Out Because of the Paleo Diet?

Abigail Tucker in Smithsonian:

Jun2016_e02_phenom-wr-v1_jpg__800x600_q85_crop_subject_location-581,137Humans tend to dismiss Neanderthals as dimwits, yet the brains of our doomed cousins were actually larger than our own. “If you go to a site from 150,000 years ago,” says Miki Ben-Dor, a Tel Aviv University archaeologist, “you won’t be able to tell whether Neanderthals or Homo sapiens lived there, because they had all the same tools.” Which helps explain why, to fathom how our fates diverged, he recently scrutinized Neanderthals’ bodies instead of their skulls. While humans have barrel-shaped chests and narrow pelvises, Neanderthals had bell-shaped torsos with wide pelvises. The prevailing explanation has been that Neanderthals, often living in colder and drier environments than their human contemporaries, needed more energy and therefore more oxygen, so their torsos swelled to hold a bigger respiratory system. But Ben-Dor had a gut feeling this was wrong. What if the difference was what they ate? Living in Eurasia 300,000 to 30,000 years ago, Neanderthals settled in places like the Polar Urals and southern Siberia—not bountiful in the best of times, and certainly not during ice ages. In the heart of a tundra winter, with no fruits and veggies to be found, animal meat—made of fat and protein—was likely the only energy source.

Alas, though fat is easier to digest, it’s scarce in cold conditions, as prey animals themselves burn up their fat stores and grow lean. So Neanderthals must have eaten a great deal of protein, which is tough to metabolize and puts heavy demands on the liver and kidneys to remove toxic byproducts. In fact, we humans have a “protein ceiling” of between 35 and 50 percent of our diet; eating too much more can be dangerous. Ben-Dor thinks that Neanderthals’ bodies found a way to utilize more protein, developing enlarged livers and kidneys, and chests and pelvises that widened over the millennia to accommodate these beefed-up organs.

More here.

Thursday, May 19, 2016

Could We Just Lose the Adverb (Already)?

Christian Lorentzen in Vulture:

ScreenHunter_1947 May. 20 09.14I’m cursed with a mind that looks at a sentence and sees grammar before it sees meaning. It might be that I’m doing math by other means, that I overdid it with diagramming sentences as a boy, or that my grasp of English was warped by learning Latin. Translating Horace felt like solving math problems. Reading Emily Dickinson began to feel like solving math problems. You might think this is a cold way of reading, but it’s the opposite. You develop feelings. Pronoun, verb, noun — I like sentences that proceed in that way, in a forward march. Or those tricked out with a preposition, another noun, and a couple of adjectives. Conjunctions and articles leave me unfazed. If these combinations result in elaborate syntactical tangles, it thrills me. It’s cheap words I hate, and I hate adverbs.

I’m unembarrassed to admit that my taste in literary style owes a lot to my adolescent reading of The Sun Also Rises — Hemingway was no friend of adverbs. He’s not alone. “Use as few adverbs as possible” is among V. S. Naipaul’s rules for beginning writers. When William Strunk and E. B. White admonish us to omit unnecessary words, I know they’re talking about adverbs without their having to say it.

More here.

“It’s Your Generation of Experimenters That Makes Me Look Good!” – An Interview with Kip Thorne

Karan Jani in The Wire:

What was your first reaction when you saw the gravitational-wave event on September 14, 2015 and the whole process which followed until the historic announcement?

ScreenHunter_1946 May. 20 09.04I think it was just one of deep satisfaction, that a dream that Rai Weiss, Ron Drever and Joseph Weber and Vladimir Braginsky and Stan Whitcomb and others had developed and shared so many decades ago that was finally reaching fruition.

In fact, nature turned out to be giving us just what I had expected – I’d expected since the early 1980s that the first thing we would see would be merging blackholes because the distance you can see goes up roughly proportionally with the mass of the binary, and so the volumes are cubed, and that factor would overwhelm the absolute lower event rate for blackhole binaries compared to neutron star binaries. It seemed very likely to me so that’s just what I thought would happen. It’s a big part of how I hoped to sell this project.

To have that come out right was pleasing, to have the strength of the waves be 10-21 – that’s a number we started targeting in 1978. So it all came to pass the way we expected it to, thanks to enormous work by your generation of experimenters. You were the ones who really pulled it off. The way I like to say it is that it’s your generation of experimenters that makes me look good!

More here.

On Ruth Goodman’s ‘How to Be a Tudor’

1631491393.01.LZZZZZZZEd Simon at The Millions:

There is something uncanny about staying in another person’s house — the stark differences and the small convergences of sameness. We all like to snoop a bit. Now, public historian Ruth Goodmangives us the chance to snoop on the lives of people who died 500 years ago. When you’re watchingThe Tudors or Wolf Hall, Goodman is the woman behind the scenes ensuring that the clothes look right, the home interiors are accurate, and the sumptuous feasts are as true to life as possible. InHow to Be a Tudor: A Dawn-to-Dusk Guide to Tudor Life, she makes her almost preternatural knowledge about life during the 16th century available to the reading public.

You wouldn’t expect the intricacies of Tudor baking, brewing, ploughing, cooking, needlework, painting, dancing, and card-playing to hold an audience rapt, and yet Goodman makes the minutia of everyday life a half-millennia ago tremendously interesting. Indeed, her voluminous knowledge makes Goodman seem not so much a specialist on period authenticity as an actual time traveler. Ingeniously structuring the book around the hourly rhythms of daily life (with chapters going from “At Cock’s Crow” to “And so to bed”), Goodman transmits information about food, work, medicine, education, leisure, lodging, sleep, and even sexuality. How to Be a Tudor, with its grounding in physical detail and avoidance of theoretical analysis, is true to the guide book genre, but one featuring recipes for veal meatballs (exceedingly expensive at the time) and Galenic medical advice.

more here.

life in ruins

Imperial-rome-circus-maximus-by-alan-sorrell-190474-1Mary Beard at the Times Literary Supplement:

Inside the monastery of S. Trinità dei Monti, which stands at the top of the Spanish Steps in Rome, is a room decorated in glorious trompe l’oeilas a ruin. Created in 1766 by Charles-Louis Clérisseau, and originally intended to be the cell of the monastery’s resident mathematician Fr Thomas Le Sueur, it imitates a decaying classical temple, with tumbled columns, a roof open to the sky, encroaching vegetation and a large parrot perched on one of the apparently surviving crossbeams. The irony of the design worked on several levels. It allowed the famous scholar to enjoy the pleasure of ruins without the discomfort. But it was also a wry comment on the life cycle of buildings. Ruins are one stage on their inevitable journey to destruction. As we know from some of the most ambitious modern attempts at conservation on archaeological sites all over the world, from Pompeii to Machu Picchu, collapse can be delayed – but not prevented. Here Clérisseau offered dilapidation frozen in time, a ruin built to last.

That life cycle of buildings, from conception to death, with an occasional lucky, or unlucky, resurrection, is the theme of James Crawford’s Fallen Glory – twenty chapters telling the biography of twenty structures, from across the world, ancient and modern, real and imaginary (the first chapter is on the Tower of Babel, the last on the virtual world of the web hosting service GeoCities). Some of these life stories work better than others. The Roman Forum, the subject of Chapter Six, needs so much background that we tend to lose sight of the main character as it rises out of the marshes, becomes the monumental centre of the empire, and slips back into pasture, only to be revived again in the service of Mussolini’s grandiose ambitions.

more here.

Two Old Jewish Socialists: Henry Roth Meets Bernie Sanders

UrlNathaniel Popkin at Tablet Magazine:

Hiat was born in Kletsk, a town south of Minsk, in Belarus. As a child, he began to doubt the possibility of God. “I’ve seen children die, small children, and the doubt of a merciful God really drove me” away from religious belief, he said to Roth during the first interview session, describing the crucible of his political consciousness and suggesting the rigor of his autodidactic mind. But at the same time, at the cheder in Kletsk, Hiat was introduced to the Jewish teaching that opened him intellectually to a “revolutionary instinctive upbringing.” “Socialism,” he said, “is part of philosophical Judaism.” There is, he explained to Roth, who never received, or pursued, a full Jewish education, “a certain Hebrew word, ein kemach, ein Torah: If you have no bread, you have no Torah.”

Bernie Sanders, who perhaps embodies this connection as thoroughly as any American public figure in history, rarely draws that line. In a speech last year to the students of the Evangelical Christian Liberty University, he quoted the Book of Matthew, not Torah or Talmud, in citing a religious influence in his political ideology. (Hillary Clinton, for her part, draws a connection between the Christianity she experienced growing up and her instinct to volunteer in poor neighborhoods of Chicago.) Sanders sometimes directs the question of how his Jewish self-identity inspired his political beliefs to the specter of the Holocaust, from which his father escaped but many of his relatives in Poland did not; more often, he simply identifies his parents as “Polish.”

more here.

Chinese Is Not a Backward Language

Tom Mullaney in Foreign Policy:

ScreenHunter_1945 May. 19 15.34Even in the age of China’s social media boom, with billion-dollar valuations for Beijing-based IT start-ups, prejudice against the Chinese language is alive and well. One would be forgiven for thinking that by 2016, the 20th century’s widespread critiques of racism, colonialism, and Social Darwinism would have sounded the death knell of 19th-century Orientalism, which viewed China and the Chinese language through a condescending, colonialist lens. At the least, one might hope that if notions of Chinese Otherness were still with us, those who carry on the tradition of these threadbare ideas would generally be seen as archaically Eurocentric and gauche — the dross of airport bookshop paperbacks, unworthy of serious engagement. If only. Nineteenth-century understandings of China persist, not only surviving the decline of Social Darwinism and race science, but flourishing in this new century, driven primarily by arguments about China’s unfitness for modern technology and media.

Call it Orientalism 2.0.

More here.

Homo Sapiens 2.0? We need a species-wide conversation about the future of human genetic enhancement

Jamie Metzl in KurzweilAI:

After 4 billion years of evolution by one set of rules, our species is about to begin evolving by another. Overlapping and mutually reinforcing revolutions in genetics, information technology, artificial intelligence, big data analytics, and other fields are providing the tools that will make it possible to genetically alter our future offspring should we choose to do so. For some very good reasons, we will. Nearly everybody wants to have cancers cured and terrible diseases eliminated. Most of us want to live longer, healthier and more robust lives. Genetic technologies will make that possible. But the very tools we will use to achieve these goals will also open the door to the selection for and ultimately manipulation of non-disease-related genetic traits — and with them a new set of evolutionary possibilities. As the genetic revolution plays out, it will raise fundamental questions about what it means to be human, unleash deep divisions within and between groups, and could even lead to destabilizing international conflict.

And the revolution has already begun. Today’s genetic moment is not the stuff of science fiction. It’s not Jules Verne’s fanciful 1865 prediction of a moon landing a century before it occurred. It’s more equivalent to President Kennedy’s 1962 announcement that America would send men to the moon within a decade. All of the science was in place when Kennedy gave his Houston speech. The realization was inevitable; only the timing was at issue. Neil Armstrong climbed down the Apollo 11 ladder seven years later. We have all the tools we need to alter the genetic makeup of our species. The science is here. The realization is inevitable. Timing is the only variable.

More here.

Thursday Poem

Daphne in Mourning

Palm fronds have woven out the sky.
Fog has infiltrated every vein.
My hair has interlaced with vines.
Cobwebs lash their gauze across my eyes.

I’ve stood so since the world began,
and turned almost to stone some years ago.
Who passes by perceives a lichened post,
my girlish figures, ghostly, nearly gone.

My bark is warmer than the dead’s.
Human blood still lulls the underside of leaves.
My fingers hold the very dress I loved
to dance in, when dancing mattered-and it did.

by Melissa Green
from Daphne in the Morning
Pen & Anvil Press, 2010
.

Wednesday, May 18, 2016

His Last Decade

Michel-foucault1

Stuart Elden in Berfrois:

Foucault’s Last Decade is a study of Foucault’s work between 1974 and his death in 1984. In 1974, Foucault began writing the first volume of his History of Sexuality, developing work he had already begun to present in his Collège de France lecture courses. In that first volume, published in late 1976, Foucault promised five further volumes, and indicated some other studies he intended to write. But none of those books actually appeared, and Foucault’s work went in very different directions. At the very end of his life, two further volumes of the History of Sexuality were published, and a fourth was close to completion. In contrast to the originally planned thematic treatment, the final version was a much more historical study, returning to antiquity and early Christianity. In this book, I trace these developments, and try to explain why the transition happened.

Foucault’s Last Decade has its roots as far back as the late 1990s. I had just finished a PhD thesis on Nietzsche, Heidegger and Foucault. Right at the end of that process Foucault’s courses from the Collège de France began to be published – the first in 1997, the second in 1999. I already knew how much Heidegger scholarship had been changed by the insights of his lecture courses and thought that the same would be true for Foucault. (Of course, with Heidegger, much more and much worse was to come with his notebooks.) I wrote a review essay on the second published Foucault course – The Abnormals – for the journal boundary 2, on the invitation of Paul Bové, and then Paul invited me to the University of Pittsburgh when I spoke about‘Society Must Be Defended’, a text which was also published in boundary 2. I thought then that if I wrote something about each course as they came out, then in time there might be the raw materials for a book.

And so, on and off, in and around other projects, I read, spoke and sometimes wrote about most of Foucault’s courses as they appeared. Some of these were published here at Berfrois.Foucault taught at the Collège de France from late 1970 until his death in 1984. There were thirteen courses in total, but they were published in non-chronological order – the earliest courses presented the greatest editorial difficulties, and so were among the last to appear. The last of the Collège de France ones was published in 2015. Some courses from elsewhere and other material has also been published in the intervening years, and we now have far more material published since Foucault’s death than appeared in his lifetime. This, despite, his wish for ‘no posthumous publications’ – a request that was once followed scrupulously, then generously interpreted and is now largely ignored.

More here.