the futurity man

TLS_Harman_727835a

“Self-made” simply isn’t a strong enough term for H. G. Wells, as Michael Sherborne’s authoritative new Life makes very clear. His father was an unsuccessful shopkeeper in Bromley, Kent; his mother a lady’s maid who had to return to service as the family got gradually poorer. Lack of money meant that Bertie’s formal education was delayed until a few months before his eighth birthday and ended soon after his thirteenth: the next year he was expected to teach other children, some bigger than him, at a National School in Wookey. Lessons consisted of “whatever occurred” to the teenager, punctuated by hand-to-hand combat, as Wells recalled in his autobiography: “I fought my class, hit them about viciously and had altogether a lot of trouble with them”. Wells was put to several apprenticeships and seemed fated to replicate his father’s life in trade, either as a draper or a chemist. But he was not prepared to leave his change of fortune to luck or accident. There was a “game against life” which he was determined to win, and as his lowly hero Mr Polly comes to realize in the novel, “If the world does not please you, you can change it”.

more from Claire Harman at the TLS here.



Wednesday, August 4, 2010

Mullican, Mullican

Slaven_1 Jessica Slaven in Paper Momument:

Matt Mullican bears the consummate art-insider’s pedigree: his father, Lee Mullican, is a well-respected painter; he attended the California Institute of the Arts in the 70s and studied with John Baldessari; he participated in Documentas IX and X and was shown in the 2008 Whitney Biennial; he has excellent gallery representation and an impressive critical bibliography; he even has an artwork commissioned in black granite in the 50th Street subway station in New York City. In the summer of 2005, he was invited to be a Visiting Artist at the Skowhegan School of Painting and Sculpture in Skowhegan, Maine, where he lectured on his life’s work—from his earliest student experiments, through his seminal sculptures and installations of the 80s, up to new drawings and video-taped performances from “Learning from That Person’s Work,” an exhibition he’d just mounted at the Ludwig Museum in Cologne.

I was there. His lecture turned out to be the surprise controversy of the summer at Skowhegan, where our cohort was rumored to have been the most professional and well behaved in recent memory. Before you cock an eyebrow, remember that 2005 was perhaps the historical apex of the American MFA system, which excelled at producing refined and competent young artists ready to meet and greet the contemporary art world. Back then, it paid to be affable and business-like; it was a time to be on your insider-y best behavior.

Charming and gracious in person, Matt Mullican has been performing and lecturing for nearly thirty years, and by now it’s often difficult to distinguish the two activities.

The “Life” of Theodor W. Adorno: One Last Genius

J.M. Woolsey reviews Detlev Claussen by Theodor W. Adorno: One Last Genius, in Politics and Culture:

In his writings dedicated to collective memory, Maurice Halbwachs argued that family memory operates as a “physiognomy” in which our remembrance of family members and relations consist of a condensation or “summation of an entire period—the idea of a type of life” (60). At the level of both content and method, Halbwachs’s observations about the condensed and physiognomic nature of family memory as “the idea of a type of life” speak directly to Detlev Claussen’s Theodor W. Adorno: One Last Genius. Anyone familiar with Adorno will surely be aware of his infamous claim in Minima Moralia that “Wrong life cannot be lived rightly” (39). This statement—usually interpreted as “survivor’s guilt,” or a declaration about the impossibility of escaping total enmeshment in the exploitative exchange relations of capitalism—has become, inter alia, the equivalent of a physiognomic sound bite, metonymically branding Adorno. It is perhaps Claussen’s most valuable biographical insight that this negation of an idea of a type of life should be understood in relation to Adorno’s own memory of family life and his membership in the extended family of his intellectual friendships.

With the publication of One Last Genius, Claussen, a former student of Adorno who is now a professor of social theory and culture at the Leibniz University of Hanover, has made an important and valuable contribution to the recent literature dedicated to exploring the relation between the dialectician’s life experiences and his unique articulation of critical theory. By combining a close, if unbalanced, reading of some of Adorno’s central texts with an astute attention to letters and other private and public testimony written by his intellectual contemporaries, Claussen has produced a work similar in scope (but not depth) to Rolf Wiggerhaus’s magisterial account of the Institute of Social Research: The Frankfurt School: Its History, Theories and Political Significance (1995). However, anyone who reads One Last Genius looking for a definite statement about Adorno or “negative dialectics” will be sorely disappointed. In fact, throughout much of the text, Adorno is a shadowy figure in the background of a story focused on his intimate circle of friends; just another face in the crowd. Furthermore, Claussen’ s stylistic approach is often so repetitious and circular that it is bound to frustrate even the most interested and sympathetic reader. This problem can partially be attributed to the fact that each chapter in the book is designed to stand on its own and thus inevitably covers the same material as others, often in the same context and to make the same point.

An Excerpt from ‘Ernest Gellner: An Intellectual Biography’

OB-JI496_Gellne_DV_20100722170456From John Hall's new book, in the WSJ:

When Ernest Gellner died in December 1995, the flags of the University of Cambridge, where he had taught from 1984 to 1992, were set at half mast. This reflected the status he had achieved in the last years of his life, as a public intellectual able to comment on a very wide range of issues. It did not mean, however, that his views had lost their bite. If Gellner's name had been made during the scandal surrounding his early attack on Oxford linguistic philosophy, his late essays – not least his attack on Isaiah Berlin as a 'Savile Row postmodernist' – were capable of causing just as much outrage. Still, many felt affection for Gellner, with whose voice they had become familiar, and to whom they often turned for guidance and insight. All the same, very few people knew what to make of him. He was hard to pin down. For two decades he had the curious title of Professor of Sociology with special reference to philosophy at the London School of Economics and Political Science (LSE) – held, it should be noted, in two different departments: first Sociology, then Philosophy, Logic and Scientific Method – before taking up the William Wyse Professorship of Social Anthropology at the University of Cambridge. He had separate reputations as scholar of Islam, theorist of nationalism, philosopher of history, and historian of ideas. He ended his career in Prague, the city in which he had grown up as a boy, though in his final years he was most interested in developments in Russia. His status as public intellectual rested on this background, that of a multilingual polymath, a modern philosophe. He was sometimes cited as one of the last great thinkers from Central Europe whose Jewish background meant a direct experience of the twentieth century's horrors.

It is possible to hint at what follows by noting the very particular way in which Gellner fits into this last category. The contours of his formative experiences are clear, and were pungently expressed by Gellner himself when discussing the work of Hannah Arendt. The rise of nationalist sentiment at the end of the nineteenth century created a dilemma for Jews, especially those who had experienced the Enlightenment and an end to anti-Jewish discrimination by the state. Gellner insisted that the return to cultural roots was always an illusion, a piece of pure romanticism he neatly illustrated by noting sardonically that 'it was the great ladies at the Budapest Opera who really went to town in peasant dresses, or dresses claimed to be such'. Illusion or no, the Jews felt the pull of belonging just as much as others did – perhaps even more. But the romantic call to belong affected the minority Jewish community and the demographic majority in two very different ways.

Emotions help animals to make choices

From PhysOrg:

Emotionshelp To understand how animals experience the world and how they should be treated, people need to better understand their emotional lives. A new review of animal emotion suggests that, as in humans, emotions may tell animals about how dangerous or opportunity-laden their world is, and guide the choices that they make.

An animal living in a world where it is regularly threatened by predators will develop a negative emotion or 'mood', such as anxiety, whereas one in an environment with plenty of opportunities to acquire resources for survival will be in a more positive . The researchers argue that these emotional states not only reflect the animal's experiences, they also help it decide how to make choices, especially in ambiguous situations, which could have good or bad outcomes. An animal in a negative state will benefit from adopting a safety-first, 'pessimistic' response to an ambiguous event — for example interpreting a rustle in the grass as signalling a predator – while an animal in a positive state will benefit from a more 'optimistic' response, interpreting it as signalling prey.

More here.

Jilted salamanders lash out

From Nature:

Salamandar Amphibian Casanovas beware: the ladies aren't likely to take infidelity lying down. Male salamanders returning home after a night of disloyalty can expect a beating, a new study reveals. The finding has surprised behavioural researchers: most instances of infidelity punishment place females on the receiving end of the abuse. Female red-backed salamanders are no different, but they don't just take it — they dish it out too, finds behavioural ecologist Ethan Prosen of the University of Louisiana in Lafayette, who led the research1.

“This is the only species I know of where the male is intimidating the female and the female returns the favour,” says Prosen. No one is sure how common infidelity is among salamanders. But male red-backed salamanders are known to be aggressive toward female partners that have visited other males. But as males and females of this species are evenly matched in size, Prosen wondered why the females were putting up with this violent treatment.

More here.

Wednesday Poem

Snowsbanks North of the House

Those great sweeps of snow that stop suddenly six
feet from the house …
Thoughts that go so far.
The boy gets out of high school and reads no more
books;
the son stops calling home.
The mother puts down her rolling pin and makes no
more bread.
And the wife looks at her husband one night at a
party, and loves him no more.
The energy leaves the wine, and the minister falls
leaving the church.
It will not come closer
the one inside moves back, and the hands touch
nothing, and are safe.

The father grieves for his son, and will not leave the
room where the coffin stands.
He turns away from his wife, and she sleeps alone.

And the sea lifts and falls all night, the moon goes on
through the unattached heavens alone.

The toe of the shoe pivots
in the dust …
And the man in the black coat turns, and goes back
down the hill.
No one knows why he came, or why he turned away,
and did not climb the hill.

by Robert Bly
from The Man in the Black Coat Turns

derivatives

Babylon_0324

In “Babylon Revisited,” F. Scott Fitzgerald’s 1931 short story about the aftermath of the 1929 Wall Street crash, Fitzgerald makes the point that such collapses are slips in morality as much as financial failures. Charlie Wales, the story’s emotionally fragile hero, returns to Paris in a desperate effort to regain custody of his nine-year-old daughter. “I heard that you lost a lot in the crash,” says the Ritz bartender. Implying his moral lapses, Charlie replies that yes, he did, “but I lost everything I wanted in the boom.” In fact, upper-middle-class people like Charlie hesitated during the first months of the market’s run-up—until early in 1928. That was when they joined the gambling frenzy, and that was when, as John Kenneth Galbraith wrote in The Great Crash, 1929, the “mass escape into make-believe, so much a part of the true speculative orgy, started in earnest.” Eight decades later, stock-market investors like Charlie had no role in bringing on or profiting from the 2008 financial crisis. This time they stood on the sideline as major financial institutions engaged in a speculative orgy. Guided by no moral compass, the most sophisticated financial players in the world were betting big with one another about interest rates, commodity prices, and whether companies or governments would default.

more from William J. Quirk at the American Scholar here.

the new great game

Azcauccamap

In the 19th Century the British Empire and Tsarist Russia competed for hegemony in Central Asia. London fought to slow down Moscow’s expansionism, fearing that tessera after tessera Russia would have reached the borders of India, which was Britain’s most prized possession. Russia instead worked to restrict British influence, in a region perceived both as its own backyard and as a buffer zone. The comings and goings of spies and wheeler-dealers, ruthless traders and officers, the ups and downs of plots and intrigue, double-crossing and diplomatic discourtesies reported between the Caspian and Kabul during the 19th Century were catalogued under one single heading, the Great Game. The copyright, it is said, came from Arthur Conolly, an English secret agent serving with the East India Company. The novelist Rudyard Kipling took possession of the saying, bringing it to the attention of the public. Times change, as do situations, empires die and imperial democracies are born, but Central Asia, this vast portion of the world bordered on the west by the Caspian, on the east by China, on the north by Russia and on the south by Pakistan, Afghanistan and Iran, continues to be the theatre of significant manoeuvring. It is no coincidence that geopolitical analysts call it the New Great Game.

more from Matteo Tacconi at Reset here.

the listeners

Levi_1

“You do not interest me. No man can say these words to another without committing a cruelty and offending against justice,” writes philosopher Simone Weil. To turn a deaf ear is an offence not only to the ignored person but also to thinking, justice and ethics. Coleridge’s Ancient Mariner is cursed because no one will listen to his story. The Italian chemist-turned-writer Primo Levi was preoccupied with this fable because of his fear that on returning from Auschwitz people like him would be either ignored or simply disbelieved. Regardless, listening gets a very mixed press amongst critics and intellectuals. There is a suspicion of “wistful optimism” or the quasi-religious appeal to “hold hands” and play priest at the confessional. These qualms miss the centrality of listening to a radical humanism which recognises that dialogue is not merely about consensus or agreement but engagement and criticism. This is something that Primo Levi understood.

more from Les Back at Eurozine here.

james franco is interesting

Franco100726_1_250

Not so long ago, James Franco’s life and career were fairly normal. He grew up in Palo Alto, California, where his parents had met as Stanford students. Young James was, at his father’s urging, a math whiz—he even got an internship at Lockheed Martin. As a teenager, he rebelled, got in trouble with the law (drinking, shoplifting, graffiti), and eventually migrated toward the arts. His hero was Faulkner. He fell in love with acting when he played the lead in a couple of dark and heavy high-school plays. After freshman year, he dropped out of UCLA, very much against his parents’ wishes, to try to make a career of it. He was good, lucky, and driven, and within a couple of years, he got his first big break: Judd Apatow cast him in what would become the cult TV series Freaks and Geeks. When the series was canceled after just a season, Franco landed the lead in the TNT biopic James Dean. He played the part with a slumping intensity that seemed like a reasonable replication of the real thing—or at least much closer than anyone had a right to expect from a TNT biopic—and the performance won a Golden Globe. Soon after, he was cast as Robert De Niro’s drug-addicted son in the film City by the Sea. That same year, he entered mainstream consciousness as Peter Parker’s best friend in Spider-Man. Franco had become, in other words, a working Hollywood actor. An unusual actor—he overprepared for minor roles, read Dostoyevsky and Proust between takes, and occasionally drove colleagues crazy with his intensity—but still identifiably an actor, with an actor’s career. As he climbed toward leading-man status, however, Franco had a crisis of faith. He found himself cast in a string of mediocre films—Annapolis, Flyboys, Tristan + Isolde—most of which bombed. He felt like he was funneling all his effort into glossy, big-budget entertainment over which he had no control, and of which he wasn’t proud.

more from Sam Anderson at New York Magazine here.

Tuesday, August 3, 2010

The Honor of Exile

647_thumb Norman Manea in Project Syndicate:

The Romanian sculptor Brancusi once said that when the artist is no longer a child, he is dead. I still don't know how much of an artist I have become, but I grasp what Brancusi was saying. I can grasp – even at my age – my childish enduring self. Writing is a childish profession, even when it becomes excessively serious, as children often are.

My long road of immaturity began more than half a century ago. It was July 1945, a few months after I returned from a concentration camp called Transnistria. I lived that paradisiacal summer in a little Moldavian town, overwhelmed by the miraculous banality of a normal, secure environment. The particular afternoon was perfect, sunny and quiet, the room's semi-obscurity hospitable. I was alone in the universe, listening to a voice that was and wasn't mine. My partner was a book of Romanian fairy tales with a hard green cover, which I had been given a few days before when I turned the solemn age of 9.

That is when the wonder of words, the magic of literature started for me. Illness and therapy began at the same time. Soon, too soon, I wanted myself to be part of that family of word wizards, those secret relatives of mine. It was a way of searching for “something else'' beyond the triviality of everyday life, and also of searching for my real self among the many individuals who inhabited me.

What Darwin Got Wrong

Fodor200 John Dupre reviews Fodor and Piatelli-Palmarini's What Darwin Got Wrong, in The Philosopher's Magazine:

Neo-Darwinism is, very roughly, the claim that natural selection is by far the most important explanation of biological form, the particular characteristics of particular kinds of organism. It usually includes a commitment to gradualism (the idea that evolution occurs in small steps), and often involves attributing central importance to genes as the units that natural selection selects, or at any rate as the objective measure of evolutionary change. Versions have been prominently defended in recent years by such authors as Richard Dawkins, Daniel Dennett and Jerry Coyne.

Neo-Darwinism is, however, a perspective under ever-growing pressure, not (or not only) from the antiscientific assaults of the religious, but from the advancement of science. The decline of this intellectual monolith is generally to be welcomed, not least because it may be expected to bring down with it some of its less appetising academic fellow travellers, most notably Evolutionary Psychology. At the same time those contributing to the demise of neo-Darwinism must be aware of the risk, especially in the United States, that they will provide succour for fundamentalist Creationists and aficionados of so-called Intelligent Design.

Fodor and Piatelli-Palmarini’s (henceforth FPP) book is intended as a contribution to the critical task just mentioned, and they are well aware of the potential hazards. Sadly, however, the book is an almost tragic failure: it is unlikely to be taken seriously as a contribution to the dismantling of neo-Darwinism and it has been, and will continue to be, picked up by the fundamentalist enemies of science.

The first half of the book does a decent job of summarising the recent scientific insights responsible for the growing difficulties facing neo-Darwinism. Neo-Darwinism, by virtue of its emphasis on natural selection, sees evolution as driven from outside, by the environment. Central among the difficulties that FPP emphasise are crucial respects in which evolution is constrained, or even driven, by internal features of the organism. This realisation has been promoted by evolutionary developmental biology (“evo-devo”), which has also highlighted the unacceptable black-boxing of development in mainstream evolutionary theory, a concomitant of the exclusive focus on external determinants of change. Also crucial has been a gradual move away from excessively atomistic views of organisms and an appreciation of the necessity of treating them as integrated wholes, illustrated by the impossibility of analysing the genome into a unique set of discrete elements, “genes”. And equally important has been the disclosure of the complexity of the relations between genomes and phenotypes.

While much material is presented that does indeed reveal the dire straits in which neo-Darwinism finds itself, the overall argument is generally elusive. I speculate that this is because there are two quite different conclusions in the offing.

Hard to Find

ScienceHardtoFind__1279308039_2472Samuel Arbesman in The Boston Globe:

If you look back on history, you get the sense that scientific discoveries used to be easy. Galileo rolled objects down slopes. Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge to ask the right questions, but the experiments themselves could be almost trivial.

Today, if you want to make a discovery in physics, it helps to be part of a 10,000-member team that runs a multibillion dollar atom smasher. It takes ever more money, more effort, and more people to find out new things.

But until recently, no one actually tried to measure the increasing difficulty of discovery. It certainly seems to be getting harder, but how much harder? How fast does it change?

This type of research, studying the science of science, is in fact a field of science itself, and is known as scientometrics. Scientometrics may sound self-absorbed, a kind of inside baseball for scientists, but it matters: We spend billions of dollars annually on research, and count on science to do such things as cure cancer and master space travel, so it’s good to know what really works.

From its early days of charting the number of yearly articles published in physics, scientometrics has broadened to yield all sorts of insights about how we generate knowledge. A study of the age at which scientists receive grants from the National Institutes of Health found that over the past decades, older scientists have become far more likely to receive grants than younger ones, suggesting that perhaps younger scientists are being given fewer chances to be innovative. In another study, researchers at Northwestern University found that high-impact research results are more likely to come from collaborative teams — often spanning multiple universities — rather than from a single scientist. In other words, the days of the lone hero scientist are vanishing, and you can measure it.

Birth of a Salesman

Salesman-bodyAmitava Kumar in Guernica:

So, sadly, the dreamers and the haters are not two groups. They are often one and the same persons. —Arjun Appadurai, Fear of Small Numbers

The U.S. government’s exhibit 1002 in United States of America v. Hemant Lakhani was a document from the commissary in the Corrections Bureau in Passaic County, New Jersey. It indicated that on March 16, 2005, in the “early afternoon hours the defendant went to the commissary and notwithstanding his medical condition ordered four bags of hot buffalo chips.” That same afternoon, the defendant also purchased one bag of crunchy cheese chips. Assistant U.S. Attorney Stuart Rabner flipped through the rest of the pages of exhibit 1002. On March 21, Rabner told the U.S. Court of Appeals for the Third Circuit, the defendant had received five bags of hot buffalo chips, five bags of salty peanuts, and five bags of crunchy cheese chips. On March 28, he received one cheese pizza, and again, five bags each of hot buffalo chips, salted peanuts, and crunchy cheese chips—and five apple pies. Turning to another page, Rabner said that on April 8 the defendant had ordered five bags of hot buffalo chips, five bags of salted peanuts, and two bags of crunchy cheese chips. And then on April 11, the food items ordered were five bags of hot buffalo chips, five bags of salted peanuts, three apple pies, two honey buns, and a cheese pizza.

“The defendant’s conduct,” the prosecutor argued, “can indeed be determined to be a contributing factor to the swollen legs that he now complains about and on which basis he seeks an adjournment of this trial. He should not be allowed.”

Hemant Lakhani’s diet was under scrutiny because he had undergone three surgeries in three weeks. The trial had begun in early January, but only ten days later the defendant had needed to be hospitalized. On the morning of January 14, a deputy marshal informed the court that the defendant had been admitted the previous evening at the St. Barnabas Medical Center in New Jersey with a variety of problems: a hernia, a congenital heart condition, and renal failure. Speaking on record four days later, Lakhani’s doctor reminded the court that his patient was nearly seventy. He was probably suffering from hypertension. And it was possible that his heart needed surgical treatment. Later that week, Lakhani underwent an angioplasty and a pacemaker was inserted into his body. He was having problems with one of his knees and a rheumatologist had been pressed into service. The court couldn’t meet for three weeks because the defendant had needed time to recuperate.

Henry Klingeman, the defendant’s lawyer, stated that his client had described the jail food as “inedible,” and had complained that he wasn’t given rice, which had “been a staple of his diet for his entire life.” The commissary food was used as a “supplement” and, because he had a “sweet tooth,” he used to order apple pies.

The judge in the case, Katharine Hayden, took a considered view of the medical opinion she had been provided about the defendant. She declared that Lakhani was “ready to go” and commented with some concern that the diet the defendant had chosen was “loaded with salt” and “loaded with sugar.” She noted that Lakhani had more than once refused nutritious meals consisting of salad, bread, beans, apples, cookies, and hard-boiled eggs. With adequate good reason, the appeal to adjourn was denied by the judge.

Judicial trials by their very nature are about acts. They concern themselves with what has actually been done by an individual or group. But the Lakhani trial from the very beginning had seemed to be about who he was rather than what he had done.

It Wasn’t a War

Kate Perkins interviews Norman Finkelstein in Guernica:

Finkle-Body The career of radical political scholar Norman Finkelstein might be described as a sort of heroic painting-into-a-corner. The son of Holocaust survivors, his life’s work has been dedicated to exposing the hypocrisy, ideology, and violence that sustains the Israeli occupation of Palestine. The dimensions of his emphatic anti-Zionism, expounded over the course of six meticulously researched and often polemical books on Israel, Palestine, and the legacy of the Holocaust, have made him a pariah in the mainstream and a hero amongst supporters of Palestinian liberation. The high controversy around Finkelstein’s politics has penetrated university walls on more than one occasion, making his academic career fraught with defensive, uphill battles. I first met Finkelstein in 2007, in the eye of a storm of controversy surrounding his academic status at DePaul University. Despite his prolific and highly influential body of critical scholarship—and after first having been approved for tenure at DePaul by both department and faculty committees—Finkelstein’s tenure had ultimately been denied—minority dissenters had campaigned successfully against his appointment. Flanked by a supporting cast of speakers including Tariq Ali, Tony Judt, and Noam Chomsky (via satellite), Finkelstein stood before some one thousand six hundred people in the University of Chicago’s packed Rockefeller Chapel to make the case for academic freedom. Contrary to his reputedly prickly demeanor, he appeared extraordinarily collected and calm, his heavy brow furrowing only slightly over sharp, dark eyes as he prepared to publicly address the charges against him. (The university’s final word on the matter was that Dr. Finkelstein’s reputation for outspoken criticism of Israel and of Israeli apologists like Harvard Law Professor Alan Dershowitz made Finkelstein unfit for tenure at DePaul, a school of “Vincentian values.”)

It was the culmination of a long struggle to advance his radical political critique of Israel and of the American Israeli lobby from within the academy. Now an independent scholar, Dr. Finkelstein remains a leading voice of dissent against the pro-Israel policies that underwrite an apartheid regime enforced by egregious war crimes and human rights violations. In This Time We Went Too Far: Truth and Consequences of the Gaza Invasion, his first book since departing from DePaul—he argues that Israel’s November, 2008 invasion of Gaza, which decisively ended a fragile ceasefire brokered by Egypt that June, marked the beginning of an unprecedented decline in public support for Israel. The book’s epilogue is devoted to the Goldstone Report, a document authored by renowned South African jurist Richard Goldstone that describes the damning conclusions of a U.N.-commissioned investigation into the Gaza invasion, including charges of war crimes against Israel.

More here.

A System for Connecting Brains to the Outside World

From The New York Times:

Brain About four years ago, John Donoghue’s son, Jacob, then 18, took his father aside and declared, “Dad, I now understand what you do — you’re ‘The Matrix’!” Dr. Donoghue, 61, is a professor of engineering and neuroscience at Brown University, studying how human brain signals could combine with modern electronics to help paralyzed people gain greater control over their environments. He’s designed a machine, the BrainGate, that uses thought to move objects. We spoke for two hours in his Brown University offices in Providence, R.I., and then again by telephone. An edited version of the two conversations follows:

Q. WHAT EXACTLY IS BRAINGATE?

A. It’s a way for people who’ve been paralyzed by strokes, spinal cord injuries or A.L.S. to connect their brains to the outside world. The system uses a tiny sensor that’s been implanted into the part of a person’s brain that generates movement commands. This sensor picks up brain signals, transmits them to a plug attached to the person’s scalp. The signals then go to a computer which is programmed to translate them into simple actions.

Q. WHY MOVE THE SIGNALS OUT OF THE BODY?

A. Because for many paralyzed people, there’s been a break between their brain and the rest of their nervous system. Their brains may be fully functional, but their thoughts don’t go anywhere. What BrainGate does is bypass the broken connection. Free of the body, the signal is directed to machines that will turn thoughts into action.

More here.

to be tête-à-tête with things

Andre-kertesz_the_fork_1928_500px

Why photograph inanimate objects, which neither move nor change? Set aside for the moment explorations of abstract form (Paul Strand’s flower pots, Edward Weston’s peppers) and glamorous advertisements for material luxuries (Edward Steichen’s cigarette lighters, Irving Penn’s melted brie). Many of the earliest photographs were still life of necessity: only statues, books, and urns could hold still long enough to leave their images on salted paper. But with the still lifes of Roger Fenton, sharpness of detail and richness of texture introduce a new note: the dusty skin of a grape puckers around the stem, a flower petal curls and darkens at the edge. Photographic still life, like painted still life, is about our sensual experience of everyday objects, and the inevitability of decay. Penn famously photographed cigarette butts and trash collected from the gutter, rotting fruit and vegetables, discarded clothes, and other examples of dead nature. The nineteenth-century art critic Théophile Thoré objected to the French term for still life, nature morte, proclaiming, “Everything is alive and moves, everything breathes in and exhales, everything is in a constant state of metamorphosis… There is no dead nature!” The Czech photographer Josef Sudek tersely echoed this thought when he said that to the photographer’s eye, “a seemingly dead object comes to life through light or by its surroundings.”

more from Imogen Sara Smith at Threepenny Review here.

righteous and wrong

Ruthven_1-081910_jpg_230x790_q85

Obsessed as they are with their model of a “totalitarian threat” to Enlightenment liberalism, both Berman and Hirsi Ali fail to take account of well-documented facts that would challenge their presuppositions. Berman muddles kin-patronage politics, a constant in Arab societies, with fascism. Hirsi Ali—oblivious of changes in gender roles that are occurring within more developed Muslim polities, and ignoring the way that traditional systems of authority tend to oppress women in cultures as different as China, Japan, and India—confuses Islam (a malleable religious tradition) with patriarchy (a specific set of social relationships built around masculine power). As Julien Benda himself might acknowledge, a failure to look at all the facts, however complex they may be, is a kind of intellectual betrayal, a trahison des clercs.

more from Malise Ruthven at the NYRB here.