Anthropology and Racial Politics

Bakerbook_fullSerena Golden interviews Lee Bakers, author of the new book Anthropology and the Racial Politics of Culture, in Inside Higher Ed:

Q: You describe a “dramatic shift” in the first half of the 20th century, when the federal government “promulgat[ed]… policies to first destroy and then protect American Indian culture.” This swift change “mirrored shifts in American popular culture, aesthetics, and attitudes toward traditional or authentic Native American cultures.” Can you give an overview of how and why such a dramatic about-face occurred?

A: In 1883, the Bureau of Indian Affairs (BIA) developed a policy called the Religious Crimes Code, which authorized agents to use force or imprisonment to repress and stop American Indian religious practices that they deemed subversive, immoral, or an impediment to the goal of “civilizing” the Indian. This followed the much more comprehensive Dawes Act of 1887, which divided up tribal lands into small, individually-owned parcels. This allotment mechanism created a putative surplus of land that was sold to developers, railroads, and ranchers. The idea was to force rapid civilization based on individualism or speed the process of assimilation by destroying communal ways of life, but the amount of land provided to individual families was not large enough to be sustainable. The act and its various amendments were in place for almost a half a century, and American Indian families lost an estimated 90 million acres of treaty land. These two policies reinforced other punitive policies, practices, and violence tethered to an explicit “vanishing policy” — a policy designed to make American Indian culture disappear.

In 1890, Sitting Bull was shot dead, and the army quickly stopped one of the last so-called uprisings with their massacre at Wounded Knee. In 1893, Fredrick Jackson Turner delivered his influential paper on “The Significance of the Frontier in American History,” detailing how American culture was tied to frontier expansion, but the frontier was now closed. By the turn of the 20th century, the western frontier was finished and the threat of American Indians diminished.

Americans began to focus on preservation and conservation of land, wildlife, and water, which fueled movements to establish public spaces and establish more national parks. As boarding schools, the Dawes act, and the BIA articulated macabre vanishing policies, early anthropologists like James Mooney and Frank H. Chushing began to practice salvage ethnography. They attempted to preserve and conserve Indian culture by writing and describing the practices that they viewed as quickly disappearing. Tourism to the Southwest, a growing appreciation in Indian art, living ethnological fair exhibits, and wild-west shows all promoted a pacified yet exotic and distinctively American way of life. At the same time, summer camps and organizations like the Camp Fire Girls were promoted, and teenagers around the country began dressing up to play Indian. American Indian culture slowly became America’s exotic but safe “other.”

[H/t: Linta Varghese]

A World Without Planes

DeBotton Alain de Botton in the BBC:

In a future world without aeroplanes, children would gather at the feet of old men, and hear extraordinary tales of a mythic time when vast and complicated machines the size of several houses used to take to the skies and fly high over the Himalayas and the Tasman Sea.

The wise elders would explain that inside the aircraft, passengers, who had only paid the price of a few books for the privilege, would impatiently and ungratefully shut their window blinds to the views, would sit in silence next to strangers while watching films about love and friendship – and would complain that the food in miniature plastic beakers before them was not quite as tasty as the sort they could prepare in their own kitchens.

The elders would add that the skies, now undisturbed except by the meandering progress of bees and sparrows, had once thundered to the sound of airborne leviathans, that entire swathes of Britain's cities had been disturbed by their progress.

And that in an ancient London suburb once known as Fulham, it had been rare for the sensitive to be able to sleep much past six in the morning, due the unremitting progress of inbound aluminium tubes from Canada and the eastern seaboard of the United States.

At Heathrow, now turned into a museum, one would be able to walk unhurriedly across the two main runways and even give in to the temptation to sit cross-legged on their centrelines, a gesture with some of the same sublime thrill as touching a disconnected high-voltage electricity cable, running one's fingers along the teeth of an anaesthetised shark or having a wash in a fallen dictator's marble bathroom.

[H/t: Anil Kalhan]

Sunday Poem

The Woman Who Collects Noah's Arks

Has them in every room of her house,
wall hangings, statues, paintings, quilts and blankets,
ark lampshades, mobiles, Christmas tree ornaments,
t-shirts, sweaters, necklaces, books,
comics, a creamer, a sugar bowl, candles, napkins,
tea-towels and a tea-tray, nightgown, pillow, lamps.
……..Animals two-by-two in plaster and wood,
fabric, oil paint, copper, glass, plastic, paper,
tinfoil, leather, mother-of-pearl, styrofoam,
clay, steel, rubber, wax, soap.
……..Why I cannot ask, though I would like
to know, the answser has to be simply
because. Because at night when she lies
with her husband in bed, the house rocks out
into the bay, the one that cuts in here to the flatlands
at the center of Texas. Because the whole wood structure
drifts off, out under the stars, beyond the last
lights, the two of them pitching and rolling
as it all heads seaward. Because they hear
trumpets and bellows from the farther rooms.
Because the sky blackens, but morning finds them always
safe on the raindrenched land,
bird on the windowsill.

by Janet McCann
from PoemMemoirStory, Grove Press

Professor Antony Flew: philosopher (1923-2010)

From The London Times:

Flew Antony Flew was one of the best-known atheists of his generation but he finally repudiated the label. As an academic philosopher he subjected the question of God’s existence to careful, non-polemical analysis. When he declared himself a theist in his old age he annoyed many of his admirers — which might have been the intention. Antony Garrard Newton Flew was born in London in 1923 to a Methodist family. His father was president of the Methodist Conference for the one-year term and was active in other organisations including the World Council of Churches.

He was educated at St Faith’s School, Cambridge and then Kingswood School in Bath. At 15 he was struck by the incompatibility of divine omnipotence and the existence of evil, and lost his faith. He later identified this as the first step towards his career as a philosopher. His study of that subject was delayed by the war; he studied Japanese and served as an intelligence officer in the RAF. After the war he went to St John’s College, Oxford to read Greats, of which classical philosophy is a part. His interest in the philosophy of religion led him C. S. Lewis’s Socratic Club. He was impressed by the Christian apologist, calling him “an eminently reasonable man”. He was attracted, but not persuaded, by Lewis’s moral argument for God’s existence. He studied other traditional proofs for God’s existence, and developed his philosophical skills in opposition to them.

More here.

The Colour of Paradise

From The Telegraph:

Emeralds-m_1616096f For Muslims, green is the emblematic colour of Islam; traditionally, only descendants of the Prophet Mohammed were allowed to wear green turbans and green robes. So it is not surprising that when Muslim potentates amassed hoards of jewels, they prized emeralds above all. Some had verses from the Koran carved into the faces of large emeralds, which were sewn into their ceremonial robes as talismans and amulets. The hunger of some rulers for these vivid green gemstones was almost insatiable. The first East India Company merchant to visit the Mughal court at Agra (in 1610) noted that the Emperor Jahangir’s emeralds weighed a total of 412 pounds – whereas his collection of diamonds weighed little more than a quarter of that, even though India was then the world’s leading diamond producer.

Where had those emeralds come from? The Mughals and Persian Shahs had a three-fold classification: the very best were said to be from Egypt, the next category came from 'old mines’ in Asia and the lowest quality came from 'new mines’ in the Americas. But this was a fiction. Just 10 years ago, a team of mineralogists analysed the oxygen isotopes in a number of famous Mughal emeralds, and found that almost all of them were from the Americas. To be more precise, they were from the highlands of Colombia; this analysis was in fact able to identify the specific outcrops from which they had been extracted.

More here.

I’ve hurled myself at America in a sort of fury

Reynolds-t_CA0-articleInline

Glimpsing the human side of major historical figures is endlessly fascinating. As Melville noted, Shakespeare in his own day wasn’t Shakespeare. He was Master William Shakespeare, the harried writer — mocked as an “upstart crow” by a critic — who churned out plays for the proprietors of the Globe Theater. Humanizing Alexis de Tocqueville ­poses special challenges. His magnum opus, “Democracy in America,” has gained prophetic stature since its publication in two volumes (1835 and 1840). Its grand pronouncements about America roll before us in chapter after sweeping chapter, each ringing with authority. Tocqueville covered many topics — government, commerce, law, literature, religion, newspapers, customs — in elegant prose that captured the essence of democracy. His insights, while sometimes debatable, are often eerily prescient. In “Tocqueville’s Discovery of America,” Leo Damrosch, the Ernest Bernbaum professor of literature at Harvard, reveals the man behind the sage. Damrosch shows us that “Democracy in America” was the outcome of a nine-month tour of the United States that Tocqueville, a temperamental, randy 25-year-old French apprentice magistrate of aristocratic background, took in 1831-32 with his friend Gustave de Beaumont.

more from David S. Reynolds at the NYT here.

A Reign Not of This World: On Juan Carlos Onetti

1271348943-large Jonathan Blitzer in The Nation:

“I am animated by the idea that you can stop reading me when you wish,” explains Juan María Brausen in A Brief Life, a 1950 novel by Juan Carlos Onetti. Brausen's remark appears in a letter to a friend: Brausen has recently left town, and he doesn't want his friend to follow him. Reading Onetti's fiction, you can't help feeling superfluous yourself, encouraged to slink away, to give up the pursuit. It's nothing personal. As the protagonist of Onetti's novella The Pit (1939) says of his writing, more with indifference than piquancy, “I don't know whether it's interesting but that doesn't bother me.” Onetti's characters need to be alone, and whether they are writers or not, they take especial pains to harvest their solitude. In this they resemble their author, who was never quite reclusive but rather willfully self-contained. His fiction appears, maybe more than for most writers, to have been a necessary, perhaps even hermetic, personal instrument; writing only for his characters, as he once professed, he could contain and give shape to the self so that he might, momentarily, forget that he existed. “My oeuvre,” Onetti wrote to Octavio Paz in the frosty and uncharacteristically public exchange that ensued after Paz was awarded the prestigious Premio Cervantes in 1981, “is nothing more than a combination of fictional works in which the only thing that mattered to me was my own self, confronting and maybe conjoined with the perspectives of many characters that life has forced on me or that I have perhaps imagined.” Onetti immerses himself in reality just long enough to fashion an escape. This is his peculiar gift.

Words appear in odd and unlikely combinations with Onetti, always courting possibilities while reducing certainty. His fictions and correspondence attest to his insurmountable remoteness. In interviews he was much the same, speaking slowly, punctuating remarks with long pauses, taking interminable drags on a cigarette in midsentence, trailing off in a bemused monotone. As the Spanish novelist Antonio Muñoz Molina once said, recalling a 1977 televised interview with the writer in Madrid, where Onetti lived for nineteen years in political, then self-imposed, exile: “I had never heard anyone speak about literature with such a lack of emphasis.”

A Shakespeare Scholar Takes on a ‘Taboo’ Subject

Photo_4391_landscape_large Jennifer Howard in The Chronicle of Higher Education:

The most startling thing about Contested Will, James Shapiro's new book about the Shakespeare authorship debate, is not what it concludes about who really wrote Hamlet and King Lear. Shapiro, a professor of English at Columbia University, is an unrepentant Stratfordian, a firm believer that William Shakespeare of Stratford-upon-Avon created the plays and poems associated with his name.

What will surprise fellow Stratfordians—as well as doubters who want to dethrone Shakespeare and install Christopher Marlowe, Edward de Vere (the Earl of Oxford), Francis Bacon, or another contender in his place—is Shapiro's argument that the different camps have more in common than they admit. As Shapiro sees it, Stratfordians, Marlovians, Oxfordians, Baconians, and the rest share an anachronistic insistence on what he calls “reading the life out of the works.” In other words, they try to find autobiographical details in the plays and poetry that will confirm the true identity of the author.

Among mainstream Shakespeare scholars, Contested Will may be disconcerting for another reason. The book, just out from Simon & Schuster, argues that the authorship question is the one subject that they have deliberately neglected.

“More than one fellow Shakespearean was disheartened to learn that I was committing my energies to it,” Shapiro writes in the prologue, “as if somehow I was wasting my time and talent, or worse, at risk of going over to the dark side. I became increasingly interested in why this subject remains virtually taboo in academic circles, as well as in the consequences of this collective silence.”

Shapiro has not, in fact, gone over to the “dark side.” Contested Will includes a chapter on why he continues to believe that the Stratford candidate is the genuine article. But rehashing the authorship debate is not the purpose of the book. It does not attempt an exhaustive review of the merits of the competing claims. As Shapiro explicitly says, what interests him is not what people think about the authorship question but why they think it and how their personal and historical circumstances help shape that.

Understanding the Split Personality of Iceland’s Volcanoes

Volcano_IcelandJohn Timmer in Ars Technica:

The initial images of the Eyjafjallajökull eruption showed the sort of dramatic spires of molten rock that we associate with Hawaiian volcanoes. The next time it made the news, it was because air travel throughout Northern Europe had been shut down as a huge cloud of ash spread slowly across the UK and Scandinavia—very un-Hawaiian. To get a better sense of why this Icelandic volcano was showing such a split personality, we got in touch with the American Geophysical Union, which handed us on to Dr. Jeff Karson, who's chair of the Earth Sciences department at Syracuse University. Dr. Karson patiently explained what makes volcanism in Iceland distinct.

If you're like me, and know just enough Geology to be dangerous, you'd probably divide volcanoes into two categories: hotspots like Hawaii, where molten rock pours out as gently as anything that's 1,200°C possibly can, and volcanism associated with subduction zones, which tends to produce massive, explosive eruptions, such as the ones at St. Helens and Pinatubo. There is, of course, a third kind, the eruptions associated with the spreading of mid-ocean ridges. But these generally take place deep underwater, and are rarely captured on film.

Iceland is distinct because it's the product of a huge hot spot located directly under the Mid-Atlantic ridge. It's not unique in that regard; Karson said that the Azores and Galapagos Islands are the product of similar situations. But Iceland is much hotter and more active than the others.

If it's not on a subduction zone, however, you might not expect it to produce the sorts of explosive eruptions that send clouds of ash across large areas of the North Atlantic. Karson said there are two factors that can make Icelandic volcanoes pack a punch. The first is that, in addition to the basaltic magma associated with mid-ocean ridges, Iceland's volcanoes produce significant amounts of rhyolite, which is silica-rich and, more significantly, contains a lot more volatile substances. As the rhyolitic magma reaches the surface and pressure is released, the result can be an explosive eruption. As Karson put it, the cause is different from the explosive volcanic eruptions that occur at subduction zones, but the result can look very similar.

Individual magma chambers beneath Iceland can have varying mixes of basaltic and rhyolitic materials, which means that individual volcanoes on the island may have a complicated eruption history.

The other factor that adds to the explosiveness of Iceland's volcanoes is the ice itself.

Graveyard of Empires: Nine Months on the Ground in Obama’s Afghanistan

Genoways-thumbnail The new issue of the Virginia Quarterly Review has a number of interesting pieces on the war in Afghanistan. VQR editor Ted Genoways on the issue:

Everything seemed to be going exactly to plan. For the first week after Operation Moshtarak was launched under cover of darkness on February 13, NATO and Afghan troops lived up to the offensive’s lofty name—a Dari word meaning “together,” selected to reinforce the operation’s joint effort. The Afghan National Army made up some 60 percent of the thousands of troops advancing on the dusty redoubt of Marja, an agricultural town latticed with canals and ditches irrigating the poppy fields that made it a crossroads for heroin traffickers and pro-Taliban forces in the Helmand Province. Locals, as asked, voluntarily stayed in their homes to avoid IEDs emplaced by insurgents and shared intelligence with international commanders. Even Pakistan’s Directorate for Inter-Services Intelligence (ISI) got in the act, arresting two “shadow governors” of Afghanistan’s northern provinces and raiding a house in Karachi where Mullah Abdul Ghani Baradar, the Taliban’s military commander, was captured.

Coming nearly nine months after General Stanley McChrystal was appointed by President Barack Obama to be commander of forces in Afghanistan, the coordinated action in the southern provinces and across the border in Pakistan appeared to be an astounding exoneration of the general’s new counterinsurgency plan. And not a moment too soon. After eight long years of military stalemate and political neglect, US troops were scoring measurable victories, and the fresh focus on winning the confidence of ordinary Afghans appeared to be paying major dividends. For the first time since the shedding of burqas and shaving of beards in the exultant early days of the invasion, the Afghan people seemed to be rallying around NATO forces.

Then, on February 21, troops sweeping for insurgents on the run from Marja intercepted Taliban radio chatter near the main road in Oruzgan Province. Little Bird helicopters, flown by elite US Special Forces, were called in. Pilots discovered a tight-knit convoy of two Land Cruisers and a pickup, all overloaded and riding low, lurching up the Khotal Chowzar mountain pass toward Daykondi Province. They concluded that the vehicles were heavy-laden with arms and insurgents. They opened fire, destroying the convoy. But when ground troops moved in to collect Taliban casualties, they instead found twenty-seven dead civilians—including at least four women and a child—and fourteen more wounded. These were ordinary Afghans, it turned out, fleeing the renewed violence. President Hamid Karzai swiftly denounced the attack as “unjustifiable” and called it “a major obstacle for an effective counterterrorism effort.”

Fatima Bhutto: ‘We didn’t know what would happen tomorrow’

From The Guardian:

Fatima-bhutto-in-london-001 In December 2007, at the moment Benazir Bhutto was murdered in the chaotic run-up to the Pakistani elections – in which she hoped to win a third term as prime minister – her niece Fatima was out campaigning. Fatima's first thought, when the news came through, was disbelief: they couldn't kill another Bhutto, they wouldn't dare. Then shock set in as she made the emotional connection with the murder of her father, Murtaza, a decade before. “I cried for the next five days,” she writes in her new book, Songs of Blood and Sword. “By the time I had drained myself of tears, I had cried for everyone.” The tears were unexpected. Fatima was out campaigning for a rival party. She has for years been a ferocious critic of Benazir and her widower, Asif Ali Zardari, now president of Pakistan, and what she calls the “Bhutto cult”, whereby party leadership is handed down through the family. But she has good memories of Benazir as well. As a child she was often told she was just like her aunt. “We liked all the same revolting sweets,” she says.

The Bhuttos have dominated Pakistani politics for decades. But dynastic politics brought murderous rivalries. Benazir fell out with her brothers, one of whom was Fatima's father, after their father, Zulfikar Ali Bhutto, founder of the Pakistan People's Party and the country's fourth president, was executed by General Zia's dictatorship in 1979. While her brothers plotted to overthrow Zia from Kabul, and were accused of orchestrating a plane hijacking in 1981, Benazir pressed for political advantage and was twice elected prime minister – and twice removed on charges of corruption. The siblings became enemies, and when Murtaza was shot dead outside the family home in Karachi in 1996, Fatima, then 14, suspected her aunt of being involved.

More here.

Indecision-Making

From The New York Times:

Sheena Sheena Iyengar is the psychologist responsible for the famous jam experiment. You may have heard about it: At a luxury food store in Menlo Park, researchers set up a table offering samples of jam. Sometimes, there were six different flavors to choose from. At other times, there were 24. (In both cases, popular flavors like strawberry were left out.) Shoppers were more likely to stop by the table with more flavors. But after the taste test, those who chose from the smaller number were 10 times more likely to actually buy jam: 30 percent versus 3 percent. Having too many options, it seems, made it harder to settle on a single selection.

Wherever she goes, people tell Iyengar about her own experiment. The head of Fidelity Research explained it to her, as did a McKinsey & Company executive and a random woman sitting next to her on a plane. A colleague told her he had heard Rush Limbaugh denounce it on the radio. That rant was probably a reaction to Barry Schwartz, the author of “The Paradox of Choice” (2004), who often cites the jam study in antimarket polemics lamenting the abundance of consumer choice. In Schwartz’s ideal world, stores wouldn’t offer such ridiculous, brain-­taxing plenitude. Who needs two dozen types of jam?

More here.

Saturday Poem

Third Person Neuter

Is God mad? Was Christ
crazy? Is truth
the legal truth? (Three PhDs who swear

the human being who believes
a human being God
is what, in fairness, speaking

clinically, we call
a nut.) No jury,
given sacred laws

of science and democracy, would now
forgive so big a claim as Christ's—a claim
for good. (The wounded get

their settlements in millions, not
worlds-without-end.) We think of bliss
as ignorance, and heaven as naiveté: the doctor's

a philosopher, the priest a practicing
apologist. Not one of them
will let me see

with my own eyes my friend again.
When experts gave him time, it made
his luck and language die. What good

was love? It was the ultimate
authority to quit.
He had no use

for flesh at last
and, Christ,
I'm made of it.

by Heather McHugh
from Shades;
Wesleyan University Press, 1988

Byrne, Baby, Byrne

Michael Archer interviews David Byrne in Guernica:

Byrne300Clayton%20Cubitt Setting Imelda Marcos’s life to music—dance beats, no less—seems a perfectly mainstream concept coming from David Byrne. After all, this is a man who placed an old pump organ inside the Great Hall of the Battery Maritime Building in New York City and used hoses to connect it to pumps and motors set throughout the century-old former ferry terminal, so that when visitors pushed the organ’s keys they were “Playing the Building,” the project’s name. Still, a musical version of the former Phillippine First Lady’s life will likely raise some eyebrows. But beyond that, it’s also a potential new business model for the record industry. How so? By creating a musical biography of Marcos, one with a specific narrative thread, Byrne hopes to drum up demand not just for the catchiest of the songs, but the entire arc of the CD collection.

Here Lies Love, in which Byrne, 57, collaborates with Fatboy Slim (Norman Cook) to chronicle the rise and fall of Marcos and her relationship with former servant Estrella Cumpas, seems a perfectly logical progression for the frontman and principle song writer for the legendary Talking Heads, the influential band that placed four albums on Rolling Stone’s list of the 500 greatest and was inducted into the Rock and Roll Hall of Fame in 2002. Throughout his career, Byrne has collaborated with musicians as varied as Selena, members of Devo, and Luaka Bop. His world music label has released work from Os Mutantes, the Brazilian psychedelic rock band, and the Belgian group Zap Mama.

More here. [I wrote about Here Lies Love here.]

The Blacks of Mexico

Afro-mexicans2Alexis Okeowo in More Intelligent Life:

The first time I felt deeply uncomfortable being black was when I was a kid. My family had just moved to Alabama, and I was in a car with my father and my brother. A white woman with a harshly lined face and brown frizzy hair yelled out a racial slur as we drove by. Dad immediately put the car in reverse and drove over to her as she pumped gas at a filling station. “What did you say?” he demanded. She glared at him and refused to respond. Shocked into silence, my brother and I didn't say anything for the rest of the drive home.

The second time was in a quaint town in Mexico. I am a journalist living in Mexico City and I had decided to take a trip to Veracruz, where hundreds of thousands of African slaves had been brought by Spanish colonialists five centuries prior. I wanted to visit Yanga, a place that called itself “the first free slave town in the Americas”. The town was named for Gaspar Yanga, a slave who had led a successful rebellion against the Spanish in the 16th century.

I had only just learned about Afro-Mexicans, the isolated descendants of Mexico's original slaves, who reside on the country's rural Pacific and Gulf Coasts. After months of research and a visit to the remote Afro-Mexican community on the Pacific Coast, where most of them live, I felt compelled to visit the Afro-Mexicans in Veracruz on the Gulf Coast. I ended up spending most of my time trying to figure out Yanga.

As I arrived in town, I peered out of my taxi window at the pastel-painted storefronts and the brown-skinned residents walking along the wide streets. “Where are the black Mexicans?” I wondered. A central sign proclaimed Yanga's role as the first Mexican town to be free from slavery, yet the descendants of these former slaves were nowhere to be found. I would later learn that most live in dilapidated settlements outside of town.

Neuroeconomic Theory

Photo_IBrocasIsabelle Brocas and Juan D. Carrillo over at Vox EU:

Economics has always relied on a careful modelling of decision-makers. They are described by utility functions that represent their goals, and they interact at (Nash) equilibrium. Nevertheless, the discrepancies between theoretical predictions and observed behaviour have haunted the field for many decades.

To cope with this mismatch, behavioural economists have developed new theories of decision-making that are a better fit for the behavioural data than traditional models. The methodology consists in building models to demonstrate the relationship between a cause (such as a preference for a particular object) and a behavioural anomaly. This line of research formulates possible explanations for behavioural data, but it is nevertheless subject to shortcomings. Often the cause is not observable, and there is no evidence of the relationship provided by the model. Most notably, the freedom provided by the introspection method leads to a model selection problem. Also, the cause of the behavioural anomaly may simply lie elsewhere.

Photo_JCarrillo

Neuroeconomics offers a solution through an additional set of data obtained via a series of measurements of brain activity at the time of decisions. Experimental neuroeconomics can be seen as a subfield of experimental economics, where behavioural data is enriched with brain data. Neuroeconomic theory proposes to build brain-based models capable of predicting observed behaviour.

Experimental neuroeconomics is controversial. While some consider it to be an irrelevant body of research, there are those who claim it is essential (see Camerer, et al. 2005, Gul and Pesenderfor 2008). In fairness, the field is probably too young to tell. Surprisingly, the discussion has been centred on empirical issues regarding the collection method, amount, cost, and quality of brain data – the broad implications have not received as much attention. Indeed, the new set of data provided by experimental neuroeconomics will shed light on the causes of behaviour (and therefore of the behavioural anomalies) and help build new theories capable of explaining and predicting decisions, a long-term goal of economics. Neuroeconomic theory offers to do precisely this. So far, research in that direction has been very limited and its impact has been largely ignored.

On Obama’s Plan for the Future of Human Space Flight

WIR_4_16_10_SQRLee Billings in Seed:

Consider NASA: Today’s extremely expensive and somewhat unsatisfying staples of US human spaceflight, the space shuttle and a space station, can be traced back forty years or so to the aftermath of the Apollo program, when President Nixon chose them as its replacements. The legacy of these choices is that no one has ventured beyond low-Earth orbit ever since.

This has far graver implications than failing to fulfill a generation’s dreams of Pan-Am moon flights and Lunar Hiltons. In the fullness of time, lack of progress in expanding humanity’s sustainable presence off-planet represents an existential threat to our species. Even staunch critics of human spaceflight must acknowledge this. It doesn’t take a rocket scientist to realize that keeping all one’s eggs in a single, vulnerable planet-sized basket is an unsound long-term investment strategy. To do so is to court extinction.

NASA administrator Charles Bolden may have had such thoughts in mind on Tuesday when he addressed the audience at the National Space Symposium. “This is a big week for the entire nation,” Bolden said, “and it’s a week where probably more people than ever before will be thinking about space. It’s an important week for all of us in the space industry and it’s a particularly important week for NASA.”

Bolden was referring to the Obama administration’s new plans for America’s space agency, which the president himself presented and defended yesterday in a speech at Kennedy Space Center in Cape Canaveral, Florida. President Obama’s 2011 budget request eliminated funding for the earlier Bush administration’s Constellation program, which aimed to build a fleet of government-run rockets for returning astronauts to the Moon in the 2020s to build a lunar base. Instead, Obama’s plan deemphasizes the Moon and pumps more money into the commercial space sector for creating rockets to replace the aging space shuttle fleet. It also seeks to spark innovation, boosting NASA funds for development of breakthrough technologies in space-based propulsion, life-support, and power generation.

ramble on, outworn college creed

Fredric+Jameson

Fredric Jameson’s pre-eminence, over the last generation, among critics writing in English would be hard to dispute. Part of the tribute has been exacted by his majestic style, one distinctive feature of which is the way that the convoy of long sentences freighted and balanced with subordinate clauses will dock here and there to unload a pithy slogan. ‘Always historicise!’ is one of these, and Jameson has also insisted, under the banner of ‘One cannot not periodise,’ on the related necessity (as well as the semi-arbitrariness) of dividing history into periods. With that in mind, it’s tempting to propose a period, coincident with Jameson’s career as the main theorist of postmodernism, stretching from about 1983 (when Thatcher, having won a war, and Reagan, having survived a recession, consolidated their popularity) to 2008 (when the neoliberal programme launched by Reagan and Thatcher was set back by the worst economic crisis since the Depression). During this period of neoliberal ascendancy – an era of deregulation, financialisation, industrial decline, demoralisation of the working class, the collapse of Communism and so on – it often seemed easier to spot the contradictions of Marxism than the more famous contradictions of capitalism, and no figure seemed to embody more than Fredric Jameson the peculiar condition of an economic theory that had turned out to flourish above all as a mode of cultural analysis, a mass movement that had become the province of an academic ‘elite’, and an intellectual tradition that had arrived at some sort of culmination right at the point of apparent extinction.

more from Benjamin Kunkel at the LRB here.

when will they come for our women?

201016std001

IN THE Cascade mountains of California, north of Lassen Peak, astronomers are looking for aliens. The Allen Telescope Array (mostly paid for by Paul Allen, co-founder of Microsoft) consists of 42 dish antennas, each six metres across, scattered across the countryside. When the array is complete, it will have 350 dishes that, by acting in concert, will have the power of a single instrument 700 metres across. The Allen telescope is looking for aliens the traditional way: by searching for radio signals that have either been sent out deliberately, or leaked into space accidentally, as human radio signals are. The search for extraterrestrial intelligence, or SETI, is a 50-year-old idea. Much progress has been made in locating Earthlike planets (see article) but about 1,000 star systems have also been subject to serious radio scrutiny. The Allen array will increase the number to 1m within a decade.

more from The Economist here.