Fossils point to a big family for human ancestors

From Nature:

LeakeyFossilized skulls show that at least three distinct species belonging to the genus Homo existed between 1.7 million and 2 million years ago, settling a long-standing debate in palaeoanthropology. A study published this week in Nature1 focuses on Homo rudolfensis, a hominin with a relatively flat face, which was first identified from a single large skull in 1972. Several other big-skulled fossils have been attributed to the species since then, but none has included both a face and a lower jaw. This has been problematic: in palaeoanthropology, faces and jaws function like fingerprints for identifying a specimen as a particular species (which is indicated by the second word in a Linnaean title, such as 'rudolfensis'), as opposed to the broader grouping of genus (the first word, as in'Homo').

Without complete skulls, it has been difficult to reach a consensus on whether specimens attributed to H. rudolfensis are genuinely members of a distinct species, or actually belong to other Homo species that lived around the same time, such as Homo habilis or Homo erectus. Understanding how many different Homo species there were, and whether they lived concurrently, would help to determine whether the history of the human lineage saw fierce competition between multiple hominins, or a steady succession from one species to another. But the latest result has dissipated much of this uncertainty. It concerns three fossils — two lower jaws and a juvenile’s lower face — that were found in a desert area called Koobi Fora in northern Kenya. The team that pulled them out of the ground, led by Meave Leakey, a palaeontologist at the Turkana Basin Institute in Nairobi, describes how the dental arcade, the arch created by the teeth at the front of the mouth, is nearly rectangular, just like the palate structure of the 1972 skull. By contrast, the average modern human mouth has a curved dental arcade.

More here.

Thursday Poem

Elk Trails

Ancient, world-old Elk paths
Narrow, dusty Elk paths
Wide-trampled, muddy,
Aimless . . . wandering . . .
Everchanging Elk paths.

I have walked you, ancient trails,
Along the narrow rocky ridges
High above the mountains that
Make up your world:
Looking down on giant trees, silent
In the purple shadows of ravines—
Above the spire-like alpine fir
Above the high, steep-slanting meadows
Where sun-softened snowfields share the earth
With flowers

I have followed narrow twisting ridges,
Sharp-topped and jagged ass a broken crosscut saw
Across the roof of all the Elk-world
On one ancient wandering trail,
Cutting crazily over rocks and dust and snow—
Gently slanting through high meadows,
Rich with scent of Lupine,
Rich with smell of Elk-dung,
Rich with scent of short-lived
Dainty alpine flowers.
And from the ridgetops I have followed you
Down through heather fields, through timber,
Downward winding to the hoof-churned shore of
One tiny blue-green mountain lake
Untouched by lips of men.

Read more »

Wednesday, August 8, 2012

The Last Days of Pushing on a String

BlythMark Blyth over at the Harvard Business Review blog:

A metaphor attributed to John Maynard Keynes maintains that using monetary policy to fight a severe recession is like “pushing on a piece of string.” When the problem is inflation, pushing up interest rates (pulling on a string) is a pretty effective policy tool — ask anyone who lived through the Volcker recession of the early 1980s. But when rates are pushed down to stimulate economic activity the 'push' becomes less and less effective the closer to zero rates get.

The power of this “pushing on a string” metaphor is especially apparent today. The Federal Reserve's balance sheet shows that, since 2008, “deposits by depository institutions” (i.e. banks) have ballooned from about $30 billion to around $1.5 trillion. Why is all that money sitting at the Fed earning a meager 0.25% nominal interest when those same banks could make a lot more than that by lending it out?

The answer is simple: uncertainty about the future. Not uncertainty over Obamacare, or “regulation,” or any of the other bêtes noires of moment, but uncertainty over the lack of demand in an economy whose consumers and producers are paying back debt. After all, who opens a factory in the middle of a recession? But if we all think this way then investment expectations fall, which hits borrowing and lending activity, thereby bringing about the very recession that we all wanted to avoid in the first place.

The New Great Game in Central Asia

Cooley_411Alexander Cooley in Foreign Affairs:

In the last decade, the world has started taking more notice of Central Asia. For the United States and its allies, the region is a valuable supply hub for the Afghanistan war effort. For Russia, it is an arena in which to exert political influence. For China, it is a source of energy and a critical partner for stabilizing and developing the restive Xinjiang province in the Middle Kingdom's west. Some commentators have referred to Washington, Moscow, and Beijing's renewed activity in the region as a modern iteration of the Great Game. But unlike the British and Russian empires in their era of competition and conquest, the Central Asian governments are working to use renewed external involvement to their sovereign advantage, fending off disruptive demands and reinforcing their political control at home. Accordingly, the Central Asian case today is not a throwback to the past but a guide to what is to come: the rise of new players and the decline of Western influence in a multipolar world.

The first lesson to take from China, Russia, and the United States' involvement in Central Asia is that it has strengthened the hand of rulers, who have been able to play the suitors off one another to extract economic benefits and political support where possible. Most dramatically, in 2009, President Kurmanbek Bakiyev of Kyrgyzstan, host to the Manas Transit Center, initiated a bidding war between the United States and Russia by threatening to close the base.

the unknowable one

Marilyn

But none of that explains why, 50 years after her death, she is latent, current, mysterious yet known. When she died, the popular explanation was suicide, and it has always been easy to believe that lovely, uneducated kids often get found out by fame and stardom. But any examination of her death teaches the lesson that hers is the first death in that haunting line of the ’60s that includes the Kennedy brothers, Malcolm X, Martin Luther King, and Lee Harvey Oswald. How can such notable people die in uncertainty? Are there really infinite intrigues in the world, or do we refuse to accept simple and obvious answers? It is a kind of religion. So the collection of stories attending her death are more potent than her films, and they provide an occult explanation for that gorgeous, plaintive look she had: “What do you think happened to me?” We tell ourselves now that we are known in so many oppressive ways: Our identity is laid out in numbers ready to be stolen; all our e-mails are retrievable; increasingly, we are subject to surveillance, all meant for our “security,” but all contributing to its opposite. Monroe stands for this unlikely possibility: that in an age of mounting data and information storage, it is possible for someone beautiful and famous to be unknowable.

more from David Thomson at TNR here.

Art is becoming more loquacious

DocumentaStadtKassel

This seems to be a moment when art needs to take stock of itself, to reassess its position both historically—that is, in relation to the art of the past—and functionally, in the sense of reconsidering what distinguishes it from (and links it to) other cultural practices. After all, this is not some eccentric byway that Christov-
Bakargiev has followed blindly; it can’t be a coincidence that this year’s Manifesta and Paris Triennale are both as steeped in anthropology and art history as Documenta. Perhaps because Documenta is the largest—and most distended—of the three exhibitions, it is also the one that seems to have no decisive sense of what contemporary art can be. And yet there are artists re-examining the nature and function of art today; some of them are even included in Documenta. One is Kader Attia, whose installation includes sculptures he commissioned from African craftsmen: he asked them to copy photographs of hideously disfigured World War I veterans, with the result that the “grotesque” anatomical distortions we admire in tribal sculpture are reframed as nearly naturalistic attempts to render an almost unbearably poignant reality. And I should mention here too, among others, the videos of William Kentridge and Wael Shawky and a typically interrogative performance piece by Tino Sehgal.

more from Barry Schwabsky at The Nation here.

What Was Revealed When the Lights Went Out in India

Jonathan Shainin in The New Yorker:

ScreenHunter_11 Aug. 08 14.09As world news events go, the biggest power failure in history—which struck India Tuesday afternoon, plunging almost seven hundred million people into hypothetical darkness—may have been less momentous than advertised. There was chaos, of course: gridlocked traffic, stalled trains, and stranded commuters. Water supplies were interrupted, hospitals ceased all but essential services, and at least a few hundred unlucky miners in two eastern states were trapped underground for several nerve-wracking hours.

But power cuts are hardly uncommon in India, which is why offices and factories have diesel generators and the homes of the better-off come equipped with battery backup systems. (Basharat Peer has written about how strategies for shortages are woven into daily life.) Many people caught in the middle of the world’s biggest power outage experienced it as a brief flicker of the lights. And many others didn’t experience it at all. Though the headlines announced that seven hundred million people across twenty-one states had lost power, only about three hundred and twenty million of those had any electricity to begin with: in Uttar Pradesh, India’s most populous state and one of its poorest, sixty-three per cent of households, or about a hundred and twenty-five million people, lack access to electricity. Nationwide, about one third of households (roughly four hundred million people, more than everyone in the United States) don’t have electricity—which sounds like an astonishing number, until you consider that twenty years ago fifty-eight per cent of households were without electric power.

More here.

What we don’t understand about religion just might kill us

Scott Atran in Foreign Policy:

ScreenHunter_10 Aug. 08 13.54The era of world struggle between the great secular ideological –isms that began with the French Revolution and lasted through the Cold War (republicanism, anarchism, socialism, fascism, communism, liberalism) is passing on to a religious stage. Across the Middle East and North Africa, religious movements are gaining social and political ground, with election victories by avowedly Islamic parties in Turkey, Palestine, Egypt, Tunisia, and Morocco. As Israel's National Security Council chief, Gen. Yaakov Amidror (a religious man himself), told me on the eve of Tunisia's elections last October, “We expect Islamist parties to soon dominate all governments in the region, from Afghanistan to Morocco, except for Israel.”

On a global scale, Protestant evangelical churches (together with Pentacostalists) continue to proliferate, especially in Latin America, but also keep pace with the expansion of fundamentalist Islam in southern Africa and eastern and southern Asia. In Russia, a clear majority of the population remains religious despite decades of forcibly imposed atheism. Even in China, where the government's commission on atheism has the Sisyphean job of making that country religion-free, religious agitation is on the rise. And in the United States, a majority says it wants less religion in politics, but an equal majority still will not vote for an atheist as president.

But if reams of social scientific analysis have been produced on religion's less celestial cousins — from the nature of perception and speech to how we rationalize and shop — faith is not a matter that rigorous science has taken seriously. To be sure, social scientists have long studied how religious practices correlate with a wide range of economic, social, and political issues. Yet, for nearly a century after Harvard University psychologist William James's 1902 masterwork, The Varieties of Religious Experience, there was little serious investigation of the psychological structure or neurological and biological underpinnings of religious belief that determine how religion actually causes behavior. And that's a problem if science aims to produce knowledge that improves the human condition, including a lessening of cultural conflict and war.

More here.

Education and “The Public Promotion of Moral Genius”: An Interview with Peter Hershock

Matt Bieber in The Wheat and Chaff:

9780415544436Peter Hershock is the author of Buddhism in the Public Sphere, one of the most interesting books about public policy that I have ever read. The book presents a set of Buddhist perspectives on a series of political and policy challenges. Each chapter – which cover issues as varied as the environment and terrorism – is worth a read. The final chapter, which serves as the jumping-off point for this interview, is a tour de force of wide-ranging theory and fresh insight about the purposes and practices of contemporary education.

Hershock is an education specialist at the East-West Center in Honolulu. In addition to Buddhism in the Public Sphere, he has written or co-edited many other books, including Educations and Their Purposes: A Conversation Among Cultures.

MATT BIEBER: In your view, much of contemporary education concerns itself with three goals: transmitting information and knowledge, imparting “circumstantially useful skills”, and forming young people through “principle-structured character development and socialization.” Many educational theorists would argue that this forms at least a partial list, if not a complete list, of appropriate educational goals. For you, however, this educational paradigm is deeply inappropriate and, in fact, in crisis. Why?

PETER HERSHOCK: Well, that’s a big question, and we’re going to need a lot of history to be able to respond. Here are some quick thoughts, and then we can do more background if need to.

More here.

Wednesday Poem

On the Table

I was taught to smooth the aura at the end
said my masseuse, hands hovering at the end.

Inches above my placid pummeled self
did I feel something floating at the end?

Is my naked body merely prone
to extoplasmic vapors to no end?

Many another arthritic has lain here
seeking to roll pain's ball end over end.

Herbal oils, a CD playing soft loon calls,
wave raps, bird trills now must end.

I rise and dress, restored to lift and bend,
my ethereal wisp invisible at the end.

by Maxine Kumin
from Ravishing Dis-Unities – Real Ghazals in English
Wesleyan University Press, 2000
.
.

The Architecture of Memory

From Smithsonian:

Memorypalace550Most of us think of memory as a chamber of the mind, and assume that our capacity to remember is only as good as our brain. But according to some architectural theorists, our memories are products of our body’s experience of physical space. Or, to consolidate the theorem: Our memories are only as good as our buildings. In the BBC television series “Sherlock,” the famous detective’s capacious memory is portrayed through the concept of the “mind palace“—what is thought to be a sort of physical location in the brain where a person stores memories like objects in a room. Describing this in the book A Study in Scarlet, Holmes says, “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose…”

The mind palace—also known as the memory palace or method of loci—is a mnemonic device thought to have originated in ancient Rome, wherein items that need to be memorized are pinned to some kind of visual cue and strung together into a situated narrative, a journey through a space. The science writer and author Joshua Foer covered this technique in depth in his book Moonwalking with Einstein, in which he trained for and ultimately won the U.S. Memory Championship. To memorize long lists of words, a deck of cards, a poem, or a set of faces, mental athletes, as they’re called, fuse a familiar place—say, the house they grew up in—with a self-created fictional environment populated by the objects in their list.

More here.

Social Network Size Linked to Brain Size

From Scientific American:

Social-network-size-linked-brain-size_1As humans, we aren't born with formidable armaments or defenses, nor are we the strongest, fastest, or biggest species, yet despite this we are amazingly successful. For a long time it was thought that this success was because our enlarged brains allows each of us to be smarter than our competitors: better at abstract thinking, better with tools and better at adapting our behavior to those of our prey and predators. But are these really the most significant skills our brains provide us with? Another possibility is that we are successful because we can form long-lasting relationships with many others in diverse and flexible ways, and that this, combined with our native intelligence, explains why homo sapiens came to dominate the planet. In every way from teaching our young to the industrial division of labour we are a massively co-operative species that relies on larger and more diverse networks of relationships than any other species. In 1992 British anthropologist Robin Dunbar published an article showing that, in primates, the ratio of the size of the neo-cortex to that of the rest of the brain consistently increases with increasing social group size. For example, the Tamarin monkey has a brain size ratio of about 2.3 and an average social group of size of about 5 members. On the other hand, a Macaque monkey has a brain size ratio of around 3.8 but a very large average group size of about 40 members. From this work Dunbar put forward what is now known as the “social brain hypothesis.” The relative size of the neo-cortex rose as social groups became larger in order to maintain the complex set of relationships necessary for stable co-existence. Most famously, Dunbar suggested that given the human brain ratio we have an expected social group size of around 150 people, about the size of what Dunbar called a “clan.”

Now, in the journal Proceedings of the Royal Society B, Dunbar and his colleagues have shown that the size of each individual’s social network is linearly related to the neural volume in a frontal region of each individual’s brain, the orbital prefrontal cortex. This provides strong support for Dunbar’s original conjecture at the individual level for what was previously proposed based on species-level data: Our brains are not as large as they are in order to provide each of us with the raw computational power to think our way out of a sticky situation, instead our brain size helps each of us to deal with the large and complex network of relationships we rely on to thrive.

More here.

Tuesday, August 7, 2012

dementia praecox

Joan_of_arc_miniature-e1338472790986

Modern American psychiatry treats auditory hallucinations as the leading symptom of serious psychotic disorder, of which the most severe form is schizophrenia. When the German psychiatrist Emil Kraepelin first distinguished dementia praecox, as he called it, from manic-depressive disorder in 1893, back when Freud was drafting the Interpretation of Dreams, he argued that schizophrenia could be recognized by its persistent, deteriorating course. These days, schizophrenia is often imagined as the quintessential brain disease, an expression of underlying organic vulnerability perhaps exacerbated by environmental stress, but as real and as obdurate as kidney failure. The new post-psychoanalytic psychiatric science that emerged in this country in the 1980s argued that mental illnesses were physical illnesses. Many Americans and most psychiatrists took away from this science a sense that serious mental illnesses were brain dysfunctions and that the best hope for their treatment lay in the aggressive new drugs that patients often hated but that sometimes held symptoms at bay.

more from T. M. Luhrmann at The American Scholar here.

sisters of the night

Sistersofnightbookcrop

The bookstore, and especially the used bookstore, is vanishing from New York City. Today there are a few, but there used to be a multitude of them, crammed between kitchen appliance shops and Laundromats and thrift stores. They all had temperamental cats prowling their aisles and they all smelled wonderfully of what a team of chemists in London has called “a combination of grassy notes with a tang of acids and a hint of vanilla over an underlying mustiness.” I will miss terribly this stimulating fragrance, and the books that produce it, when it’s washed from the city for good. Luckily, there are towns that still accommodate used bookshops. Lambertville, New Jersey, is one of them. On North Union Street, there are two used bookstores, Panoply and Phoenix Books, one right across from the other. You can spend hours here, and it’s guaranteed that you’ll return with some grassy, musty artifact of the past. On my last visit to Panoply, I came home with a copy of Sisters of the Night: The Startling Story of Prostitution in New York Today by “veteran newspaperman” Jess Stearn.

more from Jeremiah Moss at the Paris Review here.

The morality of drone warfare revisited

NOTE: Bradley Strawser has written to me and strongly protested the way he was portrayed in this article. He is making a formal complaint to the independent ombudsman at The Guardian and has also published this op-ed to correct some of what he feels are misrepresentations of his views. –Abbas Raza

Bradley Strawser in The Guardian:

An-unmanned-drone-at-Cree-008In the contentious debate over drone warfare, it is necessary to separate US government policy from the broader moral question of killing by aerial robots. The policy question deserves vigorous debate by legal scholars, policy experts, and diplomats. The moral question posed by this new form of remote warfare is more abstract and has only recently begun to receive critical examination by philosophers and ethicists.

The Guardian has attempted to feature the distinct moral and philosophical side of this issue, and a recent story profiled my own views on the topic. Unfortunately – if understandably, given the complexities of the matter – I consider some of my views were misrepresented. Most disturbingly, I was reported to claim that “there's no downside” to killing by drones. In fact, the majority of my work on drones is dedicated to elucidating and analyzing the serious moral downsides that killing by remote control can pose. The Guardian has graciously offered me this space to set the record straight.

My view is this: drones can be a morally preferable weapon of war if they are capable of being more discriminate than other weapons that are less precise and expose their operators to greater risk. Of course, killing is lamentable at any time or place, be it in war or any context. But if a military action is morally justified, we are also morally bound to ensure that it is carried out with as little harm to innocent people as possible.

More here.

Anything But Human

05thestone-img-blog427Richard Polt in in the NYT's The Stone:

Wherever I turn, the popular media, scientists and even fellow philosophers are telling me that I’m a machine or a beast. My ethics can be illuminated by the behavior of termites. My brain is a sloppy computer with a flicker of consciousness and the illusion of free will. I’m anything but human.

While it would take more time and space than I have here to refute these views, I’d like to suggest why I stubbornly continue to believe that I’m a human being — something more than other animals, and essentially more than any computer.

Let’s begin with ethics. Many organisms carry genes that promote behavior that benefits other organisms. The classic example is ants: every individual insect is ready to sacrifice itself for the colony. As Edward O. Wilson explained in a recent essay for The Stone, some biologists account for self-sacrificing behavior by the theory of kin selection, while Wilson and others favor group selection. Selection also operates between individuals: “within groups selfish individuals beat altruistic individuals, but groups of altruists beat groups of selfish individuals. Or, risking oversimplification, individual selection promoted sin, while group selection promoted virtue.” Wilson is cautious here, but some “evolutionary ethicists” don’t hesitate to claim that all we need in order to understand human virtue is the right explanation — whatever it may be — of how altruistic behavior evolved.

I have no beef with entomology or evolution, but I refuse to admit that they teach me much about ethics. Consider the fact that human action ranges to the extremes. People can perform extraordinary acts of altruism, including kindness toward other species — or they can utterly fail to be altruistic, even toward their own children. So whatever tendencies we may have inherited leave ample room for variation; our choices will determine which end of the spectrum we approach. This is where ethical discourse comes in — not in explaining how we’re “built,” but in deliberating on our own future acts. Should I cheat on this test? Should I give this stranger a ride?

David Mamet, Gilad Atzmon and Identity Politics

WarnerJames Warner in Open Democracy:

Reading Gilad Atzmon's The Wandering Who? immediately after David Mamet's The Secret Knowledge, I was surprised to find the two books, written from vehemently opposed political viewpoints, nonetheless reminded me of each other. Does Mamet's need to see the Israelis only as scapegoats grow from the same root as Atzmon's need to see them only as perpetrators? An underlying emotional argument of Mamet's The Secret Knowledge could be glossed as “I used to be a poster child for liberalism, so all the more reason to believe me now I reject everything about liberalism.” For an underlying emotional argument of Atzmon's The Wandering Who? substitute “Zionism” for “liberalism.” But even if this were a compelling line of argument, each book contains plenty of evidence Mamet and Atzmon were never exactly poster children.

Mamet's plays and other writings celebrate individual courage, discipline, and commitment. While he has only recently started identifying as a conservative, his long-term distrust of academia and high estimation of street smarts, his generally low opinion of human nature and belief that playing the victim card is a more contemptible route to power than is straightforward self-interested chicanery – while arguably bipartisan attitudes — in the contemporary U.S. tend to be more associated with the right. It's not surprising if a man whose plays observe the Aristotelian unities of Time, Place and Action leans conservative, while when it comes to Israel – more likely the driving factor behind Mamet's political conversion – he has for some time been on the right of Israel's foreign policy spectrum. According to The Secret Knowledge, he now desires a Republican victory in the U.S. in 2012 and the repeal of health care reform, Israel's infallibility apparently not extending to its system of socialized medicine. Mamet loves America and Israel for their entrepreneurialism, and tends toward the neocon line that Israel is the front line in the “War Against Terrorism,” and that anyone criticizing the Israeli government's treatment of the Palestinians must be an anti-Semite. Mamet reports he is now ashamed not to have fought in Vietnam, a lack for which his more recent hawkishness could be seen as a bid to compensate.

Atzmon on the other hand is in rebellion against his own experience of the 1980s Israel-Lebanon war, recalling in The Wandering Who? visiting a prison camp in Lebanon where Palestinians were incarcerated by Israelis, and deciding he was on the wrong side.