State of the Species

Charles C. Mann in Orion Magazine:

MannND12_Anonymous-silo-e1421175894287THE PROBLEM WITH environmentalists, Lynn Margulis used to say, is that they think conservation has something to do with biological reality. A researcher who specialized in cells and microorganisms, Margulis was one of the most important biologists in the last half century—she literally helped to reorder the tree of life, convincing her colleagues that it did not consist of two kingdoms (plants and animals), but five or even six (plants, animals, fungi, protists, and two types of bacteria).

Until Margulis’s death last year, she lived in my town, and I would bump into her on the street from time to time. She knew I was interested in ecology, and she liked to needle me. Hey, Charles, she would call out, are you still all worked up about protecting endangered species? Margulis was no apologist for unthinking destruction. Still, she couldn’t help regarding conservationists’ preoccupation with the fate of birds, mammals, and plants as evidence of their ignorance about the greatest source of evolutionary creativity: the microworld of bacteria, fungi, and protists. More than 90 percent of the living matter on earth consists of microorganisms and viruses, she liked to point out. Heck, the number of bacterial cells in our body is ten times more than the number of human cells! Bacteria and protists can do things undreamed of by clumsy mammals like us: form giant supercolonies, reproduce either asexually or by swapping genes with others, routinely incorporate DNA from entirely unrelated species, merge into symbiotic beings—the list is as endless as it is amazing. Microorganisms have changed the face of the earth, crumbling stone and even giving rise to the oxygen we breathe. Compared to this power and diversity, Margulis liked to tell me, pandas and polar bears were biological epiphenomena—interesting and fun, perhaps, but not actually significant.

Does that apply to human beings, too? I once asked her, feeling like someone whining to Copernicus about why he couldn’t move the earth a little closer to the center of the universe. Aren’t we special at all? This was just chitchat on the street, so I didn’t write anything down. But as I recall it, she answered that Homo sapiens actually might be interesting—for a mammal, anyway. For one thing, she said, we’re unusually successful.

More here.

The Human Centipede; Or, How to Move to New York

Elissa Bassist in The Paris Review:

Horrormovies-300x280I moved to New York for graduate school. I was in my mid-twenties, and what do we do when we’re in our mid-twenties? We move to New York with very little money and very high hopes. Like many, I entered into the nexus of love and wealth and fame looking for a piece of the glistering and transmutable dream itself. In short, I was here to write a book. But standing on the threshold of this dream, I began to panic. I thought, I have arrived, and thought nothing of how far I had to go or what it would take to get there. I could see downtown Brooklyn from my window, and most days my impression of New York came from inside my bedroom. Outside, the sidewalks were cobbled and uneven, and the houses and apartments looked like replicas of the houses and apartments I’d watch on TV.

I’d lived in Brooklyn less than a month but had already settled into an inexplicable depression I’d nicknamed The Darkness. I couldn’t leave my apartment, except to attend class in Manhattan two nights a week. Sitting on the F train, I felt sure no one could lived in New York without a constantly replenished supply of antidepressants, courtesy of some kind of pharmaceutical Fresh Direct. The city and its boroughs, alternating blocks under perpetual construction, seemed to reflect its residents. Walking home on those Mondays and Wednesdays, I saw the funeral parlors and casket makers on what felt like every corner and wondered if funeral parlors and casket makers really were ubiquitous in New York, or if I was just noticing them more.

More here.

Hounding the Dissenters

20121025_09

In the wake of the Lahore Institute of Management Sciences' (LUMS) decision to not renew Pervez Hoodbhoy's contract, Mohammad Taqi in Daily Times:

Pakistan has an unfortunate track record of hounding dissenters from Dr Fatima Ali Jinnahbhoy to Dr Pervez Amirali Hoodbhoy. And the bullying has gone on with impunity and without shame. The recent slap-on-the-wrist verdict by the Supreme Court (SC) in the 16-year-old Asghar Khan case barely scratches the surface of how institutionalised and deep the rot is.

A hyperactive judiciary that has been adjudicating anything from samosa prices to the National Reconciliation Ordinance (NRO) 2007 and had no hesitation in sending an elected prime minister packing for non-implementation of its orders in the NRO case, danced around the substance of the Asghar Khan case. The SC has recommended action ‘under the law’ against the former army chief, General (retired) Mirza Aslam Beg and the ISI boss at the time, General (retired) Asad Durrani, for their illegal actions. The court effectively turfed the matter to the government by ordering it to set off this process through the Federal Investigation Agency.

The lawyer for the petitioner, the outstanding Salman Akram Raja, is spot on when he says that proceedings against the culprits should be initiated under Article six of the constitution. What could serve as the touchstone here is the SC’s own judgment in the Asma Jilani vs Government of Punjab case 1972 that, “As soon as the first opportunity arises, when the coercive apparatus falls from the hands of the usurper, he should be tried for high treason and suitably punished. This alone will serve as a deterrent to the would-be adventurers.” It would have certainly helped the civilian leadership if the SC had provided at least some clarity as to what exactly it had in mind when it directed the government to punish the offenders. The verdict left the door open to proceed potentially against the political activity of the president, but it would neither touch the political beneficiaries of the Mehran Bank-army collusion or the army itself with a 10-foot pole. Nonetheless, the ruling provides the civilian leadership an opportunity to proceed against those who have directly or indirectly, as in the 1990 stolen elections, usurped the people’s mandate or refused to honour it when the Pakistani people have spoken despite their machinations.

Why IQs Rise

IQ_curve.svg_

Meehan Crist and Tim Requarth in TNR:

IN THE MID-’80s, the political philosopher James Flynn noticed a remarkable but puzzling trend: for the past century, average IQ scores in every industrialized nation have been steadily rising. And not just a little: nearly three points every decade. Every several years, IQ tests test have to be “re-normed” so that the average remains 100. This means that a person who scored 100 a century ago would score 70 today; a person who tested as average a century ago would today be declared mentally retarded.

This bizarre finding—christened the “Flynn effect” by Richard Herrnstein and Charles Murray inThe Bell Curve—has since snowballed so much supporting evidence that in 2007 Malcolm Gladwell declared in The New Yorker that “the Flynn effect has moved from theory to fact.” But researchers still cannot agree on why scores are going up. Are we are simply getting better at taking tests? Are the tests themselves a poor measure of intelligence? Or do rising IQ scores really mean we are getting smarter?

In spite of his new book’s title, Flynn does not suggest a simple yes or no to this last question. It turns out that the greatest gains have taken place in subtests that measure abstract reasoning and pattern recognition, while subtests that depend more on previous knowledge show the lowest score increases. This imbalance may not reflect an increase in general intelligence, Flynn argues, but a shift in particular habits of mind. The question is not, why are we getting smarter, but the much less catchy, why are we getting better at abstract reasoning and little else?

Koshik the Elephant Can Speak Korean

49233-1024x684

Becky Crew in Scientific American:

Upstaged spectacularly by a young Beluga whale that can sort of speak human, an Asian elephant named Koshik can also imitate human speech, but in Korean, using his trunk.

Captive-born in 1990 and transferred to South Korea’s Everland Zoo three years later, Koshik lived with two female Asian elephants for a couple of years before being kept completely alone for the following seven years. During this time, he showed a keen interest in learning several spoken commands, and by August 2004, when he was 14 years old and about to reach sexual maturity, his trainers noticed that he was attempting to imitate their speech.

It’s not known if this was the first time Koshik imitated human speech, or if he’d started doing it earlier and his trainers hadn’t noticed, but there’s a good chance the reason he had started was because, for a long period of time during his formative years, the only social interaction he had was with humans.

Isolation from conspecifics has led to speech intimation in a number of unlikely animals, such as Hoover the harbour seal (Phoca vitulina) from Maine, who in 1976 showed an ability to imitate human speech. Hoover was found as an orphaned cub, and was hand-reared by locals before being transferred at three months old to the New England Aquarium. Here he shared an exhibit pool with other harbour seals, but he was the oldest male for most of his life.

Science Fictions

Sir-Francis-Bacon

Philip Ball in Aeon:

Scientists can be notoriously dismissive of other disciplines, and one of the subjects that suffers most at their hands is history. That suggestion will surprise many scientists. ‘But we love history!’ they’ll cry. And indeed, there is no shortage of accounts from scientists of the triumphant intellectual accomplishments of Einstein, Darwin, Newton, Galileo, and so on. They name institutes and telescopes after these guys, making them almost secular saints of rationalism.

And that’s the problem. All too often, history becomes a rhetorical tool bent into a shape that serves science, or else a source of lively anecdote to spice up the introduction to a talk or a book. Oh, that Mendeleev and his dream of a periodic table, that Faraday forecasting a tax on electricity!

I don’t wish to dismiss the value of a bit of historical context. But it’s troubling that the love of a good story so often leads scientists to abandon the rigorous attitude to facts that they exhibit in their own work. Most worrisome of all is the way these tales from science history become shoehorned into a modern narrative — so that, say, the persecution of Galileo shows how religion is the enemy of scientific truth.

There’s no point getting too po-faced about the commandeering of Newton’s almost certainly apocryphal falling apple to represent science in the Paralympic opening ceremony. But what Newton’s definitive biographer Richard Westfall says about that story warns us how these populist fables can end up giving a distorted view of science. He says that it ‘vulgarises universal gravitation by treating it as a bright idea. A bright idea cannot shape a scientific tradition.’ Besides, how many of those munching apples at the ceremony could have explained why, if the moon is indeed just like an apple, the apple falls but the moon does not? Anecdote can anaesthetise thought rather than stimulate it.

An Interview with Maurice Sendak

Interview_sendak

Emma Brockes in The Believer:

THE BELIEVER: Do you miss the city, living out here?

MAURICE SENDAK: I really don’t like the city anymore. You get pushed and harassed and people grope you. It’s too tumultuous. It’s too crazy. I’m afraid of falling over in New York. People are all insane and talking on machines and twittering and twottering. All that. I’m here looking for peace and quiet. A yummy death.

BLVR: A yummy death?

MS: I’m just reading a book about Samuel Palmer and the ancients in England in the 1820s. You were so lucky to have William Blake. He’s lying in bed, he’s dying, and all the young men come—the famous engravers and painters—and he’s lying and dying, and suddenly he jumps up and begins to sing! “Angels, angels!” I don’t know what the song was. And he died a happy death. It can be done. [Lifts his eyebrows to two peaks] If you’re William Blake and totally crazy.

BLVR: You do some teaching out here?

MS: I have a fellowship that started last year, two men and two women living in a house, and I go over when they want me to critique, or whatever the hell. I just talk dirty. They’re nice people. Young. It’s probably not very original, but old artists like to have young artists around… to destroy. I’m joking. I really want to help them. But publishing is such an outrageously stupid profession. Or has become so.

BLVR: More so than it was?

MS: Well, nobody knows what they’re doing. I wonder if that’s always been true. I think being old is very fortunate right now. I want to get out of this as soon as possible. It’s terrible. And the great days in the 1950s and after the war, when publishing children’s books was youthful and fun… it really was. It’s not just looking back and pretending that it was good. It was good. And now it’s just stupid.

The Corporatization of Higher Education

13512646964863353178_1ee452937e_b

Nicolaus Mills in Dissent:

In 2003, only two colleges charged more than $40,000 a year for tuition, fees, room, and board. Six years later more than two hundred colleges charged that amount. What happened between 2003 and 2009 was the start of the recession. By driving down endowments and giving tax-starved states a reason to cut back their support for higher education, the recession put new pressure on colleges and universities to raise their price.

When our current period of slow economic growth will end is anybody’s guess, but even when it does end, colleges and universities will certainly not be rolling back their prices. These days, it is not just the economic climate in which our colleges and universities find themselves that determines what they charge and how they operate; it is their increasing corporatization.

If corporatization meant only that colleges and universities were finding ways to be less wasteful, it would be a welcome turn of events. But an altogether different process is going on, one that has saddled us with a higher-education model that is both expensive to run and difficult to reform as a result of its focus on status, its view of students as customers, and its growing reliance on top-down administration. This move toward corporatization is one that the late University of Montreal professor Bill Readings noted sixteen years ago in his study,The University in Ruins, but what has happened in recent years far exceeds the alarm he sounded in the 1990s.

an adventure

E0cd5a47-2e38-447b-b21b-b2755d6e1185

Patrick Leigh Fermor, who died last year aged 96, had a facility for bringing together worlds usually considered incompatible. Here was a war hero who was also one of the great English prose stylists; who adored Greece and Britain with equal passion; and who was celebrated for his love of both high and low-living. His masterpiece, A Time of Gifts (1977), an account of the first stage of his 1933-34 walk from the Hook of Holland to Constantinople (“like a tramp, a pilgrim, or a wandering scholar”) has his 18-year-old self moving from doss-houses to Danubian ducal fortresses: “There is much to recommend moving straight from straw to a four-poster,” he writes, “and then back again.”

more from William Dalrymple at the FT here.

biafra

1104-Nossiter-Hochschild-sfSpan

The architects of Biafra were correct in their frustration with the Nigerian government, which did not intervene as thousands of Ibos were massacred. But they were deluding themselves that Biafra was viable. The nascent state had virtually no chance of survival once the authorities in Lagos decided they were going to stamp out the secession in what they called a “police action.” Was Biafra ever really a “country,” as Achebe would have it? It had ministries, oil wells, a ragtag army, an often-shifting capital, official cars (Achebe had one) and a famous airstrip. But as a “country,” it was stillborn. Nonetheless, for over two brutal years, the Biafran war dragged on at the insistence of Ojukwu — described as “brooding, detached and sometimes imperious” in a 1969 New York Times profile by Lloyd Garrison — and meddling international players. Hundreds of thousands of civilians were killed. As many as 6,000 a day starved to death once the federal government blockaded the ever diminishing Republic of Biafra. But Ojukwu refused to give up. The final death toll was estimated at between one and three million people.

more from Adam Nossiter at the NY Times here.

Friday, November 2, 2012

Inside the Centre: The Life of J Robert Oppenheimer

From The Telegraph:

Carter_main_2385878bIt’s 11 years since Ray Monk’s biography of Bertrand Russell, a book which, like his earlier one of Ludwig Wittgenstein, pulled off the impressive feat of explaining the philosophy while rivetingly portraying the life. The subject of his new 780-page book, Inside the Centre: the Life of J Robert Oppenheimer, would seem to be an excellent fit: Oppenheimer was intellectually brilliant, his work arcane and personally he was a disaster – an “unintegrated” personality made up, his friend Isidor Rabi said, of “many bright, shining splinters”. Monk has called his book Inside the Centre because Oppenheimer, the son of rich, assimilated German Jewish parents – the classic insider-outsider – had a talent for putting himself at the centre of things: at the birth of particle physics at the University of Göttingen in the Twenties, at the creation of the atom bomb, and as director of Princeton’s Institute for Advanced Study where he gathered about him the likes of Einstein and T S Eliot. He was also fascinated, throughout his adult life, by what lay within the centre of the atom.

Born in 1904 in New York into a tight-knit cultured, liberal, philanthropic, Jewish social circle, Oppenheimer was an exceptionally bright child. His parents were suffocatingly attentive. Monk describes an atmosphere of melancholy, overprotective and short on “fun”. With a voracious appetite for, among other things, chemistry, French literature, modernist poetry, Hinduism and Sanskrit (which he taught himself), he didn’t discover physics until his second year at Harvard, blagging his way onto a postgraduate course in thermodynamics. He went onto the Rutherford laboratory in Cambridge and, aged 22, so impressed Max Born – orchestrator of the amazing advances in quantum mechanics taking place at the University of Göttingen in Germany – that the latter invited him there to collaborate. He returned to the United States from the cutting edge of theoretical physics in 1929 with the deliberate intention of building a school of physics in America to rival that in Europe. Within five years he had pretty much succeeded. In the late Thirties he made his most original scientific contribution: three articles, ignored at the time, in which he described what happened to collapsing stars and predicted the existence of black holes. Had he lived another three years – when the existence of neutron stars were confirmed – he would probably have received a Nobel Prize.

More here.

New tools reveal ‘new beginning’ in split-brain research

From PhysOrg:

BrainSplit-brain research has been conducted for decades, and scientists have long ago shown that language processing is largely located in the left side of the brain. When words appear only in the left visual field—an area processed by the right side of the brain—the right brain must transfer that information to the left brain, in order to interpret it. The new study at UC Santa Barbara shows that healthy test subjects respond less accurately when information is shown only to the right brain.

While hemispheric specialization is considered accurate, the new study sheds light on the highly complex interplay—with neurons firing back and forth between distinct areas in each half of the brain. The findings rely on extremely sensitive neuroscience equipment and analysis techniques from network science, a fast-growing field that draws on insights from sociology, mathematics, and physics to understand complex systems composed of many interacting parts. These tools can be applied to systems as diverse as earthquakes and brains. Fifty years ago, UC Santa Barbara neuroscientist Michael S. Gazzaniga moved the field forward when he was a graduate student at the California Institute of Technology and first author of a groundbreaking report on split-brain patients. The study, which became world-renowned, was published in the Proceedings of the National Academy of Sciences (PNAS) in August 1962. This week, in the very same journal, Gazzaniga and his team announced major new findings in split-brain research. The report is an example of the interdisciplinary science for which UC Santa Barbara is well known.

More here.

Sarah Losh, Romantic architect

P8_Birch_Losh_paper_303229h

What Sarah described as a “Lombardic” idiom could be at once northern and southern, regional and European, and it would honour the purity of the early Church. But it is the idiosyncratic decorative scheme of the church, rather than its structural style, that makes her design so extraordinary. Sarah’s wide reading in Romantic literature had led her to see patterns of spiritual significance in nature, and she was attracted to the myths and cults that had revered natural rhythms of birth and death long before the advent of Christianity. Her church is steeped in a history that is not confined to the traditions of Anglicanism. Lotus flowers represent light and creation, while the pomegranate symbolizes regeneration. The pulpit was made from bog oak, thousands of years old; it was carved to resemble a fossilized tree, tracing a form of growth far older than the Church. The pine cone, which gives Uglow the title of her book, is to be found everywhere, as a recurrent emblem of eternal life. Like so much that caught Sarah’s imagination, it was both local and universal. The pine cone was a familiar object in the woods she owned, but it was also a symbol common to the Romans and Egyptians, and even to the Masons, who often used it to signify renewal in their ornate halls. It embodied the mysterious multiplicity of meaning that she valued most.

more from Dinah Birch at the TLS here.

three feet high and rising

Hurricane-sandy-satellite-image-537x358

The New York Academy Sciences has already begun examining the viability of three massive floodgates near the mouth of New York Harbor, not unlike the Thames River floodgate that protects London today. Another floodgate has been proposed for the Potomac River just south of Washington, fending against tsunami-like surge tides from future mega storms. Plus there will be levees—everywhere. Imagine the National Mall, Reagan National Airport and the Virginia suburbs—all well below sea level—at the mercy of “trust-us-they’ll-hold” levees maintained by the Army Corps of Engineers. Oceans worldwide are projected to rise as much as three more feet this century—much higher if the Greenland ice sheet melts away. Intense storms are already becoming much more common. These two factors together will in essence export the plight of New Orleans, bringing the Big Easy “bowl” effect here to New York City and Washington, as well as to Charleston, Miami, New York and other coastal cities. Assuming we want to keep living in these cities, we’ll have to build dikes and learn to exist beneath the surface of surrounding tidal bays, rivers and open seas—just like New Orleans.

more from Mark Tidwell at The Nation here.

it’s climate change

Sandy-and-Earth-300x300

Hurricane Sandy has emboldened more scientists to directly link climate change and storms, without the hedge. On Monday, as Sandy came ashore in New Jersey, Jonathan Foley, director of the Institute on the Environment at the University of Minnesota, tweeted: “Would this kind of storm happen without climate change? Yes. Fueled by many factors. Is [the] storm stronger because of climate change? Yes.” Raymond Bradley, director of the Climate Systems Research Center at the University of Massachusetts, was quoted in the Vancouver Sun saying: “When storms develop, when they do hit the coast, they are going to be bigger and I think that’s a fair statement that most people could sign onto.” A recent, peer-reviewed study published by several authors in the Proceedings of the National Academy of Science concludes: “The largest cyclones are most affected by warmer conditions and we detect a statistically significant trend in the frequency of large surge events (roughly corresponding to tropical storm size) since 1923.”

more from Mark Fischetti at Scientific American here.

Mapping the Art Genome

From Smithsonian:

GalleryIf you’re not familiar, Pandora is a Web site that takes a visitor’s preference for an individual musician or song and creates a personalized radio station to fit his or her taste. If you like the Beatles’ “Paperback Writer,” you may also like “Ruby Tuesday” by The Rolling Stones, for instance, or “I Can’t Explain” by The Who. With Art.sy, a visitor can enter an artist, artwork, artistic movement or medium into a search bar and the site will generate a list of artists and works that have been deemed related in some way. “There are a lot of people who may know who Warhol is, but they have no idea who Ray Johnson is. The ability to make those connections is what this is about,” said Cwilich, Art.sy’s Chief Operating Officer, on a recent segment of The Takeaway with John Hockenberry.

The endeavor is a true collaboration between computer scientists and art historians. (This is even evident in Art.sy’s leadership. Cleveland, Art.sy’s 25-year-old chief executive officer, is a computer science engineer, and Cwilich is a former executive from Christie’s Auction House.) To create a Web site that could generate fine-art recommendations, the Art.sy team had to first tackle the Art Genome Project. Essentially, a number of art historians have identified 800-and-counting “genes,” or characteristics, that apply to different pieces of art. These genes are words that describe the medium being used, the artistic style or movement, a concept (i.e., war), content, techniques and geographic regions, among other things. All the images that are tagged with a specific gene—say, “American Realism” or “Isolation/Alienation”—are then linked within the search technology.

More here.

Memento mori: its time we reinvented death

From NewScientist:

DeathIT'S said that when a general returned in glory to ancient Rome, he was accompanied in his procession through the streets by a slave whose job it was to remind him that his triumph would not last forever. “Memento mori,” the slave whispered into the general's ear: “remember you will die”. The story may be apocryphal, but the phrase is now applied to art intended to remind us of our mortality – from the Grim Reaper depicted on a medieval clock to Damien Hirst's bejewelled skull. As if we needed any reminder. While few of us know exactly when death will come, we all know that eventually it will. It's usual to talk about death overshadowing life, and the passing of loved ones certainly casts a pall over the lives of those who remain behind. But contemplating our own deaths is one of the most powerful forces in our lives for both good and ill (see “Death: Why we should be grateful for it“) – driving us to nurture relationships, become entrenched in our beliefs, and construct Ozymandian follies.

In this, we are probably unique. Most animals seem to have hardly any conception of mortality: to them, a dead body is just another object, and the transition between life and death unremarkable. We, on the other hand, tend to treat those who have passed away as “beyond human”, rather than “non-human” or even “ex-human”. We have developed social behaviours around the treatment of the dead whose complexity far exceeds even our closest living relatives' cursory interest in their fallen comrades. Physical separation of the living from the dead may have been one of the earliest manifestations of social culture (see “Death: The evolution of funerals“); today, the world's cultures commemorate and celebrate death in ways ranging from solemn funerals to raucous carnivals. So you could say that humans invented death – not the fact of it, of course, but its meaning as a life event imbued with cultural and psychological significance. But even after many millennia of cultural development, we don't seem to be sure exactly what it is we've invented. The more we try to pin down the precise nature of death, the more elusive it becomes; and the more elusive it becomes, the more debatable our definitions of it (see “Death: The blurred line between dead and alive“).

More here.

Thursday, November 1, 2012

Peter Singer on Roe v. Wade, Obamacare, Romney

Singer.pig_-300x194

John Horgan talks to Singer in Scientific American:

Singer’s analysis of abortion surprised me. First of all, he agreed with many pro-lifers that a fetus, even at six weeks, is a “living human being.” [See postscript below] He showed us slides of fetuses, because we should not “run away from what abortion is.”

Singer nonetheless believes that abortion is ethical, because even a viable fetus is not a rational, self-aware person with desires and plans, which would be cut short by death; hence it should not have the same right as humans who have such qualities. Abortion is also justified, Singer added, both as a female right and as a method for curbing overpopulation.

Singer further surprised me—and showed his meta-commitment to democracy and reason–when he said that he, like Mitt Romney and his running mate Paul Ryan, disliked Roe V. Wade. That 1973 Supreme Court decision, Singer felt, provides a flimsy rationale for abortion and has corrupted the process whereby Supreme Court Justices are chosen. Ideally, Singer said, voters rather than unelected judges should determine the legal status of abortion. Singer nonetheless acknowledged that if Roe V. Wade is overturned, some states might outlaw or severely restrict abortion. “I’m torn,” he admitted.