Juan Cole’s summary of the situation in Pakistan

Juan Cole in his blog, Informed Comment:

As Pakistani president Asaf Ali Zardari arrived in Washington for talks with President Obama and Afghan President Hamid Karzai, fighting intensified in Pakistan's northwest.

On Tuesday morning, Pakistani Taliban deployed a suicide bomber to attack Pakistani security forces near Peshawar killing 5 and wounding 9 persons, among them school children bystanders.

WaPo says that fighting had intensified Monday in the Swat Valley between the Pakistani Taliban and government troops, as well as in Buner, the district into which they recently made an incursion and from which the government has been attempting to dislodge them. So far some 80 militants have been killed in the Buner campaign, and 20,000 civilians have been displaced.

Tony Karon at Time explains that the Pakistani military establishment disagrees with Washington that the Taliban are an existential threat to the Pakistani state, and why.

Convinced that Pakistan's problems are in part rooted in economic issues, Sens. John Kerry and Dick Lugar introduced legislation Monday aimed at tripling US foreign aid to Islamabad.

Meanwhile, on the diplomatic front, Secretary of Defense Robert Gates is calling on Saudi Arabia to help Pakistan crush the Pakistani Taliban. The Saudis have developed a fear of the vigilante radicals that they once supported back in the 1980s, and spent 2003-2006 suppressing them at home, and perhaps by now Gates's idea makes some sense.

More here. Obama says Pakistan is toughest U.S. challenge:

A Natural History of the Flu

Carl Zimmer in the New York Times:

ScreenHunter_01 May. 05 08.25 The current outbreak shows how complex and mysterious the evolution of viruses is. That complexity and mystery are all the more remarkable because a virus is life reduced to its essentials. A human influenza virus, for example, is a protein shell measuring about five-millionths of an inch across, with 10 genes inside. (We have about 20,000.)

Some viruses use DNA, like we do, to encode their genes. Others, like the influenza virus, use single-strand RNA. But viruses all have one thing in common, said Roland Wolkowicz, a molecular virologist at San Diego State University: they all reproduce by disintegrating and then reforming.

A human flu virus, for example, latches onto a cell in the lining of the nose or throat. It manipulates a receptor on the cell so that the cell engulfs it, whereupon the virus’s genes are released from its protein shell. The host cell begins making genes and proteins that spontaneously assemble into new viruses. “No other entity out there is able to do that,” Dr. Wolkowicz said. “To me, this is what defines a virus.”

The sheer number of viruses on Earth is beyond our ability to imagine. “In a small drop of water there are a billion viruses,” Dr. Wolkowicz said. Virologists have estimated that there are a million trillion trillion viruses in the world’s oceans.

More here.

Monday, May 4, 2009

Sunday, May 3, 2009

The 2012 Apocalypse — And How to Stop It

2012 Perhaps alarmist, Brandon Keim in Wired:

For scary speculation about the end of civilization in 2012, people usually turn to followers of cryptic Mayan prophecy, not scientists. But that’s exactly what a group of NASA-assembled researchers described in a chilling report issued earlier this year on the destructive potential of solar storms.

Entitled “Severe Space Weather Events — Understanding Societal and Economic Impacts,” it describes the consequences of solar flares unleashing waves of energy that could disrupt Earth’s magnetic field, overwhelming high-voltage transformers with vast electrical currents and short-circuiting energy grids. Such a catastrophe would cost the United States “$1 trillion to $2 trillion in the first year,” concluded the panel, and “full recovery could take 4 to 10 years.” That would, of course, be just a fraction of global damages.

Good-bye, civilization.

Worse yet, the next period of intense solar activity is expected in 2012, and coincides with the presence of an unusually large hole in Earth’s geomagnetic shield. But the report received relatively little attention, perhaps because of 2012’s supernatural connotations. Mayan astronomers supposedly predicted that 2012 would mark the calamitous “birth of a new era.”

Whether the Mayans were on to something, or this is all just a chilling coincidence, won’t be known for several years. But according to Lawrence Joseph, author of “Apocalypse 2012: A Scientific Investigation into Civilization’s End,” “I’ve been following this topic for almost five years, and it wasn’t until the report came out that this really began to freak me out.”

Wired.com talked to Joseph and John Kappenman, CEO of electromagnetic damage consulting company MetaTech, about the possibility of geomagnetic apocalypse — and how to stop it.

Visible Young Man

Colson In the NYT, a review of Colson Whitehead's Sag Harbor:

Now that we’ve got a post-black president, all the rest of the post-blacks can be unapologetic as we reshape the iconography of blackness. For so long, the definition of blackness was dominated by the ’60s street-fighting militancy of the Jesses and the irreverent one-foot-out-the-ghetto angry brilliance of the Pryors and the nihilistic, unrepentantly ghetto, new-age thuggishness of the 50 Cents. A decade ago they called post-blacks Oreos because we didn’t think blackness equaled ghetto, didn’t mind having white influencers, didn’t seem full of anger about the past. We were comfortable employing blackness as a grace note rather than as our primary sound. Post-blackness sees blackness not as a dogmatic code worshiping at the altar of the hood and the struggle but as an open-source document, a trope with infinite uses.

The term began in the art world with a class of black artists who were adamant about not being labeled black artists even as their work redefined notions of blackness. Now the meme is slowly expanding into the wider consciousness. For so long we were stamped inauthentic and bullied into an inferiority complex by the harder brothers and sisters, but now it’s our turn to take center stage. Now Kanye, Questlove, Santigold, Zadie Smith and Colson Whitehead can do blackness their way without fear of being branded pseudo or incognegro.

So it’s a perfect moment for Whitehead’s memoiristic fourth novel, “Sag Harbor,” a coming-of-age story about the Colsonesque 15-year-old Benji, who wishes people would just call him Ben. He’s a Smiths-loving, Brooks Brothers-wearing son of moneyed blacks who summer in Long Island and recognize the characters on “The Cosby Show” as kindred spirits.

Sunday Poem

Found
Ron Koertge

My wife waits for a caterpillar
to crawl onto her palm so she
can carry it out of the street
and into the green subdivision
of a tree.

Yesterday she coaxed a spider
into a juicier corner. The day
before she hazed a snail
in a half-circle so he wouldn’t
have to crawl all the way
around the world and be 2,000
years late for dinner.

I want her to hurry up and pay
attention to me or go where I
want to go until I remember
the night she found me wet
and limping, felt for a collar
and tags, then put me in
the truck where it was warm.

Without her I wouldn’t
be standing here in these
snazzy alligator shoes.

A Queen for the Ages

From The Washington Post:

Cleo More than two millennia after it took place, the story of Cleopatra has lost none of its grip on the world's imagination. It has inspired great plays (Shakespeare, Shaw and Sardou), novels, poems, movies (Elizabeth Taylor!), works of art, musical compositions both serious (Handel and Samuel Barber) and silly (“Comin' Atcha,” by Cleopatra), and of course histories and biographies. Yet for all this rich documentation and interpretation, it remains at least as much legend and mystery as historical record, which has allowed everyone who tells it to play his or her own variations on the many themes it embraces.

The latest to take it on is Diana Preston, a British writer of popular history. On the evidence of “Cleopatra and Antony,” I'd say she's a thoroughgoing pro. Her research is careful and deep; her prose is lively and graceful; her sympathy for her central character is strong but wholly without sentimentality; her depiction of the worlds in which Cleopatra lived is detailed, textured and evocative. If there is a better book about Cleopatra for today's reader, I don't know what it is.

She calls her book “Cleopatra and Antony,” thus reversing the order as immortalized by Shakespeare. History and legend have usually given priority to the two great men in the Egyptian queen's life, Julius Caesar and Mark Antony, but Preston argues that “Cleopatra perhaps deserves first place” because “her tenacity, vision and ambition would have been remarkable in any age but in a female ruler in the ancient world they were unique.” She was “a charismatic, cultured, intelligent ruler,” yet thanks to the propaganda put about by Octavian — later the Emperor Augustus but in the fourth decade B.C. Mark Antony's rival for control of the Roman Empire — she “was transformed into a pleasure-loving houri, the very epitome of fatal beauty and monstrous depravity, bent on bringing animal gods, barbarian decadence and despotism to the sacred halls of Rome's Capitol.”

More here.

Why can’t we concentrate?

Laura Miller in Salon:

Story Here's a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you're finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan's famous “problem that has no name,” this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr's much-discussed “Is Google Making Us Stupid?” essay for the Atlantic Monthly and diatribes like “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” a book published last year by Mark Bauerlein.

You don't have to agree that “we” are getting stupider, or that today's youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic's headline writers as a salvo against the ubiquitous search engine) reported feeling the change “most strongly” while he was reading. “Immersing myself in a book or a lengthy article used to be easy,” he wrote. “Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text.” For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I'm at the computer, why not check e-mail? Most of the time, I'll wind up pausing the DVD player before the end of the movie and telling myself I'll watch the rest tomorrow.

More here.

The Big Similarities & Quirky Differences Between Our Left and Right Brains

Carl Zimmer in Discover Magazine:

Brain There is nothing more humbling or more perception-changing than holding a human brain in your hands. I discovered this recently at a brain-cutting lesson given by Jean-Paul Vonsattel, a neuropathologist at Columbia University. These lessons take place every month in a cold, windowless room deep within the university’s College of Physicians and Surgeons. On the day I visited, there were half a dozen brains sitting on a table. Vonsattel began by passing them around so the medical students could take a closer look. When a brain came my way, I cradled it and found myself puzzling over its mirror symmetry. It was as if someone had glued two smaller brains together to make a bigger one.

Vonsattel then showed us just how weak that glue is. He took back one of the brains and used a knife to divide the hemispheres. He sliced quickly through the corpus callosum, the flat bundle of nerve fibers that connects the halves. The hemispheres flopped away from each other, two identical slabs of fleshy neurons.

Sometimes surgeons must make an even more extreme kind of slice in the brain of a patient. A child may suffer from epilepsy so severe that the only relief doctors can offer is to open up the skull and cut out the entire hemisphere in which the seizures start. After the surgery, the space soon fills with cerebrospinal fluid. It may take a child a year of physical therapy to recover from losing a hemisphere—but the fact that patients recover at all is stunning when you consider that they have only half a brain. It makes you wonder what good two hemispheres are in the first place.

More here.

After the Great Recession

President Obama discusses how his policies on schools, energy and health care might change daily life in America.

David Leonhardt in the New York Times Magazine:

03obama-500 Are there tangible ways that Wall Street has made the average person’s life better in the way that Silicon Valley has?

THE PRESIDENT: Well, I think that some of the democratization of finance is actually beneficial if properly regulated. So the fact that large numbers of people could participate in the equity markets in ways that they could not previously — and for much lower costs than they used to be able to participate — I think is important.

Now, the fact that we had such poor regulation means — in some of these markets, particularly around the securitized mortgages — means that the pain has been democratized as well. And that’s a problem. But I think that overall there are ways in which people have been able to participate in our stock markets and our financial markets that are potentially healthy. Again, what you have to have, though, is an updating of the regulatory regimes comparable to what we did in the 1930s, when there were rules that were put in place that gave investors a little more assurance that they knew what they were buying.

More here.

Genius: The Modern View

David Brooks in the New York Times:

Ts-brooks-190 Some people live in romantic ages. They tend to believe that genius is the product of a divine spark. They believe that there have been, throughout the ages, certain paragons of greatness — Dante, Mozart, Einstein — whose talents far exceeded normal comprehension, who had an other-worldly access to transcendent truth, and who are best approached with reverential awe.

We, of course, live in a scientific age, and modern research pierces hocus-pocus. In the view that is now dominant, even Mozart’s early abilities were not the product of some innate spiritual gift. His early compositions were nothing special. They were pastiches of other people’s work. Mozart was a good musician at an early age, but he would not stand out among today’s top child-performers.

What Mozart had, we now believe, was the same thing Tiger Woods had — the ability to focus for long periods of time and a father intent on improving his skills. Mozart played a lot of piano at a very young age, so he got his 10,000 hours of practice in early and then he built from there.

More here.

Saturday, May 2, 2009

J. G. Ballard, 1930-2009

Jg-ballard_165935tIn The Independent:

J G Ballard, the award-winning writer best known for his autobiographical novel Empire of the Sun, has died at his home in Shepperton, aged 78, after a long illness. He had been unwell “for several years”, said his agent, Margaret Hanbury. He had prostate cancer.

“J G Ballard has been a giant on the world literary scene for more than 50 years,” said Ms Hanbury, who was his agent for 25 of them. “His acute and visionary observation of contemporary life was distilled into a number of brilliant, powerful novels which have been published all over the world and saw Ballard gain cult status.”

James Graham Ballard was regularly labelled a writer of science fiction, but maintained he was “picturing the psychology of the future”. He earned the rare distinction of appearing as an adjective – “Ballardian” – in the Collins English Dictionary, referring to “dystopian modernity, bleak man-made landscapes and the psychological effects of technological, social or environmental developments”.

A Biocentric Theory of The Universe

Biocentric Robert Lanza and Bob Berman make their case in Discover:

According to biocentrism, time does not exist independently of the life that notices it. The reality of time has long been questioned by an odd alliance of philosophers and physicists. The former argue that the past exists only as ideas in the mind, which themselves are neuroelectrical events occurring strictly in the present moment. Physicists, for their part, note that all of their working models, from Isaac Newton’s laws through quantum mechanics, do not actually describe the nature of time. The real point is that no actual entity of time is needed, nor does it play a role in any of their equations. When they speak of time, they inevitably describe it in terms of change. But change is not the same thing as time.

To measure anything’s position precisely, at any given instant, is to lock in on one static frame of its motion, as in the frame of a film. Conversely, as soon as you observe a movement, you cannot isolate a frame, because motion is the summation of many frames. Sharpness in one parameter induces blurriness in the other. Imagine that you are watching a film of an archery tournament. An archer shoots and the arrow flies. The camera follows the arrow’s trajectory from the archer’s bow toward the target. Suddenly the projector stops on a single frame of a stilled arrow. You stare at the image of an arrow in midflight. The pause in the film enables you to know the position of the arrow with great accuracy, but you have lost all information about its momentum. In that frame it is going nowhere; its path and velocity are no longer known. Such fuzziness brings us back to Heisenberg’s uncertainty principle, which describes how measuring the location of a subatomic particle inherently blurs its momentum and vice versa.

Liberalism, Past and Future

George Scialabba reviews Alan Wolfe's The Future of Liberalism in The Nation:

Wolfe's account of liberalism's substantive commitments is straightforward and persuasive–much the best part of the book. The conservative and libertarian enemies of liberalism have squandered so much wealth and welfare, blighted so many lives, that it is always satisfying to see them intellectually routed yet again. Unfortunately, Wolfe does not stop there. He sees liberalism's enemies, or unreliable friends, everywhere and feels bound to scold them all. Wolfe's spiritual home is The New Republic, and he manifests the same complacent centrism as most of its regular writers (though not–for better and worse–the snarky wit and verbal edge that make the magazine at once irresistible and insufferable). Half The Future of Liberalism is valuable affirmation; the other half is an ideological Syllabus of Errors.

The first and most dangerous heresy that Wolfe rebukes from the pulpit–“the single most influential illiberal current of our time”–is evolutionary psychology. The attempt to view human behavior in Darwinian perspective amounts to “nothing short of a determined campaign to reduce human beings and their accomplishments to insignificance.” According to these anti-humanists, humans “rarely accomplish very much independent of what nature has bequeathed to them”; culture is a “side effect,” a “by-product,” just “one more way in which nature imposes its designs upon us.” All this, Wolfe protests, radically undermines liberal morale. Liberalism is about choice and purpose, but the aim of evolutionary psychology “is to show that leading any kind of life we think we are choosing is impossible.”

If science really and truly discredited liberalism, then the only honest response would be: so much the worse for liberalism. But, of course, it does not. The distinction between nature and culture that Wolfe brandishes so menacingly is far more subtle and tenuous than he recognizes. His version, like the obsolete distinction between body and soul, implies that we cannot be both purely physical and meaningfully moral. And yet we are. Whatever “free will” means, it does not mean that choices are uncaused.

Saturday Poem

Rhetorical Figures
Tom Christopher

When a sentence is composed of two independent
clauses, the second being weaker than the first,
it is called One-Legged Man Standing. If it
purposefully obscures meaning, it’s called
Ring
Dropped in Muddy Creek,
or if elegantly composed,
Wasp Fucking Orchid. There are words behind words,
and half the time our thought spraying out like water
from a hose, half the time banging inside our heads
like a wren in a house. When a sentence ends
unexpectedly because someone has punched
the speaker in the face, its Avalanche Sudden.
When instead the speaker is stopped with sloppy
kisses, it’s Dripping Cloud. Not to be confused
with Dripping Cone, when someone overturns
the table, or Bird Pecking the Mountain, when
the sentence goes on for an hour and a half and ends
in a shaking death. If the speaker lies in the driveway
so drunk on cheap wine that one listening cannot
get close to the meaning and thus runs away again,
claiming, “For the last time,” it’s
Pregnant Dog
Cooked in Sun
. If the speaker sells everything for
an old convertible and drives out into the desert
with unintelligible shouting to the pissed-off stars:
Aching Stones Laughing. Forced incongruent words
are Fishes on Fire, and are beautiful but bring us
no closer to the Truth or the Cosmos or the All,
so either we tour Europe looking for the bodies
of saints or drink all night playing Johnny Cash LPs.
Everything we have said, we have said all our lives.
Same for what we haven’t said. Learning the terms
doesn’t help, we’re still filled over the rim with longing.
Already in this room there is
Clamshell Moon, Barn
House Burning, Cow Lowing the Field, One Hundred
Village Bells, Moth Flurry.
Somewhere above, a
Torn
Shirt
, a Peasant Girl Crying, a
Baby Dropped Through
Smoke to Voices Shouting.
Not much further a
Cat
in Heat,
a Wailing Street, and in the end
Tree Frogs
Blazing Reeds with Sound.

from: Best American Poetry 2006; Scribner Poetry, NY

Fordham Law Class Collects Personal Info About Scalia; Supreme Ct. Justice Is Steamed

Martha Neil in the ABA Journal:

ScreenHunter_01 May. 02 13.31Last year, when law professor Joel Reidenberg wanted to show his Fordham University class how readily private information is available on the Internet, he assigned a group project. It was collecting personal information from the Web about himself.

This year, after U.S. Supreme Court Justice Antonin Scalia made public comments that seemingly may have questioned the need for more protection of private information, Reidenberg assigned the same project. Except this time Scalia was the subject, the prof explains to the ABA Journal in a telephone interview.

His class turned in a 15-page dossier that included not only Scalia's home address, home phone number and home value, but his food and movie preferences, his wife's personal e-mail address and photos of his grandchildren, reports Above the Law.

And, as Scalia himself made clear in a statement to Above the Law, he isn't happy about the invasion of his privacy:

“Professor Reidenberg's exercise is an example of perfectly legal, abominably poor judgment. Since he was not teaching a course in judgment, I presume he felt no responsibility to display any,” the justice says, among other comments.

More here.

Untangling the Brain

From Harvard Magazine:

Brain Modern neuroscience rests on the assumption that our thoughts, feelings, perceptions, and behaviors emerge from electrical and chemical communication between brain cells: that whenever we recognize a face, read the newspaper, throw a ball, engage in a conversation, or recall a moment in childhood, a pattern of activity in our neurons makes such feats possible. It’s a tenet of modern biology that sparks fascination—and disbelief. How can a tangle of cells produce the complexity and subtlety of a mind?

Answering that question has always been propelled—and limited—by the available technologies. Accessing the living brain is difficult, and yet studying neurons outside their normal context can’t tell us how they work together normally. Today, using anatomical studies and technologies like functional magnetic resonance imaging, scientists can finally observe large-scale patterns of how the brain is organized. In animals, they have begun to map out networks of neurons responsible for processes like vision and smell. And detailed studies of individual neurons have revealed much about how they “fire” electrically in response to inputs from other neurons, or release neurotransmitters to communicate with one another. But one of the most difficult questions in neuroscience is how to connect these two scales: how do individual neurons link to one another in networks that somehow result in complex brain functions?

More here.