magpie and chameleon

TLS_Stace1_737297a

Bowie has worn his dilettantism proudly and, through his dabblings, created some of the greatest music of the pop era. He is blessed with one of the most versatile voices. His talent for mimicry, coupled with a willingness to adapt his vocal approach to the song at hand, sets him apart from the competition: you could never tell which David Bowie would be singing. He has always been the bravest among his otherwise simply successful contemporaries, and, particularly in collaboration with Brian Eno, took rock music places it had never meant to go. He has namedropped Nietzsche here and there, and his attitude towards self-renewal has always been that of the Übermensch or the “homo superior”, words he relished singing in “Oh! You Pretty Things” on Hunky Dory in 1971. Bowie has always been at his best when he leaves himself open to chance, starting from scratch: his fearless mixing of genre, his willingness to enter the studio with no material (as he did for his masterpiece Station to Station, 1976), his constantly re-invented recording techniques (for example, instructing guitar players to play a song without ever having heard it, then keeping their first take), his embrace of Eno’s “oblique strategies” for “The Berlin Trilogy” (Low, 1977, Heroes, 1977, Lodger, 1979, only one of which was actually made in Berlin). He fails when his instincts desert him, when he tries to recreate consciously what he does so well unconsciously. Bowie knows all this: it just leaves him in the uncomfortable position of having endlessly to rehearse for unpreparedness.

more from Wesley Stace at the TLS here.

Our data, ourselves

From The Boston Globe:

Data If you’re obsessive about your health, and you have $100 to spare, the Fitbit is a portable tracking device you can wear on your wrist that logs, in real time, how many calories you’ve burned, how far you’ve walked, how many steps you’ve taken, and how many hours you’ve slept. It generates colorful graphs that chart your lifestyle and lets you measure yourself against other users. Essentially, the Fitbit is a machine that turns your physical life into a precise, analyzable stream of data.

If this sounds appealing — if you’re the kind of person who finds something seductive about the idea of leaving a thick plume of data in your wake as you go about your daily business — you’ll be glad to know that it’s happening to you regardless of whether you own a fancy pedometer. Even if this thought terrifies you, there’s not much you can do: As most of us know by now, we’re all leaving a trail of data behind us, generating 0s and 1s in someone’s ledger every time we look something up online, make a phone call, go to the doctor, pay our taxes, or buy groceries.

More here.

ScienceShot: You Are Here

From Science:

Universe Astronomers have produced the most complete 3D map of the nearby universe to date. Using telescopes in both hemispheres, they measured distances to a whopping 45,000 galaxies out to a distance of 380 million light-years—for the astronomy buffs, a red shift of .07. Unlike the famous Sloan Digital Sky Survey, which mapped only part of the sky, the new 2MASS Redshift Survey covers 95% of surrounding space, skipping over only the region near the plane of our own galaxy, where the Milky Way's stars and dust block the view of remote objects. In the map, color codes for distance: purple dots are nearby galaxies; red dots are distant ones.

More here.

Wednesday, May 25, 2011

Physics and the Immortality of the Soul

Sean Carroll in Scientific American:

ScreenHunter_01 May. 25 18.34 The topic of “life after death” raises disreputable connotations of past-life regression and haunted houses, but there are a large number of people in the world who believe in some form of persistence of the individual soul after life ends. Clearly this is an important question, one of the most important ones we can possibly think of in terms of relevance to human life. If science has something to say about, we should all be interested in hearing.

Adam Frank thinks that science has nothing to say about it. He advocates being “firmly agnostic” on the question. (His coblogger Alva Noë resolutely disagrees.) I have an enormous respect for Adam; he's a smart guy and a careful thinker. When we disagree it's with the kind of respectful dialogue that should be a model for disagreeing with non-crazy people. But here he couldn't be more wrong.

Adam claims that there “simply is no controlled, experimental[ly] verifiable information” regarding life after death. By these standards, there is no controlled, experimentally verifiable information regarding whether the Moon is made of green cheese. Sure, we can take spectra of light reflecting from the Moon, and even send astronauts up there and bring samples back for analysis. But that's only scratching the surface, as it were. What if the Moon is almost all green cheese, but is covered with a layer of dust a few meters thick? Can you really say that you know this isn't true? Until you have actually examined every single cubic centimeter of the Moon's interior, you don't really have experimentally verifiable information, do you? So maybe agnosticism on the green-cheese issue is warranted. (Come up with all the information we actually do have about the Moon; I promise you I can fit it into the green-cheese hypothesis.)

More here.

Could Conjoined Twins Share a Mind?

Twin Susan Dominus in The NYT Magazine:

Twins joined at the head — the medical term is craniopagus — are one in 2.5 million, of which only a fraction survive. The way the girls’ brains formed beneath the surface of their fused skulls, however, makes them beyond rare: their neural anatomy is unique, at least in the annals of recorded scientific literature. Their brain images reveal what looks like an attenuated line stretching between the two organs, a piece of anatomy their neurosurgeon, Douglas Cochrane of British Columbia Children’s Hospital, has called a thalamic bridge, because he believes it links the thalamus of one girl to the thalamus of her sister. The thalamus is a kind of switchboard, a two-lobed organ that filters most sensory input and has long been thought to be essential in the neural loops that create consciousness. Because the thalamus functions as a relay station, the girls’ doctors believe it is entirely possible that the sensory input that one girl receives could somehow cross that bridge into the brain of the other. One girl drinks, another girl feels it.

What actually happens in moments like the one I witnessed is, at this point, theoretical guesswork of the most fascinating order. No controlled studies have been done; because the girls are so young and because of the challenges involved in studying two conjoined heads, all the advanced imaging technology available has not yet been applied to their brains. Brain imaging is inscrutable enough that numerous neuroscientists, after seeing only one image of hundreds, were reluctant to confirm the specific neuro­anatomy that Cochrane described; but many were inclined to believe, based on that one image, that the brains were most likely connected by a live wire that could allow for some connection of a nature previously unknown. A mere glimpse of that attenuated line between the two brains reduced accomplished neurologists to sputtering incredulities. “OMG!!” Todd Feinberg, a professor of clinical psychiatry and neurology at Albert Einstein College of Medicine, wrote in an e-mail. “Absolutely fantastic. Unbelievable. Unprecedented as far as I know.” A neuroscientist in Kelowna, a city in British Columbia near Vernon, described their case as “ridiculously compelling.” Juliette Hukin, their pediatric neurologist at BC Children’s Hospital, who sees them about once a year, described their brain structure as “mind-blowing.”

Capital Controls or Protectionism

Pa3752c_thumb3 Hector R. Torres in Project Syndicate:

[O]one of the main lessons of the crisis is that accumulating reserves shelters an economy from imported crises, thereby permitting governments to implement counter-cyclical policies. This is true, but, in an integrated world economy, it assumes that export-led growth is still an option.

In an environment of high liquidity, in which Latin American countries are far less successful than China in fending off capital inflows, advising them to raise real interest rates can only lure more short-term capital, compounding appreciation pressures. Nobody should be surprised to see trade tensions.

So, what should be done?

In an ideal world, liquidity creation should be regulated internationally, and the coherence of domestic exchange-rate policies ensured. But this is far from today’s real-world situation, so we need to aim for second-best solutions.

Capital controls (regulations and “prudential measures”) could help to curb appreciation pressures. Admittedly, they are not watertight and could eventually be sidestepped, but they are far better than what might follow if they prove ineffective. If capital controls do not work, governments may feel tempted to provide protection to “their” domestic industries by imposing trade restrictions.

The IMF has recently accepted that controlling capital inflows could be appropriate under certain circumstances.

Being Human

Acg1 S.J. Fowler interviews A.C. Grayling over at 3:AM Magazine:

3:AM: In a meaningful sense, your atheism seems to refute the idea that atheism is a philosophical necessity that results in pessimism. To what extent must atheism and the fragility of human nature be taken as given for us to begin legitimately philosophising?

ACG: As has been well said, atheism is to religious belief what not collecting stamps is to stamp collecting. If instead of ‘atheism’ you use the word ‘afairyism’ or some such, to illustrate the fact that there is no real subject matter in play (whereas ‘religion’ – a man-made phenomenon that has been a massive presence in history – is a different matter) you see that all that talk of ‘atheism’ does is to close down certain absurdities that get in the way of doing metaphysics and ethics properly. Whereas talk of ‘religion’ requires us to address the questions of the place of religious voices in the public square; this is where secularism becomes important.

3:AM: Can it be said if we are not overarchingly religious, nor taken with project of self improvement and personal responsibility, then we are inhabiting an age of ambivalence rather than nihilism or religosity. Now it seems the question of meaning is not answered yes or no, but not asked at all, especially in the young. Do you think consumerism, isolation, distraction has taken the place of any stringent belief?

ACG: Given half an invitation to reflect philosophically on the value and direction of life, people quickly begin to do so. (The religions do not want people to think philosophically, because then they begin to question the one-size-fits-all pieties that the religions sell.) The ‘distractions’ of entertainment, consumerism and co have more of a point in them than we sometimes acknowledge, because fun, pleasure, beauty and recreation are significant aspects of experience. But they don’t entirely stop people thinking about questions of value, for human lives also have sorrow and loss in them, and difficult choices, and periods of depression, all of which remind people of the task of thinking and choosing, which is inescapable. Philosophy can provide materials and suggestions here, and encouragement to think; that is or should be one of its principal gifts.

deaf: an ethnicity?

ID_PI_GOLBE_DEAF_AP_001

The newly published The People of the Eye sets out to define the Deaf-World and to fight for it. Where Deaf activists have spent decades arguing that deafness is not a defect but a character trait — a benefit even — The People of the Eye goes a step further. It asserts that Deaf is an ethnicity. An ethnicity like all officially classed ethnicities, to be given its due, politically and culturally. Authors Harlan Lane, Richard C. Pillard, and Ulf Hedberg write that, although Deaf identity is based not on religion, race, or class, “there is no more authentic expression of an ethnic group than its language.” Language is the core of American Deaf life. The important characteristic that distinguishes deafness from other conditions classed as disabilities is that deafness is a matter of communication. With the emergence of Deaf schools, literacy allowed Deaf people to better communicate in the hearing world. As ASL developed, Deaf Americans could better communicate with each other, and with this came the creation of a Deaf culture, even a new way of being. ASL signers say that they spend much more time thinking about and dealing with language than most Americans, resulting in a rich and independent tradition of Deaf language arts — literature, theater, journalism. Deaf people have their own clubs, their own rituals, their own places of worship, their own newspapers, their own sense of humor. The People of the Eye discusses, too, how the fully embodied language of ASL and Deaf pride created a culture of storytelling in the Deaf-World, and how this storytelling developed a unique narrative structure based on the particularities of ASL.

more from Stefany Anne Golberg at The Smart Set here.

buster

Prikryl_1-060911_jpg_230x804_q85

More than fifty years have passed since critics rediscovered Buster Keaton and pronounced him the most “modern” silent film clown, a title he hasn’t shaken since. In his own day he was certainly famous but never commanded the wealth or popularity of Charlie Chaplin or Harold Lloyd, and he suffered most when talkies arrived. It may be that later stars like Cary Grant and Paul Newman and Harrison Ford have made us more susceptible to Keaton’s model of offhand stoicism than his own audiences were. Seeking for his ghost is a fruitless business, though; for one thing, film comedy today has swung back toward the sappy, blatant slapstick that Keaton disdained. There’s some “irony” in what Judd Apatow and Adam Sandler do, but it’s irony that clamors to win the identification of the supposedly browbeaten everyman in every audience. Keaton took your average everyman and showed how majestically alone he was. The story of his life seems in its twists and dives borrowed from his movies, survival demanding a pure lack of sentiment. There were twenty years of child stardom in vaudeville and nearly a decade making popular silent movies, followed by alcoholism, a nasty divorce, a nastier second marriage, twenty years producing a few dreadful blockbusters for MGM followed by a long series of low-budget flops, and a third lasting marriage, until his silent work was unearthed and brought him renewed recognition. “What you have to do is create a character,” he once said. “Then the character just does his best, and there’s your comedy. No begging.” He embodied this attitude so entirely in his silent films that you can’t watch him without feeling won over, a partisan of the nonpartisan side.

more from Jana Prikryl at the NYRB here.

wikipedia, a UNESCO site?

Wikipedia-logo-LARGE

Boasting more than 18 million entries in 279 languages, Wikipedia is arguably the largest store of human knowledge in the history of mankind. In its first decade, the digital encyclopedia has done more to challenge the way we think about the relationship between knowledge and the Internet than virtually any other website. But is this ubiquitous tree of knowledge as culturally sacred as the pyramids of Giza, the archaeological site of Troy, or the Native American mound cities of Cahokia? Jimmy Wales, co-founder of Wikipedia, thinks so. Spurred on by a German chapter of the Wikimedia Foundation, the digital encyclopedia will launch a petition this week to have the website listed on the UN Educational, Scientific, and Cultural Organization’s world heritage lists. If accepted, Wikipedia would be afforded the international protection and preservation afforded to man made monuments and natural wonders. The first digital entity to vie for recognition as cultural treasure, Wikipedia argues that the site meets the first and foremost of UNESCO’s criteria: “to represent a masterpiece of human creative genius. ”

more from Jared Keller at The Atlantic here.

Ideas festival: Great minds think and drink alike

From The Telegraph:

Ruth Most people go to Hay-on-Wye for the annual literary festival – not to ponder profound existential questions. But this year, more than 7,000 visitors will arrive for the world's largest philosophy festival. HowTheLightGetsIn was founded in 2009 by the philosopher Hilary Lawson, whose aim is create a festival-like atmosphere, in which to discuss issues that make us tick. The festival is named after lyrics found in a Leonard Cohen song, “Anthem”; (“There is a crack in everything/ That's how the light gets in”). Philosophical debates, this year including Philip Pullman discussing the role of fantasy in his novels and life, are combined with live bands, film screenings, comedy and all-night parties.

“I'm not a geek, interested in intellectual puzzles,” Lawson says. “I'm interested in philosophy because it's about understanding the world and our lives. When we started the festival three years ago, philosophy was more likely to appear in Monty Python. It was a laughable matter, it was technical and analytical – not about our lives. Our aim is to overturn the current intellectually conservative environment, where ideas and philosophy are not valued or taken seriously. Our goal is to create an open, vibrant, intellectual culture which combines innovative thought with rich experience.”

More here.

Top Ten Myths About the Brain

From Smithsonian:

Brain 3. It’s all downhill after 40 (or 50 or 60 or 70).
It’s true, some cognitive skills do decline as you get older. Children are better at learning new languages than adults—and never play a game of concentration against a 10-year-old unless you’re prepared to be humiliated. Young adults are faster than older adults to judge whether two objects are the same or different; they can more easily memorize a list of random words, and they are faster to count backward by sevens. But plenty of mental skills improve with age. Vocabulary, for instance—older people know more words and understand subtle linguistic distinctions. Given a biographical sketch of a stranger, they’re better judges of character. They score higher on tests of social wisdom, such as how to settle a conflict. And people get better and better over time at regulating their own emotions and finding meaning in their lives.

4. We have five senses.
Sure, sight, smell, hearing, taste and touch are the big ones. But we have many other ways of sensing the world and our place in it. Proprioception is a sense of how our bodies are positioned. Nociception is a sense of pain. We also have a sense of balance—the inner ear is to this sense as the eye is to vision—as well as a sense of body temperature, acceleration and the passage of time. Compared with other species, though, humans are missing out. Bats and dolphins use sonar to find prey; some birds and insects see ultraviolet light; snakes detect the heat of warmblooded prey; rats, cats, seals and other whiskered creatures use their “vibrissae” to judge spatial relations or detect movements; sharks sense electrical fields in the water; birds, turtles and even bacteria orient to the earth’s magnetic field lines.

By the way, have you seen the taste map of the tongue, the diagram showing that different regions are sensitive to salty, sweet, sour or bitter flavors? Also a myth.

More here.

Tuesday, May 24, 2011

The King and I: My Quest to Build the Perfect Production of King Lear.

110517_CB_Lear_Bam_TN Jessica Winter in Slate:

Thackeray found King Lear boring. Tolstoy was no great fan. Samuel Johnson dreaded rereading the play—he recoiled from the death of Lear's youngest daughter, Cordelia. (Johnson preferred playwright Nahum Tate's sentimental rewrite of Lear, published in 1681, which inserted a happy ending and supplanted Shakespeare's version onstage for more than a century.) Nineteenth-century essayist Charles Lamb declared that staging Lear “has nothing in it but what is painful and disgusting,” concluding, “The Lear of Shakespeare cannot be acted.” Nearly two centuries later, Harold Bloom concurred: “You shouldn't even go and see somebody try and act the part,” the scholar said, “because it's unactable… I've never seen a Lear that worked.” Beginning with a vain, irrational king rejecting both his favorite child and his most faithful servant on a whim, ending with a mad, uncrowned derelict dying of a broken heart—with a detour wherein another foolish old man's eyes are gouged out—King Lear is a shocking spectacle of two families eating themselves alive.

Yet more and more actors have attempted the unactable in recent years; in New York City alone, they've included Ian McKellen, Christopher Plummer, Kevin Kline, and Stacy Keach. The latest is Derek Jacobi, who performs the title role in the rapturously received Donmar Warehouse production of the play (at BAM Harvey Theater in Brooklyn through June 5). The laurelled English actor Greg Hicks will do Lear at the Lincoln Center Festival this summer, and Law & Order's Sam Waterston takes the role for the Public Theater in the fall; a film version starring Al Pacino is also in the works. To a confident actor in the winter of his career, the notion of Shakespeare's tragedy as “a labyrinthian citadel, all but impregnable” (Kenneth Tynan) may seem less like a warning and more like a provocation.

That's how a viewer can approach King Lear, too. Like most, I first read it in college, where I took notes in lectures and seminars about its reputation as a play that resists being played—and, flush with those earnest yet contrarian energies peculiar to late adolescence, I sought out every Lear I could find. And I still do. This quasi-completeist mission is perverse, because its frisson depends largely on expectations of shameless presumption and abject failure. (You fiends! How dare you dare to stage this!) But the promise—always kept—is the thrill of seeing actors try the impossible.

Bob Dylan’s Curious Dotage

Dylan6 Tim de Lisle over at More Intelligent Life:

A few years ago a concert promoter took the BBC television series “Walking with Dinosaurs” and turned it into a stage show that toured the world’s indoor arenas. Seen from one angle, it was an enterprising move. Seen from another, it was quite unnecessary. The world’s arenas were already crawling with dinosaurs, in the form of old rock stars.

The early years of the 21st century have been the age of the veteran in rock and pop. Records are now trumped by live music, a field where the oldies can dominate. The golden age of popular music, the Sixties, is just close enough for the central figures from it to be still on the road. The Rolling Stones do a world tour every few years; Paul McCartney, with a small child to think about, does a short tour every few months. Brian Wilson of the Beach Boys, now a doddery old teddy bear propped up by a dazzling young band, turns out every other year. Simon & Garfunkel, not always on the best of terms, manage a month here and a month there. And then there is Bob Dylan.

Dylan tours even more than the others. In the 20 years to 2010, he gave 2,045 concerts, according to the fan site ExpectingRain.com, where you can study the setlist for every one of those nights. In April he will play in Singapore, Australasia and—if Beijing lets him in, after rebuffing him last year—China. In the summer he is expected in Europe. Not for nothing are his wanderings known as the Never Ending Tour.

Dylan’s gigs are unlike those of all his peers. If a show by McCartney or the Stones has a fault—apart from some creaking on the high notes—it is that it can be predictable. The Stones always play “Satisfaction”, “Brown Sugar”, “Jumping Jack Flash”; McCartney always does the Beatles classics he wrote himself—“Let It Be”, “Get Back”, “Hey Jude”. With Dylan, the only sure thing is “Like a Rolling Stone”, locked in as the first encore. Otherwise, he reserves the right to leave out any song. And often it’s a relief when he does, given the way he treats the songs he does play, which veers between indifference and outright sabotage.

The Destroyer

081027_r17847_p465 Jon Lee Anderson in The New Yorker:

“Empowerment” has been one of the rhetorical pillars of Mugabe’s government, but many of the schemes to benefit black “indigenous Zimbabweans” have been used by those in positions of authority or influence to enrich themselves. For all the talk of redistribution, Mugabe and his circle have not so much broken with the past as assumed for themselves an updated version of the country-club life style once enjoyed exclusively by the nation’s whites. There are many newly built luxury villas in Harare, and a sizable number of Mercedes-Benzes and Volvos, the vehicles of choice among Zimbabwe’s black nomenklatura. (Affluent whites seem to prefer S.U.V.s.) In 2005, Mugabe and his wife moved into a new twenty-five-bedroom mansion in Borrowdale Brooke, a Harare suburb, which cost a reported ten million U.S. dollars to build. Nobody knows exactly how he paid for it, but in Harare it is received wisdom that the mansion was financed by the Chinese, to whom the President had granted lucrative mining and trade concessions. Mugabe said openly that he had the help of “foreign governments.” (He added that Malaysia’s former Prime Minister Mahathir bin Mohamad, a personal friend, had donated tropical timber for the roof; China was reported to have supplied the shiny blue roof tiles.) Grace Mugabe has become infamous for her shopping expeditions abroad and, like Imelda Marcos, her expensive taste in shoes; she has been quoted as saying that because of her narrow feet she can “only wear Ferragamo.” Shortly after her marriage to Mugabe, Grace oversaw the construction of another mansion, called Graceland, which was allegedly built with public funds. She later sold Graceland to the Libyan government.

Another legacy of the colonial era is the cross-hatching of interests between the government and the private sector. A mining-company official I met with, a white man and a prominent figure in Zimbabwe, spoke of fending off direct requests for bribes from a senior cabinet minister, whom he described as “especially rapacious.” He confided that the executives of several mining companies had, under pressure, given large sums of money to government officials that were used to help fund the ZANU-P.F. election campaign. He added that Mugabe and his cronies would probably continue to use the threat of expropriation of the mines as a “political bludgeon” to extract bribes from mining companies. Meanwhile, he expected to see “more Chinese take over more dubious concessions.”

This kleptocratic style of government has had a trickle-down effect: corruption and graft are depressingly unremarkable in Mugabe’s Zimbabwe.

the novel’s not dead

Row_36.3_books

It is from Woolf that we get our sense that the novel is irrevocably divided into two kinds: new and retrograde. From her we inherit the feeling that all that matters in literature is “now,” that contemporary writing is a constant battle between the forces of innovation and life-giving freshness (“life,” “truth,” “the real”) and the turgid, sordid, compromised writers of yesteryear. Bakhtin would say such bifurcations are fatuous and a waste of time; I would go one step further and call them a convenient fiction, a chimera, and a sideshow. Or, to put it another way: there is no crisis of realism in contemporary fiction; there is only, among certain literary critics, a crisis of ownership, a last-ditch effort to keep debates over fiction stalled where they have been for nearly a century. What we have seen for the last ten years or so is a kind of proxy battle, very much in Woolf’s spirit, in which a contrived debate about novelistic method masks a silent—perhaps largely unintentional—effort to maintain cultural, racial, and geographic boundaries.

more from Jess Row at Boston Review here.

Jharia Burning

Joyce-01-thumbnail

At the center of Dhanbad City, in the Jharia region of northeastern India, amid a handful of concrete buildings, stands the enormous bronze statue of a coal miner. He is shirtless, muscular, and handsome. He strides doggedly forward, a mining helmet on his head, a pickax slung over his shoulder. The message is clear: Coal is my life. The area around Dhanbad produces India’s highest grade of coking coal, which in turn fuels the blast furnaces used for smelting steel. Contained in twenty-three large underground mines and nine open cast mines across an expanse of 450 square kilometers, Jharia’s heart of coal also produces power: two thirds of all electricity in India is generated in coal-fired plants. The earth beneath Jharia contains one of the largest coal reserves in the world. But the coal is also on fire.

more from Allison Joyce at VQR here.

This will be the century of disasters

110513_TECH_worldend_TN

In the same way that the 20th century was the century of world wars, genocide, and grinding ideological conflict, the 21st will be the century of natural disasters and technological crises and unholy combinations of the two. It’ll be the century when the things that we count on to go right will, for whatever reason, go wrong. Late last month, as the Mississippi River rose in what is destined to be the worst flood in decades, and as the residents of Alabama and other states rummaged through the debris of a historic tornado outbreak, physicists at a meeting in Anaheim, Calif., had a discussion about the dangers posed by the sun. Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society’s annual meeting was devoted to discussing the hazard of electromagnetic pulses (EMPs) caused by solar flares or terrorist attacks. Such pulses could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars. But maybe even that’s not the disaster people should be worrying about. Maybe they should worry instead about the ARkStorm. That’s the name the U.S. Geological Survey’s Multihazards Demonstration Project gave to a hypothetical storm that would essentially turn much of California’s Central Valley into a bathtub. It has happened before, in 1861-62, when it rained for 45 straight days. The USGS explains: “The ARkStorm draws heat and moisture from the tropical Pacific, forming a series of Atmospheric Rivers (ARs) that approach the ferocity of hurricanes and then slam into the U.S. West Coast over several weeks.” The result, the USGS determined, could be a flood that would cost $725 billion in direct property losses and economic impact.

more from Joel Achenbach at Slate here.