“What is life?” –We are Entering a New Phase of Evolution

From The Daily Galaxy:

Venter_life_speed_light_bookWhat is life?” asks Craig Venter, author of Life at the Speed of Light and one of the first to sequence the human genome and create the first cell with a synthetic genome: “Only three simple words, and yet out of them spins a universe of questions that are no less challenging. What precisely is it that separates the animate from the inanimate? What are the basic ingredients of life? Where did life first stir? How did the first organisms evolve? Is there life everywhere? To what extent is life scattered across the cosmos? If other kinds of creatures do exist on exoplanets, are they as intelligent as we are, or even more so? Today these questions about the nature and origins of life remain the biggest and most hotly debated in all of biology.”

The code-script known as DNA has been sending out its signals since the dawn of all life, some four billion years ago. Half a century ago, writes Venter, “the great evolutionary geneticist Motoo Kimura estimated that the amount of genetic information has increased by one hundred million bits over the past five hundred million years. 6 The DNA code-script has come to dominate biological science, so much so that biology in the twenty-first century has become an information science. Taxonomists now use DNA bar codes to help distinguish one species from another. Others have started to use DNA in computation, or as a means to store information. I have led efforts not only to read the digital code of life but also to write it, to simulate it within a computer, and even to rewrite it to form new living cells. “Life ultimately consists of DNA-driven biological machines. All living cells run on DNA software, which directs hundreds to thousands of protein robots. We have been digitizing life for decades, observes Venter, “since we first figured out how to read the software of life by sequencing DNA. Now we can go in the other direction by starting with computerized digital code, designing a new form of life, chemically synthesizing its DNA, and then booting it up to produce the actual organism. And because the information is now digital we can send it anywhere via biological teleporter, to re-create proteins, viruses, and living cells at a remote location at the speed of light and re-create the DNA and life at the other end, perhaps changing forever how we view life.”

More here.

Craig Venter’s ‘biological teleportation’ device

Zoe Corbyn in The Guardian:

John-Craig-Venter-at-Synt-010Craig Venter reclines in his chair, puts his feet up on his desk and – gently stroking his milk chocolate-coloured miniature poodle, Darwin, asleep in his arms – shares his vision of the household appliance of the future. It is a box attached to a computer that would receive DNA sequences over the internet to synthesise proteins, viruses and even living cells. It could, for example, fill a prescription for insulin, provide flu vaccine during a pandemic or even produce phage viruses targeted to fight antibiotic-resistant bacteria. It could help future Martian colonists by giving them access to the vaccines, antibiotics or personalised drugs they needed on the red planet. And should DNA-based life ever be found there, a digital version could be transmitted back to Earth, where scientists could recreate the extraterrestrial organism using their own life-printing box. “We call it a Digital Biological Converter. And we have the prototype,” says Venter. I am visiting the office and labs of Venter's company Synthetic Genomics Incorporated (SGI) in La Jolla, a wealthy seaside enclave north of San Diego, California, where he also lives, because the pioneering American scientist dubbed biology's “bad boy” wants to talk about his new book, released this week.

…The book, Venter's second after his 2007 autobiography, is called Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life. It looks at the future Venter is aiming to create through his scientific endeavours in synthetic biology, a kind of turbo-charged version of genetic engineering where scientists design new biological systems – even synthetic life – rather than just tweaking existing organisms by inserting a gene here or there.

More here.

Tuesday, October 22, 2013

To the Great God Pan

Laura Jacobs reviews Isadora Duncan's My Life: The Restored Edition, in the LRB:

There is only one piece of film that shows Isadora Duncan dancing. It is four seconds long, the very end of a performance, and it is followed by eight seconds in which Duncan accepts applause. This small celluloid footprint – light-struck in the manner of Eugène Atget – contains quite a bit of information. It is an afternoon recital, early in the 20th century, and it takes place en plein air, trees in the background, like so much of the painting of the day. Duncan enters the frame turning, her arms positioned in an upward reach not unlike ballet’s codified fourth position, but more naturally placed. She wears a loose gown draped crosswise with a white veil, a floating X over her heart. Coming out of the turn and moving in the direction of the camera, her arms melt open as her head falls back. The white column of her neck, the spade-like underside of her jaw, the lifted breastbone crossed in white gauze: had any female dancer before Duncan projected such ecstatic presence and concrete power? Because of her thrown back upper body it seems as if she is running, but she is actually slow and steady, offering herself to something so large she doesn’t need to move fast. The dance over, she stands simply and acknowledges her audience with a Christ-like proffering of her palms. In fact, her classical garb is as much that of the sandalled shepherd of men as it is a barefoot goddess of Greek mythology. ‘I have come,’ she once said, ‘to bring about a great renaissance of religion through the dance, to bring the knowledge of the beauty and holiness of the human body through its expression of movements.’ Thus spake Isadora.

We have no more than four seconds of Duncan dancing because she did not like the medium of film as it existed in the years of her solo career, which began in the 1890s, peaked between 1910 and 1920, and continued intermittently until her death in 1927, at the age of 50. In those days film was in its infancy and still silent (The Jazz Singer was released the year Duncan died). Because music – Chopin, Schubert, Brahms, Beethoven, Wagner – was the spiritual inspiration for so much of what she did, Duncan couldn’t imagine dancing on film without it.

More here.

Line of Credit

13sep2013rbi_73_0

Mark Bergen on Raghuram Rajan, in his new role as head of India's central bank, in Caravan:

With the currency commanding unprecedented attention and talk of another 1991 growing louder, Delhi took action. According to several people working with the finance ministry, RBI officials were summoned to North Block with unusual frequency—and on 15 July, the central bank intervened.

The RBI initiated a series of dramatic measures to drain liquidity from the market and defend the battered rupee. The moves were abrupt, haphazard, and ineffectual: as the central bank fumbled from strategy to strategy over the following month, the rupee kept tumbling.

Close observers of the Indian economy have many disagreements—over why growth stalled, who is to blame, and what must be done—but here they reached a consensus. A chorus of former officials, economists and investors told me that the RBI had been strong-armed by a politically anxious finance ministry. The liquidity moves came as an utter surprise to the Technical Advisory Committee (TAC), a seven-member body that counsels the central bank, two of its members said. The bank’s public statements were sporadic and clumsy, uncharacteristic of the then RBI governor, Duvvuri Subbarao—solid evidence, one person with close knowledge of the RBI told me, that its hand had been forced by the government.

In Indian financial circles, where the RBI was seen as the sole government institution whose credibility had remained intact, the feckless moves that began in July signaled that its credibility was unravelling.

If those exaggerated anxieties have since been reversed, the turnaround began on 6 August, when the flailing government surprised its fiercest critics by naming Raghuram Govind Rajan as the next head of the central bank. Rajan, a distinguished professor at the University of Chicago’s Booth School of Business, had been hailed as “the oracle of the financial crisis” for his prescient warnings three years before the 2008 collapse. He had spent the past year as the government’s chief economic advisor, a post that many presumed was a brief stopover on his way to Mint Road.

More here.

Reinventing Photosynthesis

01445733e29921a3e3b1cbe899edd0e0

Nathan S. Lewis and William J. Royea in Project Syndicate:

In the United States and other industrialized countries, many applications that rely on fossil fuels (such as air transport or aluminum production) cannot be reconfigured to use electrical power. Moreover, fossil fuels are required to produce electricity as well, both to meet demand and to compensate for the intermittency of renewable energy systems such as wind or solar power. Is there really a scalable, low-carbon alternative?

One promising approach is artificial photosynthesis, which uses non-biological materials to produce fuels directly from sunlight. The sun is a nearly inexhaustible energy source, while energy stored in the form of chemical bonds – like those found in fossil fuels – is accessible, efficient, and convenient. Artificial photosynthesis combines these features in a viable technology that promises energy security, environmental sustainability, and economic stability.

While natural photosynthesis provides a complex, elegant blueprint for the production of chemical fuels from sunlight, it has significant performance limitations. Only about one-tenth of the sun’s peak energy is used; annualized net energy-conversion efficiencies are less than 1%; significant amounts of energy are expended internally to regenerate and maintain the exquisite molecular machinery of photosynthesis; and the energy is stored in chemical fuels that are incompatible with existing energy systems.

More here.

What did Tom Clancy and David Foster Wallace have in common? Not much….but they were both obsessed with packing in all the facts

Our man of the hour, Morgan Meis, in The Smart Set:

ID_IC_MEIS_CLANC_AP_001Tom Clancy’s death did not shake the literary establishment. That’s because Tom Clancy was never part of the literary establishment. He was an insurance salesman. In his spare time, Clancy wrote military thrillers. His first book, The Hunt for Red October — about a Soviet naval officer who takes his super-secret sub and defects to the US — was published by the Naval Institute Press in Annapolis. Clancy got $5000 bucks for it. But Ronald Reagan read the book and started telling everybody how much he loved it. Soon enough, Clancy was a bestselling author. The story gets ridiculous from there, with books spilling off the presses and hundreds of millions of dollars changing hands. Movies were made. Video games were made. This was the stuff that publishers and business-minded authors dream about on cold autumn nights.

The more famous Tom Clancy became, the more serious readers ignored him. Clancy was, after all, not so much a writer as a teller of war stories. He wrote to get the story down. Beyond that, he had little sense of craft or style. He wrote his stories. He made his money. And then he died. The end.

There is, however, a tantalizing side note to this story. There was a man whose death did very much shake the literary establishment. That man was David Foster Wallace. And David Foster Wallace, it turns out, liked to read Tom Clancy. That’s no big deal, you might say. Even the most serious writer needs to take a break from reading Dostoyevsky and Wittgenstein, and we all enjoy a good potboiler. True enough. Except that DFW seems to have valued Tom Clancy a lot more highly than that.

More here.

Bonus question: What do Morgan Meis and David Foster Wallace have in common? 🙂

the homelessness most new yorkers do not see

131028_r24153_p233Ian Frazier at The New Yorker:

For baseball games, Yankee Stadium seats 50,287. If all the homeless people who now live in New York City used the stadium for a gathering, several thousand of them would have to stand. More people in the city lack homes than at any time since . . . It’s hard to say exactly. The Coalition for the Homeless, a leading advocate for homeless people in the city and the state, says that these numbers have not been seen in New York since the Great Depression. The Bloomberg administration replies that bringing the Depression into it is wildly unfair, because those times were much worse, and, besides, for complicated reasons, you’re comparing apples and oranges. The C.F.H. routinely disagrees with Mayor Bloomberg, and vice versa; of the many disputes the two sides have had, this is among the milder. In any case, it’s inescapably true that there are far more homeless people in the city today than there have been since “modern homelessness” (as experts refer to it) began, back in the nineteen-seventies.

Most New Yorkers I talk to do not know this. They say they thought there were fewer homeless people than before, because they see fewer of them. In fact, during the twelve years of the Bloomberg administration, the number of homeless people has gone through the roof they do not have.

more here.

What the modern science of memory owes to the amnesiac patient H.M.

Gross_longandshortmemory_ftrCharles Gross at The Nation:

H.M. is, arguably, the most famous patient in the history of psychology and neuroscience. He was studied intensively for more than fifty years by hundreds of scientists (including, briefly, the author of this review); he died in 2008, and his brain is still being analyzed. Permanent Present Tense, by Suzanne Corkin, is the story of how these investigations led to a fundamental revolution in our understanding of the human brain and, particularly, of the organization and varieties of memory. Her accessible book places his story in the context of past and present research on memory and describes many of the questions initiated by research on H.M. It is a scientifically exciting and personally moving portrait of a man whose life and brain ended up being devoted to the science of memory. By the time he was 24, in 1950, H.M. (a k a Henry) had developed severe epilepsy, perhaps from a bicycle accident years earlier, and was referred to the neurosurgeon William Beecher Scoville, who had performed many frontal lobotomies on patients diagnosed as “psychotic.” Scoville had been unsatisfied with the results of frontal lobotomies and was trying a new surgery, bilateral medial temporal lobotomy, in another attempt to treat psychosis.

more here.

Chemical ‘clock’ tracks ageing more precisely than ever before

Amanda Mascareli in Nature:

Greying hair and wrinkles are external signs of aging, but they are not very precise. Now research shows that a code written into the body's epigenome — the chemical tags that modify DNA — can accurately tell the age of human tissues and cells. This ‘clock’ could provide insights into why certain tissues age faster than others, and why those tissues may be more cancer-prone. In the past few years, researchers have been homing in on regions of DNA that accrue lots of chemical tags called methyl groups as people age. Such methylation can selectively switch off genes. “What was not yet known was that one can develop an age predictor that really works well across most tissues and cell types,” says Steve Horvath, a bioinformatician at the University of California, Los Angeles.

…Strikingly, he found that women’s breast tissue accrues methylation in a way that makes it look an average of 2–3 years older than other healthy tissues from the same woman. In women with breast cancer, healthy tissue situated next to diseased tissue appeared to be an average of 12 years older than other tissues in the body. And Horvath found that tissue from 20 cancer types looked an average of 36 years older than healthy tissue.

More here.

Tuesday Poem

To Run

Here, in the present, I run
from you and the problems.
I run south, far away, to the warmth
of new places and new people.

In the past, I did not run
from what was happening.
I stayed in familiar places
and weathered the cold.

In the future, I will be running
on a beach with laughing children.
The water is warm and clear.
The horizon is pure blue.

I will have run all way
to a future perfect
in all the ways that the past
and present can never be.
.
by Pamela Milne
.
.

New Technique Holds Promise for Hair Growth

Denise Grady in The New York Times:

HairScientists have found a new way to grow hair, one that they say may lead to better treatments for baldness. So far, the technique has been tested only in mice, but it has managed to grow hairs on human skin grafted onto the animals. If the research pans out, the scientists say, it could produce a treatment for hair loss that would be more effective and useful to more people than current remedies like drugs or hair transplants. Present methods are not much help to women, but a treatment based on the new technique could be, the researchers reported Monday in Proceedings of the National Academy of Sciences.

…The senior author of the study is Angela Christiano, a hair geneticist and dermatology professor at Columbia University Medical Center in New York, who has become known for her creative approach to research. Dr. Christiano’s interest in the science of hair was inspired in part by her own experience early in her career with a type of hair loss called alopecia areata. She has a luxuriant amount of hair in the front of her head, but periodically develops bald spots in the back. The condition runs in her family. In the mid-1990s, she sent photographs of her bald spots to researchers in Pakistan, hoping her plight would persuade them to collaborate with her on a study of a rare genetic disorder there that left people with no hair at all on their heads or bodies. Her strategy worked, and the joint effort identified the gene. In subsequent studies, Dr. Christiano and other colleagues identified multiple genes that play an important role in alopecia areata.

More here.

Our own Morgan Meis wins $50,000 Whiting Writers’ Award!

Congratulations, Morgan!

The Whiting Writer's Award is a prestigious prize given to writers who have shown “exceptional talent and promise in early career,” and has previously been won by the likes of Elif Batuman, Mark Doty, Jeffery Eugenides, Jonathan Franzen, Tony Kushner, Suketu Mehta, William T. Vollman, and David Foster Wallace.

Morgan is certainly a deserving winner and will no doubt go on to even bigger and better things. As you may recall, he already won a $30,000 Andy Warhol Foundation Award some years ago. His collection of essays (some of which were first published here on 3QD) called Ruins was explicitly cited by the Whiting awards committee. You may buy that book by clicking the ad in the right-hand column.

I was honored to be invited to the awards ceremony in New York City last night but unfortunately I could not make it. Robin Varghese was there, however, and took this photo just after Morgan received his award:

Morgan

Needless to say, everyone at 3QD is extremely excited for Morgan. And as a mutual friend said, “That tie alone deserves an award!” Congratulations again, my dear friend! The announcment in the New York Times is here.

Monday, October 21, 2013

Sunday, October 20, 2013

Gerald Dworkin’s Philosophy Commonplace Book: A Review

Justin E. H. Smith in his blog:

2940044565791_p0_v1_s260x420Jerry Dworkin might not be hands-down the funniest person in the history of philosophy, but he's probably the funniest Dworkin. Not that he's had much competition. There was that one who had the line about 'clerking for Learned Hand', which always made me snicker but was probably just a one-off sort of thing, and there was that other who… well, never mind. As for Jerrys, there he's had some stiff competition indeed, and in the same broadly borschty category. But this much can be said with certainty: Jerry Dworkin has survived all the other Jerrys and all the other Dworkins, and now, with this rich epoch-making e-tome, has singlehandedly revived the genre of the commonplace book, and bequeathed to the generations a fine collection that is bound to survive its author. At least if anyone can figure out how to download the damned thing. I had to write and request a review copy, which was duly sent along. Which in turn compelled me, morally, to either fork over the $5.97 a Kindle download would have cost me, or to do a little write-up. Since I am now an employee of a French university and therefore am basically worrying at this point about stocking up enough coal for the coming winter, I decided to do the writing thing.

Let's get something straight right away. Philosophy humour is generally awful: dismal vocational coping, and nothing more, substantially no different from the bumpersticker you might spot on a sagging Econoline that reads 'Electricians Conduit Better'. And if anyone ever again suggests to me that awful Monty Python sketch about the philosophers' football match, I am just going to come clean and tell them that my ideal of humour is rather closer to Redd Foxx's classic routine, You've Got to Wash Your Ass.

More here.

The Last of the Sheiks?

Christopher M. Davidson in the New York Times:

ScreenHunter_364 Oct. 20 17.33This summer, disgruntled Saudis took their grievances online in droves, complaining of ever-growing inequality, rising poverty, corruption and unemployment. Their Twitter campaign became one of the world’s highest trending topics. It caused great alarm within elite circles in Saudi Arabia and sent ripples throughout the region. The rallying cry that “salaries are not enough” helped to prove that the monarchy’s social contract with its people is now publicly coming unstuck, and on a significant scale.

Many experts believe that the Gulf states have survived the Arab Spring because they are different. After all, they’ve weathered numerous past storms — from the Arab nationalist revolutions of the 1950s and ’60s to Saddam Hussein’s 1990 invasion of Kuwait to an Al Qaeda terror campaign in 2003.

But they are not different in any fundamental way. They have simply bought time with petrodollars. And that time is running out.

The sheiks of the Persian Gulf might not face the fate of Col. Muammar el-Qaddafi of Libya or Hosni Mubarak of Egypt next year, but the system they have created is untenable in the longer term and it could come apart even sooner than many believe.

More here.

The Searing, Visceral “12 Years a Slave”

Christopher Orr in The Atlantic:

12yearsAslave650“All right now, y’all fresh niggers,” a white overseer in the antebellum South tells his charges in the opening scene of 12 Years a Slave. “Y’all gonna be in the cuttin’ gang.”

We soon discover what this entails, as the slaves take machetes in hand and begin hacking their way through an almost endless expanse of sugar cane; they might as well be trying to empty the ocean using teacups. The physicality of their labor is not merely extreme, it is extravagant. We immediately understand that what we are witnessing is an economy predicated on the idea that human—that is, black—sweat and sinew are not merely cheap resources, but essentially inexhaustible ones, subject to careless squander.

The scene establishes the searing, visceral tone that will characterize director Steve McQueen’s audacious third feature. Moments later we watch as the film’s protagonist, Solomon Northup (Chiwetel Ejiofor), lies awake in a bunkhouse surrounded by fellow slaves. When he turns to the woman next to him, she takes his hand roughly to her breast, and then between her legs, grimacing with joyless urgency before she twists away from him. Day and night, a perpetuity of toil and a pantomime of love—it all comes down to this: to flesh and blood, to individual endurance in a solitary prison of pain.

More here.

The psychology of spiritualism: science and seances

The idea of summoning the spirits took thrilling hold of the Victorian imagination – and has its adherents now. But the psychology behind spiritualism is more intriguing.

David Derbyshire in The Guardian:

Seance-Scene-in-Dr.-Mabus-008As the evenings get darker and the first hint of winter hangs in the air, the western world enters the season of the dead. It begins with Halloween, continues with All Saints' and All Souls' days, runs through Bonfire Night – the evening where the English burn effigies of historical terrorists – and ends with Remembrance Day. And through it all, Britain's mediums enjoy one of their busiest times of the year.

People who claim to contact the spirit world provoke extreme reactions. For some, mediums offer comfort and mystery in a dull world. For others they are fraudsters or unwitting fakes, exploiting the vulnerable and bereaved. But to a small group of psychologists, the rituals of the seance and the medium are opening up insights into the mind, shedding light on the power of suggestion and even questioning the nature of free will.

Humanity has been attempting to commune with the dead since ancient times. As far back as Leviticus, the Old Testament God actively forbade people to seek out mediums. Interest peaked in the 19th century, a time when religion and rationality were clashing like never before. In an era of unprecedented scientific discovery, some churchgoers began to seek evidence for their beliefs.

Salvation came from two American sisters, 11-year-old Kate and 14-year-old Margaret Fox. On 31 March 1848, the girls announced they were going to contact the spirit world. To the astonishment of their parents they got a reply.

More here.