Genome-editing revolution: My whirlwind year with CRISPR

Jennifer Doudna in Nature:

DoudnaSome 20 months ago, I started having trouble sleeping. It had been almost two years since my colleagues and I had published a paper1 describing how a bacterial system called CRISPR–Cas9 could be used to engineer genomes (see ‘Based on bacteria’). I had been astounded at how quickly labs around the world had adopted the technology for applications across biology, from modifying plants to altering butterfly-wing patterns to fine-tuning rat models of human disease. At the same time, I'd avoided thinking too much about the philosophical and ethical ramifications of widely accessible tools for altering genomes. Questions about whether genome editing should ever be used for non-medical enhancement, for example, seemed mired in subjectivity — a long way from the evidence-based work I am comfortable with. I told myself that bioethicists were better positioned to take the lead on such issues. Like everyone else, I wanted to get on with the science made possible by the technology. Yet as the uses of CRISPR–Cas9 to manipulate cells and organisms continued to mount, it seemed inevitable that researchers somewhere would test the technique in human eggs, sperm or embryos, with a view to creating heritable alterations in people. By the spring of 2014, I was regularly lying awake at night wondering whether I could justifiably stay out of an ethical storm that was brewing around a technology I had helped to create.

…This year has been intense — and intensely fascinating. At times I have wished that I could step off the merry-go-round, just for a few minutes, to process everything. Ensuring that my travel and other commitments do not disrupt the progress of my lab members has been a priority, but working with them has increasingly involved meeting at night or on weekends, or conferring by e-mail or Skype. For now, time for my beloved vegetable garden and for hikes into the wilds of California with my 13-year-old son is gone. Almost three years after a colleague warned me that a “tidal wave” of research, discussion and debate involving CRISPR–Cas9 was coming, I still don't know when the wave will crest.

More here.

Tuesday, December 29, 2015

The real John le Carré

P28_Walden_WEB_1199817hGeorge Walden at the Times Literary Supplement:

In the life of David Cornwell, alias John le Carré, mysteries remain, since along with the author’s powers of fantasy went an easy-going relationship to the facts. After a spell with MI5 at the age of twenty-six, in 1960 he switched to the Secret Intelligence Service (SIS), yet he remains reluctant to confirm that he ever worked there, explain how he came across his non de plume, or come clean about various episodes in his past. Professional mystification? Or as Adam Sisman repeatedly suggests, false memory? “Everything he says”, Sisman states in an introduction, “needs to be examined sceptically.” This is not the voice of an authorized biographer – Cornwell didn’t want one – though that of a friend.

If the book begins on a tragicomic note it is because of Ronnie, David’s con-man father. His surreal schemes can be entertaining (“In New York Ronnie checked into the Plaza Hotel, announcing that he was in town to sell Bethlehem steel, America’s largest shipbuilder . . . ”), but for his sons David and Tony they were not so funny. A childhood running from scam to scam and house to house was to cost them their mother, who abandoned her children when David was five. The effects were as you might expect, and the father’s gift for make-believe appears to have rubbed off on his son. When David claimed at Oxford that he had quit Sherborne after a housemaster tried to kiss him, for example, Sisman implies that the episode may have been invented to boost his anti-Establishment credentials, something to which Cornwell was to devote much of his life.

more here.

looking at the stars

Header_ESSAY-aeon-final-wave-revisedGene Tracy at Aeon Magazine:

Today, we are more disconnected from the stars than ever before. Even utilitarian attachments have fallen away, as the markers that form our sense of place in the wider world have shifted from the distant to the local. Navigators once used the stars as reference marks; the GPS units in modern cellphones refer instead to a constellation of artificial satellites in orbit around the Earth, synchronised to atomic clocks in ground-based laboratories. (There has been one intriguing reversal of the trend: anxiety about the wartime vulnerability of the GPS system recently prompted the US Naval Academy to reinstate the teaching of celestial navigation. This particular unease is an apt metaphor for our general anxiety about losing our way when the lights go out, about where we stand in general relation to the world.)

We have lost a part of our selves in the process. Knowing where you are in the world is fundamental to knowing who you are. The development of our sense of spatial relationships – the ongoing discovery of where I am – is deeply entwined with memory formation. Neuroscience studies reveal that this is because forming the knowledge of place, and building that sense of our relation to other parts of the world, requires the brain to combine several different sense modalities.

more here.

theater in russia, then and now

Idiots_GogolCenterMaksim Hanukai at n+1:

THEATER HAS LONG BEEN at the center of political struggle in Russia. The theatricalization of life was one of the key aims of the Russian avant-garde, which embraced the 1917 revolution in part because it promised to transform everyday life into living theater. With the advent of socialist realism in the 1930s, Stalin turned theater into an instrument of state propaganda, but restrictions loosened again in the period of late socialism, from the 1960s to the 1980s, at which time theater acquired a near sacred status in Soviet culture. As Marina Davydova, a leading expert on Russian theater, observes in her 2005 book The End of a Theater Epoch, “Russia in the period of late socialism was not a literature- but a theater-centric country.” While censorship was strong, and many Western authors remained taboo, Soviet directors began to test the boundaries of artistic speech through the camouflaging techniques of Aesopian language. Ordinary citizens went to great lengths for the chance to see Russian bard Vladimir Vysotsky in the role of Hamlet at the Taganka or, with the introduction of glasnost, to attend a new play by Liudmila Petrushevskaya. Outside Moscow, too, amateur theater circles flourished (my own parents first met in one such circle in Baku). In a country without a functioning civil society and, officially, without religion, theater became a substitute for the church, the parliament, and the free press.

All of this changed in the decade after the fall of the Soviet Union—perhaps unfairly regarded as a period of stagnation for Russian theater. Historians cite various reasons for theater’s decline in the 1990s. The rate of emigration among the intelligentsia was high; the economy was in free fall; new social and civic institutions, however imperfect, had begun to emerge; and theater now had greater competition from other media, such as commercial film and television.

more here.

Unnatural Laws

Nancy Cartwright in IAI News:

SetWidth592-1334534164568From the faceless particles of fundamental physics to marshes, mountains, and rain forests, fleas, walruses and traffic jams, we are all supposed to live in a world governed by eternal, all-encompassing laws, laws discovered by the experiments of physics and encoded in its mathematical equations. This 400-year-old image of the governance of nature is today being undermined by exciting new modes of understanding across the sciences, including physics and biology, as well as, perhaps less surprisingly, in the study of society. There is order visible in the world, and invisible. But if we trust to these new ways of understanding, this need not be order by universal law. It can be local, piecemeal, and contextual – much like the world as we encounter it.

We live our everyday lives in a dappled world unlike the world of fundamental particles regimented into kinds, each just like the one beside it, mindlessly marching exactly as has forever been destined. In the everyday world the future is open, little is certain, the unexpected intrudes into the best-laid plans, everything is different from everything else, things change and develop, and different systems built in different ways give rise to different patterns. For centuries this everyday world was at odds with the scientific world governed through-and-through by immutable law. But many of the ways we do science today bring the scientific image into greater harmony with what we see every day: much of modern science understands and manipulates the world without resort to universal laws.

Consider biology, where our knowledge since World War II has made huge leaps forward and with it, our ability to put that knowledge to use. How is this knowledge encoded? A close look at the methodologies employed, especially in evolutionary biology, suggests that rather than good old-fashioned ‘proper laws’, biology offers instead laws that emerge historically, laws that are contingent and laws that admit exceptions.

More here.

Edmund Burke was no conservative

Richard Bourke in Aeon:

Header_ESSAY--162279583Edmund Burke, one of the great statesmen and philosophers of the 18th century, is the founder of modern conservatism. Or so it is commonly held: authorities, from Corey Robin on the left to Niall Ferguson on the right, agree that conservative ideology can be traced to this original source. The view has in fact been commonplace in the United States since the 1950s and has steadily been gaining currency across the globe. Admirers of Burke’s ‘traditionalism’ can be found in numerous countries, as different as the Netherlands and Japan. Yet there is something deeply misleading about this view of conservatism’s origins. Burke was a reforming Whig of the 18th-century British parliament whose ideas were not developed with modern politics in mind.

Even if we imagine Burke as our contemporary, his commitments are not in any way compatible with conservatism. For example, he was a defender of colonial rights against the British Empire during the period of the American Revolution. In lending his support to American defiance, he opposed the reigning tenets of British imperial policy and took a stand against successive ministries at Westminster. His defence of colonial rights included support for insurrection, for violent resistance against established authority. It is hard to reconcile this endorsement of revolt with what are usually regarded as conservative ideals.

More here.

Tuesday Poem

Carrying My Tools

Any good craftsman carries his tools.
Years ago, they were always at the ready.
In the car. In a knapsack.
Claw hammers, crisscrossed heads,
32 ouncers. Wrenches in all sizes,
sometimes with oil caked on the teeth.
Screwdrivers, with multicolored
plastic handles
(what needed screwing got screwed).
I had specialty types: Allen wrenches,
torpedo levels, taps and dies.
A trusty tape measure.
Maybe a chalk line.
Millwrights also carried dial indicators,
micrometers–the precision kind.
They were cherished like a fine car,
a bottle of rare wine,
or a moment of truth.
I believed that anyone could survive
without friends, without the comfort of blankets
or even a main squeeze
(for a short while anyway).
But without tools . . . now there was hard times.
Without tools, what kind of person could I be?
The tools were my ticket to new places.
I often met other travelers, their tools in tow,
and I'd say: “Go ahead, take my stereo and TV.
Take my car. Take my toys of leisure.
Just leave the tools.”
Nowadays, I don't haul these mechanical implements.
But I still make sure to carry the tools
of my trade: Words and ideas,
the kind no one can take away.
So there may not be any work today,
but when there is, I'll be ready.
I got my tools.
.

by Luis Rodriguez
from The Concrete River, 1995

‘Too Much of a Good Thing’ Finds a Dilemma in Our DNA

Abigail Zuger in The New York Times:

BookWe tend to think of good genes as the ones that make us thin, calm, cheerful and healthy. This logical assumption may not be completely accurate: Dr. Lee Goldman suggests that even better genes may be the ones that make us fat, anxious and candidates for the services of a cardiologist like himself. It’s all a matter of perspective, and Dr. Goldman takes the long, long view in “Too Much of a Good Thing,” arguing that many common modern ills result from the surpassingly excellent genes that allowed our species to endure over the millenniums. Only very recently did these survivor genes turn on us, creating the collection of overweight, hypertensive, jumpy and miserable individuals we are today.

Some of his argument will probably be familiar, at least when it comes to the question of why we have all become so fat. Less has been written about other areas of human physiology where our genetic programming seems to butt up against the circumstances of modern life. Dr. Goldman integrates it all into a complex narrative — a little tough sledding at points, but still thought-provoking. Human blood had to clot efficiently, too, or people would have bled to death from the continual accidents inherent in outdoor life and women would have died from the bloody process of childbirth. Finally, primitive humans had to be continually vigilant against attack, instinctively fighting some dangers and hiding from others, never lowering their guard for a second. All those genetic predispositions tend to be nothing but trouble in modern times.

More here.

Monday, December 28, 2015

The Winners of the 3QD Philosophy Prize 2015

PhilTop2015 2015 philosophy Collins PhilWin2015

John Collins has picked the three winners from the nine finalists:

  1. Top Quark, $500: Vidar Halgunset, Slow Corruption
  2. Strange Quark, $200: Daniel Silvermint, On How We Talk About Passing
  3. Charm Quark, $100: Lisa Herzog, (One of) Effective Altruism’s blind spot(s)

Here is what Professor Collins has to say about the winners:

The nine finalists for this year’s 3QD Philosophy Prize are all very fine examples of philosophical blogging. They combine clarity, immediacy, subtlety, humanity, and provocation. The task of judging them was a difficult pleasure. I learned something from every single one of these posts and I would be hard pressed indeed to explain, in any detail, my reasons for having excluded each of the other six pieces from the group to be awarded quark flavor.

After an initial read-through on December 17th, four of the entries stood out to me. On a second reading of the whole field a week later I found that I had narrowed the selection down to three of those four. But then I became stuck. It took a third and fourth reading and some agonizing, before I began to feel at all confident about a final ranking.

In 3rd Place, the Charm Quark goes to Lisa Herzog for her piece on the Justice Everywhere blog: “(One of) Effective Altruism’s Blind Spot(s) …”.

Moral theories that prescribe extreme versions of utilitarianism are sometimes criticized for being too demanding. Herzog’s focus is on a respect in which effective altruism appears to be not demanding enough.

By taking existing social institutions and practices as simply given the effective altruist finds herself choosing from a “restaurant menu” of given options, ignoring the possibility of deeper structural change. When the problem is construed as one of individual choice rather than collective action, such approaches will remain invisible.

The Strange Quark for 2nd Place is awarded to Daniel Silvermint for a sensitive and nuanced essay on the Feminist Philosophers blog: “On How We Talk About Passing”.

Silvermint’s piece, occasioned by last summer’s Rachel Dolezal incident, avoids the thorny issue of why, exactly, self-identification might be taken to be authoritative in the case of gender though not race, and asks us instead to hesitate and reconsider what we are doing when we rush to police the trespass of socially constructed categories that are tracked by highly unreliable markers. There is a valuable discussion here of the varieties of passing, though I found myself unsure as to whether to accept Silvermint’s suggestion that we apply the concept even to cases where there is neither misidentification nor intent. Can, for example, a white cisgender man, who, through privilege has had the luxury of never giving these matters a moment’s thought, really be said to be “passing” as white and male? Silvermint comments that “a trans woman that passes isn’t a man pretending to be a woman – she is a woman”. I agree wholeheartedly with the main point there, but I’d be inclined to add that her being a woman means that she isn’t simply passing as a woman either. (Whether a trans person might be said to—or want to?—pass as cisgender is another matter.)

My choices for the top two spots share this quality: they warn us to slow down, hesitate, and carefully reconsider the rush to judgment that is often encouraged by quick-fire debate in social media and in online discussion in general.

In first place, the winner of the 3QD Top Quark for philosophical blogging in 2015 is Vidar Halgunset for his piece on the Orienteringsforsøk blog: “Slow Corruption”.

I liked the simple humanity of this essay very much. Halgunset’s immediate topic is the recent public debate in Norway over the selective abortion of fetuses diagnosed with Down’s syndrome. His central suggestion is that we focus not on the question “what would be so terrible about a society without Down’s syndrome?” but ask instead, why might it be undesirable to create a society that lacked people with Down’s syndrome? And he asks us to stop and consider the reception of this debate by those of us who have Down’s syndrome.

But there’s another subtle thread woven through this piece, that has to do with what we ought to debate publicly, and how we ought to discuss it. Halgunset begins by quoting a particularly insensitive tweet of Richard Dawkins’s and asks us to consider whether the distinction between tone and content in this message is really as clear as Dawkins would maintain. The slow corrupting influence here is that of the public expression of blind certainty in 140 characters or fewer. These are matters of tone and selective silence. I am reminded of a comment my late friend and colleague Sid Morgenbesser once made to me: “Don’t you think there are situations in which it is simply indecent to deliberate at all?”

Orienteringsforsøk, we’re told, is written at 78.13 degrees of latitude North (Longyearbyen?). It’s a safe bet, I think, that this is indeed “the Northernmost Philosophy Blog in the World”.

From here this late December summer morning, writing these words at 33.87 degrees South, I send a heart-felt “Godt Nytt År!” to the distant Svalbard Archipelago. There’s reason here, I think, to be optimistic about the incoming year 2016.

John Collins, December 27th, 2015, Sydney

Congratulations also from 3QD to the winners (remember, you must claim the money within one month from today—just send me an email). And feel free, in fact we encourage you, to leave your acceptance speech as a comment here! And thanks to everyone who participated. Many thanks also, of course, to John Collins for doing the final judging.

The three prize logos at the top of this post were designed by me, Sughra Raza, and Margit Oberrauch. I hope the winners will display them with pride on their own blogs!

Details about the prize here.

Sunday, December 27, 2015

When Inequality Kills

CGT-faculty-bio-stiglitz-headshot

Joseph Stiglitz in Project Syndicate:

France, for example, spends less than 12% of its GDP on medical care, compared to 17% in the US. Yet Americans can expect to live three full years less than the French.

For years, many Americans explained away this gap. The US is a more heterogeneous society, they argued, and the gap supposedly reflected the huge difference in average life expectancy between African Americans and white Americans.

The racial gap in health is, of course, all too real. According to a study published in 2014, life expectancy for African Americans is some four years lower for women and more than five years lower for men, relative to whites. This disparity, however, is hardly just an innocuous result of a more heterogeneous society. It is a symptom of America’s disgrace: pervasive discrimination against African Americans, reflected in median household income that is less than 60% that of white households. The effects of lower income are exacerbated by the fact that the US is the only advanced country not to recognize access to health care as a basic right.

Some white Americans, however, have attempted to shift the blame for dying younger to African Americans themselves, citing their “lifestyles.” It is perhaps true that unhealthy habits are more concentrated among poor Americans, a disproportionate number of whom are black. But these habits themselves are a consequence of economic conditions, not to mention the stresses of racism.

The Case-Deaton results show that such theories will no longer do. America is becoming a more divided society – divided not only between whites and African Americans, but also between the 1% and the rest, and between the highly educated and the less educated, regardless of race. And the gap can now be measured not just in wages, but also in early deaths. White Americans, too, are dying earlier as their incomes decline.

More here.

365 days: The science events that shaped 2015

Monya Baker, Ewen Callaway, Davide Castelvecchi, Lauren Morello, Sara Reardon, Quirin Schiermeier and Alexandra Witze in Nature:

Rarely has a method roared onto the scene as quickly as the accurate, easy-to-use yet controversial CRISPR–Cas9 genome-editing system. In April, scientists in China reported use of the technique to edit non-viable human embryos, which spurred researchers and bioethicists to debate in editorials and meetings whether the technology should ever be used in human embryos, even for basic research. The debate culminated in the International Summit on Human Gene Editing in early December in Washington DC, which brought together nearly 500 ethicists, scientists and legal experts from more than 20 countries. The organizers wrapped up the event with a statement: the tools are not yet ready to be used to edit the genomes of human embryos intended for pregnancy. But they did not call for an outright ban of this work for basic research.

Over the past three years, CRISPR has become the tool of choice for scientists seeking to enhance animals and crops, and to cure human disease (see ‘CRISPR craze’). In October, researchers set a record by editing the genomes of pig embryos in 62 places at once — a move that could help to revitalize the field of xenotransplantation. The genetic tinkering could lower the risk of exposure to potentially dangerous pig viruses when people receive human-like organs grown in swine. Dogs, goats and sheep have also had their DNA modified with the low-cost technology.

Source: Scopus

CRISPR could target human diseases as well. With that aim in mind, in August, Google and other investors pumped US$120 million into the genome-editing start-up Editas Medicine in Cambridge, Massachusetts. The firm plans to use CRISPR in clinical trials in 2017 to correct a genetic mutation in some people who are visually impaired.

Other, more mature genome-editing technologies are already entering the clinic. In November, researchers in the United Kingdom announced that they had used a different system — enzymes called TALENs — to edit human immune cells and transplant them into a one-year-old with leukaemia, possibly saving her life.

More here.

The Angst of Being a Modern Indian

Subodh_guptas_untitled_egg

Sonal Shah in The Wire:

“Being a modern Indian is hard work,” a former king tells Qayanaat, the protagonist of Anjum Hasan’s The Cosmopolitans. If this is true for the King, the dispossessed monarch of fictional, small-town Simhal, it’s certainly so for Qayanaat, a 53-year-old single woman who lives in Bengaluru, subsisting on the diminishing material wealth of one man, her deceased father, while trying to manage her excess of emotions for another, the artist Baban.

Had Hasan chosen Baban—a character who recalls certain real Indian artists, such as Subodh Gupta—as her protagonist, The Cosmopolitans would likely have been India’s first Künstlerroman set in the contemporary art world. And Baban, triumphantly returning from New York to launch his large-scale conceptual work, ‘Nostalgia’, in Bengaluru, would have been a rich character for Hasan to use to pick apart the tensions she explores: between modernity and tradition, aesthetics and ethics, art and profit.

Instead, although The Cosmopolitans opens with the inauguration of ‘Nostalgia’, Hasan sets about painting a portrait of Qayanaat, a character on the periphery of the art world, but at the center of this ambitious, yet intimate, novel of ideas. Qayanaat neither makes art, nor collects it, and her place in the wider world is unclear as well. She is hopeless with money; her quietly bohemian lifestyle, surrounded by her garden and a few works of art, is only enabled by the house that her father left her. By conventional benchmarks, she is something of a failure. This makes her an appealing and important character in a country obsessed with success.

More here.

Why I Might Not See Star Wars

Star-wars-fan-fiction-love

Julia Felsenthal in Vogue:

A long time ago, the words “Star Wars: Episode VII” would have sent me into paroxysms of joy. The percussive clatter that opens John Williams’s famous score could make me sweat with Pavlovian zeal. That was an era when, more often than not, the hip pocket of my jeans concealed a tiny dime-store replica of the Millennium Falcon, its back panel hand-painted with glow-in-the-dark nail polish that I stole from my older sister. At night, my crude plastic hunk of junk would radiate dimly as though half-heartedly engaging its hyperdrive. During the day it was a secret talisman, a close-at-hand porthole to a world within a world.

All this did not take place in a galaxy far, far away; it took place in Chicago, where I grew up. I loved Star Wars. I don’t remember when I first saw the films. I do know that by the time I reached junior high, in the mid-’90s, my simmering ardor had reached full, rolling boil.

As I write this, I am struck again by how not cool that sounds. I entered—and exited—middle school nearly 6-feet tall, weighing about 95 pounds, and fully androgynous, a look that may have played well in CK One ads but was not so popular on the bar mitzvah circuit in insular, private-school Chicagoland. I was ill-prepared for the light-speed jump of adolescence, the transformation from childhood to teenagerhood that so many of my friends seemed to be making overnight. The future felt even more terrifying than the present. Star Wars offered a wormhole between future and past, a galaxy that mashed up technology with mythology, the power of computing with the knights of the Round Table. Captivating stuff for someone who blew her bat mitzvah money on a cutting-edge IBM Aptiva PC so that I could better play a Sims-like role-playing game about the management of medieval castles and feudal estates.

More here.

Matter will be created from light within a year, claim scientists

Ian Sample in The Guardian:

ScreenHunter_1587 Dec. 27 20.04Researchers have worked out how to make matter from pure light and are drawing up plans to demonstrate the feat within the next 12 months.

The theory underpinning the idea was first described 80 years ago by two physicists who later worked on the first atomic bomb. At the time they considered the conversion of light into matter impossible in a laboratory.

But in a report published on Sunday, physicists at Imperial College London claim to have cracked the problem using high-powered lasers and other equipment now available to scientists.

“We have shown in principle how you can make matter from light,” said Steven Rose at Imperial. “If you do this experiment, you will be taking light and turning it into matter.”

The scientists are not on the verge of a machine that can create everyday objects from a sudden blast of laser energy. The kind of matter they aim to make comes in the form of subatomic particles invisible to the naked eye.

The original idea was written down by two US physicists, Gregory Breit and John Wheeler, in 1934. They worked out that – very rarely – two particles of light, or photons, could combine to produce an electron and its antimatter equivalent, a positron. Electrons are particles of matter that form the outer shells of atoms in the everyday objects around us.

More here.

Sunday Poem

And the world was Icarus
Which smelled of melting wax

…………. —Anonymous

Landscape With the Fall of Icarus

According to Brueghel
when Icarus fell
it was spring

a farmer was ploughing
his field
the whole pageantry

of the year was
awake tingling
near

the edge of the sea
concerned
with itself

sweating in the sun
that melted
the wings’ wax

unsignificantly
off the coast
there was

a splash quite unnoticed
this was
Icarus drowning

by William Carlos Williams
from Collected Poems: 1939-1962, Volume II
New Directions Publishing Corp., 1962

.