Did Neanderthals Die Out Because of the Paleo Diet?

Abigail Tucker in Smithsonian:

Jun2016_e02_phenom-wr-v1_jpg__800x600_q85_crop_subject_location-581,137Humans tend to dismiss Neanderthals as dimwits, yet the brains of our doomed cousins were actually larger than our own. “If you go to a site from 150,000 years ago,” says Miki Ben-Dor, a Tel Aviv University archaeologist, “you won’t be able to tell whether Neanderthals or Homo sapiens lived there, because they had all the same tools.” Which helps explain why, to fathom how our fates diverged, he recently scrutinized Neanderthals’ bodies instead of their skulls. While humans have barrel-shaped chests and narrow pelvises, Neanderthals had bell-shaped torsos with wide pelvises. The prevailing explanation has been that Neanderthals, often living in colder and drier environments than their human contemporaries, needed more energy and therefore more oxygen, so their torsos swelled to hold a bigger respiratory system. But Ben-Dor had a gut feeling this was wrong. What if the difference was what they ate? Living in Eurasia 300,000 to 30,000 years ago, Neanderthals settled in places like the Polar Urals and southern Siberia—not bountiful in the best of times, and certainly not during ice ages. In the heart of a tundra winter, with no fruits and veggies to be found, animal meat—made of fat and protein—was likely the only energy source.

Alas, though fat is easier to digest, it’s scarce in cold conditions, as prey animals themselves burn up their fat stores and grow lean. So Neanderthals must have eaten a great deal of protein, which is tough to metabolize and puts heavy demands on the liver and kidneys to remove toxic byproducts. In fact, we humans have a “protein ceiling” of between 35 and 50 percent of our diet; eating too much more can be dangerous. Ben-Dor thinks that Neanderthals’ bodies found a way to utilize more protein, developing enlarged livers and kidneys, and chests and pelvises that widened over the millennia to accommodate these beefed-up organs.

More here.

Could We Just Lose the Adverb (Already)?

Christian Lorentzen in Vulture:

ScreenHunter_1947 May. 20 09.14I’m cursed with a mind that looks at a sentence and sees grammar before it sees meaning. It might be that I’m doing math by other means, that I overdid it with diagramming sentences as a boy, or that my grasp of English was warped by learning Latin. Translating Horace felt like solving math problems. Reading Emily Dickinson began to feel like solving math problems. You might think this is a cold way of reading, but it’s the opposite. You develop feelings. Pronoun, verb, noun — I like sentences that proceed in that way, in a forward march. Or those tricked out with a preposition, another noun, and a couple of adjectives. Conjunctions and articles leave me unfazed. If these combinations result in elaborate syntactical tangles, it thrills me. It’s cheap words I hate, and I hate adverbs.

I’m unembarrassed to admit that my taste in literary style owes a lot to my adolescent reading of The Sun Also Rises — Hemingway was no friend of adverbs. He’s not alone. “Use as few adverbs as possible” is among V. S. Naipaul’s rules for beginning writers. When William Strunk and E. B. White admonish us to omit unnecessary words, I know they’re talking about adverbs without their having to say it.

More here.

“It’s Your Generation of Experimenters That Makes Me Look Good!” – An Interview with Kip Thorne

Karan Jani in The Wire:

What was your first reaction when you saw the gravitational-wave event on September 14, 2015 and the whole process which followed until the historic announcement?

ScreenHunter_1946 May. 20 09.04I think it was just one of deep satisfaction, that a dream that Rai Weiss, Ron Drever and Joseph Weber and Vladimir Braginsky and Stan Whitcomb and others had developed and shared so many decades ago that was finally reaching fruition.

In fact, nature turned out to be giving us just what I had expected – I’d expected since the early 1980s that the first thing we would see would be merging blackholes because the distance you can see goes up roughly proportionally with the mass of the binary, and so the volumes are cubed, and that factor would overwhelm the absolute lower event rate for blackhole binaries compared to neutron star binaries. It seemed very likely to me so that’s just what I thought would happen. It’s a big part of how I hoped to sell this project.

To have that come out right was pleasing, to have the strength of the waves be 10-21 – that’s a number we started targeting in 1978. So it all came to pass the way we expected it to, thanks to enormous work by your generation of experimenters. You were the ones who really pulled it off. The way I like to say it is that it’s your generation of experimenters that makes me look good!

More here.

On Ruth Goodman’s ‘How to Be a Tudor’

1631491393.01.LZZZZZZZEd Simon at The Millions:

There is something uncanny about staying in another person’s house — the stark differences and the small convergences of sameness. We all like to snoop a bit. Now, public historian Ruth Goodmangives us the chance to snoop on the lives of people who died 500 years ago. When you’re watchingThe Tudors or Wolf Hall, Goodman is the woman behind the scenes ensuring that the clothes look right, the home interiors are accurate, and the sumptuous feasts are as true to life as possible. InHow to Be a Tudor: A Dawn-to-Dusk Guide to Tudor Life, she makes her almost preternatural knowledge about life during the 16th century available to the reading public.

You wouldn’t expect the intricacies of Tudor baking, brewing, ploughing, cooking, needlework, painting, dancing, and card-playing to hold an audience rapt, and yet Goodman makes the minutia of everyday life a half-millennia ago tremendously interesting. Indeed, her voluminous knowledge makes Goodman seem not so much a specialist on period authenticity as an actual time traveler. Ingeniously structuring the book around the hourly rhythms of daily life (with chapters going from “At Cock’s Crow” to “And so to bed”), Goodman transmits information about food, work, medicine, education, leisure, lodging, sleep, and even sexuality. How to Be a Tudor, with its grounding in physical detail and avoidance of theoretical analysis, is true to the guide book genre, but one featuring recipes for veal meatballs (exceedingly expensive at the time) and Galenic medical advice.

more here.

life in ruins

Imperial-rome-circus-maximus-by-alan-sorrell-190474-1Mary Beard at the Times Literary Supplement:

Inside the monastery of S. Trinità dei Monti, which stands at the top of the Spanish Steps in Rome, is a room decorated in glorious trompe l’oeilas a ruin. Created in 1766 by Charles-Louis Clérisseau, and originally intended to be the cell of the monastery’s resident mathematician Fr Thomas Le Sueur, it imitates a decaying classical temple, with tumbled columns, a roof open to the sky, encroaching vegetation and a large parrot perched on one of the apparently surviving crossbeams. The irony of the design worked on several levels. It allowed the famous scholar to enjoy the pleasure of ruins without the discomfort. But it was also a wry comment on the life cycle of buildings. Ruins are one stage on their inevitable journey to destruction. As we know from some of the most ambitious modern attempts at conservation on archaeological sites all over the world, from Pompeii to Machu Picchu, collapse can be delayed – but not prevented. Here Clérisseau offered dilapidation frozen in time, a ruin built to last.

That life cycle of buildings, from conception to death, with an occasional lucky, or unlucky, resurrection, is the theme of James Crawford’s Fallen Glory – twenty chapters telling the biography of twenty structures, from across the world, ancient and modern, real and imaginary (the first chapter is on the Tower of Babel, the last on the virtual world of the web hosting service GeoCities). Some of these life stories work better than others. The Roman Forum, the subject of Chapter Six, needs so much background that we tend to lose sight of the main character as it rises out of the marshes, becomes the monumental centre of the empire, and slips back into pasture, only to be revived again in the service of Mussolini’s grandiose ambitions.

more here.

Two Old Jewish Socialists: Henry Roth Meets Bernie Sanders

UrlNathaniel Popkin at Tablet Magazine:

Hiat was born in Kletsk, a town south of Minsk, in Belarus. As a child, he began to doubt the possibility of God. “I’ve seen children die, small children, and the doubt of a merciful God really drove me” away from religious belief, he said to Roth during the first interview session, describing the crucible of his political consciousness and suggesting the rigor of his autodidactic mind. But at the same time, at the cheder in Kletsk, Hiat was introduced to the Jewish teaching that opened him intellectually to a “revolutionary instinctive upbringing.” “Socialism,” he said, “is part of philosophical Judaism.” There is, he explained to Roth, who never received, or pursued, a full Jewish education, “a certain Hebrew word, ein kemach, ein Torah: If you have no bread, you have no Torah.”

Bernie Sanders, who perhaps embodies this connection as thoroughly as any American public figure in history, rarely draws that line. In a speech last year to the students of the Evangelical Christian Liberty University, he quoted the Book of Matthew, not Torah or Talmud, in citing a religious influence in his political ideology. (Hillary Clinton, for her part, draws a connection between the Christianity she experienced growing up and her instinct to volunteer in poor neighborhoods of Chicago.) Sanders sometimes directs the question of how his Jewish self-identity inspired his political beliefs to the specter of the Holocaust, from which his father escaped but many of his relatives in Poland did not; more often, he simply identifies his parents as “Polish.”

more here.

Chinese Is Not a Backward Language

Tom Mullaney in Foreign Policy:

ScreenHunter_1945 May. 19 15.34Even in the age of China’s social media boom, with billion-dollar valuations for Beijing-based IT start-ups, prejudice against the Chinese language is alive and well. One would be forgiven for thinking that by 2016, the 20th century’s widespread critiques of racism, colonialism, and Social Darwinism would have sounded the death knell of 19th-century Orientalism, which viewed China and the Chinese language through a condescending, colonialist lens. At the least, one might hope that if notions of Chinese Otherness were still with us, those who carry on the tradition of these threadbare ideas would generally be seen as archaically Eurocentric and gauche — the dross of airport bookshop paperbacks, unworthy of serious engagement. If only. Nineteenth-century understandings of China persist, not only surviving the decline of Social Darwinism and race science, but flourishing in this new century, driven primarily by arguments about China’s unfitness for modern technology and media.

Call it Orientalism 2.0.

More here.

Homo Sapiens 2.0? We need a species-wide conversation about the future of human genetic enhancement

Jamie Metzl in KurzweilAI:

After 4 billion years of evolution by one set of rules, our species is about to begin evolving by another. Overlapping and mutually reinforcing revolutions in genetics, information technology, artificial intelligence, big data analytics, and other fields are providing the tools that will make it possible to genetically alter our future offspring should we choose to do so. For some very good reasons, we will. Nearly everybody wants to have cancers cured and terrible diseases eliminated. Most of us want to live longer, healthier and more robust lives. Genetic technologies will make that possible. But the very tools we will use to achieve these goals will also open the door to the selection for and ultimately manipulation of non-disease-related genetic traits — and with them a new set of evolutionary possibilities. As the genetic revolution plays out, it will raise fundamental questions about what it means to be human, unleash deep divisions within and between groups, and could even lead to destabilizing international conflict.

And the revolution has already begun. Today’s genetic moment is not the stuff of science fiction. It’s not Jules Verne’s fanciful 1865 prediction of a moon landing a century before it occurred. It’s more equivalent to President Kennedy’s 1962 announcement that America would send men to the moon within a decade. All of the science was in place when Kennedy gave his Houston speech. The realization was inevitable; only the timing was at issue. Neil Armstrong climbed down the Apollo 11 ladder seven years later. We have all the tools we need to alter the genetic makeup of our species. The science is here. The realization is inevitable. Timing is the only variable.

More here.

Thursday Poem

Daphne in Mourning

Palm fronds have woven out the sky.
Fog has infiltrated every vein.
My hair has interlaced with vines.
Cobwebs lash their gauze across my eyes.

I’ve stood so since the world began,
and turned almost to stone some years ago.
Who passes by perceives a lichened post,
my girlish figures, ghostly, nearly gone.

My bark is warmer than the dead’s.
Human blood still lulls the underside of leaves.
My fingers hold the very dress I loved
to dance in, when dancing mattered-and it did.

by Melissa Green
from Daphne in the Morning
Pen & Anvil Press, 2010
.

His Last Decade

Michel-foucault1

Stuart Elden in Berfrois:

Foucault’s Last Decade is a study of Foucault’s work between 1974 and his death in 1984. In 1974, Foucault began writing the first volume of his History of Sexuality, developing work he had already begun to present in his Collège de France lecture courses. In that first volume, published in late 1976, Foucault promised five further volumes, and indicated some other studies he intended to write. But none of those books actually appeared, and Foucault’s work went in very different directions. At the very end of his life, two further volumes of the History of Sexuality were published, and a fourth was close to completion. In contrast to the originally planned thematic treatment, the final version was a much more historical study, returning to antiquity and early Christianity. In this book, I trace these developments, and try to explain why the transition happened.

Foucault’s Last Decade has its roots as far back as the late 1990s. I had just finished a PhD thesis on Nietzsche, Heidegger and Foucault. Right at the end of that process Foucault’s courses from the Collège de France began to be published – the first in 1997, the second in 1999. I already knew how much Heidegger scholarship had been changed by the insights of his lecture courses and thought that the same would be true for Foucault. (Of course, with Heidegger, much more and much worse was to come with his notebooks.) I wrote a review essay on the second published Foucault course – The Abnormals – for the journal boundary 2, on the invitation of Paul Bové, and then Paul invited me to the University of Pittsburgh when I spoke about‘Society Must Be Defended’, a text which was also published in boundary 2. I thought then that if I wrote something about each course as they came out, then in time there might be the raw materials for a book.

And so, on and off, in and around other projects, I read, spoke and sometimes wrote about most of Foucault’s courses as they appeared. Some of these were published here at Berfrois.Foucault taught at the Collège de France from late 1970 until his death in 1984. There were thirteen courses in total, but they were published in non-chronological order – the earliest courses presented the greatest editorial difficulties, and so were among the last to appear. The last of the Collège de France ones was published in 2015. Some courses from elsewhere and other material has also been published in the intervening years, and we now have far more material published since Foucault’s death than appeared in his lifetime. This, despite, his wish for ‘no posthumous publications’ – a request that was once followed scrupulously, then generously interpreted and is now largely ignored.

More here.

The climate of history: Four theses

Cco8zmlW4AEtCUj

Dipesh Chakrabarty in Eurozine:

The current planetary crisis of climate change or global warming elicits a variety of responses in individuals, groups, and governments, ranging from denial, disconnect, and indifference to a spirit of engagement and activism of varying kinds and degrees. These responses saturate our sense of the now. Alan Weisman's best-selling book The World without Us suggests a thought experiment as a way of experiencing our present: “Suppose that the worst has happened. Human extinction is a fait accompli. […] Picture a world from which we all suddenly vanished. […] Might we have left some faint, enduring mark on the universe? […] Is it possible that, instead of heaving a huge biological sigh of relief, the world without us would miss us?” I am drawn to Weisman's experiment as it tellingly demonstrates how the current crisis can precipitate a sense of the present that disconnects the future from the past by putting such a future beyond the grasp of historical sensibility. The discipline of history exists on the assumption that our past, present, and future are connected by a certain continuity of human experience. We normally envisage the future with the help of the same faculty that allows us to picture the past. Weisman's thought experiment illustrates the historicist paradox that inhabits contemporary moods of anxiety and concern about the finitude of humanity. To go along with Weisman's experiment, we have to insert ourselves into a future “without us” in order to be able to visualize it. Thus, our usual historical practices for visualizing times, past and future, times inaccessible to us personally – the exercise of historical understanding – are thrown into a deep contradiction and confusion. Weisman's experiment indicates how such confusion follows from our contemporary sense of the present insofar as that present gives rise to concerns about our future. Our historical sense of the present, in Weisman's version, has thus become deeply destructive of our general sense of history.

I will return to Weisman's experiment in the last part of this essay. There is much in the debate on climate change that should be of interest to those involved in contemporary discussions about history. For as the idea gains ground that the grave environmental risks of global warming have to do with excessive accumulation in the atmosphere of greenhouse gases produced mainly through the burning of fossil fuel and the industrialized use of animal stock by human beings, certain scientific propositions have come into circulation in the public domain that have profound, even transformative, implications for how we think about human history or about what the historian C. A. Bayly recently called “the birth of the modern world”.

More here.And a response by Timothy J. LeCain.

How I Acted Like A Pundit And Screwed Up On Donald Trump

Apologia_16x9

Nate Silver over at FiveThirtyEight:

Trump is one of the most astonishing stories in American political history. If you really expected the Republican front-runner to be bragging about the size of his anatomy in a debate, or to be spending his first week as the presumptive nominee feuding with the Republican speaker of the House and embroiled in a controversy over a tweet about a taco salad, then more power to you. Since relatively few people predicted Trump’s rise, however, I want to think through his nomination while trying to avoid the seduction of hindsight bias. What should we have known about Trump and when should we have known it?

It’s tempting to make a defense along the following lines:

Almost nobody expected Trump’s nomination, and there were good reasons to think it was unlikely. Sometimes unlikely events occur, but data journalists shouldn’t be blamed every time an upset happens, particularly if they have a track record of getting most things right and doing a good job of quantifying uncertainty.

We could emphasize that track record; the methods of data journalism have been highly successful at forecasting elections. That includes quite a bit of success this year. The FiveThirtyEight “polls-only” model has correctly predicted the winner in 52 of 57 (91 percent) primaries and caucuses so far in 2016, and our related “polls-plus” model has gone 51-for-57 (89 percent). Furthermore, the forecasts have been well-calibrated, meaning that upsets have occurred about as often as they’re supposed to but not more often.

But I don’t think this defense is complete — at least if we’re talking about FiveThirtyEight’s Trump forecasts. We didn’t just get unlucky: We made a big mistake, along with a couple of marginal ones.

The big mistake is a curious one for a website that focuses on statistics. Unlike virtually every other forecast we publish at FiveThirtyEight — including the primary and caucus projections I just mentioned — our early estimates of Trump’s chances weren’t based on a statistical model. Instead, they were what we sometimes called ”subjective odds” — which is to say, educated guesses. In other words, we were basically acting like pundits, but attaching numbers to our estimates. And we succumbed to some of the same biases that pundits often suffer, such as not changing our minds quickly enough in the face of new evidence. Without a model as a fortification, we found ourselves rambling around the countryside like all the other pundit-barbarians, randomly setting fire to things.

More here.

The empty brain

Header_ESSAY-GS3522985

Robert Epstein in Aeon:

No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.

To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.

A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to changerapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.

But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

More here.

Is Life Worth Living?

Cfef3278-7bd2-4851-bd70-1463eaf8809bCatherine Hollis at Public Books:

Any number of recent memoirs—most, but not all, by women—face down the question James posed in his essay “Is Life Worth Living?” Should we go on living, and if so, what will our lives look like? If terrible things have happened to us, is healing possible? Each of the four writers under discussion here—Jessa Crispin, Jacqueline Rose, Rachel Moran, and Sandra Cisneros—confronts these questions to varying degrees. Constructing a life, like constructing a memoir or a biography, involves asking how to live. Any reader confined to her bed by depression may find whole communities of fellow travelers in the ranks of memoirs published during the so-called memoir boom.1 Best-selling examples of this trend include Elizabeth Gilbert’s Eat, Pray, Love (2006), Jeannette Walls’s Glass Castle (2006), and Cheryl Strayed’s Wild (2012). Although Walls and Strayed face serious issues including poverty, homelessness, and addiction, Gilbert’s own memoir has been taken to exemplify a genre of narrative we might call “first-world white girl problems.” It’s hard to feel sorry for someone who treats her depression with trips to Italy, India, and Bali, even if we empathize with her grief at ending a marriage.

Other notable, if less famous, entrants in the “Live Through This” genre include three memoirs about surviving difficult parents. Nick Flynn’s Another Bullshit Night in Suck City (2004) is probably the best known, and documents with lyrical honesty his relationship with his homeless father. Domenica Ruta’s With or Without You (2013) is a stunning portrait of a mother-daughter relationship shaped by addiction and violence in an Italian working-class neighborhood near Boston. And Ariel Gore (publisher of Hip Mama) writes tenderly about caregiving and emotional boundaries in The End of Eve (2014), when she finds herself tending to an insane and impossible mother dying from cancer.

more here.

How Photomontage Ended the Lebanese Civil War

Salti-web1-LARGERasha Salti at The Brooklyn Rail:

Officially, the Lebanese Civil War ended when the Taif Agreement was ratified (November 1989), the Lebanese parliament voted to adopt an Amnesty Law (March 1991), and militias were dissolved (May 1991). The Amnesty Law pardoned all political crimes committed prior to its enactment. I, for myself, cannot recall a specific moment or event when the war ended. The transition from the state of war to a state of non-war is blurred in my memory. It was gradual, and even though the country felt relatively safer, tension pervaded the atmosphere and the looming threat of the conflict reigniting remained. The actors of the Civil War—those still alive, as well as the ghosts—transformed from warlords to political leaders, populating the legislative and executive bodies of governance. It was as if we were all handed a new, additional script, and we, mutatis mutandis, all slipped into new roles. The militias became political parties with an overt agenda to strictly serve their sectarian constituencies, even as the militants upheld the claim that they were constitutive protagonists of a republic. And we, perpetrators and victims, everyday folks, claimed to be citizens of this republic. The longest serial ever performed: for the past twenty-five years, we have been voting for these very same leaders to govern our destinies—to ensure our safety, well-being, and prosperity. It has not been smooth sailing; the “old script” surges every once in a while, in relative degrees of intensity. It was and remains necessary to believe that the war is over, and that we all wanted it to end. In the span of twenty-five years, the theatricality deployed to that end has been superlative and tireless, with public showcases of redemption, songs, music videos, and speeches to cheer our resolve to end the war.

more here.

Is this the end of sex?

41zzO16vpZL._SX327_BO1,204,203,200_Philip Ball at The New Statesman:

Is it time to give up sex? Oh, it has plenty to recommend it; but as a way of making babies it leaves an awful lot to chance. I mean, you might have some pretty good genes, but – let’s face it – some of them aren’t so great. Male pattern baldness, phenylketonuria, enhanced risk of breast cancer: I’m not sure you really want those genetic conditions passed on in the haphazard shuffling of chromosomes after sperm meets egg.

It is already possible to avoid more than 250 grave genetic conditions by genetic screening of few-days-old embryos during in vitro fertilisation (IVF), so that embryos free from the genetic mutation responsible can be identified for implantation. But that usually works solely for diseases stemming from a single gene – of which there are many, though most are rare. The procedure is called pre-implantation genetic diagnosis (PGD), and it is generally used only by couples at risk of passing on a particularly nasty genetic disease. Otherwise, why go to all that discomfort, and possibly that expense, when the old-fashioned way of making babies is so simple and (on the whole) fun?

In The End of Sex, Henry Greely, a law professor and bioethicist at Stanford University, argues that this will change. Thanks to advances in reproductive and genetic technologies, he predicts that PGD will become the standard method of conception in a matter of several decades. (Recreational sex might nonetheless persist.)

more here.

The spectrum of sex development: Eric Vilain and the intersex controversy

Sara Reardon in Nature:

Sex1As a medical student in Paris in the 1980s, Eric Vilain found himself pondering the differences between men and women. What causes them to develop differently, and what happens when the process goes awry? At the time, he was encountering babies that defied simple classification as a boy or girl. Born with disorders of sex development (DSDs), many had intermediate genitalia — an overlarge clitoris, an undersized penis or features of both sexes. Then, as now, the usual practice was to operate. And the decision of whether a child would be left with male or female genitalia was often made not on scientific evidence, says Vilain, but on practicality: an oft-repeated, if insensitive, line has it that “it's easier to dig a hole than build a pole”. Vilain found the approach disturbing. “I was fascinated and shocked by how the medical team was making decisions.”

Vilain has spent the better part of his career studying the ambiguities of sex. Now a paediatrician and geneticist at the University of California, Los Angeles (UCLA), he is one of the world's foremost experts on the genetic determinants of DSDs. He has worked closely with intersex advocacy groups that campaign for recognition and better medical treatment — a movement that has recently gained momentum. And in 2011, he established a major longitudinal study to track the psychological and medical well-being of hundreds of children with DSDs. Vilain says that he doesn't seek out controversy, but his research seems to attract it. His studies on the genetics of sexual orientation — an area that few others will touch — have attracted criticism from scientists, gay-rights activists and conservative groups alike. He is also a medical adviser for the International Olympic Committee, which about five years ago set controversial rules by which intersex individuals are allowed to compete in women's categories.

More here.