Jerry Coyne catches Sean Carroll in a subtle error

Jerry Coyne discusses Sean Carroll's interview in the New York Times in his blog, Why Evolution Is True:

JerryCoyne Carroll is a smart and amiable guy, and gives a good interview. There’s one place, however, where I think he misses the mark. That’s where he discusses the effect of time’s directionality on biology, specifically ageing:

Q. THE CENTERPIECE OF THE RECENT MOVIE “BENJAMIN BUTTON” AND THE ABC TELEVISION SERIES “FLASH FORWARD” IS THE TIME TRAVEL. HOW DO YOU RATE THE SCIENCE OF THOSE ENTERTAINMENTS?

ScreenHunter_02 Apr. 24 09.36 A. Well, the Benjamin Button character ages in reverse. In “Fast Forward” people glimpse the future. These are great story-telling devices.

But the writers can’t resist the temptation to bend the rules. If time travel were possible, you still wouldn’t be able to change the past — it’s already happened! Benjamin Button, he’s born old and his body grows younger. That can’t be true because being younger is a very specific state of high organization. A body accumulates various failures and signs of age because of the arrow of time.

But I don’t think that entropy (at least in bodies) is the only solution here, or even an important solution, for it’s perfectly possible for a body to be immortal, and some plants (and bdelloid rotifers, who appear to reproduce largely asexually) have approached physical immortality. There’s far more to ageing than just “the arrow of time.” Indeed, the inexorable increase in entropy encapsulated in the second law of thermodynamics, a law that holds over the whole universe, is violated locally by two biological phenomena: development and evolution.

More here. [Sean admits that ageing is too complicated to be explained by entropy alone.]



WHAT DO PHILOSOPHERS BELIEVE?

Anthony Gottlieb in More Intelligent Life:

ScreenHunter_01 Apr. 24 09.19 There was once a website on which academic philosophers listed the curious things that strangers had said to them upon learning that they were in the presence of a philosopher. The following conversation allegedly took place on an aeroplane:
“May I ask you a question?”
“Yes.”
“It’s a philosophical question. Is that ok?”
“Sure.”
“There’s a boy I fancy. Should I text him or e-mail him?”
In a similar vein, also from the skies:
“What do you do?”
“I’m a philosopher.”
“What are some of your sayings, then?”
This exchange makes professional philosophers titter, because their daily work is far removed from the production of sage utterances. But the request for “sayings” was not an unreasonable one. The great philosophers of old are remembered largely by their posthumous contributions to dictionaries of quotations. How is an ordinary person to know what today’s professional philosophers think?
One answer – a novel one, it seems – comes from a new survey of philosophers’ views. A preliminary analysis of the results has been published in an electronic journal, PhilPapers. Unfortunately, however, the survey was written for philosophy nerds. So here is a translation for airline passengers.
More here. [Thanks to Stefany Anne Golberg.]

There Was Never Any Pay-day For the Negroes

From History Matters:

Slavery As slavery collapsed at the close of the Civil War, former slaves quickly explored freedom’s possibilities by establishing churches that were independent of white control, seeking education in Freedmen’s Bureau schools, and even building and maintaining their own schools. Many took to the roads as they sought opportunities to work and to reconstitute their families. Securing their liberty meant finding the means of support to obtain land or otherwise benefit from their own labor, as Jourdon Anderson made clear in this letter to his former owner. He addressed Major Anderson from Ohio, where he had secured good wages for himself and schooling for his children. Many freedpeople argued that they were entitled to land in return for their years of unpaid labor and looked to the federal government to help achieve economic self-sufficiency. Black southerners understood the value of their own labor and looked for economic independence and a free labor market in their battle over the meaning of emancipation in post-Civil War America.

Dayton, Ohio, August 7, 1865

To My Old Master, Colonel P.H. Anderson, Big Spring, Tennessee

Sir: I got your letter and was glad to find you had not forgotten Jourdon, and that you wanted me to come back and live with you again, promising to do better for me than anybody else can. I have often felt uneasy about you. I thought the Yankees would have hung you long before this for harboring Rebs they found at your house. I suppose they never heard about your going to Col. Martin’s to kill the Union soldier that was left by his company in their stable. Although you shot at me twice before I left you, I did not want to hear of your being hurt, and am glad you are still living. It would do me good to go back to the dear old home again and see Miss Mary and Miss Martha and Allen, Esther, Green, and Lee. Give my love to them all, and tell them I hope we will meet in the better world, if not in this. I would have gone back to see you all when I was working in the Nashville Hospital, but one of the neighbors told me Henry intended to shoot me if he ever got a chance.

Read the rest of this remarkable letter here. [Thanks to David Schneider.]

Friday, April 23, 2010

Pocket Protectors and Politics: Is (Stephen Jay Gould’s) Science Political?

Gould-evolution-coverChadwick Jenkins in Popmatters (via bookforum):

[S]o we come to the subject of this essay: David F. Prindle’s well intentioned but deeply flawed effort, Stephen Jay Gould and the Politics of Evolution. Prindle’s project is one that I cannot help but applaud. He endeavors to expose the deep connections between the political and the scientific beliefs of a prominent public figure.

By all rights, this ought to be a stellar book, but it quickly flounders in its inability to forge any real causal (or even implicative) connection between Gould’s politics and his science. Indeed, Prindle offers us an almost immediate opportunity to gauge his failure by providing a rare glance into the process of publication. He reproduces two reader reviews that he received from a publisher other than Prometheus Books.

One reader excoriates Prindle’s project as completely wrong-minded insofar as science is an objective pursuit and politics play no role. The other reader revealingly claims that Gould’s “political and social views biased his scientific views” and that these “social attitudes… led him to his exaggerated views on the role of chance in evolution” (p.12; emphases mine).

Now this leaves Prindle in a rather precariously moderate position, and in this case the moderate position is not necessarily the rational one. On the one hand, reader 1 disavows any linkage between the scientific and the political. On the other hand, reader 2 basically disavows anything resembling the scientific. Everything is simply political.

Both worldviews, at least, can claim to be coherent. Prindle, in attempting to forge a weakly buttressed middle ground, finds himself steeped in contradiction. The problem arises from the “Credo” he includes in his introduction, in which he proclaims that he believes that “there is such a thing as objective reality in nature, independent of the human mind” (p. 12). Only a few short pages later, Prindle asserts that Gould’s “scientific ideas were seamlessly wedded to his political positions, so that his methodological and philosophical stance always buttressed his political values and vice-versa” (emphasis mine).

Let us think about this carefully for a moment-something that Prindle obviously never bothered to do. There are only a few possibilities here:

1) Gould’s pursuit of scientific method led him to certain political beliefs (Prindle explicitly denies this);

2) Gould pursued science in a more or less purist manner and his scientific insights fortuitously coincided with his political outlook, in which case the science is all that truly matters (this approximates the first reader response that Prindle rejects);

3) Gould held certain political beliefs and he twisted his scientific outlook to buttress those beliefs (this conforms to the second reader response that Prindle’s Credo contradicts);

4) Gould’s political beliefs predisposed him to certain insights into the truth; and finally

5) by some magic coincidence, Gould’s science and his politics not only coincided but were mutually reinforcing.

Since Prindle rejects the first three positions, that leaves us only with the last two.

Ending the Slavery Blame-Game

23oped_ready-articleInline Henry Louis Gates Jr. in the NYT:

For centuries, Europeans in Africa kept close to their military and trading posts on the coast. Exploration of the interior, home to the bulk of Africans sold into bondage at the height of the slave trade, came only during the colonial conquests, which is why Henry Morton Stanley’s pursuit of Dr. David Livingstone in 1871 made for such compelling press: he was going where no (white) man had gone before.

How did slaves make it to these coastal forts? The historians John Thornton and Linda Heywood of Boston University estimate that 90 percent of those shipped to the New World were enslaved by Africans and then sold to European traders. The sad truth is that without complex business partnerships between African elites and European traders and commercial agents, the slave trade to the New World would have been impossible, at least on the scale it occurred.

Advocates of reparations for the descendants of those slaves generally ignore this untidy problem of the significant role that Africans played in the trade, choosing to believe the romanticized version that our ancestors were all kidnapped unawares by evil white men, like Kunta Kinte was in “Roots.” The truth, however, is much more complex: slavery was a business, highly organized and lucrative for European buyers and African sellers alike.

The African role in the slave trade was fully understood and openly acknowledged by many African-Americans even before the Civil War. For Frederick Douglass, it was an argument against repatriation schemes for the freed slaves. “The savage chiefs of the western coasts of Africa, who for ages have been accustomed to selling their captives into bondage and pocketing the ready cash for them, will not more readily accept our moral and economical ideas than the slave traders of Maryland and Virginia,” he warned. “We are, therefore, less inclined to go to Africa to work against the slave trade than to stay here to work against it.”

Brad DeLong comments:

1. The first generations of African kings who began selling slaves to European traders for guns indeed did not know what slavery was like on the other side of the Middle Passage–they thought it was like slavery in Africa, where if you become a slave you then become part of a single household in which you have roughly the status of the very poor third cousin. But slavery in the Caribbean was a much harsher and more vicious institution–as capitalist slavery driven by production of staple cash crops so often is.

2. Once the slave trade was started and once the kings of Africa knew what they were doing, no individual African kingdom along the coast can back off and stop. If it does, the guns and ammunition stop coming–and it gets conquered in short order by its coastal neighbors who are still engaged in the slave trade.

3. Only when European consumer demand for Caribbean staple crops appears–only when the profits from slave agriculture and thus slave-raiding become really large–is it worth African kings' while to start substantial slave-raiding in the interior (and is it worth the Europeans' while to start shipping people across the Middle Passage).

That Henry Louis Gates makes fun of these arguments doesn't make them untrue.

And so I reject the quitclaim deed he offers: just because there were people with skin of another color on another continent who aided and conspired with my ancestors in their crimes does not mean that I am quits of all obligations as I sit here still enjoying the fruits of their crimes.

The pleasure principle he despised

Ffce13f0-4e7b-11df-b48d-00144feab49a

Suddenly Rothko is everywhere, and it’s safe to say he would have liked that. In New York, everyone wants to see Red, John Logan’s play about the artist. In London, the play and Alfred Molina’s confrontational performance for the Donmar Warehouse predictably triggered an outbreak of eye-rolling among the Art Classes. The prospect of yet another melodrama featuring a heroically tormented painter trowelling on the angst in heavy pigment, and monologues about nailing the Tragic to the canvas, brought on an attack of sneering at romantic platitudes; much muttering about Sturm und Drang for the middlebrow. But on the other side of the Atlantic, Rothko’s own side, the play has been received as deep, dark and moving, much like the artist’s late works. As it happens, that’s the right response. Whether the myths make the man or vice-versa, there are some artists who actually do live with the old burden of the melancholy temperament, richly chronicled in Rudolf and Margot Wittkower’s wonderful 1963 anthology of artistic gloom, Born Under Saturn. Rothko’s humour had more than its fair share of black bile, and wherever his painterly impulse took him, it was, by his own account, always engaged in the struggle to register the sacrificial and destructive habits ingrained in the human condition.

more from Simon Schama at the FT here.

the other adam smith

20100421_2010+16smith_w

The Theory of Moral Sentiments, Adam Smith’s first book, was published in early 1759. Smith, then a young professor at the University of Glasgow, had some understandable anxiety about the public reception of the book, which was based on his quite progressive lectures. On 12 April, Smith heard from his friend David Hume in London about how the book was doing. If Smith was, Hume told him, prepared for “the worst”, then he must now be given “the melancholy news” that unfortunately “the public seem disposed to applaud [your book] extremely”. “It was looked for by the foolish people with some impatience; and the mob of literati are beginning already to be very loud in its praises.” This light-hearted intimation of the early success of Smith’s first book was followed by serious critical acclaim for what is one of the truly outstanding books in the intellectual history of the world. After its immediate success, Moral Sentiments went into something of an eclipse from the beginning of the 19th century, and Smith was increasingly seen almost exclusively as the author of his second book, An Inquiry into the Nature and Causes of the Wealth of Nations, which, published in 1776, transformed the subject of economics. The neglect of Moral Sentiments, which lasted through the 19th and 20th centuries, has had two rather unfortunate effects.

more from Amartya Sen at The New Statesman here.

Transgression was alarmingly present

Gornick2

When I was young, I thought that Hillel’s “do unto others as you would have others do unto you”—the Golden Rule—was all the political theory necessary to make the world a good place in which to live. The Rule demanded only that I honor the same irreducible humanity in my fellows that I identified in myself. Just as I understood that I wished—no, needed—to not be dismissed, discounted, or traduced; restrained, cheated, or humiliated; robbed, raped, or murdered; so I understood that all other persons needed the same. This practice alone would provide the equality necessary to make all of us see ourselves in one another. Equality. The word itself moved me, made my heart sing. Instinctively, I felt that equality was the key to human comradeship. In fact, I thought that it almost didn’t matter how impoverishing or threatening a circumstance might be, so long as it was experienced equally. Repeatedly, throughout history, it had been shown that the most soul-destroying of conditions—wars, plagues, depressions—could be borne if shared equally. (A recent study, published in the United States as The Spirit Level: Why Greater Equality Makes Societies Stronger, actually argues that everything—life expectancy, infant mortality, obesity levels, crime rates, literacy scores, even garbage collection—improves in societies that are more rather than less equal). It was inequality, I was certain, that did the damage; inequality that destroyed one’s innate sense of self-worth.

more from Vivian Gornick at The Boston Review here.

Barack Obama and the question of race

Howard W French in The National:

ScreenHunter_03 Apr. 23 15.41 Just who is Barack Obama?

Fifteen months into his presidency, we may have acquired an intuitive sense of the answer to this question, and yet Obama remains elusive, like a fidgety subject posing for a daguerreotype. He nods and bobs forward and back, in and out of focus, never altogether fixed.

By now we have all been sufficiently exposed to the Obama act to suspect real method. The recent passage of major healthcare reform presents one case in point: early in his term, Obama placed healthcare at the centre of his domestic agenda, and yet he long seemed content to avoid defining his own parameters for the reform, or even, for that matter, establishing a bottom line.

Along the way, compromise with irredentist Republicans was treated as an almost sacred virtue – maddeningly so for Obama supporters, who began to suspect that he was weak, or worse, fired by insipid conciliatory instincts. Until, at the 11th hour, the president revealed a hitherto unseen mailed fist, and the bill was pushed through Congress without a single supporting vote from an opposition that had been marginalised by its very refusal to negotiate.

The key to this unusual style, if one is to be found, would seem to exist in Obama’s own life story, uncommonly rich in crossed genes and mixed signals. This story has now received its third major retelling, in the form of a massive new biography by the New Yorker editor David Remnick.

More here.

Cruel Ethiopia

Helen Epstein in the New York Review of Books:

ScreenHunter_01 Apr. 23 15.24 Parts of southern Ethiopia resemble the scenery in a Tarzan movie. When I was there last fall, the green forested hills were blanketed in white mist and rain poured down on the small farms and homesteads. In the towns, slabs of meat hung in the butchers’ shops and donkeys hauled huge sacks of coffee beans, Ethiopia’s major export, along the stony dirt roads. So I was surprised to see the signs of hunger everywhere. There were babies with kwashiorkor, a disease caused by malnutrition, which I’d assumed occurred only in war zones. Many of the older children were clearly stunted and some women were so deficient in iodine they had goiters the size of cannonballs.

This East African nation, famous for its ancient rock-hewn churches, Solomonic emperors, and seemingly intractable poverty, has a long history of famine. But I had always assumed that food shortages were more common in the much drier north of the country than in the relatively fertile south. Although rainfall throughout Ethiopia had been erratic in 2008 and 2009, the stunting and goiter I saw were signs of chronic malnutrition, which had clearly existed for many years.

What was causing it?

More here.

Sounds Make Memories Stick During Sleep

From Scientific American:

Sounds-make-memories-stick_1 MONTREAL—A good night's sleep, or even just a nap, can be an aid to memory. Psychologists have known for years that sleep solidifies what we've learned during the day, transforming tenuous associations into stable ones. Learning while you snooze seems supremely efficient, and so people have long dreamed of co-opting this process so that their dozing brain shores up what matters to them—say, material they've studied for a test or a talk, or verbiage in a foreign language they want to master. But until now there has been little support for the notion that studying in your sleep is useful. Psychology graduate student John Rudoy at Northwestern University in Illinois reported findings here on Monday at the Cognitive Neuroscience Society 2010 annual meeting that hint at a way to do that.

Rudoy, who works in neuroscientist Ken Paller's group, and his colleagues showed study participants 50 photographs and asked them to memorize where each one appeared on a computer screen. To help the participants remember the locations, the researchers asked them to practice moving each picture to where they thought it had appeared, and after they’d made their move, showed them the picture's correct location. In addition, the participants were taught to associate each photograph with a distinct sound—say, a chirp, ring, buzz or tone—that was related to the image. For example, the sound of an object hitting the water accompanied a picture of a splash. The participants then took a nap lasting for up to 90 minutes in an easy chair in the laboratory. As they dozed, the investigators exposed the subjects to 25 distinct sounds—the ones they had associated with half of the photographs. When the nappers woke up, they again tried to move each of the 50 photographs to its previously assigned spot on a screen.

More here.

Dreams Linked to Better Memories

From Science:

Sleep You probably don't need a neuroscientist to tell you that sleep helps your brain absorb new information. But what about dreams? Some researchers have speculated that they, too, might improve memory. Now, a new study provides some of the first experimental evidence: People who dreamed about a virtual reality maze they'd encountered a few hours earlier were quicker to find a way out when tested a second time. Lots of studies have suggested that our brains are busy while we sleep, consolidating memories of the day's events and putting them in the context of things we already know. In sleeping rodents, for example, neurons in the hippocampus fire in patterns remarkably similar to those recorded during a previous maze-running session—almost as if the animals replay the experience in their sleep. Some researchers wondered whether the rodents were dreaming of the maze, but of course there was no way to ask them.

So neuroscientists Erin Wamsley, Robert Stickgold, and colleagues at Harvard Medical School in Boston turned to a more verbal species: Harvard undergraduates. Participants in the study sat at a computer for 45 minutes and played with a virtual reality maze (see image). During this time, the researchers tested their memory by asking them to remember a particular object in the maze and find their way back to it from various starting points chosen at random. Fifty of the 99 participants then had the opportunity to take a nap while the others watched videos. The researchers used electroencephalography to monitor the brain activity of the napping students and either woke them once to ask about the content of any dreams or asked them at the end of their naps. Not surprisingly, people who took a nap improved more on the maze—as judged by the speed with which they found requested objects—than did those who stayed awake. But the four students who reported thoughts of the maze just as they were falling asleep or dreams of the maze during their nap improved on their previous performance about 10 times more, on average, than other nappers did, the researchers report online today in Current Biology.

More here.

Thursday, April 22, 2010

The Improbability Pump

Jerry A. Coyne in The Nation:

1271959587-large Imagine for a moment that a large proportion of Americans–let's say half–rejected the “germ theory” of infectious disease. Maladies like swine flu, malaria and AIDS aren't caused by micro-organisms, they claim, but by the displeasure of gods, whom they propitiate by praying, consulting shamans and sacrificing goats. Now, you'd surely find this a national disgrace, for those people would be utterly, unequivocally wrong. Although it's called germ theory, the idea that infections are spread by small creatures is also a fact, supported by mountains of evidence. You don't get malaria unless you carry a specific protozoan parasite. We know how it causes the disease, and we see that when you kill it with drugs, the disease goes away. How, we'd ask, could people ignore all this evidence in favor of baseless superstition?

But that's fiction, right? Well, not entirely, for it applies precisely to another “theory” that is also a fact: the theory of evolution. Over the past quarter-century, poll after poll has revealed that nearly half of all Americans flatly reject evolution, many clinging to the ancient superstition that the earth was created only 6,000 years ago, complete with all existing species.

More here.

Representing Mother******s

Representing-mother-s.4713962.40

In the year 867, a new portrait mosaic of the Virgin Mary & Son was unveiled in the apse of the Hagia Sophia — the seat of the Eastern Orthodox Catholic Church in Istanbul (then Constantinople) — homilized by the Armenian-born soon-to-be–Patriarch Photios as a victory over Iconoclasm: the almost-century-long proscription on depiction that had rocked the ages-old Byzantine art world to its foundations. This forbearance of graven images was (and remains) one of the most profound differences between Islam and its Great Satanic neighbors. At least until the advent of modernism, and the eruption of abstract painting — just about a century ago. The craving for persuasive facsimiles of human bodies has reached a recent epitome with James Cameron’s Avatar, a realization that makes one yearn for the rigorous formalism of the Taliban — or at least Clement Greenberg. Mid-20th-century critic Greenberg’s successful championing of the abstract expressionists and insistence on the transcendental flatness of the painterly medium have just reached an epitome of their own in the most awesome sheet of U.S. postage in recent memory, including dollhouse-ready reproductions of Jackson Pollock’s Convergence (1952), Willem de Kooning’s Asheville (1948) and eight other iconic images of the New York School.

more from Doug Harvey at the LA Weekly here.

shakespeare was shakespeare

TLS_Nicholl_709733a

What, aside from international fame, did Mark Twain, Helen Keller, Henry James, Sigmund Freud, Charlie Chaplin and Orson Welles have in common? The answer is that they all believed that the plays and poems attributed to William Shakespeare were really written by someone else. The first three belong to the classic “Baconian” era of the late nineteenth and early twentieth centuries, when the claims of Sir Francis Bacon’s authorship were uppermost; and were argued most vociferously in America. Freud and Welles were more modern “Oxfordians”, believing the true author to be Edward de Vere, 17th Earl of Oxford, as first proposed by J. Thomas Looney in 1920. Chaplin was a floating voter, a generic “anti-Stratfordian”. He did not know who wrote the plays, he explained in his 1964 autobiography, “but I can hardly think it was the boy from Stratford. Whoever wrote them had an aristocratic attitude”. These are essentially celebrity endorsements: none of the above, with the possible exception of Freud, could be called a Shakespeare scholar. It is an impressive list but also a very elderly one. One could continue it through to the present day (Malcolm X, Enoch Powell, Derek Jacobi, Mark Rylance, Jim Jarmusch . . .), but those early big names look back to the heyday of the authorship controversy, when the anti-Stratfordian cause seemed daring and even excitingly modern in its challenge to traditional (and, from the American point of view, to English) orthodoxy. And if to many it also seemed barmy, it was a flamboyant, newsworthy sort of barminess. Some of the frontline Baconian theorists were themselves minor celebrities, eccentric exemplars of the epoch’s passion for discovery, on a par with wild-haired inventors and staring-eyed explorers in search of lost cities.

more from Charles Nicholl at the TLS here.

Thursday Poem

Wicked Woman

If you come to my city
you are bound to find
my name in the roster
of wicked women
I have all that it takes
to be as wicked
as they come
I have a goblet
brimming over
in my hand
My laughter is known
for its abandon
Flames find a home
in my mouth
My heart beats and
every nerve does
a little dance
The road is at my feet
And just the sky above
I have the courage to bear
and express myself without fear

by Nirupama Dutt
Translation by author

Beyond the Brain

From The National Geographic:

Mind-brain-electrodes_8903_600x450 The ancient Egyptians thought so little of brain matter they made a practice of scooping it out through the nose of a dead leader before packing the skull with cloth before burial. They believed consciousness resided in the heart, a view shared by Aristotle and a legacy of medieval thinkers. Even when consensus for the locus of thought moved northward into the head, it was not the brain that was believed to be the sine qua non, but the empty spaces within it, called ventricles, where ephemeral spirits swirled about. As late as 1662, philosopher Henry More scoffed that the brain showed “no more capacity for thought than a cake of suet, or a bowl of curds.” Around the same time, French philosopher René Descartes codified the separation of conscious thought from the physical flesh of the brain. Cartesian “dualism” exerted a powerful influence over Western science for centuries, and while dismissed by most neuroscientists today, still feeds the popular belief in mind as a magical, transcendent quality.

A contemporary of Descartes named Thomas Willis—often referred to as the father of neurology—was the first to suggest that not only was the brain itself the locus of the mind, but that different parts of the brain give rise to specific cognitive functions. Early 19th-century phrenologists pushed this notion in a quaint direction, proposing that personality proclivities could be deduced by feeling the bumps on a person's skull, which were caused by the brain “pushing out” in places where it was particularly well developed. Plaster casts of the heads of executed criminals were examined and compared to a reference head to determine whether any particular protuberances could be reliably associated with criminal behavior.

More here.

Changing the Dating Game

From Scientific American:

Dating-style Women are much choosier than men when it comes to romance. This is well known, but the reason for this gender difference is unclear. Evolutionary psychologists think it is because back in prehistoric times “dating” was much riskier for women. Men who made an ill-advised choice in the ancient version of a singles bar simply had one lousy night. Women who chose unwisely could end up facing years of motherhood without the critical help that a stable partner would have provided.

That is less true today, yet women remain much more selective. Is this difference a vestige of our early ancestry? Or might it be totally unrelated to reproductive risk, the result of something more modern and mundane? A couple of Northwestern University psychologists, Eli J. Finkel and Paul W. Eastwick, decided to explore this question in an unusual laboratory: a real-life speed-dating event.

More here.