The Moral Significance of Sex Workers and People With Disabilities

Tauriq Moosa in Big Think:

ScreenHunter_50 Aug. 25 16.05When prostitution cases are brought before a judge in Britain, a particular kind of “John” (or customer) will almost always have the case tossed out of court: that is, if the customer is a person with a disability. So explains a member of the groupTLC Trust to me: a group that defends and promotes the interaction between sex workers and people with disabilities – TLC, as you’ll see, and similar groups, has become my new favourite advocacy group. This powerful statement, of judges dismissing almost out of hand any prostitution cases involving persons with a physical disability, sets an important moral precedent that I think we all ought to follow.

For many, voluntary sex work – that is, done by people who do it without being physically forced* or blackmailed into it – is inherently wrong for reasons I find extremely wanting (unless they mean sex trafficking, in which case we're not talking about the same thing. Please see the notes below for more). But many, including judges, accept that there is a unique situation when it comes to the relationship between sex workers and people with disabilities**. However, what I perceive from sex workers and this relationship is an element of morality worth emulating and promoting; and thus ultimately treating both sex workers and people with disabilities with the respect both groups deserve, as persons with interests, that warrant wider respect and, indeed, admiration.

More here.

The talent myth: How to maximise your creative potential

From The Independent:

If you thought that geniuses were born not bred, you'd be wrong, says Daniel Coyle. He visited centres of excellence across the world and discovered that, if we all just followed a few key rules, success could be ours for the taking.

A few years back, on an assignment for a magazine, I began visiting talent hotbeds: tiny places that produce large numbers of world-class performers in sports, art, music, business, maths, and other disciplines. My research also took me to a different sort of hotbed: the laboratories and research centres around the country investigating the new science of talent development. For centuries, people have instinctively assumed that talent is largely innate, a gift given out at birth. But now, thanks to the work of a wide-ranging team of scientists, including Dr K Anders Ericsson, Dr Douglas Fields, and Dr Robert Bjork, the old beliefs about talent are being overturned. In their place, a new view is being established, one in which talent is determined far less by our genes and far more by our actions: specifically, the combination of intensive practice and motivation that produces brain growth.

… What follows is a collection of simple, practical tips – all field-tested and scientifically sound – for improving skills, taken directly from the hotbeds I visited and the scientists who research them.

1. Stare at who you want to become

If you were to visit a dozen talent hotbeds tomorrow, you would be struck by how much time the learners spend observing top performers. When I say observing, I'm not talking about passively watching. I'm talking about staring – the kind of raw, unblinking, intensely-absorbed gazes you see in hungry cats or newborn babies.

More here.

Saturday Poem

I'm Alone and You're in a Bottle

In you, empty blue bottle on the windowsill,
people walk on a paved sky,
turn a swimming, sun-stroked periwinkle.
Birds fly backwards and upside-down,
traffic is truncated, tiny, curving into nothingness.

Sunlight filters through, and you,
open-mouthed and tinted blue, are learning
the world's so silly,
and nothing sticks around long enough.

I know, I've been at the window too,
standing there all blue, watching
the world come and go,
unable to hold on to any of it

by Angela Rydell
from Barrow Street, Winter 2001

Debunking the Hunter-Gatherer Workout

From The New York Times:

CalDARWIN isn’t required reading for public health officials, but he should be. One reason that heart disease, diabetes and obesity have reached epidemic levels in the developed world is that our modern way of life is radically different from the hunter-gatherer environments in which our bodies evolved. But which modern changes are causing the most harm? Many in public health believe that a major culprit is our sedentary lifestyle. Faced with relatively few physical demands today, our bodies burn fewer calories than they evolved to consume — and those unspent calories pile up over time as fat. The World Health Organization, in discussing the root causes of obesity, has cited a “decrease in physical activity due to the increasingly sedentary nature of many forms of work, changing modes of transportation and increasing urbanization.” This is a nice theory. But is it true? To find out, my colleagues and I recently measured daily energy expenditure among the Hadza people of Tanzania, one of the few remaining populations of traditional hunter-gatherers. Would the Hadza, whose basic way of life is so similar to that of our distant ancestors, expend more energy than we do? Our findings, published last month in the journal PLoS ONE, indicate that they don’t, suggesting that inactivity is not the source of modern obesity.

…All of this means that if we want to end obesity, we need to focus on our diet and reduce the number of calories we eat, particularly the sugars our primate brains have evolved to love. We’re getting fat because we eat too much, not because we’re sedentary. Physical activity is very important for maintaining physical and mental health, but we aren’t going to Jazzercise our way out of the obesity epidemic. We have a lot more to learn from groups like the Hadza, among whom obesity and heart disease are unheard of and 80-year-old grandmothers are strong and vital. Finding new approaches to public health problems will require further research into other cultures and our evolutionary past.

More here.

Friday, August 24, 2012

Playboy Interview: Richard Dawkins

PlaygroundHero

Chip Rowe in Playboy:

PLAYBOY: Your call for militant atheism is one reason you were featured as a character on an episode of South Park. The show’s creators, Trey Parker and Matt Stone, had been accused of being atheists, so they thought of the most militant atheist they could skewer.

DAWKINS: It’s the only South Park episode I’ve seen. There was an attempt at something approaching satire in the idea of an imagined future in which different sects of atheists are fighting each other. But most of that episode was ridiculous in the sense that what they had the cartoon figure of me doing, like buggering the bald transvestite——

PLAYBOY: Transsexual, actually.

DAWKINS: Transsexual, okay. That isn’t satire because it has nothing to do with what I stand for. And the scatological part, where they had somebody throwing shit, which stuck to my forehead—that’s not even funny. I don’t understand why they couldn’t go straight to the atheists fighting each other, which has a certain amount of truth in it. It reminded me of the bit from Monty Python’s Life of Brian with the Judean People’s Front and the People’s Front of Judea.

PLAYBOY: President Obama acknowledged “nonbelievers” in his inaugural address, which caused a fuss. But when you consider religious belief, one of the largest groups in the U.S. is atheists and agnostics. Why do they get overlooked in political discussions?

DAWKINS: It’s a good point. Of course, it depends how you slice it. Christians are by far the largest group. If you divide Christians into denominations, agnostics and atheists come in third, behind Catholics and Baptists. That’s interesting when you contrast it with the lack of influence of nonbelievers. And if you count up the number of Jews, certainly observant Jews, it’s much smaller than the number of nonbelievers. Yet Jews have tremendous influence. I’m not criticizing that—bully for them. But we could do the same.

More here.

The worst art restoration project of all time

ScreenHunter_49 Aug. 24 21.02

Raphael Minder in the New York Times:

An elderly woman stepped forward this week to claim responsibility for disfiguring a century-old “ecce homo” fresco of Jesus crowned with thorns, in Santuario de la Misericordia, a Roman Catholic church in Borja, near the city of Zaragoza.

Ecce homo, or behold the man, refers to an artistic motif that depicts Jesus, usually bound and with a crown of thorns, right before his crucifixion.

The woman, Cecilia Giménez, who is in her 80s, said on Spanish national television that she had tried to restore the fresco, which she called her favorite local representation of Jesus, because she was upset that parts of it had flaked off due to moisture on the church’s walls.

The authorities in Borja said they had suspected vandalism at first, but then determined that the shocking alterations had been made by an elderly parishioner. The authorities said she had acted on her own.

But Ms. Giménez later defended herself, saying she could not understand the uproar because she had worked in broad daylight and had tried to salvage the fresco with the approval of the local clergy. “The priest knew it,” she told Spanish television. “I’ve never tried to do anything hidden.”

More here.

The subtle perils of Eurocentrism

Mihir S. Sharma in the Business Standard:

ScreenHunter_48 Aug. 24 21.00Few events have so up-ended the established order as Japan’s crushing victory over the Russians at the Battle of Tsushima in 1905. This, the first salvo in the long war to push back the subjugation of the East by the West, was heard around the colonised world; and it is where Pankaj Mishra begins From the Ruins of Empire, which purports to be a history of the ways in which the East imagined that war. Sadly, the glaring flaws that populate Mishra’s book, reducing it even from pop history to puerile polemic, begin there, too. Misleading quotes, for example: he says Gandhi responds by recognising it was “self-respect” that won Japan the battle, except most of Gandhi’s writing on Tsushima actually praised Japan’s patriotism and national unity, a considerably more inward-looking and less reactive claim.

Mishra’s treatment of attitudes to Japanese ambition, in fact, is just one instance of the double standards – which match those of the most devoted apologist of empire — that riddle this book. The Russo-Japanese war was a battle of empires for land in Manchuria; but throughout, Mishra insists on describing the horrors of Japanese imperialism as “but a reaction”. So, too, could the British Empire be a “reaction” to the Spanish Empire, and the German Empire a “reaction” to the British. But white people are granted agency by Mishra, and people of colour are not – one of the many, many ways in which this book fits squarely into the Eurocentric, mentally colonised framework which Mishra wants us to believe he is helping us escape. Later on in the book, the moral blindness that comes with such double-standards is hideously exposed in his description of Japanese expansionism, where the Rape of Nanking is hastily glossed over, and that empire’s brutality against fellow-Asians is excused as “revenge for decades of racial humiliation.” Indeed, he goes on to say essentially that the occupied should be thankful for this good, Asian, empire: it allowed them to imagine what freedom from the West would be like.

More here.

No Epiphanies Whatsoever

Cat-sp2Jane Hu in New Inquiry:

Wednesday morning, I wake up to face the computer screen—still open—on my bedside table. One swipe of keypad and a line of tabs brightens into view. I glance at the last page open and last night blows by like a smudge. Did I really read Cat Marnell’s Vice columns until I fell asleep? Rubbing liner from an eyelid, I shift the laptop onto my stomach and lie back down again. Where did I drop off?

What, you don’t know who Cat Marnell is? Oh, you don’t care. Then just forward this to the nine friends on your contacts list who do. We might be hopelessly hooked on her exploits, but that doesn’t mean you need be too.

Honestly, I hadn’t even heard of Marnell until this June, when she left xoJane.com (after failed attempts to resolve her drug addiction) and subsequently joined Vice as their “pills and narcissism” correspondent with a column titled “Amphetamine Logic.” As writers buzzed about Marnell’s media crackup, they tracked to the start of her writing career, when she interned and edited at various Condé Nast publications. Marnell worked at magazines such as NYLON, Teen Vogue, Glamour, and Lucky for, predominantly, their beauty sections. A narrative was set: Young talent starts early, works hard, rises only to go out prematurely—though with a bang.

As one tag affixed to Marnell’s xoJane columns reassures: “It Happened to Me.” That phrase performs a democratizing gesture—prompting readers to engage with a writer’s specific experience—that finally normalizes what “happened” for both writer and reader. Could the two main things that finally happened to Marnell be found in her job title for Vice? Pills and narcissism.

The public ate it up. Eager readers followed Marnell, as her articles moved deep inside half-lit bedrooms, sticky with sex and shaded with angel dust. With each new scandalous detail of her addictions, Marnell’s audience couldn’t wait to see where she would spiral next. Gimme gimme more, gimme more, gimme gimme more.

As Jen Doll has repeatedly observed in the Atlantic, “the same habit of addiction that drives a person to return again and again to the drug of his or her choice may have found a parallel in the reward-and-shame cycle of writing about oneself.” The more readers saw, the more they wanted to see. Inversely, the more Marnell displayed her disintegration, the more she had, and even wanted, to show.

The Lost Futures of Chris Marker

Tumblr_m3yufg2fjb1qb59k0o1_1280_jpg_470x472_q85J. Hoberman on Chris Marker, in the New York Review of Books blog:

Gracefully off-kilter, stylized as semaphores, the shadow of a man and an outlined woman are positioned at the center of a sea shell spiral. Are they dancing on air—or falling into the void?

The poster for Alfred Hitchcock’s Vertigo is scarcely less haunting than the movie. I first saw the image, without understanding what it was, as a nine-year-old on summer vacation and carried the memory with me for some twelve or fifteen years before I first saw the film. It was, as the filmmaker Chris Marker—one of Vertigo’s most ardent admirers—might say, a memory of the future.

As Vertigo is the most uncanny of movies, it feels more than coincidental that on August 1, two days following Marker’s death, at ninety-one, in Paris, the British film journal Sight and Sound announced, with no little fanfare that, after forty years, his favorite movie had finally dethroned Citizen Kane atop the magazine’s once-a-decade critics poll.

Kane is the movie that hyper-dramatized the act of filmmaking. Vertigo is about film-watching in extremis—the state of being hopelessly, obsessively in love with an image. Unlike Kane (or Psycho for that matter), Vertigo was not immediately recognized as great cinema, except in France by people like Marker. Steeped as it is in the pathos of unrecoverable memory, Marker’s La Jetée (1962) was probably the first movie made under Vertigo’s spell.

Tied for fiftieth place (one vote ahead of Rear Window) in the Sight and Sound poll, La Jetée is Marker’s most generally known work, in part because it was remade in the mid 1990s by Terry Gilliam as 12 Monkeys. Marker was the opposite of a celebrity; he was famous not for his well-knownness but for a certain willful unknowability. The man born Christian François Bouche-Villeneuve was permanently incognito. He allowed few interviews and carefully concealed his personal life; although he turned his camera on countless people, including several fellow filmmakers (Andrei Tarkovsky, Akira Kurosawa), he never allowed himself to be photographed.

I called Marker a “filmmaker” but it would more accurate to term him a “film artist.” His oeuvre encompasses movies, photography, videos, TV series, CD-ROMS, computer games, and gallery installations. Some of these might be considered memento mori, often for the film medium. Others propose cinema as a model for historical consciousness. “We can see the shadow of a film on television, the longing for a film, the nostalgia, the echo of a film, but never a film,” is a characteristic Marker observation; one of his favorite aphorisms is borrowed from George Steiner: “It is not the past that rules us—it is the image of the past.”

Notes of a Novice Student of India

India-map1Justin E. H. Smith in Berfrois:

Any specialist on anything will have had that peculiar experience of coming across some casual comment from a total non-specialist about the very thing to which one has devoted one’s life, a comment made as if there were no such thing as specialist knowledge, as if what we know in any domain at all were just so much hearsay and vulgarisation. Lord knows I’ve seen plenty of people denouncing Descartes, for example, or praising Spinoza (seldom the reverse), who know nothing, but nothing, about Descartes or Spinoza. This is easy and costless to do (and we all do it, including those of us who pride ourselves on being specialists and who really care about getting things right in our special domains), so long as one doesn’t mingle with the specialists in the domain about which one holds forth.

I’ve been thinking about how this works, about this seldom-discussed aspect of the sociology of knowledge, quite a bit recently, as I go deeper in my mid-career shift to what used to be called ‘Indology’ (more on this telling term soon). I am still a near-absolute beginner, yet I am now reaching the point where I can no longer say whatever I want to say on the grounds that I don’t know anything anyway, and that the people with whom I’m speaking don’t know anything either. I am now interacting with people who do not find it at all peculiar to care about Pāṇinian syntax theory, or about the rules of proper inference in Navya-Nyāya logic. The days are over when I could make sweeping claims about civilizational differences (the sort of sweeping claims my colleagues in philosophy often make) as regards rationality, for example. So in short I’m learning to be careful about what I say, which is really nothing other than entering a community of specialists. I expect anything I say now will appear naive to me when I look back on it in a few years, which is only to say that I will have entered more fully into that community. But one has to start somewhere.

What used to be called ‘Indology’ is now referred to more obliquely by phrases such as ‘South Asian Studies’, ‘Religions- und Kulturgeschichte Südasiens’, and so on. To some extent this shift can be explained as part of the broader changes that turned geology into ‘earth science’, and so on. Here, it’s just a matter of rebranding, and has nothing to do with respecting the sensibilities of the subjects themselves that are being studied (rocks and sediment don’t have sensibilities). In addition, there is the broad impact of Saïd’s critique of Orientalism, and the bizarre presumption that if we redescribe ourselves as doing ‘studies’ of something rather than the ‘-logy’ of it, then we are somehow immune to that critique. But unlike the transformation of Sinology into East-Asian Studies (it’s gone translinguistic now, too: in Montreal you can major in ‘Études est-asiatiques’), Indology is weighted down by other historical legacies than just the one Saïd picked out, since the gaze upon India has often been one that did not treat it as exotically other, but also, for often less than liberal reasons, treated it as fundamentally, autochthonously, the same.

Wilde in the Office

From LARB:

OscarFor those interested in, or like me obsessed with, anniversaries: this summer marks the quasquicentennial of Oscar Wilde’s first ever office job (fifty years to go before the dodransbicentennial and a century before the sestercentennial). Admittedly, the significance of the event pales in comparison with the centennial of the sinking of Titanic or the bicentenary of Charles Dickens’s birth, but Wilde’s experience in the office provides that curious anniversary where the writer, who wants the best of both worlds as a journalist and a serious author, can see in practice whether such a thing is possible or desirable. When he was 33 years old, Wilde began working for the publishing firm Cassell & Company for the duration of more than two years. His best non-fiction and fiction work was produced during the time he spent in the office at Ludgate Hill, near Fleet Street. In between May 18, 1887, when he signed the contract with Thomas Wemyss Reid, who was general manager of the company, and October 1889, when he was handed his notice, Wilde managed to write the most brilliant and lengthy of his essays, including “The Critic as Artist,” “The Decay of Lying,” “Pen, Pencil and Poison,” and “The Portrait of Mr W. H.,” a speculation on Shakespeare's Sonnets (which later became a favourite of Borges), not to mention The Picture of Dorian Gray, which is often considered Wilde's best work and the defining text of the late-Victorian age. It is difficult to imagine a serious author of our day performing a similar feat. Could Jonathan Franzen, that great enemy of superficial twittering, have written The Corrections while editing GQ, spending his weekdays in its offices? While numerous contemporary authors prefer unplugging the network cable from their laptops while writing, Wilde did the opposite thing and tried to have as many connections as possible, which he thought would contribute to his competence and inventiveness as an author.

Having toured the United States and parts of England during the early 1880s for a series of lectures about decoration, fashion, and applied arts, Wilde had amused American and British audiences with his personality and oratorical skills. When this great tour came to an end, he immediately looked for fame in prestigious literary magazines and newspapers where he could review books and publish essays about his favourite subjects. In the course of a year he reviewed dozens of books, some of which he confessed to not reading in their entirety (“I never read a book I must review,” he wrote, “it prejudices you so.”) Building for himself a credible byline which he hoped would open new opportunities for him, Wilde inhabited a freelancer's existence for a few years. This period was central to his growth as an independent thinker.

More here.

Two Steps to Free Will

From Harvard Magazine:

WillAstronomy naturally inspires cosmic thinking, but astronomers rarely tackle philosophical issues directly. Theoretical astrophysicist Robert O. Doyle, Ph.D. ’68, associate of the department of astronomy, is an exception. For five years, Doyle has worked on a problem he has pondered since college: the ancient conundrum of free will versus determinism. Do humans choose their actions freely, exercising their own power of will, or do external and prior causes (even the will of God) determine our acts? Since the pre-Socratics, philosophers have debated whether we live in a deterministic universe, in which “every event has a cause, in a chain of causal events with just one possible future,” or an indeterministic one, in which “there are random (chance) events in a world with many possible futures,” as Doyle writes in Free Will: The Scandal in Philosophy (2011). The way out of the bottle, he says, is a “two-stage model” whose origin he traces to William James, M.D. 1869, LL.D. ’03, philosopher, psychologist, and perhaps the most famous of all Harvard’s professors. Some of the confusion, Doyle believes, stems from how thinkers have framed the question—in an either/or way that allows only a rigidly predetermined universe or a chaotic one totally at the mercy of chance. David Hume, for example, asserted that there is “no medium betwixt chance and an absolute necessity.” But Doyle also finds the term “free will” unclear and even unintelligible, because the condition of “freedom” applies to the agent of action, not the will: “I think the question is not proper, whether the will be free, but whether a man be free,” in John Locke’s concise phrasing. “The element of randomness doesn’t make us random,” Doyle says. “It just gives us possibilities.”

Doyle limns a two-stage model in which chance presents a variety of alternative possibilities to the human actor, who selects one of these options and enacts it. “Free will isn’t one monolithic thing,” he says. “It’s a combination of the free element with selection.” He finds many antecedents in the history of philosophy—beginning with Aristotle, whom he calls the first indeterminist. But he identifies James as the first philosopher to clearly articulate such a model of free will, and (in a 2010 paper published in the journal William James Studies and presented at a conference honoring James; see “William James: Summers and Semesters”) he honors that seminal work by naming such a model—“first chance, then choice”—“Jamesian” free will.

More here.

Thursday, August 23, 2012

ryan, rand, hayek

26economy-articleLarge-v4

In actuality, Ryan is like a lot of politicians who merely cherry-pick Hayek to promote neoclassical policies, says Peter Boettke, an economist at George Mason University and editor of The Review of Austrian Economics. “What Hayek has become, to a lot of people, is an iconic figure representing something that he didn’t believe at all,” Boettke says. For example, despite his complete lack of faith in the ability of politicians to affect the economy, Hayek, who is frequently cited in attacks on entitlement programs, believed that the state should provide a base income to all poor citizens. To be truly Hayekian, Boettke says, Ryan would need to embrace one of his central ideas, known as the “generality norm.” This is Hayek’s belief that any government program that helps one group must be available to all.

more from Adam Davidson at the NY Times Magazine here.

beauty and self-hatred

Helen-Gurley-Brown-006

Ilse has grown up in the shadow of Cosmo-culture, where drastic measures are encouraged if beauty, and therefore confidence and “empowerment”, is the end result. The death of Cosmopolitan’s Helen Gurley Brown, plastic surgery pioneer, has brought some of her choice quotes to the surface. “Self-help,” she said to Nora Ephron, explaining the methods she used to improve her flaws. “I wish there were better words, but that is my whole credo. You cannot sit around like a cupcake asking other people to come and eat you up and discover your great sweetness and charm. You’ve got to make yourself more cupcakable all the time so you’re a better cupcake to be gobbled up.” The formula she laid down for Cosmopolitan in 1965 relied on constant renovation, improvement and a continual quest for achievement, where anybody can be beautiful, if only they try hard enough.

more from Eva Wiseman at The Observer here.

the inimitable

Oates_2-081612_jpg_230x986_q85

Dickens is so brilliant a stylist, his vision of the world so idiosyncratic and yet so telling, that one might say that his subject is his unique rendering of his subject, in an echo of Mark Rothko’s statement, “The subject of the painting is the painting”—except of course, Dickens’s great subject was nothing so subjective or so exclusionary, but as much of the world as he could render. If Dickens’s prose fiction has “defects”—excesses of melodrama, sentimentality, contrived plots, and manufactured happy endings—these are the defects of his era, which for all his greatness Dickens had not the rebellious spirit to resist; he was at heart a crowd-pleaser, a theatrical entertainer, with no interest in subverting the conventions of the novel as his great successors D.H. Lawrence, James Joyce, and Virginia Woolf would have; nor did he contemplate the subtle and ironic counterminings of human relations in the way of George Eliot and Thomas Hardy, who brought to the English novel an element of nuanced psychological realism not previously explored. Yet among English writers Dickens is, as he once called himself, part-jesting and part-serious, “the inimitable.”

more from Joyce Carol Oates at the NYRB here.

In-law infighting boosted evolution of menopause: Conflict between generations of unrelated childbearing women affects offspring survival

From Nature:

DauConflict between women and their daughters-in-law could be a factor in explaining an evolutionary puzzle — the human menopause. Humans, pilot whales and killer whales are the only animals known to stop being able to reproduce long before they die. In terms of evolution, where passing on your genes is the main reason for living, the menopause remains puzzling. Now, using a large data set from Finland, researchers have for the first time been able to test a hypothesis that competition between different generations of genetically unrelated breeding women could have promoted the evolution of the menopause. The results are published today in Ecology Letters1. Mirkka Lahdenperä, an ecologist at the University of Turku in Finland, and her colleagues used data from meticulous birth, death and marriage records kept by the Lutheran church in the country between 1702 and 1908. As they dug into the data, the researchers found that the chances of children dying increased when mothers-in-law and daughters-in-law gave birth around the same time. For children of the older women, survival dropped by 50%. For children of the daughters-in-law, it dropped by 66%. However, if mothers and daughters had children at the same time, the survival of those children wasn’t affected. The results suggest that it would be beneficial to stop having children once your daughter-in-law entered the fray. “We were surprised that the result was so strong,” says Andrew Russell, an ecologist at the University of Exeter, UK, who was part of the research team. He suggests that perhaps in-laws fought over food for their children instead of cooperating as mothers and daughters might.

Other theories to explain the menopause include the mother hypothesis, which suggests that older women have an increased chance of dying in childbirth, and the grandmother hypothesis — that the benefits to the family when women care for their grandchildren provide an evolutionary reason to stay alive after reproductive age. Using an inclusive-fitness model, which counts the number of gene equivalents passed from generation to generation, the team showed that when mothers and their sons' wives had children at the same time, there was strong selection against women remaining fertile past the age of 51.

More here.