Carnal Ethics

Anncahill

Richard Marshall interviews Ann Cahill in 3:AM Magazine:

3:AM: You say: “Intersubjectivity says the relation is first and constitutes the being of the parties involved in it, and the parties involved constitute the relationship… As humans we can not come into existence without someone caring for us. Our very being as existence is being with another. … Existence is intersubjective.” Does all your work rely on a notion of intersubjectivity? Is this connected to notions of postmodernity in that it decentres identity: are you a postmodernist philosopher?

AC: Yes, intersubjectivity is a strong thread that runs through virtually all of my work. It’s absolutely connected to postmodern theories that challenge the modern notion of the self as autonomous, self-contained, and ideally free from the demands of the other. I do identify as a postmodernist philosopher, and tend to work from and with postmodern thinkers such as Foucault, Butler, Irigaray, etc. But I don’t think the concept of intersubjectivity is contrary to identity, unless you understand identity as necessarily innate and stable. Focusing on intersubjectivity has led me to understand identity more as location. Just as one can’t have a location without reference to other entities, one can’t have an identity exception in relation to other beings. Which is not to say that one’s identity is reducible to those relations, or that one could predict aspects of a person’s identity simply by extrapolating from those relations; such assumptions would deny the dynamism of intersubjectivity. Identity and relations are co-constituting: who I am (at this moment, at this place, keeping in mind that identity is always a process) affects the kind and quality of relations I engage in, just as those relations simultaneously affect my identity. I should emphasise that I’m thinking here of the location of a being who can move, not the location of a static or fixed object (if such a thing even exists).

3:AM: You say that intersubjectivity is a ‘big word’, and by that you don’t just mean they are large but that they are unfamiliar. You defend them don’t you?

AC: Ah, I do defend them. I love big words. I understand the critique of accessibility, that is, that big words can serve to alienate and intimidate readers, and I certainly believe that philosophy (especially feminist philosophy) has a responsibility to be accountable and relevant to the real lives of human and other-than-human beings. When using big words gets in the way of that responsibility, we need to be careful. But big words also have the capacity to break through the fog of dominant assumptions, to do the hard work of substantially reframing familiar problems or questions so that we can gain new and better leverage.

Is Philosophy Finally Without God?

Piper-vangelder

Daniel Tutt reviews Christopher Watkin's Difficult Atheism: Tracing the Death of God in Contemporary Continental Thought, in Berfrois (image: Warkton, Northamptonshire: Monument by Vangelder, 1775, John Piper, 1964):

Declaring oneself an “atheist” isn’t what it used to be. Growing numbers of Generation Y prefer to remain agnostic, which is why so many of them go by the “nones,” or those with no religious preference. My wife used to work at a large university and she told me that on standardized tests many of the students write in “human” in the ethnic and racial identity box. A friend of mine launched a social media campaign to have “Jedi” recognized as a religion in Great Britain. It took off like wild fire and in 2006; Jedis were the fourth largest religion in all of Great Britain. Occupying these undecided identities: “none,” “Jedi” and “human” make a lot of sense. In so doing, one renders no judgment upon the status quo, nor does the person negate traditional religious identities for which many of us still have some allegiance to.

The truth is, declaring oneself an atheist is a difficult process, but we’ve lost touch with this difficulty. Kierkegaard notoriously said “the biggest problem with Christians today is that no one wants to kill them anymore.” What I think he meant by this is that a healthy sense of atheism is good for religion, and lest we forget, Christianity is perhaps the most resilient religion the world has seen. This resiliency is due in part to the fact that Christianity can handle a complicated belief in God and still retain followers. Hegel saw in Christ’s utterance on the cross, “my father, why have you forsaken me” a splitting in two of the absolute itself, a splitting in two of God. What this split represented was the death of the metaphysical God. Nietzsche’s “God is dead” mostly had to do with an epistemological death of suprasensory truths, a death that ushered in a new type of nihilism.

Most atheists today that are firm in their convictions tend to be in a trance by the so-called “Four Horsemen of the New Atheism.” Despite news of their best-selling whirlwind and the larger discourse that has risen from it is on the decline, to the point of them now losing their followers, much of atheist identity is intertwined with Dawkins, Hitchens, Dennett and Harris. The weapons they use against religion are as tired as they are outdated: Darwinian natural selection and evolution (Richard Dawkins), naturalizing reductions of religion via general science (Daniel Dennett), brash literary humanism (Christopher Hitchens) and quite paradoxically, racist appeals to reason (Sam Harris).

For the none’s and the atheists, as well as for the religious, I might add, a healthy debate about God is vital to sustaining a larger dialogue about religion, morality, and ethics in the public sphere. But we’ve been deprived of such a discourse. This is why it is a perfect time to ask: what is/can/should philosophy contribute to the question of God and atheism?

Nadeem Aslam: a life in writing

From The Guardian:

Nadeem-Aslam-010Nadeem Aslam was years into his second novel when the 11 September attacks took place. “Many writers said the books they were writing were now worthless,” he recalls. Martin Amis, for one, felt his work in progress had been reduced to a “pitiable babble”. But Aslam's saddened reaction to 9/11 was one of recognition. “I thought, that's Maps for Lost Lovers – that's the book I'm writing.” The link might seem tenuous to a novel set many miles from the twin towers or Bin Laden's lair, in an almost cocooned urban community of Pakistani migrants and their offspring in the north of England, where Aslam grew up from the age of 14. The novel was almost pastoral in its tracing of the seasons, with riffs on jazz, painting and spectacular moths. Each chapter was as minutely embellished as the Persian and Mughal miniatures Aslam has in well-thumbed volumes on his coffee table. But the plot turns on a so-called honour killing, as an unforgiving brand of Islam takes hold. In his view, and above all for women, “we were experiencing low-level September 11s every day.”

Maps for Lost Lovers, which took 11 years to write, and was published in 2004, won the Encore and Kiriyama awards (the latter recognises books that contribute to greater understanding of the Pacific Rim and South Asia). It was shortlisted for the Dublin Impac prize and longlisted for the Man Booker prize. His debut, Season of the Rainbirds (1993), set in small-town Pakistan, had also won prizes, and been shortlisted for the Whitbread first novel award. The books confirmed Aslam as a novelist of ravishing poetry and poise – admired by other writers including Salman Rushdie and AS Byatt.

More here. (Note: While I have read all his books, Maps for Lost Lovers remains my favorite. I strongly recommend it)

That Daily Shower Can Be a Killer

Jared Diamond in The New York Times:

DailyYou see, falls are a common cause of death in older people like me. (I’m 75.) Among my wife’s and my circle of close friends over the age of 70, one became crippled for life, one broke a shoulder and one broke a leg in falls on the sidewalk. One fell down the stairs, and another may not survive a recent fall. “Really!” you may object. “What’s my risk of falling in the shower? One in a thousand?” My answer: Perhaps, but that’s not nearly good enough. Life expectancy for a healthy American man of my age is about 90. (That’s not to be confused with American male life expectancy at birth, only about 78.) If I’m to achieve my statistical quota of 15 more years of life, that means about 15 times 365, or 5,475, more showers. But if I were so careless that my risk of slipping in the shower each time were as high as 1 in 1,000, I’d die or become crippled about five times before reaching my life expectancy. I have to reduce my risk of shower accidents to much, much less than 1 in 5,475. This calculation illustrates the biggest single lesson that I’ve learned from 50 years of field work on the island of New Guinea: the importance of being attentive to hazards that carry a low risk each time but are encountered frequently.

I first became aware of the New Guineans’ attitude toward risk on a trip into a forest when I proposed pitching our tents under a tall and beautiful tree. To my surprise, my New Guinea friends absolutely refused. They explained that the tree was dead and might fall on us. Yes, I had to agree, it was indeed dead. But I objected that it was so solid that it would be standing for many years. The New Guineans were unswayed, opting instead to sleep in the open without a tent. I thought that their fears were greatly exaggerated, verging on paranoia. In the following years, though, I came to realize that every night that I camped in a New Guinea forest, I heard a tree falling. And when I did a frequency/risk calculation, I understood their point of view. Consider: If you’re a New Guinean living in the forest, and if you adopt the bad habit of sleeping under dead trees whose odds of falling on you that particular night are only 1 in 1,000, you’ll be dead within a few years. In fact, my wife was nearly killed by a falling tree last year, and I’ve survived numerous nearly fatal situations in New Guinea.

More here.

Tuesday Poem

Frost Over Ireland
.
Roses hang their withered heads
Beneath the white cap of Christmas frost.
The ones without hope, without shelter,
Shiver in the hollow of the cold.

Terrified at the hunger upon them,
Small birds peck at emptiness.
Here in the snow, redwings from the East
Search in the frosted absences.

From the dark heights of a fir tree
The magpie’s greedy eye observes
The songbirds’ growing panic
When a fat rat sends them scurrying.

It is the small bird that struggles
While the predator takes his ease.
In this blank hardness without mercy
Will they find even a worm’s worth of hope?

It is the berries of ivy and holly
Who give the wren its bed and board;
Buds glistening under the frosty cap
Are the waiting June where songbirds are.

by Bríd Ní Mhóráin
from Mil ina Slaoda
publisher: An Sagart, Dingle, 2011
translation: 2012, Thomas McCarthy

Monday, January 28, 2013

Sunday, January 27, 2013

Wittgenstein’s Master: Frank Ramsey, the genius who died at 26

AC Grayling in Prospect:

ScreenHunter_110 Jan. 28 10.50Frank Ramsey was 26 years old when he died after an operation at Guy’s Hospital in January 1930. In his short life, he had made lasting contributions to mathematics, economics and philosophy, and to the thinking of a number of his contemporaries, including Ludwig Wittgenstein.

When I taught at St Anne’s, Oxford during the 1980s, I was introduced by my colleague Gabriele Taylor to Ramsey’s sister, Margaret Paul, by then retired from teaching economics at Lady Margaret Hall college. As with anyone with some knowledge of the fields of enquiry Ramsey influenced, I was immediately recruited into helping with her research into his life and thought, though in a minor capacity; she had a formidable array of other helpers besides, from eminent philosophers like Taylor and PF Strawson onwards.

Frank Ramsey was 18 when Margaret was born, so her own memories of him were those of a little girl. A large part of her motivation in writing about him was to get to know him. In this quest she was equally tireless and scrupulous. Most aspects of his work require advanced technical competence, but she was determined to understand them; an afternoon at her house talking about him could be as gruelling as it was educative.

Her memoir has now been published. It is a remarkable book, a window not just into a prodigious mind—Ramsey translated Wittgenstein’s Tractatus as a second year Trinity undergraduate, simultaneously publishing original work in probability theory and economics—but into the amazingly rich intellectual world of his day. The book’s roll-call includes John Maynard Keynes, Bertrand Russell, GE Moore and Wittgenstein, and the mise-en-scène equals it: Ramsey’s father was president of Magdalene college at Cambridge, his famously bushy-eyebrowed brother, Michael, later became Archbishop of Canterbury, and Ramsey himself, after scholarships at Winchester and Trinity, became a fellow of King’s, aged 21.

Suffering unrequited love for a married woman drove Ramsey to Vienna to be psychoanalysed by one of Freud’s pupils. It was there that he met Wittgenstein, spending hours every day in conversation with him, and later helping Keynes to bring him back to Cambridge. In the last year of his life, the 26-year-old Ramsey was the 40-year-old Wittgenstein’s nominal PhD thesis supervisor, the thesis being the Tractatus Logico-Philosophicus itself.

More here.

Diving Deep into Danger

Nathaniel Rich in the New York Review of Books:

Rich_1-020713_jpg_230x1382_q85The first dive to a depth of a thousand feet was made in 1962 by Hannes Keller, an ebullient twenty-eight-year-old Swiss mathematician who wore half-rimmed glasses and drank a bottle of Coca-Cola each morning for breakfast. With that dive Keller broke a record he had set himself one year earlier, when he briefly descended to 728 feet. How he performed these dives without killing himself was a closely guarded secret. At the time, it was widely believed that no human being could safely dive to depths beyond three hundred feet. That was because, beginning at a depth of one hundred feet, a diver breathing fresh air starts to lose his mind.

This condition, nitrogen narcosis, is also known as the Martini Effect, because the diver feels as if he has drunk a martini on an empty stomach—the calculation is one martini for every additional fifty feet of depth. But an even greater danger to the diver is the bends, a manifestation of decompression sickness that occurs when nitrogen gas saturates the blood and tissues. The problem is not in the descent, but the ascent. As the diver returns to the surface, the nitrogen bubbles increase in size, lodging in the joints, arteries, organs, and sometimes the brain or spine, where they can cause pain and potentially death. The deeper a diver descends, the more slowly he must ascend in order to avoid the bends.

More here.

The Afghan End Game?

Ann Jones in TomDispatch:

US-troops-set-out-on-a-pa-001The euphemisms will come fast and furious. Our soldiers will be greeted as “heroes” who, as in Iraq, left with their “heads held high,” and if in 2014 or 2015 or even 2019, the last of them, as also in Iraq, slip away in the dark of night after lying to their Afghan “allies” about their plans, few here will notice.

This will be the nature of the great Afghan drawdown. The words “retreat,” “loss,” “defeat,” “disaster,” and their siblings and cousins won’t be allowed on the premises. But make no mistake, the country that, only years ago, liked to call itself the globe’s “sole superpower” or even “hyperpower,” whose leaders dreamed of a Pax Americana across the Greater Middle East, if not the rest of the globe is… not to put too fine a point on it, packing its bags, throwing in the towel, quietly admitting — in actions, if not in words — to mission unaccomplished, and heading if not exactly home, at least boot by boot off the Eurasian landmass.

Washington has, in a word, had enough. Too much, in fact. It’s lost its appetite for invasions and occupations of Eurasia, though special operations raids, drone wars, and cyberwars still look deceptively cheap and easy as a means to control… well, whatever. As a result, the Afghan drawdown of 2013-2014, that implicit acknowledgement of yet another lost war, should set the curtain falling on the American Century as we’ve known it. It should be recognized as a landmark, the moment in history when the sun truly began to set on a great empire. Here in the United States, though, one thing is just about guaranteed: not many are going to be paying the slightest attention.

More here.

Cambridge, Cabs and Copenhagen: My Route to Existential Risk

Huw Price in the New York Times:

In Copenhagen the summer before last, I shared a taxi with a man who thought his chance of dying in an artificial intelligence-related accident was as high as that of heart disease or cancer. No surprise if he’d been the driver, perhaps (never tell a taxi driver that you’re a philosopher!), but this was a man who has spent his career with computers.

Indeed, he’s so talented in that field that he is one of the team who made this century so, well, 21st – who got us talking to one another on video screens, the way we knew we’d be doing in the 21st century, back when I was a boy, half a century ago. For this was Jaan Tallinn, one of the team who gave us Skype. (Since then, taking him to dinner in Trinity College here in Cambridge, I’ve had colleagues queuing up to shake his hand, thanking him for keeping them in touch with distant grandchildren.)

I knew of the suggestion that A.I. might be dangerous, of course. I had heard of the “singularity,” or “intelligence explosion”– roughly, the idea, originally due to the statistician I J Good (a Cambridge-trained former colleague of Alan Turing’s), that once machine intelligence reaches a certain point, it could take over its own process of improvement, perhaps exponentially, so that we humans would soon be left far behind. But I’d never met anyone who regarded it as such a pressing cause for concern – let alone anyone with their feet so firmly on the ground in the software business.

I was intrigued, and also impressed, by Tallinn’s commitment to doing something about it.

More here.

Genetic evidence suggests that, four millennia ago, a group of adventurous Indians landed in Australia

From The Economist:

20130119_STM999_0The story of the ascent of man usually casts Australia as the forgotten continent. Both archaeology and the genes of aboriginal Australians suggest that a mere 15,000 years were required for humanity to spread from its initial toehold outside Africa, on the Arabian side of the straits of Bab el Mandeb, to the land of Oz. The first Australians thus arrived about 45,000 years ago. After that, it took until 1788, when Captain Arthur Phillip, RN, turned up in Sydney Cove with a cargo of ne’er-do-wells to found the colony of New South Wales, for gene flow between Australia and the rest of the world to be resumed.

This storyline was called into question a few years ago by the discovery, in some aboriginal Australian men, of Y chromosomes that looked as though they had come from India. But the details were unclear. Now a study by Irina Pugach of the Max Planck Institute for Evolutionary Anthropology, in Leipzig, and her colleagues, which has just been published in the Proceedings of the National Academy of Sciences, has sorted the matter out. About 4,000 years before Captain Phillip and his merry men arrived to turn the aboriginals’ world upside down, it seems that a group of Indian adventurers chose to call the place home. Unlike their European successors, these earlier settlers were assimilated by the locals. And they brought with them both technological improvements and one of Australia’s most iconic animals.

More here.

Proton is smaller than we thought

Hamish Johnston in Physics World:

Prot1The radius of the proton is significantly smaller than previously thought, say physicists who have measured it to the best accuracy yet. The surprising result was obtained by studying “muonic” hydrogen in which the electron is replaced by a much heavier muon. The finding could mean that physicists need to rethink how they apply the theory of quantum electrodynamics (QED) – or even that the theory itself needs a major overhaul.

A proton contains three charged quarks bound by the strong force and its radius is defined as the distance at which the charge density drops below a certain value. The radius has been measured in two main ways – by scattering electrons from hydrogen and by looking very closely at the difference between certain energy levels of the hydrogen atom called the Lamb shift. Until recently the best estimate of the proton radius was 0.877 femtometres with an uncertainty of 0.007 fm

This Lamb shift is a result of the interactions between the electron and the constituent quarks of the proton as described by QED. These interactions are slightly different for electrons occupying the 2S and 2P energy levels and the resulting energy shift depends in part on the radius of the proton.

However, in muonic hydrogen the Lamb shift is much more dependent on the proton radius because the much heavier muon spends more time very near to – and often within – the proton itself.

Now an international team led by Randolf Pohl at the Max Planck Institute for Quantum Optics in Garching, Germany has measured the Lamb shift in muonic hydrogen for the first time and found the proton radius to be 0.8418 fm with uncertainty 0.0007 fm.

More here.

Sunday Poem

Clary

Her cart like a dugout canoe.

Had been an oak trunk.

Cut young. Fire-scoured.

What was bark what was heartwood : P u r e C h a r – H o l e

Adze-hacked and gouged.

Ever after (never not) wheeling hollow there behind her.

Up the hill toward Bennett Yard; down through Eight-Mile, the Narrows.

C o m e s C l a r y b y h e r e n o w

Body bent past bent. Intent upon horizon and carry.

Her null eye long since gone isinglassy, opal.

—The potent (brimming, fluent) one looks brown.

Co u r s e s C l a r y s u r e a s b a y o u t h r o u g h h e r e n o w

Bearing (and borne ahead by) hull and hold behind her.

Plies the dark.

Whole nights most nights along the overpass over Accabee.

Cr o s s e s C l a r y b l e s s h e r b a r r o w u p t h e r e n o w

Pausing and voweling there— the place where the girl fell.

( )

Afterwhile passing.

Comes her cart like a whole-note held.

by Atsuro Riley
from Poetry, Vol. 192, No. 5
publisher: Poetry, Chicago, 2008

The end of an epithet: How hate speech dies

From Time:

Hate-speech2I thought about that moment last weekend, when my 12-year-old daughter was having a Harry Potter-themed sleepover with a few of her friends. One of the girls was recalling a moment in a Potter book and came up short as she groped for a word. She was looking for ferret, but what came out was faggot. Another girl immediately jumped. “That’s a bad word,” she said. The first girl asked what it meant and after she was told, simply nodded her head at the nastiness of the thing. The girls, in effect, had gang-tackled the word, first by opprobrium, then by indifference—and then they went back to their playing. The slow, inexorable sunset of this most-used and most-loathed gay slur is by no means complete. It still burns brightly and horribly in far too many places and far too many lives, but its day is undeniably passing — a process only hastened by President Obama’s inaugural address, which included an explicit call for the rights of “our gay brothers and sisters” and memorably invoked the lessons of Seneca Falls, Selma and Stonewall. How this particular bit of hate speech finally dies will be a lesson both in the way a language and, more important, a culture matures.

The roots of the anti-gay f-word are not what most people think they are. Popular lore has it that suspected homosexuals were once put to death by fire, and that piles of sticks — or “faggots,” in the antiquated term — were used as kindling. The pile-of-sticks definition is correct, but everything else appears not to be. “There’s no historical evidence that this is how and why it originated,” says Ben Zimmer, language columnist for the Boston Globe and executive producer of the website Vocabulary.com. “Its first recorded use was in the early 20th century, when it was applied to women. As with words like queen, it then became an epithet for gay men.” But there’s value even in the etymological misconception. Gay people may never have been put to the torch, but the widespread belief that they were serves to sensitize people to the very real bigotry—and often very real danger—they’ve faced over the centuries. “Even if it has no historical truth it has a different kind of truth as a lesson,” Zimmer says. Epithets fade not just by public censure and growing disuse, but by appropriation. Queer used to pack a terrible punch of its own until gays picked it up and began using it in chants (“We’re here, we’re queer, get used to it!”), as a name for an activist group (Queer Nation) and in the “queer studies” programs offered in many college curricula.

More here.

Pride and Prejudice: universally acknowledged guide to the human heart

From The Telegraph:

Pride-prejudice_2460050b“It is a truth universally acknowledged that a single man in possession of a good fortune must be in want of a wife.” Thus begins Jane Austen’s Pride and Prejudice, one of the most famous opening lines of any novel ever written. It is a story that has touched hearts for exactly 200 years: girl meets boy, girl loses boy, girl gets boy.

…And when Austen wasn’t slicing up the men, she was defining women into tribes (long before the Spice Girls): the pretty, the funny, the clever, the bookish, the bold. Of course, I knew that in real life I was an Elizabeth – not the handsomest, not the fastest, but the “sparkiest” of girls. My true love would value me for my mind first and foremost, and that – like Elizabeth – is what I would want.

Some warn that Pride and Prejudice sets modern girls up to fail. At night, we dream of an honourable man like Darcy. By day, we learn that many modern men favour the pulchritudinous countenance of a Miss Jane Bennet, the rather relaxed morals sported by Lydia-a-likes, and especially the juicy inheritance behind an Anne de Bourgh. Against those temptations, which Elizabeth among us fancies our chance? Coraggio, whispers the author, be true to yourself. Thirty-five years later, living just eight miles from Chawton, Austen’s home, now a museum devoted to her, I find my love for the book endures (although I have long since found my Darcy). So what keeps me – and so many others – wedded to this novel? Especially when we could just whack on the Colin Firth box set instead? Certainly, I enjoy a hit of Georgian grace and fantasy: a dip into that world where problems could be solved by a new gown, an invitation to a ball, or some scrumptious item of gossip. And I appreciate more knowingly Austen’s descriptions of how money rules society. But it is Austen’s knack of describing the human heart that still sets my literary pulse racing, and makes me long for a quiet corner in which to curl up with the book. And now I read it with my daughter in mind; will she, too, find Pride and Prejudice, the gold standard of love stories, a primer for romantic life?

More here.

Saturday, January 26, 2013

For the first time in history we could end poverty while protecting the global environment. But do we have the will?

John Quiggin in Aeon:

TianjingEven to those who are thoroughly inured to warnings of impending catastrophe, the World Bank’s recent report on climate change, Turn Down the Heat (November, 2012), made for alarming reading. Looking at the consequences of four degrees of global warming, a likely outcome under current trajectories, the Bank concludes that the full scope of damage is almost impossible to project. Even so, it states: ‘The projected impacts on water availability, ecosystems, agriculture, and human health could lead to large-scale displacement of populations and have adverse consequences for human security and economic and trade systems.’ Among the calamities anticipated in the paper are large-scale dieback in the Amazon, the collapse of coral reef systems and the subsistence fishing communities that depend on them, and sharp declines in crop yields.

By contrast, most of us are already inured to the continuing catastrophe reported in the Bank’s annual World Development Report. Hundreds of millions of people go hungry every day. Tens of millions die every year from easily treatable or preventable diseases. Uncontrolled climate change could produce more crop failures and famines, and spread diseases and the pests that cause them even more widely.

Economic development and technological progress provide the only real hope of lifting billions of people out of poverty and destitution, just as it has done for the minority in the developed world. Yet the living standards of the developed world have been built on cheap energy from carbon-based fossil fuels. If everyone in the world used energy as Americans or even Europeans do, it would be impossible to restrict climate change to even four degrees of warming.

For those of us who seek a better life for everybody, the question of how much our environment can withstand is crucial. If current First World living standards can’t safely be extended to the rest of the world, the future holds either environmental catastrophe or an indefinite continuation of the age-old struggle between rich and poor. Of course, it might hold both.

More here.

Why Did Men Stop Wearing High Heels?

_65495951_louis_xiv_getty

Via Laura Agustín, William Kremer in the BBC:

[T]he intellectual movement that came to be known as the Enlightenment brought with it a new respect for the rational and useful and an emphasis on education rather than privilege. Men's fashion shifted towards more practical clothing. In England, aristocrats began to wear simplified clothes that were linked to their work managing country estates.

It was the beginning of what has been called the Great Male Renunciation, which would see men abandon the wearing of jewellery, bright colours and ostentatious fabrics in favour of a dark, more sober, and homogeneous look. Men's clothing no longer operated so clearly as a signifier of social class, but while these boundaries were being blurred, the differences between the sexes became more pronounced.

“There begins a discussion about how men, regardless of station, of birth, if educated could become citizens,” says Semmelhack.

“Women, in contrast, were seen as emotional, sentimental and uneducatable. Female desirability begins to be constructed in terms of irrational fashion and the high heel – once separated from its original function of horseback riding – becomes a primary example of impractical dress.”

High heels were seen as foolish and effeminate. By 1740 men had stopped wearing them altogether.

But it was only 50 years before they disappeared from women's feet too, falling out of favour after the French Revolution.

By the time the heel came back into fashion, in the mid-19th Century, photography was transforming the way that fashions – and the female self-image – were constructed.

Fraud, Disclosure, and Degrees of Freedom in Science

Lead2

Robert Trivers in Psychology Today:

I point out in The Folly of Fools that science is naturally self-correcting—it requires experiments, data gathering and modes of analysis to be fully explicit, the better to be replicated and thus verified or falsified—but where humans or social behavior are involved, the temptation for quick and illegitimate progress is accelerated by the apparent importance of the results and the difficulty of checking on their veracity. Recently cases of deliberate fraud have been uncovered in the study of primate cognition (Harvard), the health benefits of resveratrol (U Conn), and numerous social psychology findings (Tilburg U, Netherlands). I will devote some later blogs to other aspects of fraud in science but will begin here with a very clever analysis of statistical fraud and lack of data sharing in psychology papers published in the United States. This and related work suggest that the problem of fraud in science is much broader than the few cases of deliberate, large-scale fraud might suggest.

Wicherts and co-authors made use of a little noted feature of all papers published in the more than 50 journals of the American Psychological Association (APA)—the authors of these papers commit by contract to sharing their raw data with anyone who asks for it, in order to attempt replication. Yet earlier work by this same group showed that for 141 papers in four top APA journals, 73 percent of the scientists did not share data when asked to. Since, as they point out, statistical errors are known to be surprisingly common and accounts of statistical results sometimes inaccurate and scientists often motivated to make decisions during statistical analysis which are biased in their own preferred direction, they were naturally curious to see if there was any connection between failure to report data and evidence of statistical bias.

Here is where they got a dramatic result. They limited their research to two of the four journals whose scientists were slightly more likely to share data and most of whose studies were similar in having an experimental design. This gave them 49 papers. Again, the majority failed to share any data, instead behaving as a parody of academics.