Making Organizations Moral

Sophia Nguyen in Harvard Magazine:

EthicsIn the early 2000s, a riptide of business scandals toppled Enron, Arthur Andersen, and WorldCom. In the aftermath, says Straus professor of business administration Max Bazerman, “society turned to professional schools” to ask why their graduates were misbehaving. Behavioral ethics—combining aspects of moral philosophy, cognitive science, psychology, and economics—was born: “a creation of the new millennium.” As a teacher in this field, Bazerman explains, “My job is not about what ethics you follow, but how to bring you up to your own ethical standards.”

…In his book, Bazerman highlights promising directions in behavioral decision research that have the potential to promote more effective and more ethical noticing. One involves “choice architecture,” a term coined by Richard H. Thaler and Walmsley University Professor Cass R. Sunstein in their 2008 book Nudge: Improving Decisions about Health, Wealth, and Happiness. Choice architecture taps knowledge of psychology to identify better ways to present options—and Bazerman asserts that organizations can use it to create systems that increase the likelihood of their staff noticing key data. In a study he conducted with Kennedy School colleagues Alexandra van Geen and professor of public policy Iris Bohnet, supervisors were asked to assess a pool of job candidates. When judging applicants one at a time, they tended to favor men on quantitative tasks and women on verbal tasks. But when judging male and female candidates side-by-side, they relied on performance-related data; gender biases no longer factored into the decision. Changing the structure of the hiring process encouraged people to pay attention to the important information.

More here.



the death of fraternity

SuterABaflr25.4_60Chris Lehman at The Baffler:

Whether we like it or not, the big idea behind American democracy is to make us like each other more. It’s a faintly embarrassing dimension of our social experiment, carved out of the crack-up of the original British colonies, that the great theorists and practitioners of new world order in America were looking for something more than political independence. They sought to create a basis for the small-r republican ideal of fraternity: a territorially limited, widely participatory, and socially equitable economy made up principally of small producers—home manufacturers, merchants, and farmers. Only on such a basis, the theory went, could America be prevented from regressing into anarchy, despotism, or worse.

But things didn’t exactly go as planned. Come the Jacksonian age, the legal interpreters of the U.S. Constitution, spurred on by the directives of a fast-consolidating national and corporate economy, ratcheted the whole enterprise upward into something that many of the founders would have seen as a blatant contradiction in terms: a “commercial republic,” as the jurisprudence of the Federalist-on-the-make John Marshall (echoing the political rhetoric of his close political ally Daniel Webster) had it.

more here.

bartleby at the office

1413211764saval666Nikil Saval at Dissent:

Few institutions have offered themselves as less promising for the novelist than the modern office. Work of any kind is a tricky subject for representation; office work—gray, gnomic, and unknowable—even more so. After all, what is it that people do in offices? Herman Melville’s “Bartleby the Scrivener: A Story of Wall Street,” the locus classicus for discussions of early clerical work, begins by depicting strategies for avoiding work at what is nominally a law office. Few of the unnamed narrator’s employees seem to do much lawyering: Turkey works through the morning, but gets drunk at lunch; Nippers never finds an appropriate position to sit at his desk. And then there’s Bartleby, who, unlike his colleagues, works—and does so without fanfare, “silently, palely, mechanically.” But rather than producing things, he seems to consume them. “As if long famishing for something to copy,” the narrator observes, “he seemed to gorge himself on my documents.” And then—famously—Bartleby suddenly loses interest in his work. The tedium of office life offers a brief moment of satisfaction for Bartleby, which just as quickly vanishes; eventually deprived of his paperwork sustenance, Bartleby starves to death.

While “Bartleby” has remained unmatched as a parable of white-collar alienation (it was adapted to the contemporary, cubicular, and computerized workplace as a film in 2001), its casual treatment of the actual substance of work makes it unexceptional in the history of the literature of the office. Like many office novels that have followed, it is primarily one of manners—or in Bartleby’s case, a lack thereof.

more here.

Joshua Reynolds: portraits in action

7417424e-59fe-11e4_1103562hNorma Clarke at The Times Literary Supplement:

Reynolds dominated British art for some three decades before his death in 1792, by which time the British portrait was firmly established. Jonathan Richardson, in his influential Essay on the Theory of Painting (1715), remarked of contemporary portraitists that they had “prostituted a Noble Art, chusing to exchange the honorable Character of good painters for that sordid one of profess’d, mercenary flatterers”. Richardson’s essay was a powerful influence on the young Reynolds. Reynolds went on to cultivate an excellent character as a painter, becoming the first president of the Royal Academy of Arts on its foundation in 1768 (a position he kept for the rest of his life) and acquiring a knighthood; meanwhile, understanding how important it was to distinguish himself from those whom William Hogarth labelled a “nest of Phizmongers”, he worked tirelessly to combat sordid associations. His success is all the more remarkable given his equally tireless attention to the business side of his art. Reynolds quickly became very famous and rich as a portraitist without attracting the opprobrium of being a mercenary flatterer. How did he do it?

Avoiding insipidity was a good start, and to leaf through Hallett’s sumptuous volume is to feel the vibrancy. In the early portraits especially, something is generally going on: Commodore Augustus Keppel is striding towards us (a boiling sea behind); David Garrick is being pulled one way by the muse of comedy and the other by tragedy; Colonel Acland and Lord Sydney are flying through the forest in “The Archers”, a gloriously silly masquerade of heroic masculinity which draws from Hallett one of his rare acknowledgements that Reynolds, in his determination to make the picture move, might sometimes “teeter on the brink of absurdity”.

more here.

Obama Is a Republican: He’s the heir to Richard Nixon, not Saul Alinsky.

Bruce Barlett in The American Conservative:

A Republican stimulus would undoubtedly have had more tax cuts and less spending, even though every serious study has shown that tax cuts are the least effective method of economic stimulus in a recession. Even so, tax cuts made up 35 percent of the budgetary cost of the stimulus bill—$291 billion—despite an estimatefrom Obama’s Council of Economic Advisers that tax cuts barely raised the gross domestic product $1 for every $1 of tax cut. By contrast, $1 of government purchases raised GDP $1.55 for every $1 spent. Obama also extended the Bush tax cuts for two years in 2010.

It’s worth remembering as well that Bush did not exactly bequeath Obama a good fiscal hand. Fiscal year 2009 began on October 1, 2008, and one third of it was baked in the cake the day Obama took the oath of office. On January 7, 2009, the Congressional Budget Office projected significant deficits without considering any Obama initiatives. It estimated a deficit of $1.186 trillion for 2009 with no change in policy. The Office of Management and Budget estimated in November of that year that Bush-era policies, such as Medicare Part D, were responsible for more than half of projected deficits over the next decade.

Republicans give no credit to Obama for the significant deficit reduction that has occurred on his watch—just as they ignore the fact that Bush inherited an projected budget surplus of $5.6 trillion over the following decade, which he turned into an actual deficit of $6.1 trillion, according to a CBO study—but the improvement is real.

Screenshot 2014-10-20 12.59.16

Republicans would have us believe that their tight-fisted approach to spending is what brought down the deficit. But in fact, Obama has been very conservative, fiscally, since day one, to the consternation of his own party. According to reporting by the Washington Post and New York Times, Obama actually endorsed much deeper cuts in spending and the deficit than did the Republicans during the 2011 budget negotiations, but Republicans walked away.

Obama’s economic conservatism extends to monetary policy as well. His Federal Reserve appointments have all been moderate to conservative, well within the economic mainstream.

More here.

Expressions of the American Mind

82295A7B-A021-441C-95BDA86AA1086076_article

An excerpt from Tom Shachtman's Gentleman Scientists and Revolutionaries in Scientific American:

During the Revolutionary War, while American laboratory and field research was much reduced, science did not grind to a halt. Scientific thought helped frame America’s initiating rhetoric of the war, and throughout the conflict innovations in medicine and disease control and in arms and armaments were integral to the American effort. This and the next two chapters deal with science-related aspects of the war, the present one with the initiating rhetoric, the next with the medical aspects, and the following chapter with technology in armament.

In the seventeenth and eighteenth centuries, Jürgen Habermas writes, the “light of reason” entered the public sphere in stages, cropping up first among the elite and in a semiprivate way before being adopted by ever wider groups. Broad public participation in debate did not take place until the “problemization of areas that had until then not been questioned,” and when “the issues discussed became ‘general’ not merely in their significance but also in their accessibility: everyone had to be able to participate.” Those stages had characterized the path of natural philosophy in the American colonies, from the initial debate in the public sphere of 1721–1722 about smallpox prevention in the Boston epidemic, increasing through the middle decades of the century and cresting in the broad participation in the recording of the 1769 transit of Venus and in the growing audience for the efforts of the renewed American Philosophical Society. The same path to acceptance was being hewed in the consideration of non-monarchical and non-church governance: the debate was moving steadily from the elite’s private colloquies to publicly available written materials and thence to open assemblies. Habermas insists that the “communicative” aspects of this path were absolutely vital; in his view, the availability of newspapers able to operate beyond the day-to-day control of governing powers geometrically increased a populace’s ability to engage in public argument. From 1765 on, there were increasingly sophisticated discussions in colonial news- papers of direct and indirect taxes and of an accused person’s right to habeas corpus, as well as of such scientific matters as the parallax to be computed from observations of the transit of Venus. At stake in all these sort of discussions were Enlightenment ideals, particularly those of liberty, justice, and equality, enabling ordinary citizens to openly consider wresting their collective freedom from what Habermas labels the “restrictive particularism” of fealty to kings, lords, and church hierarchies.

More here.

Thursday, October 23, 2014

remembering our friend Matt Power (who would have turned 40 yesterday)

Matt-power-rip-writer-contributor-gqMark Kirby at GQ:

Today would've been Matthew Power's 40th birthday. The GQ contributor and friend of GQ staffers past and present died this March while reporting a story along the Nile River in Uganda. Matt was curious, adventurous, and always empathetic—he worked tirelessly to understand his subjects and bring them alive on the page. You can read Matt's stories about drone pilots, urban explorers, and theart world here on GQ.com.

In the days after Matt died, dozens of tributes to him sprang up online, coming in from all around the world. Even for those of us who knew Matt well, who knew what a constant and generous friend he was, what was especially remarkable were the number of young journalists Matt had taken the time to coach and mentor: beginning writers who'd gotten a fan letter from Matt at a moment when they were considering giving up, established journalists who owed at least part of their success to a chance Matthew Power had encouraged them to take on themselves.

more here.

Gough Whitlam, 1916-2014

Whitlam1955

John Quiggin in Crooked Timber:

Gough Whitlam, Prime Minister of Australia from 1972 to 1975, died on Tuesday. More than any other Australian political leader, and as much as any political figure anywhere, Gough Whitlam embodied social democracy in its ascendancy after World War II, its high water mark around 1970 and its defeat by what became known as neoliberalism in the wake of the crises of the 1970s.

Whitlam entered Parliament in 1952, having served in the Royal Australian Air Force during the War, and following a brief but distinguished legal career. Although Labor had already chosen a distinguished lawyer (HV Evatt) as leader, Whitlam’s middle-class professional background was unusual for Labor politicans

Whitlam marked a clear break with the older generation of Labor politicians in many other respects. He was largely indifferent to the party’s socialist objective (regarding the failure of the Chifley governments bank nationalisation referendum as having put the issue off the agenda) and actively hostile to the White Australia policy and protectionism, issues with which Labor had long been associated.

On the other hand, he was keen to expand the provision of public services like health and education, complete the welfare state for which previous Labor governments had laid the foundations, and make Australia a fully independent nation rather than being, in Robert Menzies words ‘British to the bootstraps’.

More here.

Who’s afraid of ‘Klinghoffer’?

Deathofklinghoffer

Adam Shatz in the London Review of Books:

The Death of Klinghoffer, John Adams’s 1991 opera about the hijacking of the Achille Lauro by the Palestine Liberation Front in 1985, has achieved a rare distinction in contemporary classical music: it’s considered so dangerous by its critics that they’d like to have it banned. For its opponents – the Klinghoffer family, Daniel Pearl’s father, conservative Jewish organisations, and now the former New York mayor Rudy Giuliani and former New York governor George Pataki, who took part in a noisy demonstration outside the Met last nightKlinghoffer is no less a sacrilege than The Satanic Verses was to Khomeini and his followers. They haven’t issued a fatwa, but they have done their best to sabotage the production ever since the Met announced it.

Peter Gelb, the Met’s general manager, capitulated in the summer to pressure from the Anti-Defamation League (and, according to the New York Times, from ‘three or four’ major Jewish donors), cancelling a live broadcast to cinemas around the world. The rationale for the decision, made against the backdrop of the Gaza offensive, was that the opera might be exploited by anti-semites. How, they didn’t say. For some reason the opera’s enemies don’t seem concerned that its unflinching portrayal of the murder of an elderly Jew in a wheelchair might be ‘used’ to foment anti-Muslim sentiment.

The notion that Adams and his librettist, Alice Goodman, are justifying terrorism is absurd. The hijacking is depicted in all its horror, chaos and fear. The scene that raised accusations of anti-semitism, a dinner table conversation among ‘the Rumors’, an American-Jewish family, was excised from the libretto long ago.

More here.

Benjamin C. Bradlee (1921-2014)

David Remnick in The New Yorker:

ScreenHunter_852 Oct. 23 14.34Benjamin Crowninshield Bradlee, the most charismatic and consequential newspaper editor of postwar America, died at the age of ninety-three on Tuesday. Among his many bequests to the Republic was a catalogue of swaggering anecdotes rich enough to float a week of testimonial dinners. Bradlee stories almost always relate to his glittering surface qualities, which combined the Brahmin and the profane. Let’s get at least one good one out of the way:

During his reign, from 1968 to 1991, as the executive editor of the WashingtonPost, Bradlee took time periodically to dictate correspondence into a recorder. His letters in no way resembled those of Emily Dickinson. He was given neither to self-doubt nor to self-restraint. In his era, there may have been demands by isolated readers for greater transparency, for correction or explanation, but there was no Internet, no Twitter, to amplify them. Bradlee was, by today’s standards, unchallengeable, and he was expert in the art of florid dismissal. His secretary, Debbie Regan, was, in turn, careful to reflect precisely his language when transcribing his dictation. One day, Regan approached the house grammarian, an editor named Tom Lippman, and admitted that she was perplexed. “Look, I have to ask you something,” she said. “Is ‘dickhead’ one word or two?”

This sort of stuff was especially entertaining when you remembered that Bradlee’s family was a concoction of seventeenth-century Yankees and semi-comic Vanity Fair-like European royalty.

More here.

Afghanistan: ‘A Shocking Indictment’

Rory Stewart in the New York Review of Books:

Ghani_ashraf-110614_jpg_250x1471_q85Ashraf Ghani, who has just become the president of Afghanistan, once drafted a document for Hamid Karzai that began:

There is a consensus in Afghan society: violence…must end. National reconciliation and respect for fundamental human rights will form the path to lasting peace and stability across the country. The people’s aspirations must be represented in an accountable, broad-based, gender-sensitive, multi-ethnic, representative government that delivers daily value.

That was twelve years ago. No one speaks like that now—not even the new president. The best case now is presented as political accommodation with the Taliban, the worst as civil war.

Western policymakers still argue, however, that something has been achieved: counterterrorist operations succeeded in destroying al-Qaeda in Afghanistan, there has been progress in health care and education, and even Afghan government has its strengths at the most local level. This is not much, given that the US-led coalition spent $1 trillion and deployed one million soldiers and civilians over thirteen years. But it is better than nothing; and it is tempting to think that everything has now been said: after all, such conclusions are now reflected in thousands of studies by aid agencies, multilateral organizations, foreign ministries, intelligence agencies, universities, and departments of defense.

But Anand Gopal’s No Good Men Among the Living shows that everything has not been said. His new and shocking indictment demonstrates that the failures of the intervention were worse than even the most cynical believed.

More here.

“Lingering Scars” by Farzana Hossen

Billy Kung in ArtAsiaPacific:

BengaliFrom Dhaka, 32-year-old photographer Farzana Hossen has produced a harrowing document called “Lingering Scars” (2013), a series of photographs depicting women victims of acid attacks. Hossen is one of 13 photographers taking part in an exhibition called “Voice of Tacitness: Asian Women Photography,” currently running at the Hong Kong Arts Centre from October 19 until November 2. According to Acid Survivors Foundation (ASF) in Bangladesh, from 1999 to 2011 there were 1,084 reported cases of acid assaults against women. Most of these attacks were marriage related, or lovers spurned, and far too often women are blamed for family breakups and divorce. Worse still, is the fact that both society and state have often turned a blind eye to the situation. Violence against women is an ongoing issue that has been addressed by many artists, but none with the sobering directness and honesty that are shown through Hossen’s body of work. Her bravery and sensitivity in the portrayal of these women are admirable, and the trust she has established with them comes across in her images. Upon first viewing the photographs, one is almost affronted by the horror, but those immediate responses are soon overtaken by an unbelievable sadness and at the same time, the courage and warmth displayed between the subjects and the photographer. Hossen wrote:

“I intend to work with women and girls who have survived an acid attack, and are trying to rebuild their lives despite carrying horrific mental and physical wounds. This is a profoundly personal undertaking and an important part of reflecting upon my past. I plan to travel to ten different districts of Bangladesh where groups of survivors have built support structures. I want to document their lives, their struggles, their sufferings and their resilience. The stories of these women will shed light on a very dark corner of human existence. They will give voice to individuals who’ve been silenced by their oppressors. They will tell the world that we need to campaign for women’s rights.”

More here.

The discovery of Homo floresiensis: Tales of the hobbit

Ewen Callaway in Nature:

Hobbit2In 2004, researchers announced the discovery of Homo floresiensis, a small relative of modern humans that lived as recently as 18,000 years ago. The ‘hobbit’ is now considered the most important hominin fossil in a generation. Here, the scientists behind the find tell its story. The hobbit team did not set out to find a new species. Instead, the researchers were trying to trace how ancient people travelled from mainland Asia to Australia. At least that was the idea when they began digging in Liang Bua, a large, cool cave in the highlands of Flores in Indonesia. The team was led by archaeologists Mike Morwood and Raden Soejono, who are now deceased.

Roberts: It was a very small body. That was the first thing that was immediately apparent — but also an incredibly small skull. We first thought, “Oh, it’s a child.” There was a guy who was working with us called Rokus. He did all the faunal identifications of the bones. But Rokus said, “No, no, no, it’s not a child. It’s not modern human at all. It’s a different species.”

Saptomo: Thomas drew the skeleton on paper, and he faxed the drawing to Mike and to Professor Soejono in Jakarta.

Sutikna: Mike called me at night. I couldn’t understand what he was saying over the phone, he was so excited.

More here.

It’s Possible to Live in More than One Time, More than One History of the World

27._aztec_calendar_2800x1232

John Crowley in Lapham's Quarterly:

“Then what is time?” St. Augustine asked himself in his Confessions. “I know what it is if no one asks; but if anyone does, then I cannot explain it.”

Augustine saw the present as a vanishing knife edge between the past, which exists no longer, and the future, which doesn’t yet. All that exists is the present; but if the present is always present and never becomes the past, it’s not time, but eternity. Augustine’s view is what the metaphysicians call “presentism,” which holds that a comprehensive description of what exists (an ontology) can and should include only what exists right now. But among the things that do exist now are surely such things as the memory of former present moments and what existed in them, and the archives and old calendars that denote or describe them. Like the dropped mitten in the Ukrainian tale that is able to accommodate animals of all sizes seeking refuge in it from the cold, the ever-vanishing present is weirdly capacious—“There’s always room for one more!”

Time is continuous, but calendars are repetitive. They end by beginning again, adding units to ongoing time just by turning in place, like a stationary bicycle. Most calendars these days are largely empty, a frame for our personal events and commitments to be entered in; but historically calendars have existed in order to control time’s passage with recurring feasts, memorials, sacred duties, public duties, and sacred duties done publicly—what the church I grew up in calls holy days of obligation. Such a calendar can model in miniature the whole of time, its first day commemorating the first day of Creation, its red-letter days the great moments of world time coming up in the same order they occurred in history, the last date the last day, when all of time begins again. The recent fascination with the Mayan “long count” calendar reflects this: the world cycle was to end when the calendar did.

It’s possible to live in more than one time, more than one history of the world, without feeling a pressing need to reconcile them. Many people live in a sacred time—what the religious historian Mircea Eliade called “a primordial mythical time made present”—and a secular time, “secular” from the Latin saeculum, an age or a generation. Sacred time, “indefinitely recoverable, indefinitely repeatable,” according to Eliade, “neither changes nor is exhausted.” In secular time, on the other hand, each year, month, second, is a unique and unrepeatable unit that disappears even as it appears in the infinitesimal present.

More here.

Thursday Poem

An Arundel Tomb

Side by side, their faces blurred,
The earl and countess lie in stone,
Their proper habits vaguely shown
As jointed armour, stiffened pleat,
And that faint hint of the absurd –
The little dogs under their feet.

Such plainness of the pre-baroque
Hardly involves the eye, until
It meets his left-hand gauntlet, still
Clasped empty in the other; and
One sees, with a sharp tender shock,
His hand withdrawn, holding her hand.

They would not think to lie so long.
Such faithfulness in effigy
Was just a detail friends would see:
A sculptor's sweet commissioned grace
Thrown off in helping to prolong
The Latin names around the base.

They would not guess how early in
Their supine stationary voyage
The air would change to soundless damage,
Turn the old tenantry away;
How soon succeeding eyes begin
To look, not read. Rigidly they

Persisted, linked, through lengths and breadths
Of time. Snow fell, undated. Light
Each summer thronged the grass. A bright
Litter of birdcalls strewed the same
Bone-littered ground. And up the paths
The endless altered people came,

Washing at their identity.
Now, helpless in the hollow of
An unarmorial age, a trough
Of smoke in slow suspended skeins
Above their scrap of history,
Only an attitude remains:

Time has transfigures them into
Untruth. The stone fidelity
They hardly meant has come to be
Their final blazon, and to prove
Our almost-instinct almost true:
What will survive of us is love.

by Philip Larkin
from The Whitsun Weddings, 0964

.

.

Do We Live in the Matrix?

Matrix-door

Zeeya Merali in Discover:

In the 1999 sci-fi film classic The Matrix, the protagonist, Neo, is stunned to see people defying the laws of physics, running up walls and vanishing suddenly. These superhuman violations of the rules of the universe are possible because, unbeknownst to him, Neo’s consciousness is embedded in the Matrix, a virtual-reality simulation created by sentient machines.

The action really begins when Neo is given a fateful choice: Take the blue pill and return to his oblivious, virtual existence, or take the red pill to learn the truth about the Matrix and find out “how deep the rabbit hole goes.”

Physicists can now offer us the same choice, the ability to test whether we live in our own virtual Matrix, by studying radiation from space. As fanciful as it sounds, some philosophers have long argued that we’re actually more likely to be artificial intelligences trapped in a fake universe than we are organic minds in the “real” one.

But if that were true, the very laws of physics that allow us to devise such reality-checking technology may have little to do with the fundamental rules that govern the meta-universe inhabited by our simulators. To us, these programmers would be gods, able to twist reality on a whim.

So should we say yes to the offer to take the red pill and learn the truth — or are the implications too disturbing?

More here.

The Rise of Data and the Death of Politics

US-president-Barack-Obama-011

Evgeny Morozov in The Observer (Photograph: Mandel Ngan/AFP/Getty Images):

In the near future, Google will be the middleman standing between you and your fridge, you and your car, you and your rubbish bin, allowing the National Security Agency to satisfy its data addiction in bulk and via a single window.

This “smartification” of everyday life follows a familiar pattern: there's primary data – a list of what's in your smart fridge and your bin – and metadata – a log of how often you open either of these things or when they communicate with one another. Both produce interesting insights: cue smart mattresses – one recent model promises to track respiration and heart rates and how much you move during the night – and smart utensils that provide nutritional advice.

In addition to making our lives more efficient, this smart world also presents us with an exciting political choice. If so much of our everyday behaviour is already captured, analysed and nudged, why stick with unempirical approaches to regulation? Why rely on laws when one has sensors and feedback mechanisms? If policy interventions are to be – to use the buzzwords of the day – “evidence-based” and “results-oriented,” technology is here to help.

This new type of governance has a name: algorithmic regulation. In as much as Silicon Valley has a political programme, this is it. Tim O'Reilly, an influential technology publisher, venture capitalist and ideas man (he is to blame for popularising the term “web 2.0”) has been its most enthusiastic promoter. In a recent essay that lays out his reasoning, O'Reilly makes an intriguing case for the virtues of algorithmic regulation – a case that deserves close scrutiny both for what it promises policymakers and the simplistic assumptions it makes about politics, democracy and power.

More here.

Wednesday, October 22, 2014

Behind the Mask: The Life of Vita Sackville-West

Rachel Trethewey in The Independent:

Portrait_of_Vita_SackvilleWestIn the famous image of Vita Sackville-West, Lady with a Red Hat, the writer is the embodiment of the confident young aristocrat. Exuding a languid elegance, her heavy-lidded Sackville eyes gaze out from beneath the broad brim. But this portrait captures another element of Vita’s persona. It was painted in 1918, shortly after her sexual awakening with Violet Keppel, and beneath the flamboyant clothes and bright lipstick there is an androgynous quality. In Behind the Mask, the first biography of Vita for 30 years, Matthew Dennison focuses on this ambiguity, exploring the duality which was rooted in her genetic inheritance and her eccentric upbringing.

Vita’s identity embraced masculine and feminine elements; her stiff-upper-lip English ancestry was in conflict with the Latin blood from her grandmother Pepita, a Spanish dancer who was the mistress of Lionel, Baron Sackville. Among their illegitimate offspring was Vita’s mother Victoria, who by marrying her cousin became the mistress of the Sackvilles’ ancestral home, Knole in Kent. The author of acclaimed biographies of Queen Victoria and her daughter Princess Beatrice, Dennison is particularly good at analysing complex mother-daughter relationships. Here, he sees Victoria’s identity interwoven with Vita’s. The former was a capricious character, he explains: “The fairy godmother was also a witch.” She claimed she could not bear to look at Vita because she was so ugly; the cruelty in Vita’s treatment of her lovers was learnt from her mother. An only child, Vita was often left at Knole with nannies and governesses while her parents travelled abroad. The house became like a person to her; built like a medieval village, it fired her imagination. Tragically for Vita, because she was a female she could not inherit the house. Dennison sees her fiction as addressing this; in her fantasy life, she celebrated a heroic male version of herself.

More here.