samuel johnson’s slave

3a824cf4-35fe-11e5_1166216hKathryn Sutherland at the Times Literary Supplement:

In consideration of the extraordinary life he records, Michael Bundock has given his fine biography of Francis Barber a subtitle that invokes the authenticating formula of the eighteenth-century novel: this is The true story of the Jamaican slave who became Samuel Johnson’s heir. Born on a sugar plantation in 1742/3 (the date is uncertain), the boy who later became Francis Barber was allotted the name Quashey; a generic slave name, it may also indicate he was born on a Sunday. Quashey inherited slave status, being literally the property of his master, Colonel Richard Bathurst, to sell or lend or give away. When the failure of his estates forced Bathurst to leave Jamaica, Quashey went with him along with the rest of his luggage. Was he Bathurst’s son? Perhaps, though there is no evidence to confirm this. In London, they lodged with Dr Richard Bathurst, who was the Colonel’s son and a friend of Johnson. Both men were passionate opponents of slavery. Here Quashey was baptized, receiving the name Francis Barber (the reason for the choice is unclear), his baptism possibly remitting his slavery (again, this is uncertain). Almost immediately, he was packed off to school some 250 miles away, to the small village of Barton in North Yorkshire, where his must surely have been the only black face. He returned to London two years later, at which time he joined Johnson’s household in Gough Square, Fleet Street. Already seasoned in adventures, Francis Barber was now probably around ten years old.

From the late seventeenth century, British involvement in the transatlantic slave trade led to a significant expansion of the black population of London and other port cities – Southampton, Bristol, Liverpool. Black slaves attended returning sea captains, colonial officials, merchants and plantation owners.

more here.

A spy’s daughter remembers the haunting unreality of embassy life in South Vietnam before the fall

House_sevenSarah Mansfield Taber at The American Scholar:

One summer evening in Saigon in 1974, we were invited to dinner at the home of another U.S. embassy employee, probably a covert operative like my father. I don’t remember who he was, but I recall the house—an elegant colonial villa with high ceilings and boldly colored tiled floors, surrounded by a high concrete wall. We parked on the street, walked past two guards, and slipped through a slender door cut into the wall’s façade. Like so many experiences during this sojourn of mine, stepping through that portal felt uncanny and intriguing and off. The country was at war, the enemy digging its steady way down the Ho Chi Minh Trail. Yet here I was, a rising junior in college, tagging along with my parents in hand-tailored dresses to elegant dinner parties featuring French food served by beautiful Vietnamese girls. I was annoyed at my mother that whole summer and jealous of my brother, two years younger, but I remember feeling, as I passed through that almost invisible door in the gate, that I was entering into a private and ephemeral principality, a world that could crumble at any second.

After dinner, my mother pleasantly tipsy, we ventured into the warm tropical night, across the dusky garden with its pots of fragrant plants, and out the magic door. Not a guard was in sight. Residence guards, in their little booths, often fell asleep in the evening, worn out by their bored, day-long vigils in the unrelenting heat. The embassy people joked that they hoped the guards would wake up if the Vietcong arrived.

more here.

american violence and apple pie

Baflr28-Mayo-machinegun-Hamrah-400x400A.S. Hamrah at The Baffler:

When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.

But for Brown, in 1967, American violence was white. Explicitly casting himself as an outsider, Brown said in his cherry pie speech that “violence is a part of America’s culture” and that Americans taught violence to black people. He explained that violence is a necessary form of self-protection in a society where white people set fire to Bowery bums for fun, and where they shoot strangers from the towers of college campuses for no reason—this was less than a year after Charles Whitman had killed eleven people that way at the University of Texas in Austin, the first mass shooting of its kind in U.S. history. Brown compared these deadly acts of violence to the war in Vietnam; president Lyndon B. Johnson, too, was burning people alive. He said the president’s wife was more his enemy than the people of Vietnam were, and that he’d rather kill her than them.
more here.

The Space Within: The future of healthcare

Adam Simpson in The Atlantic:

Instead of looking at broad populations to pinpoint trends within subsets of them, the medical world is increasingly turning to the individual, who can now be studied in higher definition than ever before. Precision medicine—the idea that treatments can be based on a patient’s unique biological and physiological characteristics—is gaining momentum.

Artwork

More here.

Thursday Poem

The Clod and the Pebble

'Love seeketh not itself to please,
Nor for itself hath any care,
But for another gives its ease,
And builds a heaven in hell's despair.'

So sung a little clod of clay,
Trodden with the cattle's feet;
But a pebble of the brook
Warbled out these meters meet:

'Love seeketh only Self to please,
To bind another to its delight,
Joys in another's loss of ease,
And builds a hell in heaven's despite.'

by William Blake
.
.

Wednesday, July 29, 2015

What Is Wrong with the West’s Economies?

Phelps_1-081315_jpg_600x778_q85

Edmund S. Phelps in The New York Review of Books:

Our prevailing political economy is blind to the very concept of inclusion; it does not map out any remedy for the deficiency. A monograph of mine and a conference volume I edited are among the few book-length studies of ways to remedy failure to include people generally in an economy in which they will have satisfying work.3

Commentators are talking now about injustice of another sort. Workers in decent jobs view the economy as unjust if they or their children have virtually no chance of climbing to a higher rung in the socioeconomic ladder. And moving up appears harder now. Even in the Gilded Age, many of the moguls came up from the bottom. (The rungs were far apart, yet the ladder was climbed.) The feeling of injustice comes from a sense of unfair advantages: that those above are using their connections to stay there—or to ensure that their children can follow them. The bar to upward mobility is always the same: barriers to competition put up by the wealthy, the connected, corporations, professional associations, unions, and guilds.

But the truth is that no degree of Rawlsian action to pull up low-end wages and employment—or remove unfair advantages—could have spared the less advantaged from a major loss of inclusion since Rawls’s time. The forces of productivity slowdown and globalization have been too strong. Moreover, though the injustices in the West’s economies are egregious, they ought not to be seen as a major cause of the productivity slowdowns and globalization. (For one thing, a slowdown of productivity started in the US in the mid-1960s and the sharp loss of manufacturing jobs to poorer countries occurred much later—from the late 1970s to the early 1990s.) Deeper causes must be at work.

More here.

The Moments of Realism

Antinomies-243x366

Ben Parker reviews Fredric Jameson's The Antinomies of Realism, in the LA Review of Books:

THE ODD THING about literary “realism” is that it is not a descriptive term at all, but a period: roughly 1830–1895, from Stendhal’s The Red and the Blackto Hardy’s Jude the Obscure. Many classics of 19th-century realism would be conspicuously ruled out if plausibility were any criterion. Balzac’s first successful novel, La Peau de chagrin, is about a gambler who purchases a magical, wish-fulfilling animal skin that shrinks with every wish granted; Stendhal’s Charterhouse of Parma is essentially a swashbuckling romp through Napoleonic Europe; Anna Karenina includes the interior monologue of a dog, long before Kafka; Flaubert’s works include a lurid, violent novel about the fall of ancient Carthage, and a play in which Saint Anthony confronts the Buddha, Isis, the Devil, and the Seven Deadly Sins in the desert. “Magical realism” is something of a pleonasm; 19th-century realism is already reliably outrageous, phantasmagoric, and credibility-straining.

The past tends to be evacuated of its specifics, and so realism becomes, in the folk vocabulary of everyday criticism, simply “the way that we used to do things.” The implication here is “… before we learned better,” where modernism, and most often Virginia Woolf, plays the role of pedagogue. By a curious twist, “realism” then becomes descriptive once again, as the term now encompasses a warehouse of discarded, seemingly ingenuous (but covertly ideological) techniques for the misguided project of grasping “reality.”

In her 2008 essay “Two Paths for the Novel,” Zadie Smith — in the same vein of condescension toward a hazy, credulous past — identified realism, specifically “the nineteenth-century lyrical Realism of Balzac and Flaubert,” as “a literary form in long-term crisis,” an archaic obstruction on the highway of literary culture. This realism was supposedly built on “the transcendent importance of form, the incantatory power of language to reveal truth, the essential fullness and continuity of the self.” Realism was a “bedtime story,” propagating the ideology that “the self is a bottomless pool,” and dating to a prelapsarian epoch when “novels weren’t neurotic.” All of this would come as a surprise, I think, to readers of Balzac and Flaubert: surely the latter is the most neurotic of novelists.

In fact, realism was never this way. Nineteenth-century realism was not a “bedtime story.” On the contrary, the prevailing idea that before modernism we all innocently believed in an essential plenitude of the self is itself a comforting fable by which to tuck in undergraduates. Even in as Masterpiece Theatre–ready a work as Thomas Hardy’s Tess of the D’Urbervilles, the heroine is crucially “absent” (narcoleptic, automaton-like) from her own attention at catastrophic, life-determining moments of rape and violence.

More here.

Young, smart and want to save lives? Become a banker

Joseph D'Urso at Thomson Reuters Foundation:

ScreenHunter_1278 Jul. 29 17.33Investment banking doesn't rank highly on most people's lists of ethical career choices, but according to one of the world's most famous living philosophers, becoming a hot shot in finance may be the best way for a bright graduate to help the global poor.

A high earner in the corporate world who is giving away large sums can create more social gain than if they did charity work, said Peter Singer, who teaches at Princeton University.

“If they are able to live modestly and give a lot away, they can save many lives,” he told the Thomson Reuters Foundation.

Singer is part of a movement of donors known as 'effective altruists', who demand concrete results from charitable donations, and often come from the business world. Silicon Valley billionaire Elon Musk will address the movement's global conference at Google headquarters in California in September.

The growing community encourages people to give big chunks of their income, typically around ten percent but in some cases more than half, to charities that alleviate global poverty.

More here.

A brief history of the European future

Menasse_amery_468wRobert Menasse at Eurozine:

Creating peace in Europe was a moral ambition that all could share. Yet Monnet was fully aware that moral appeals and trust in people's war-weariness would provide no more lasting security than international peace treaties. His idea, formulated as a plan with the French foreign minister Robert Schuman, was to overcome nationalism by gradually encouraging the nations to abandon rights of sovereignty, until, equally hollowed out and deprived of their very core, they would cease to have a future, hence undermining nationalism definitively. For this to work, supra-national institutions would have to gradually take over from national institutions. This process began with the creation of a high authority that regulated coal and steel production on behalf of the member states. Coal and steel were crucial not only for war but for reconstruction and economic revival. Creating a supra-national authority that controlled these products, ensuring their fair distribution and preventing secret rearmament, was the first step in a planned post-national development that would lead to the political and economic integration of the European nations, prevent them from deviating from the path, and that would ultimately supersede the nations entirely.

“Nationalism has destroyed European culture and civilization.” (Stefan Zweig)

“The nations and their political institutions have proved once and for all that they are not equal to the task of lasting peace and rule of law.” (Jean Monnet)

more here.

what is ‘not writing’?

GarmentsCoverAnne Boyer at Bookforum:

When I am not writing I am not writing a novel called 1994 about a young woman in an office park in a provincial town who has a job cutting and pasting time. I am not writing a novel called Nero about the world’s richest art star in space. I am not writing a book calledKansas City Spleen. I am not writing a sequel to Kansas City Spleen called Bitch’s Maldoror. I am not writing a book of political philosophy called Questions for Poets. I am not writing a scandalous memoir. I am not writing a pathetic memoir. I am not writing a memoir about poetry or love. I am not writing a memoir about poverty, debt collection, or bankruptcy. I am not writing about family court. I am not writing a memoir because memoirs are for property owners and not writing a memoir about prohibitions of memoirs.

When I am not writing a memoir I am also not writing any kind of poetry, not prose poems contemporary or otherwise, not poems made of fragments, not tightened and compressed poems, not loosened and conversational poems, not conceptual poems, not virtuosic poems employing many different types of euphonious devices, not poems with epiphanies and not poems without, not documentary poems about recent political moments, not poems heavy with allusions to critical theory and popular song.

more here.

HOW TO REMAIN HUMAN

In Cleveland, the ghost of d.a. levy is everywhere, even animating MOCA Cleveland's summer show. But what is it that makes the poet's legacy endure?

Morgan Meis in The Smart Set:

ScreenHunter_1277 Jul. 29 13.44A young poet killed himself in Cleveland on November 24, 1968. He did it with a .22 caliber rifle he’d owned since childhood. In the years leading up to his death, the poet often demonstrated to friends how he could operate the gun with his feet and put the muzzle against his forehead, right at the spot of his “third eye.” The poet’s name was d. a. levy, as he liked to spell it (he was born Darryl Alfred Levy). He was just 26 years old when he died.

Just a year before his death, levy was arrested by the Cleveland police. He’d been indicted in 1966. The specific charge was “contributing to the delinquency of a minor.” At a poetry reading, he allowed juveniles to read work deemed obscene by city officials. levy’s own poetry had its share of bad words, sex, and drugs. The poet was a public advocate for the legalization of marijuana. It all seems rather tame by today’s standard. But in Cleveland in 1968, the d. a. levy affair created quite a ruckus. His arrest brought national attention. Guys like Alan Ginsberg and Gary Snyder got involved in the case, advocating for the dismissal of the charges against levy. The call to “legalize levy” became a rallying cry at protests and on t-shirts and flyers, not just in Cleveland but around the country.

After his death, many people in Cleveland adopted levy as a kind of local hero. And there it should have ended, if history is any guide. A young poet takes his own life. A city mourns. The relentless wheel of history churns on, forgetting as it goes.

More here.

thinking about fukuyama

POPD-coverDaniel Luban at The Point:

If the right attacked Fukuyama for being insufficiently fearful about political threats to Western liberalism, the left attacked him for being insufficiently hopeful about economic alternatives to it. Fukuyama’s argument came on the heels of a set of developments that seemed to fit a pattern: the collapse of the USSR; Deng Xiaoping’s decision to move China toward something that looked a great deal like capitalism; Margaret Thatcher’s and Ronald Reagan’s attacks on the postwar welfare state. The closing-off of systematic alternatives to capitalism coincided with capitalism’s own transition from “Fordism” to “neoliberalism” (to use the now-conventional terminology), and Fukuyama seemed to exemplify both of these pernicious trends. To detractors on the left, his thesis was at best a failure of political imagination and at worst a highfalutin version of Thatcher’s taunt that “there is no alternative” to the free market.

However unappealing Fukuyama’s view may have been to the left, the lean years of Third Way liberalism and compassionate conservatism did little to disconfirm it. But more recent events have offered critics of the left, like those of the right, the chance to claim vindication by history.

more here.

Genomics pioneer Jun Wang on his new AI venture

David Cyranoski in Nature:

Jun%20Wang_900pxJun Wang is one of China’s most famous scientists. Since joining the genome-sequencing powerhouse BGI when it started up 16 years ago, he has participated in some of its biggest accomplishments. These include sequencing the first genome of an Asian person1, the giant panda2 and the human gut microbiome3, as well as contributions to the Human Genome Project. Wang has led BGI since 2007 (when it stopped using the name Beijing Genomics Institute and moved its headquarters to Shenzhen). But on 17 July, the institute announced that he will give up that position to pursue research into artificial intelligence (AI).

What is the concept behind your AI project?

Basically, I am just trying to feed an AI system with masses of data. Then that system could learn to understand human health and human life better than we do. The AI will try to draw a formula for life. Life is digital, like a computer program if you want to understand the results of the programming, how the genes lead to phenotypes, it is sufficiently complicated for you to need an AI system to figure out the rules. The AI system will basically consist of two components. The first is the big supercomputing platforms. We already have access to those through cloud computing and supercomputing centres. These will run or devise algorithms that look for relationships between genes, lifestyle and environmental factors, and predict phenotypes. The other thing is big data. We want to have data from one million individuals. And we want the data to be alive, in the sense that they can update their phenotype information at any time point. Other big computing companies, such as Google, could eventually do this, but we want to do it first. And we have the experience with the big data.

More here.

Tuesday, July 28, 2015

The false dichotomy of Islamophobia

Massimo Pigliucci in Scientia Salon:

N_15651_1A false dichotomy is a basic type of informal logical fallacy, consisting in framing an issue as if there were only two choices available, while in fact a range of nuanced positions may be on offer upon more careful reflection. While I have argued together with my colleagues Maarten Boudry and Fabio Paglieri that often so-called logical fallacies turn out to be pretty reasonable heuristic strategies [1], there are nonetheless plenty of instances were they do identify truly bad reasoning. I have recently discussed one such case in reference to so-called trigger warnings in the context of college classes [2], but another one is arguably represented by the never ending “debate” about Islamophobia.

It is easy to find stark examples of people defending what appear to be two irreconcilable positions about how to view Islam in a post-9/11 world. For the sake of discussion, I will bypass pundits and other pseudo-intellectuals, and use instead two comedians as representative of the contrasting positions: Jon Stewart [3] and Bill Maher [4].

Before proceeding I must acknowledge that while I’ve liked Stewart for a long time, and followed with pleasure his evolution from being solely a comedian to a savvy social commentator during his run at the Daily Show [5], my appreciation of Maher has slid further and further. I used to like his brusque style back when he was doing his “Politically Incorrect” show, first on Comedy Central, then on ABC [6]. I was aghast when ABC (allegedly) let him go because he had dared to make the truly politically (but clearly correct) statement that the 9/11 hijackers could properly be labelled with a number of negative epithets, but that cowards wasn’t one of them. But then he made his Religulous movie [7], where he slid into crass new atheism-style “criticism” of religion, and finally came out as an anti-vaxxer all the while chastising some of his guests who were “skeptical” of climate change for being anti-science.

More here.

After the crash, can biologists fix economics?

Kate Douglas in New Scientist:

ScreenHunter_1276 Jul. 28 13.54The global financial crisis of 2008 took the world by surprise. Few mainstream economists saw it coming. Most were blind even to the possibility of such a catastrophic collapse. Since then, they have failed to agree on the interventions required to fix it. But it’s not just the crash: there is a growing feeling that orthodox economics can’t provide the answers to our most pressing problems, such as why inequality is spiralling. No wonder there’s talk of revolution.

Earlier this year, several dozen quiet radicals met in a boxy red building on the outskirts of Frankfurt, Germany, to plot just that. The stated aim of this Ernst Strüngmann Forum at the Frankfurt Institute for Advanced Studies was to create “a new synthesis for economics”. But the most zealous of the participants – an unlikely alliance of economists, anthropologists, ecologists and evolutionary biologists – really do want to overthrow the old regime. They hope their ideas will mark the beginning of a new movement to rework economics using tools from more successful scientific disciplines.

Drill down, and it’s not difficult to see where mainstream “neoclassical” economics has gone wrong. Since the 19th century, economies have essentially been described with mathematical formulae. This elevated economics above most social sciences and allowed forecasting. But it comes at the price of ignoring the complexities of human beings and their interactions – the things that actually make economic systems tick.

The problems start with Homo economicus, a species of fantasy beings who stand at the centre of orthodox economics. All members of H. economicus think rationally and act in their own self-interest at all times, never learning from or considering others.

We’ve known for a while now that Homo sapiens is not like that (see “Team humanity“). Over the years, there have been various attempts to inject more realism into the field by incorporating insights into how humans actually behave. Known as behavioural economics, this approach has met with some success in microeconomics – the study of how individuals and small groups make economic decisions.

More here.

The Greek Warrior

150803_r26797-320

Ian Parker in The New Yorker (Photo by Davide Monteleone):

Varoufakis, a mathematical economist with a modest academic reputation, had become a popular writer in Greece. When the snap election was called, he interrupted his professorship at the University of Texas, flew home to Greece, and launched a ten-day election campaign whose sole expense was the cost of gas for his motorcycle. He was running for parliament, with the aim of becoming the finance minister in a Syriza government. The vote was held on January 25th. Syriza doubled its number of seats in parliament, and Tsipras formed a government in coalition with a small right-of-center party that shared its opposition to the troika’s terms. Varoufakis was elected with a larger share of the vote than any other candidate, and he was named the finance minister. His only previous experience of representative office was as the (white, Greek) leader of the Black Students’ Alliance at the University of Essex, a British institution, in the late seventies. Privately, he asked himself, “What have I done?” On his blog, he borrowed some thoughts of defiance—and, by implication, certain failure—from Dylan Thomas. “Greek democracy today chose to stop going gently into the night,” Varoufakis wrote. “Greek democracy resolved to rage against the dying of the light.”

A few years ago, Varoufakis told Yorgos Avgeropoulos, a documentary filmmaker, that the difference between a debt of ten thousand euros and one of three hundred billion euros is that only the latter gives you negotiating power. And it does so only under one condition: “You must be prepared to say no.” Upon his election, Varoufakis used the less than ideal influence available to a rock climber who, roped to his companions, announces a willingness to let go. On behalf of Tsipras’s government, Varoufakis told Greece’s creditors, and the world’s media, that his country objected to the terms of its agreements. This position encouraged widespread commentary about Greece following a heedless path from “no” to default, and from default to a “Grexit” from the euro currency, which might lead to economic catastrophe in Europe and the world.

It was as if Christopher Hitchens had woken up one day as Secretary of State. Varoufakis was no longer writing elegantly prosecutorial blog posts about Christine Lagarde, the managing director of the I.M.F.; he was meeting with Lagarde. Within days of Greece’s election, an academic with Marxist roots, a shaved head, and a strong jaw had become one of the world’s most recognizable politicians. He showed a level of intellectual and rhetorical confidence—or, perhaps, unearned swagger—that lifted Greek hearts and infuriated Northern European politicians.

More here.

Ta-Nehisi Coates woke me up: Lessons on race, atheism and my white privilege

Greg Epstein in Salon:

Brooks_coates_dawkinsCoates, an award-winning journalist for the Atlantic, is primarily seen as a writer on race. And “Between the World and Me” is, on one level, a book about race, with the story of his murdered friend Prince Jones making Sandra Bland’s seemingly similar death look all the more like a depressing and infuriating act of terror. But atheists and humanists tend to see ourselves as transcending culture and race. So much so that I’ve always been dismayed to find the majority of people who tend to show up at the meetings of organizations with words like atheist and humanist in their names, are so very, very white. Why? Maybe, as I explored in my book “Good Without God” (a title meant to offer a three-word definition of humanism), in an America where religious identity is all many minorities have to fortify them against a society that treats them as inferior and other, identifying as an atheist is far easier for people of privilege.

But Coates’ new book is also, boldly, about atheism. It is even more so about humanism. Crafting a powerful narrative about white Americans — or, as he says, those of us who need to think we are white — who are living The Dream — Coates makes a profound statement of what is, and is not, good, with or without god. Coates refers not to Martin Luther King Jr.’s dream, not quite even to the “American Dream,” but rather to The Dream in which we forget our history, our identity and much of our nation’s prosperity is built on the foundation of the suffering of people of color in general and black people in particular. The Dream, in other words, is not a state in which only Fox News Watchers find themselves. It is a state that can cancel out the very best of white, liberal, humanist intentions.

More here.