A brief history of the European future

Menasse_amery_468wRobert Menasse at Eurozine:

Creating peace in Europe was a moral ambition that all could share. Yet Monnet was fully aware that moral appeals and trust in people's war-weariness would provide no more lasting security than international peace treaties. His idea, formulated as a plan with the French foreign minister Robert Schuman, was to overcome nationalism by gradually encouraging the nations to abandon rights of sovereignty, until, equally hollowed out and deprived of their very core, they would cease to have a future, hence undermining nationalism definitively. For this to work, supra-national institutions would have to gradually take over from national institutions. This process began with the creation of a high authority that regulated coal and steel production on behalf of the member states. Coal and steel were crucial not only for war but for reconstruction and economic revival. Creating a supra-national authority that controlled these products, ensuring their fair distribution and preventing secret rearmament, was the first step in a planned post-national development that would lead to the political and economic integration of the European nations, prevent them from deviating from the path, and that would ultimately supersede the nations entirely.

“Nationalism has destroyed European culture and civilization.” (Stefan Zweig)

“The nations and their political institutions have proved once and for all that they are not equal to the task of lasting peace and rule of law.” (Jean Monnet)

more here.



what is ‘not writing’?

GarmentsCoverAnne Boyer at Bookforum:

When I am not writing I am not writing a novel called 1994 about a young woman in an office park in a provincial town who has a job cutting and pasting time. I am not writing a novel called Nero about the world’s richest art star in space. I am not writing a book calledKansas City Spleen. I am not writing a sequel to Kansas City Spleen called Bitch’s Maldoror. I am not writing a book of political philosophy called Questions for Poets. I am not writing a scandalous memoir. I am not writing a pathetic memoir. I am not writing a memoir about poetry or love. I am not writing a memoir about poverty, debt collection, or bankruptcy. I am not writing about family court. I am not writing a memoir because memoirs are for property owners and not writing a memoir about prohibitions of memoirs.

When I am not writing a memoir I am also not writing any kind of poetry, not prose poems contemporary or otherwise, not poems made of fragments, not tightened and compressed poems, not loosened and conversational poems, not conceptual poems, not virtuosic poems employing many different types of euphonious devices, not poems with epiphanies and not poems without, not documentary poems about recent political moments, not poems heavy with allusions to critical theory and popular song.

more here.

HOW TO REMAIN HUMAN

In Cleveland, the ghost of d.a. levy is everywhere, even animating MOCA Cleveland's summer show. But what is it that makes the poet's legacy endure?

Morgan Meis in The Smart Set:

ScreenHunter_1277 Jul. 29 13.44A young poet killed himself in Cleveland on November 24, 1968. He did it with a .22 caliber rifle he’d owned since childhood. In the years leading up to his death, the poet often demonstrated to friends how he could operate the gun with his feet and put the muzzle against his forehead, right at the spot of his “third eye.” The poet’s name was d. a. levy, as he liked to spell it (he was born Darryl Alfred Levy). He was just 26 years old when he died.

Just a year before his death, levy was arrested by the Cleveland police. He’d been indicted in 1966. The specific charge was “contributing to the delinquency of a minor.” At a poetry reading, he allowed juveniles to read work deemed obscene by city officials. levy’s own poetry had its share of bad words, sex, and drugs. The poet was a public advocate for the legalization of marijuana. It all seems rather tame by today’s standard. But in Cleveland in 1968, the d. a. levy affair created quite a ruckus. His arrest brought national attention. Guys like Alan Ginsberg and Gary Snyder got involved in the case, advocating for the dismissal of the charges against levy. The call to “legalize levy” became a rallying cry at protests and on t-shirts and flyers, not just in Cleveland but around the country.

After his death, many people in Cleveland adopted levy as a kind of local hero. And there it should have ended, if history is any guide. A young poet takes his own life. A city mourns. The relentless wheel of history churns on, forgetting as it goes.

More here.

thinking about fukuyama

POPD-coverDaniel Luban at The Point:

If the right attacked Fukuyama for being insufficiently fearful about political threats to Western liberalism, the left attacked him for being insufficiently hopeful about economic alternatives to it. Fukuyama’s argument came on the heels of a set of developments that seemed to fit a pattern: the collapse of the USSR; Deng Xiaoping’s decision to move China toward something that looked a great deal like capitalism; Margaret Thatcher’s and Ronald Reagan’s attacks on the postwar welfare state. The closing-off of systematic alternatives to capitalism coincided with capitalism’s own transition from “Fordism” to “neoliberalism” (to use the now-conventional terminology), and Fukuyama seemed to exemplify both of these pernicious trends. To detractors on the left, his thesis was at best a failure of political imagination and at worst a highfalutin version of Thatcher’s taunt that “there is no alternative” to the free market.

However unappealing Fukuyama’s view may have been to the left, the lean years of Third Way liberalism and compassionate conservatism did little to disconfirm it. But more recent events have offered critics of the left, like those of the right, the chance to claim vindication by history.

more here.

Genomics pioneer Jun Wang on his new AI venture

David Cyranoski in Nature:

Jun%20Wang_900pxJun Wang is one of China’s most famous scientists. Since joining the genome-sequencing powerhouse BGI when it started up 16 years ago, he has participated in some of its biggest accomplishments. These include sequencing the first genome of an Asian person1, the giant panda2 and the human gut microbiome3, as well as contributions to the Human Genome Project. Wang has led BGI since 2007 (when it stopped using the name Beijing Genomics Institute and moved its headquarters to Shenzhen). But on 17 July, the institute announced that he will give up that position to pursue research into artificial intelligence (AI).

What is the concept behind your AI project?

Basically, I am just trying to feed an AI system with masses of data. Then that system could learn to understand human health and human life better than we do. The AI will try to draw a formula for life. Life is digital, like a computer program if you want to understand the results of the programming, how the genes lead to phenotypes, it is sufficiently complicated for you to need an AI system to figure out the rules. The AI system will basically consist of two components. The first is the big supercomputing platforms. We already have access to those through cloud computing and supercomputing centres. These will run or devise algorithms that look for relationships between genes, lifestyle and environmental factors, and predict phenotypes. The other thing is big data. We want to have data from one million individuals. And we want the data to be alive, in the sense that they can update their phenotype information at any time point. Other big computing companies, such as Google, could eventually do this, but we want to do it first. And we have the experience with the big data.

More here.

Tuesday, July 28, 2015

The false dichotomy of Islamophobia

Massimo Pigliucci in Scientia Salon:

N_15651_1A false dichotomy is a basic type of informal logical fallacy, consisting in framing an issue as if there were only two choices available, while in fact a range of nuanced positions may be on offer upon more careful reflection. While I have argued together with my colleagues Maarten Boudry and Fabio Paglieri that often so-called logical fallacies turn out to be pretty reasonable heuristic strategies [1], there are nonetheless plenty of instances were they do identify truly bad reasoning. I have recently discussed one such case in reference to so-called trigger warnings in the context of college classes [2], but another one is arguably represented by the never ending “debate” about Islamophobia.

It is easy to find stark examples of people defending what appear to be two irreconcilable positions about how to view Islam in a post-9/11 world. For the sake of discussion, I will bypass pundits and other pseudo-intellectuals, and use instead two comedians as representative of the contrasting positions: Jon Stewart [3] and Bill Maher [4].

Before proceeding I must acknowledge that while I’ve liked Stewart for a long time, and followed with pleasure his evolution from being solely a comedian to a savvy social commentator during his run at the Daily Show [5], my appreciation of Maher has slid further and further. I used to like his brusque style back when he was doing his “Politically Incorrect” show, first on Comedy Central, then on ABC [6]. I was aghast when ABC (allegedly) let him go because he had dared to make the truly politically (but clearly correct) statement that the 9/11 hijackers could properly be labelled with a number of negative epithets, but that cowards wasn’t one of them. But then he made his Religulous movie [7], where he slid into crass new atheism-style “criticism” of religion, and finally came out as an anti-vaxxer all the while chastising some of his guests who were “skeptical” of climate change for being anti-science.

More here.

After the crash, can biologists fix economics?

Kate Douglas in New Scientist:

ScreenHunter_1276 Jul. 28 13.54The global financial crisis of 2008 took the world by surprise. Few mainstream economists saw it coming. Most were blind even to the possibility of such a catastrophic collapse. Since then, they have failed to agree on the interventions required to fix it. But it’s not just the crash: there is a growing feeling that orthodox economics can’t provide the answers to our most pressing problems, such as why inequality is spiralling. No wonder there’s talk of revolution.

Earlier this year, several dozen quiet radicals met in a boxy red building on the outskirts of Frankfurt, Germany, to plot just that. The stated aim of this Ernst Strüngmann Forum at the Frankfurt Institute for Advanced Studies was to create “a new synthesis for economics”. But the most zealous of the participants – an unlikely alliance of economists, anthropologists, ecologists and evolutionary biologists – really do want to overthrow the old regime. They hope their ideas will mark the beginning of a new movement to rework economics using tools from more successful scientific disciplines.

Drill down, and it’s not difficult to see where mainstream “neoclassical” economics has gone wrong. Since the 19th century, economies have essentially been described with mathematical formulae. This elevated economics above most social sciences and allowed forecasting. But it comes at the price of ignoring the complexities of human beings and their interactions – the things that actually make economic systems tick.

The problems start with Homo economicus, a species of fantasy beings who stand at the centre of orthodox economics. All members of H. economicus think rationally and act in their own self-interest at all times, never learning from or considering others.

We’ve known for a while now that Homo sapiens is not like that (see “Team humanity“). Over the years, there have been various attempts to inject more realism into the field by incorporating insights into how humans actually behave. Known as behavioural economics, this approach has met with some success in microeconomics – the study of how individuals and small groups make economic decisions.

More here.

The Greek Warrior

150803_r26797-320

Ian Parker in The New Yorker (Photo by Davide Monteleone):

Varoufakis, a mathematical economist with a modest academic reputation, had become a popular writer in Greece. When the snap election was called, he interrupted his professorship at the University of Texas, flew home to Greece, and launched a ten-day election campaign whose sole expense was the cost of gas for his motorcycle. He was running for parliament, with the aim of becoming the finance minister in a Syriza government. The vote was held on January 25th. Syriza doubled its number of seats in parliament, and Tsipras formed a government in coalition with a small right-of-center party that shared its opposition to the troika’s terms. Varoufakis was elected with a larger share of the vote than any other candidate, and he was named the finance minister. His only previous experience of representative office was as the (white, Greek) leader of the Black Students’ Alliance at the University of Essex, a British institution, in the late seventies. Privately, he asked himself, “What have I done?” On his blog, he borrowed some thoughts of defiance—and, by implication, certain failure—from Dylan Thomas. “Greek democracy today chose to stop going gently into the night,” Varoufakis wrote. “Greek democracy resolved to rage against the dying of the light.”

A few years ago, Varoufakis told Yorgos Avgeropoulos, a documentary filmmaker, that the difference between a debt of ten thousand euros and one of three hundred billion euros is that only the latter gives you negotiating power. And it does so only under one condition: “You must be prepared to say no.” Upon his election, Varoufakis used the less than ideal influence available to a rock climber who, roped to his companions, announces a willingness to let go. On behalf of Tsipras’s government, Varoufakis told Greece’s creditors, and the world’s media, that his country objected to the terms of its agreements. This position encouraged widespread commentary about Greece following a heedless path from “no” to default, and from default to a “Grexit” from the euro currency, which might lead to economic catastrophe in Europe and the world.

It was as if Christopher Hitchens had woken up one day as Secretary of State. Varoufakis was no longer writing elegantly prosecutorial blog posts about Christine Lagarde, the managing director of the I.M.F.; he was meeting with Lagarde. Within days of Greece’s election, an academic with Marxist roots, a shaved head, and a strong jaw had become one of the world’s most recognizable politicians. He showed a level of intellectual and rhetorical confidence—or, perhaps, unearned swagger—that lifted Greek hearts and infuriated Northern European politicians.

More here.

Ta-Nehisi Coates woke me up: Lessons on race, atheism and my white privilege

Greg Epstein in Salon:

Brooks_coates_dawkinsCoates, an award-winning journalist for the Atlantic, is primarily seen as a writer on race. And “Between the World and Me” is, on one level, a book about race, with the story of his murdered friend Prince Jones making Sandra Bland’s seemingly similar death look all the more like a depressing and infuriating act of terror. But atheists and humanists tend to see ourselves as transcending culture and race. So much so that I’ve always been dismayed to find the majority of people who tend to show up at the meetings of organizations with words like atheist and humanist in their names, are so very, very white. Why? Maybe, as I explored in my book “Good Without God” (a title meant to offer a three-word definition of humanism), in an America where religious identity is all many minorities have to fortify them against a society that treats them as inferior and other, identifying as an atheist is far easier for people of privilege.

But Coates’ new book is also, boldly, about atheism. It is even more so about humanism. Crafting a powerful narrative about white Americans — or, as he says, those of us who need to think we are white — who are living The Dream — Coates makes a profound statement of what is, and is not, good, with or without god. Coates refers not to Martin Luther King Jr.’s dream, not quite even to the “American Dream,” but rather to The Dream in which we forget our history, our identity and much of our nation’s prosperity is built on the foundation of the suffering of people of color in general and black people in particular. The Dream, in other words, is not a state in which only Fox News Watchers find themselves. It is a state that can cancel out the very best of white, liberal, humanist intentions.

More here.

Tuesday Poem

Thank You

Danke, merci, gracias
for the heat of the sun,
the kindness of teaching,
the smell of fresh bread.

Diolch, nkosi, shur-nur-ah-gah-lem
for the sound of sand,
children singing,
the book and the pen.

Dhannyabad, blagodaria, hvala
for the blue of small flowers,
the bobbing seal’s head,
the taste of clean water.

Shukran, rahmat, shukriya
for the stripe of the zebra,
the song of the chaffinch,
the gentleness of snails.

Mh goi, abarka, xièxiè
for the length of time,
the loveliness of eyelashes,
the arc of the ball.

Dziekuje, abrigado, shakkran
for the excitement of falling,
the stillness of night,
for my heart beating, thank you.
.

by Mandy Coe
from If You Could See Laughter
Salt Publishing, Cromer, 2011

Cellular ‘Cheaters’ Give Rise to Cancer

George Johnson in The New York Times:

CancerMaybe it was in “some warm little pond,” Charles Darwin speculated in 1871, that life on Earth began. A few simple chemicals sloshed together and formed complex molecules. These, over great stretches of time, joined in various combinations, eventually giving rise to the first living cell: a self-sustaining bag of chemistry capable of dividing and spawning copies of itself. While scientists still debate the specifics, most subscribe to some version of what Darwin suggested — genesis as a fortuitous chemical happenstance. But the story of how living protoplasm emerged from lifeless matter may also help explain something darker: the origin of cancer.

As the primordial cells mutated and evolved, ruthlessly competing for nutrients, some stumbled upon a different course. They cooperated instead, sharing resources and responsibilities and so giving rise to multicellular creatures — plants, animals and eventually us. Each of these collectives is held together by a delicate web of biological compromises. By surrendering some of its autonomy, each cell prospers with the whole. But inevitably, there are cheaters: A cell breaks loose from the interlocking constraints and begins selfishly multiplying and expanding its territory, reverting to the free-for-all of Darwin’s pond. And so cancer begins.

More here.

Why a technologically enhanced future will not be as good as we think

Citylights2-744x496

Nicholas Agar in the OUP blog (Image credit: “City Lights”, by Unsplash. Public Domain via Pixabay):

Humans have flexible psychologies that enable us to flourish in environments ranging from the Arctic to the Kalahari Desert. Walruses and meerkats lack this psychological flexibility. They are unlikely to work out how to survive an exchange of habitats. Hedonic normalization permits a human raised in the high Himalayas to find that environment normal. The same psychological mechanism that hedonically normalizes humans to Arctic and desert environments normalizes us to the very different technological environments of the 1st and 21st centuries. We can predict that it will normalize us to the technologies of the 23rd century. Differences in hedonic normalization mean that ancient Romans, 21st century New Yorkers, and 23rd century residents of Cairo view cars powered by internal combustion engines very differently. What for the Romans is a quite miraculous technology, is boringly familiar to the New Yorkers, and repellently primitive and polluting for the Cairenes.

When we overlook hedonic normalization we tend to significantly overstate the extent to which technological progress will boost the happiness of future people. I would be very happy to abruptly find myself on board a 23rd century starship. But this is not how people hedonically normalized to 23rd century will feel. The error of ignoring hedonic normalization is especially apparent when we think about the past. Techno-optimists point to the big differences that technological change has made to our world. Mary Beard’s description of the streets of ancient Pompeii covered in animal dung, rotting vegetables, human excrement and flies makes modern city dwellers glad to be alive now. But imagining how a time traveller from the early 21st century would feel to find herself marooned in Pompeii does not tell us how people hedonically normalized to that time felt. Doubtless the Pompeians would have preferred cleaner streets. But the filthiness of their streets did not affect them in the way that it would affect someone normalized to our comparatively refuse and excrement-free highways and byways. To see this more clearly consider how people from the 23rd century will feel about life in our times. The conditions of our cities are clearly not perfect – but they are not nearly as bad for us as they will seem to someone normalized to the cities that 23rd century technologies will build.

More here.

This is what economists don’t understand about the euro crisis – or the U.S. dollar

Euro_coins_and_banknotes

Kathleen McNamara in the WaPo's Monkey Cage:

Economists are condescendingly scolding the Europeans for venturing into a single currency without the proper underlying economic conditions. Paul Krugman has relentlessly excoriated the leaders of Europe for being what he calls “self-indulgent politicians” who have “spent a quarter-century trying to run Europe on the basis of fantasy economics.” The conventional wisdom seems to be that the problems of the euro zone are, as economist Martin Feldstein once put it, “the inevitable consequence of imposing a single currency on a very heterogeneous group of countries.”

What this commentary gets wrong, however, is that single currencies are never the product of debates about optimal economic solutions. Instead, currencies like the U.S. dollar itself are the result of political battles, where motivated actors try to centralize power. This has most often occurred “through iron and blood,” as Otto van Bismarck, the unifier of Germany put it, as a result of catastrophic wars. Smaller geographic units were brought together to build the modern nation state, with a unified fiscal system, a common national language that was often imposed by force, a unified legal system, and, a single currency. Put differently (with apologies to sociologist Charles Tilly), war makes the state, and the state makes the currency.

The U.S. case is instructive. America used to have a chaotic multitude of state currencies and privately issued bank notes, with complex exchange rates between them. This only changed thanks to the Civil War. The American greenback was created in 1863 when Abraham Lincoln’s Republican Party muscled through legislation giving the federal government exclusive currency rights. It was only able to do this because Southern legislators, who opposed more centralization of power, had seceded from the American union. The Union side wanted a common currency to help the war effort by rationalizing revenue raising and wartime payments. But it was also a potent symbol of the power of the federal state in the face of the challenges of a disintegrating union.

More here.

The Raj at War: A People’s History of India’s Second World War

United-States-Army-Air-Fo-009

Patrick French reviews Yasmin Khan's new book, in The Guardian (Photograph: Michael Ochs Archives/Getty):

Yasmin Khan reminds us at the start of her book that “Britain did not fight the second world war, the British empire did”. Remembrance is a great British virtue. Whether it’s a Spitfire display, replica red poppies streaming out of the Tower of London or a commemoration of the battle of Waterloo, we know how to do it.Winston Churchill’s idea of a plucky island race standing firm against tyranny in two world wars continues to resonate. Troops from Africa, the West Indies, India and beyond are historically more awkward: they tend to be seen as an adjunct to the main event, although Britain’s success in both wars came from the logistics and manpower derived from its massive empire. At last year’s centenary of 1914, the government avoided the E-word and called such people “Commonwealth soldiers”, although the Commonwealth did not exist at the time. In South Asia, too, the 2.5 million volunteers who served in the second world war are forgotten, since they do not fit easily with the nationalist narrative of independence attained by non-violent resistance.

In The Raj at War, Khan sets herself a tough task: to recover the weft of India during the second world war and tell a story not only of servicemen but of nurses, bearers, political activists, road builders, seamen, interned central European Jews, schoolgirls, Bengali famine victims, enlightened officials, 22,000 African American GIs and even destitute Kazakhs, Iraqi beggars and orphaned Polish children who were escaping upheavals elsewhere. “At many stops on their way to Bombay, local people greeted the children at the stations, treating them with sweets, fruits, cold drinks and toys,” reported the wife of the Polish consul general.

Telling history from the bottom up is difficult, since those in extremis rarely record their experiences; it is easier to come in from the sides than from below, and use the diaries and letters of Europeans or members of India’s Anglophone elite.

More here.

Monday, July 27, 2015

Eating Icleand: A Photo Essay

by Akim Reinhardt

Icelandic Bieber ponyI am recently returned from Iceland, the land waddling puffins, roaring volcanoes, and horses that look like Justin Bieber.

It was my first time visiting, and before arriving, I didn't know much about this nearly arctic island other than some vagaries about vikings and banking scandals. So I had very little in the way of preconceived notions about the cuisine, and didn't expect anything in particular.

It turns out the food was quite good. There's lots of soup, and I'm a whore for soup, so that was a good match. Also tons of seafood, which is another favorite of mine, although it doesn't quite drive me to walk the streets with a handkerchief dangling behind my shoulder. And then there's also various treats, ranging from liquor to throat lozenges, that feature harshly medicinal herbal flavors. Cheers to that, I say.

Oh, and the chocolate. Far better than I would've guessed. No nonsense. Dark, chalky and delicious.

I don't eat meat, so all that mutton was lost on me, but overall I found Iceland to be a wonderful culinary experience. However, there were also elements of the surreal, which is often the case when one ventures into a new land for the first time. And that is what I would like to share in this photo essay.

What follows are images of and brief comments about things that are neither right or wrong, but rather just make me smile and remind me that we are all very strange.

Read more »

Strained Analogies Between Recently Released Films and Current Events: Minions and the Illusion of Voting

by Matt McKenna

Minions-movie-first-trailer-P5gV20-2015Avoiding ads featuring Universal Pictures' Minions is almost as difficult as avoiding political ads for the upcoming presidential election. In case you're somehow not familiar with the minion characters, they are small Tic-Tac-looking creatures who speak half-gibberish and first appeared as bumbling sidekicks in the animated Despicable Me franchise. When it became clear the marketing potential of minions outgrew the confines of the passable children's movie from which they originated, Universal spun out a film focusing on the minions characters themselves. Thus, we have Minions, a story about the eponymous characters' attempt to find an evil leader to whom to pledge allegiance and fulfill their species' destiny. The film's premise may be simple, but it provides a view into our own election process by describing its apparent opposite–instead of politicians being forced to pander to the voting public in order to be elected, Minions inverts the who-must-ingratiate-themselves-to-whom situation and considers what the world would be like if voters (minions) had to convince politicians they are worthy followers. When reexamining American elections through the lens of Minions, it becomes clear that though the minions' leadership-acquiring process may appear to be the exact inverse of the American voters' leadership-acquiring process, they are, in fact, identical.

Minions opens by showing the evolution of the minion species from single cell organism to the plush-doll friendly form that they take in the Despicable Me franchise. Through narration, we learn that minions are a species who form a symbiotic/parasitic relationship with the most “despicable” organism in its ecosystem. Over time, minions are forced to find new villains to follow: from the biggest organism in the primordial soup, to the most fearsome dinosaur in the jungle, to eventually Napoleon, the fiercest dictator on the planet. Unfortunately, after failing Napoleon for the last time, the minions are banished to an ice-cave where they toil away until the 1960s, conveniently rendering them absent from Europe during World War II presumably so the filmmakers wouldn't have to grapple with the minions' desire to serve Hitler.

Read more »