How Carbon Dioxide Melted the World

1.10393Sonja van Renssen in Nature News:

Rising levels of carbon dioxide really did bring about the end of the most recent ice age, say researchers. By compiling a global climate record, a team has shown that regions in which concentrations of greenhouse gases increased after the warming were exceptions to the big picture.

There are many ideas about what caused the end of the last ice age 10,000 years ago, but a lack of data has hindered testing of hypotheses. It has been known that changes in temperature and CO2 levels were closely correlated during this time, but no one has been able to prove that CO2 caused the warming.

Ice samples from Antarctica even suggested that temperatures rose a few centuries before CO2 levels, making it impossible for the gas to have been responsible. This provided ammunition for climate sceptics.

But, by analysing data gathered from 80 locations around the world, Jeremy Shakun, a palaeoclimatologist at Harvard University in Cambridge, Massachusetts, and his colleagues have shown that at the global level, warming followed CO2 increases. The team used seven different records of past temperature, including ice cores, ancient pollen and the chemical composition of fossilized microscopic life.

“This is the first effort to get most of the data that’s out there together,” says Shakun. “It's the first hard empirical proof that CO2 was a big driver of global warming out of the ice age.”

Slacker at Twenty

SlackerAaron Lake Smith in n+1:

New York City is the great circling bathtub drain that young people from the college towns and mid-sized cities of North America disappear into, unable to resist the siren song of their own cosmopolitan ambitions. The drainage of souls from second- and third-tier cities like Cleveland, Columbus, and Houston culturally balkanizes the nation—the family-oriented and content stay at home, breeding more of the same, while the driven and career-minded pack off to New York, Los Angeles, and San Francisco to join thousands of others like them in the endless cultural orgy.

The most effective propaganda for the young urban way of life is friends. Many of those who stay in hometowns inevitably end up consuming the lives of their New York and California friends on Flickr and Facebook—their all-night parties and hangover brunches, their real careers, their expansive dating pool, their avant-garde theatre and Upright Citizens Brigade, until they eventually decide to give in and make a considered move to the city to join the party. This cultural natural selection—the coasts thriving and the middle dying out—is quietly undermined by a fifth column of factions that resiliently stick around smaller, less glitzy places to build them up and make them better places to live. These cultural Maoists bunker down against the forces of gravity to start up community spaces, independent video and record stores, and bike shops, seemingly undaunted by the losing war they’re fighting against attrition.

In Richard Linklater’s 1991 portrait of Austin’s freak gentry Slacker, there is a piece of recurrent graffiti that reads, “If you don’t like NYC, don’t go.” This early lumpen premonition of the coming Friends-era migration seems to be a preoccupation for this filmmaker. After a short stint living in New York, Linklater moved back to his Texas hometown and made Slacker on a shoestring budget of loans and maxed-out credit cards. When the film met with success, Linklater refused to move to Hollywood—instead he took an unconventional route, bunkering down in Austin and buying a two-story warehouse for Detour, his fledgling film production company. Nurtured by Detour and the Austin Film Society nonprofit, Linklater in Austin built up a small-film cottage industry. Slacker, though often canonized as a portrait of 1990s youth culture, is at root a local film. It was shot and produced entirely in Austin with local non-actors and musicians like the Butthole Surfers’ drummer Teresa Taylor. The fictionalized, documentary-style film doesn’t have a plot or recurring characters—long, omniscient shots track from one set of Austin hipsters to the next—but it manages to succeed on its own terms. Now, Slacker feels like a terrarium—a little universe of the living ideas and archetypes of a bygone time, preserved behind glass.

Just One More Game …: Angry Birds, Farmville and Other Hyperaddictive ‘Stupid Games’

08games1-articleInline-v2Sam Anderson in the NYT Magazine:

In 1989, as communism was beginning to crumble across Eastern Europe, just a few months before protesters started pecking away at the Berlin Wall, the Japanese game-making giant Nintendo reached across the world to unleash upon America its own version of freedom. The new product was the Game Boy — a hand-held, battery-powered plastic slab that promised to set gamers loose, after all those decades of sweaty bondage, from the tyranny of rec rooms and pizza parlors and arcades.

The unit came bundled with a single cartridge: Tetris, a simple but addictive puzzle game whose goal was to rotate falling blocks — over and over and over and over and over and over and over — in order to build the most efficient possible walls. (Well, it was complicated. You were both building walls and not building walls; if you built them right, the walls disappeared, thereby ceasing to be walls.) This turned out to be a perfect symbiosis of game and platform. Tetris’s graphics were simple enough to work on the Game Boy’s small gray-scale screen; its motion was slow enough not to blur; its action was a repetitive, storyless puzzle that could be picked up, with no loss of potency, at any moment, in any situation. The pairing went on to sell more than 70 million copies, spreading the freedom of compulsive wall-building into every breakfast nook and bank line in the country.

And so a tradition was born: a tradition I am going to call (half descriptively, half out of revenge for all the hours I’ve lost to them) “stupid games.” In the nearly 30 years since Tetris’s invention — and especially over the last five, with the rise of smartphones — Tetris and its offspring (Angry Birds, Bejeweled, Fruit Ninja, etc.) have colonized our pockets and our brains and shifted the entire economic model of the video-game industry. Today we are living, for better and worse, in a world of stupid games.

Game-studies scholars (there are such things) like to point out that games tend to reflect the societies in which they are created and played. Monopoly, for instance, makes perfect sense as a product of the 1930s — it allowed anyone, in the middle of the Depression, to play at being a tycoon. Risk, released in the 1950s, is a stunningly literal expression of cold-war realpolitik. Twister is the translation, onto a game board, of the mid-1960s sexual revolution. One critic called it “sex in a box.”

Günter Grass Attacks Israel in New Poem

Image-335661-panoV9free-nywsIn Speigel online:

Günter Grass, Germany's most famous living author and the 1999 recipient of the Nobel Prize in Literature, sparked outrage in Germany on Wednesday with the publication of a poem, “What must be said,” in which he sharply criticizes Israel's policies on Iran.

“Why did I wait until now at this advanced age and with the last bit of ink to say: The nuclear power Israel is endangering a world peace that is already fragile?” Grass writes in the poem. The 84 year old also criticizes the planned delivery of submarines “from my country” to Israel, a reference to Germany's plan to deliver Dolphin-class submarines to Israel that are capable of carrying nuclear-armed missiles. At the same time, Grass also expresses his solidarity with Israel.

In the poem, published by Germany's Süddeutsche Zeitung newspaper and other European dailies on Wednesday, Grass also calls for an “unhindered and permanent monitoring of Israel's nuclear potential and Iran's nuclear facility through an international entity that the government of both countries would approve.” It is widely believed that Israel possesses nuclear weapons, although it has never been proven.

In response to the publication, the Israeli Embassy in Berlin issued a statement offering its own version of “What must be said.” “What must be said is that it is a European tradition to accuse the Jews before the Passover festival of ritual murder,” the statements reads. “Earlier, it was Christian children whose blood the Jews allegedly used to make their unleavened bread, but today it is the Iranian people that the Jewish state allegedly wants to annihilate. What also must be said is that Israel is the only state in the world whose right to exist is openly doubted. That was true on the day of its founding and it remains true today. We want to live in peace with our neighbors in the region. And we are not prepared to assume the role that Günter Grass is trying to assign to us as part of the German people's efforts to come to terms with the past.”

the dream is szold

Hadassah

A hundred years ago, on the Jewish holiday of Purim in 1912, a group of women founded the Hadassah Chapter of the Daughters of Zion at Temple Emanu-El in Manhattan. Soon thereafter, this group began its medical endeavors in Palestine by sending two nurses to Jerusalem, which at the time was still under Ottoman rule. Today, the heart of the Hadassah Medical Organization remains in Jerusalem, and it includes two hospitals and related institutions that employ more than 5,000 men and women. Moreover, last week the organization began celebrating its centennial by opening a new hospital tower, an immense building with state-of-art technology that solidifies the organization’s place as arguably the leading medical center in the Middle East. But as I learned recently, the story here is not merely one of professional success but of human achievement in adverse political conditions. At a time when peace in the Middle East seems more distant than ever—and the Palestinian and Israeli governments appear happy in their immobility—places in civil society such as Hadassah exemplify a flourishing coexistence among Jews and Arabs.

more from Eyal Chowers at Dissent here.

guileless belief disguised as cynicism

Hirstspot-690x406

No one commands higher prices than Damien Hirst, and nothing is more fashionable than to loathe him. Still, we can’t do without him. In his person and his work, Hirst embodies the current condition of the art market: aloof, reckless, profligate, creepy, fast, fat and out of control. He is to art what Dubai is to architecture and Michael Bay is to movies: the leading exponent of the current blockbuster style. No one else has been as good at giving material drama and visual form to the vast accumulations of wealth during the latest, rococo phase of capitalist accumulation. That makes him our canary in the mineshaft. Whether despicable or dumb, whatever he does is at least worth noticing. This month, an exhibition of Hirst’s spot paintings opened at every outpost of the Gagosian Gallery empire the world over. It’s a terrific marketing trick, as is almost everything Hirst does. Anyone who visits all eleven galleries (spread among eight cities) will get a free print—and, in spite of myself, I’ve been wondering if I could swing a trip to Athens and Hong Kong next month. As an art exhibit, though, “The Complete Spot Paintings” offers a strange mix of commercial megalomania and aesthetic tedium.

more from Jacob Mikanowski at The Point here.

putin’s dilemma

Vladimir-Putin-007

Making sense of Putin’s elections during the past decade is as important for getting his regime right as is making sense of the show trials in the 1930s for getting Stalin’s regime right. A major task of Stalin’s spin doctors seventy-five years ago was to use the trials’ pre-decided verdicts to showcase Stalin’s power – a demonstration that was all the more effective the more painfully innocent those were who, in a choreographed mise-en-scène, falsely confessed their betrayal of the Great Leader and were speedily executed for their compliance. Similarly, though much less cruelly, the show elections between 2000 and 2008 demonstrated the Putin government’s puppeteer power. The Kremlin not only manipulated those elections, it also insisted (contrary to what one might expect) that everyone be made vividly aware that it was directing the movements of every single player in the electoral charade and orchestrating every apparent crisis in the run-up to an election. The Kremlin did not play the czar, it played God. Until recently, the paradox of Putin’s Russia has been that elections, though blatantly unfree and unfair, have been at the very heart of both the regime’s popular appeal and its authoritarian credentials. Just as Stalin’s claim to power was based on his constant purging of the Party of never-ending internal enemies, Putin’s claim to power has been based on his ability to organize elections that, although obviously rigged, have excited almost no open protest.

more from Stephen Holmes and Ivan Krastev at Eurozine here.

Killing Babies

Km11Kenan Malik over at his site, originally in Goteborg-Posten:

Is there no moral distinction between killing a newborn baby and aborting a fetus? And should an academic paper that seemingly advocated the killing of newborns have ever been published?

Those are the questions at the heart of a controversy that has erupted after the publication of a paper entitled ‘After-birth abortion: Why should the baby live?’ in the Journal of Medical Ethics. Two Australian academics, Alberto Giubilini and Francesca Minerva, argued that the moral status of a newborn baby was identical to that of a fetus. Given that most people view abortion as morally acceptable so, they argued, there is no reason not to see infanticide as morally acceptable, too, even in ‘cases where the newborn has the potential to have an (at least) acceptable life, but the well-being of the family is at risk’. Indeed, Giubilini and Minerva reject the term ‘infanticide’, preferring to talk of ‘after-birth abortion’.

The paper, which would normally have been read only by a handful of moral philosophers, was picked by newspapers and websites and caused outrage worldwide. ‘Slaughter newborn kids, say academics’, read the headline in one British tabloid. Australian commentators, American chat show hosts and Catholic bishops weighed in, many claiming that infanticide was the logical consequence of the legalization of abortion. The two authors say that they have received death threats.

There is, in fact, little new in Giubilini and Minerva’s argument. Philosophers such as Peter Singer have long championed similar kinds of claims. Humans, Singer suggests, have no intrinsic claim to life. The interests of an individual, including their right to life, depend upon their cognitive abilities. ‘The fact that a being is a human being, in the sense of a member of the species Homo sapiens, is not relevant to the wrongness of killing it', he argues; 'it is, rather, characteristics like rationality, autonomy, and self-consciousness that make a difference. Infants lack these characteristics. Killing them, therefore, cannot be equated with killing normal human beings, or any other self-conscious beings.’

Since a newborn, unlike an adult, is incapable ‘of anticipating the future, of having wants and desires for the future’, Singer has written, so they do not suffer by being deprived of a life they could never have imagined anyway. ‘Killing a newborn baby is', in his view, 'never equivalent to killing a person, that is, a being who wants to go on living’.

A Local Approach to Continuing Higher-ed: Bar Room U

Intelpost120409_plato_560

Christopher Beha looks at a new approach to continuing higher education started by our friends over at The Brooklyn Institute for Social Research, in New York Magazine:

One recent Tuesday evening, nine twenty- and thirtysomethings gathered in the back room of Boerum Hill’s Building on Bond to discuss a crucial text for understanding our sociopolitical moment: Plato’s Republic. While a waitress brought dinner and $3 pints of Bud, their conversation meandered from the foundational treatise to related matters left unexplored by its author, like whether Ron Paul’s libertarianism is more deontological or consequentialist. (The consensus: probably deontological at heart, though voters demand consequentialist arguments.) Two hours in, the crowd migrated up to the bar, where the discussion continued in the same vein. They were still drinking and talking when the bartender announced last call.

What transpired that night just may represent the future of higher education—or at least one proudly low-tech vision of it. Politics of the City, the formal name for the somewhat informal gathering, is the first course offered by the new Brooklyn Institute for Social Research. Its instructor, Ajay Chaudhary, dreamed up the institute while teaching in Columbia’s famed Core Curriculum, in which every undergraduate reads the classics of Western civilization. “Whenever I talked with people outside the university about what I did,” Chaudhary said, “they would tell me, ‘I want to do that. I want to read Aristotle and Augustine.’ ”

Continuing-education programs tend to be bluntly functional (professional-development courses like computer programming or bookkeeping), less than rigorous (culture “appreciation” classes), or flat-out silly (see “Transformers Star Tyrese Gibson: How to Get Out of Your Own Way—Tips for Making It” at the Learning Annex). More serious academic fare is proliferating online, but those classes are primarily for quants not quals.

In addition to classes, they are raising money over at Kickstarter to develop a knowledge tool, ~Archive. Consider a donation:

The ~Archive is a tool to provide easy electronic access to out-of-print or hard to find texts.

Okay, it's a tool. How does it work?

It happens every day. Mostly to academics, journalists, and other knowledge professionals, but also to anyone who is conducting independent research or simply trying to figure out something that's just beyond the reach of Google, Wikipedia, or even the local library. You find a reference to an important but impossible to find text. It could be old. It could be out of print. It could be rare. All you know is that you need it and you can't have it. These are not the old books you can already get for free on your Kindle or iPad through Project Gutenberg, or what you can find, sometimes incomplete, on Google Books. We love these services and wonder how we ever lived without them. We are talking about a lot of other stuff. Stuff that fell through the cracks. Works that history forgot to record, except for a tiny reference in an essay or a newspaper review. Books that are crumbling in an archive or private collection, which normally couldn’t be reproduced without permanent damage. And that's where our ~Archive comes in.

Facebook: The Next Tool in Fighting STDs

Std_facebook-460x307Tracy Clark-Flory in Salon:

Imagine being able to download a Facebook app that would alert you to your sexually transmitted infection risk based on your friend’s status updates. This may sound far-fetched, and it still is, but as some researchers shift their focus to risk among friend groups, as opposed to just sexual partners, social networks are rapidly becoming a tool to prevent the spread of STIs.

Peter Leone, a professor of medicine at the University of North Carolina’s Center for Infectious Diseases, is one of those experts. Earlier this month, he spoke at an international health conference and underscored the importance of exploring such possibilities. Real-world social networks — in other words, a person’s circle of friends and sexual partners — have already proved to be strong predictors of STI risk, he says. It follows that sites like Facebook, which convene all of those real-world connections in one virtual setting, have huge potential in this arena.

Leone found that when sexual partners of patients newly diagnosed with HIV came in for testing, 20 percent turned up HIV-positive. It might seem counter-intuitive to extend the targeted test circle to those a newly diagnosed patient is merely friends with, but people in the same social circle often sleep with the same people, and might engage in similar risk-related behavior. Instead of looking at people within a particular at-risk demographic, this approach allows them to target known clusters of infection.

Makes you think of the people on your “Close Friends” list a bit differently, doesn’t it?

The art of staying alive

From The Independent:

TomTom was my husband. Writing was his life and work. But in September 2008 he was diagnosed with a 'grade four' brain tumour, situated in the left temporal lobe, the area responsible for speech and language. During his last year, articulate speech became an effort. He willed words into being as they vanished again. It was a transcendent time, volatile and strange; full of danger. Tom's work was to keep his illness and his life in clear sight. His task was, in his own words, “a lesson in imagination, in self-imagination”. The incredible thing was that he could write this down.

Our house had long been a word factory. Since 11am on 11 March 2010, when we noticed Tom's words skip their sense in a more radical way, we knew that we were really in trouble. Tom had a second craniotomy on 13 April. After this, we were focused on the making of meaning. The level of production was intense. Tom kept writing. He revised older texts, collated essays, worked on images for an exhibition and wrote new material.

More here.

A Little Device That’s Trying to Read Your Thoughts

From The New York Times:

HawkSAN DIEGO — Already surrounded by machines that allow him, painstakingly, to communicate, the physicist Stephen Hawking last summer donned what looked like a rakish black headband that held a feather-light device the size of a small matchbox. Called the iBrain, this simple-looking contraption is part of an experiment that aims to allow Dr. Hawking — long paralyzed by amyotrophic lateral sclerosis, or Lou Gehrig’s disease — to communicate by merely thinking. The iBrain is part of a new generation of portable neural devices and algorithms intended to monitor and diagnose conditions like sleep apnea, depression and autism. Invented by a team led by Philip Low, a 32-year-old neuroscientist who is chief executive of NeuroVigil, a company based in San Diego, the iBrain is gaining attention as a possible alternative to expensive sleep labs that use rubber and plastic caps riddled with dozens of electrodes and usually require a patient to stay overnight. “The iBrain can collect data in real time in a person’s own bed, or when they’re watching TV, or doing just about anything,” Dr. Low said. The device uses a single channel to pick up waves of electrical brain signals, which change with different activities and thoughts, or with the pathologies that accompany brain disorders. But the raw waves are hard to read because they must pass through the many folds of the brain and then the skull, so they are interpreted with an algorithm that Dr. Low first created for his Ph.D., earned in 2007 at the University of California, San Diego. (The original research, published in The Proceedings of the National Academy of Sciences, was done on zebra finches.)

About the Hawking experiment, he said, “The idea is to see if Stephen can use his mind to create a consistent and repeatable pattern that a computer can translate into, say, a word or letter or a command for a computer.”

More here.

Tuesday Poem

Grammar

Maxine, back from a weekend with her boyfriend,
smiles like a big cat and says
that she's a conjugated verb.
She's been doing the direct object
with a second person pronoun named Phil,
and when she walks into the room,
everybody turns:

some kind of light is coming from her head.
Even the geraniums look curious,
and the bees, if they were here, would buzz
suspiciously around her hair, looking
for the door in her corona.
We're all attracted to the perfume
of fermenting joy,

we've all tried to start a fire,
and one day maybe it will blaze up on its own.
In the meantime, she is the one today among us
most able to bear the idea of her own beauty,
and when we see it, what we do is natural:
we take our burned hands
out of our pockets,
and clap

by Tony Hoagland
from Donkey Gospel, 1998
Graywolf Press, St. Paul, Minn.

The Wave Cry, the Wind Cry

Cover-sightlines

‘I sat there, as the others worked, and wished, as I so often do, that I could draw.’ Where the poet Kathleen Jamie sat was within the rib cage of a blue whale, in the hvalsalen (the whale hall) of the natural history museum in Bergen. Her wish was needless because her written words make readers see with a clarity bestowed by only a few most gifted writers. It was, however, an enlightening wish. It expressed the intensity of her own seeing, her gift. Only someone with obsessively hungry eyes can write as she does. It makes her, to borrow John Berger’s words quoted on the jacket of Sightlines, ‘a sorceress of the essay form’. It does not matter what she is describing, you see it with her. In the first of these essays she is on a ship threading its way between icebergs up the longest fjord in the world. In the morning sunlight an iceberg glows ‘marsh-mallow pink’, and ‘trinkets’ of white ice are scattered along the shores. In the next essay, she is in a hospital in Dundee: having concluded that nature ‘is not all primroses and otters’, she needs to get the feel of our own intimate inner natural world, the body’s shapes and forms. She has therefore found her way into a pathology lab, and then into a post-mortem. ‘I thought “we are just meat”, then called it back. Flesh, bodily substance, colons, livers and hearts, had taken on a new wonder.’

more from Diana Athill at Literary Review here.

notes on sontag

Cover00

THE TITLE OF THE SECOND VOLUME of Susan Sontag’s private writings is taken from an entry dated May 22, 1965, when Sontag was thirty-two years old. “Novel about thinking—” it begins. “An artist thinking about his work.” In the margins, she adds, “A spiritual project—but tied to making an object (as consciousness is harnessed to flesh).” It’s a strange and spooky phrase, the richest image in the diary’s five hundred pages. There’s something sad about this emblem of captivity, the spirit being put under reins. There’s also something enabling and empowering—the inanimate being directed, gaining strength, driving forward. Being harnessed to flesh means being flexible enough to move. The animating force at the heart of everything Sontag wrote—the cultivation of aesthetic and intellectual experience—is not properly speaking an idea; it’s a stance, or an attitude. It is itself a way of moving. There is no magnum opus or theoretical treatise that we can point to as Sontag’s distinct contribution, no “takeaway” we can pierce under glass. So it may not be very surprising that since her death eight years ago, the many provocations of her thinking have drifted out of view to make room for the more obvious fact of her celebrity. Besides, she’s a woman; we make good icons.

more from Christine Smallwood at Bookforum here.

the wolf knife

Dvd_nakadate_150

By the time Laurel Nakadate’s The Wolf Knife premiered, in 2010, Nakadate was already known as one of the most provocative and ambitious video artists in New York. Her fearless short films of unglamorous, middle-aged bachelors and the youthful filmmaker herself dancing to Britney Spears, stripping, or singing over a birthday cake, were “incredibly twisted,” as Jerry Saltz put it in the Village Voice. The Wolf Knife, Nakadate’s second feature film, is the daughter of this early work, and inspires similar creepy feelings about desire, domination, and voyeurism. It is also a significant artistic leap forward. Unsurprisingly, the film received nominations for an Independent Spirit Award and a Gotham Independent Film Award for “Best Film Not Playing at a Theater Near You.” Also unsurprisingly, the film has provoked some viewers to walk out within the first fifteen minutes of a screening. Variety called her “an interesting, infuriating artist” and wondered whether many people would be “willing to withstand what she has to say,” but then grudgingly admitted the film was worthy of respect. At the very least, one might call the film “uncomfortable.” Or one might dub it, as the New Yorker did, “a neorealist version of a Lynchian nightmare.”

more from Deb Olin Unferth’s intro and a link to watch the trailer at The Believer here.

In and Of the City: The Cost of Urban Ecology’s Foundational Distinction

by Liam Heneghan

Urban ecology, the environmental sciences youngest and most rambunctious cousin, is in a position to influence the design of the cities of the future. Its clout comes from its willingness to think big, to think about the ecology of entire cities as if they were just any other ecosystem. Urban ecologists call this big picture view the “ecology of the city”.

From this disciplinary perspective, Chicago is just another savannah, one where admittedly the commonest species is the human animal.

However, by taking this bird’s eye view of cities, is urban ecology losing sight of the bird-on-the-ground? I mean this quite literally. Is urban ecology losing it roots in natural history? Will the successful cultivation of relationships with decision makers, municipal authorities, city planners and other governmental powers-that-be, come at the expense of urban ecologists’ knowledge about birds, wildlife, beetles and the other creeping things inhabiting the city?

Are we (and I count myself in this troupe) urban ecologists, forgetting the world-fascination, the intense delight, that comes from direct encounters with nature in the city?

***

Practice of Everyday Life Urban ecology is not the first discipline to encounter the tensions accompanying distinctions between the bird’s-eye view and the bird-on-the-ground view of the city. An instructive example found in the work of Michel deCerteau (1925-1986) who makes of this tension a theory of the everyday interactions of people who both conform to and resist the strictures of the culture to which they belong.

In their entry on deCerteau the Routledge Encyclopedia of Philosophy describes him as “a French philosopher trained in history and ethnography, [who] was a peripatetic teacher in Europe, South America and North America.”[1] To describe him as peripatetic is apropos in two senses as the adjective describes a follower of Aristotle, and also signifies one who moves about quite a bit. Etymologically it comes from the Greek patein which is to tread. Followers of Aristotle are referred to as Peripatetics, though the term refers not to a supposed habit of wandering in the Lyceum after the lecturing Aristotle, but to the practice of teaching in a colonnade (a peripatos). Whatever about the Aristotelian influences on his work, deCerteau, a Jesuit priest, was certainly a wanderer both intellectually and physically having taught in many places and written on history, mysticism, everyday life, spiritual life, literary history and so on.

Read more »

Hip Hop and the “African Spring”

by Edward B. Rackley

Nile basinWhy didn’t the momentum and exuberance of last year’s “Arab Spring” extend to African countries south of the Sahel? Sub-Saharan populations, many immediate neighbors of Tunisia, Libya and Egypt, followed the drama with fascination and some envy. When we spoke, I was surprised how few colleagues and friends in sub-Saharan Africa were optimistic about a counterpoint “African Spring.” They claimed their societies “weren’t ready” to rally widespread discontent towards a political tipping point.

Historically, my friends were wrong—SSA has much experience with successful opposition movements, from colonialism to apartheid. But I took their resignation to mean that social fragmentation had secured the upper hand, proof that poverty and cynical governance were not just misanthropic but bitterly divisive as well. The process of overcoming deep social, generational and political divisions, with their common denominator of skepticism and self-interest, cannot simply be ignited like the proverbial box of tinder.

Internet connectivity was clearly an enabler for the Arab Spring, and SSA still lacks reliable connectivity and familiarity with social media. But coastal North African countries are different from their southern neighbors in infinite other ways as well. Despite non-western culture, values and religious beliefs, North Africa’s Mediterranean exposure imposes a definite political and economic orientation towards Europe, for ill or good. Solidarity in any form—security, economic, ideological—is almost non-existent between countries divided by the Sahel. Few North African countries look south for constructive economic or political opportunity. Exploitation of less developed southern countries (human trafficking, resource predation) is more the norm.

I’ve written here before about the Nile Basin Initiative, an internationally-funded effort to negotiate equitable use rights for the countries of the great river, killed by mutual mistrust in 2010. The late Colonel Gaddafi led Pan-Africanism, the only other north-south unification effort. His utopianism managed to defy open ridicule thanks to his hefty wallet, but never commanded serious attention. In hindsight it proved far more effective at ensconcing the dinosaur club of out-of-touch leaders, like Gaddafi himself, for decades. This retrograde model of leadership, widely practiced among newcomers to power, is arguably the continent’s greatest impediment to modernity.

Read more »