Is cold fusion truly impossible, or is it just that no respectable scientist can risk their reputation working on it?

Huw Price in Aeon:

ScreenHunter_1580 Dec. 23 08.19A few years ago, a physicist friend of mine made a joke on Facebook about the laws of physics being broken in Italy. He had two pieces of news in mind. One was a claim by a team at the Oscillation Project with Emulsion-tRacking Apparatus (OPERA) in Gran Sasso, who said they’d discovered superluminal neutrinos. The other concerned Andrea Rossi, an engineer from Bologna, who claimed to have a cold fusion reactor producing commercially useful amounts of heat.

Why were these claims so improbable? The neutrinos challenged a fundamental principle of Albert Einstein’s theory of special relativity, which says that nothing can travel faster than light. Meanwhile cold fusion (or LENR, for ‘low-energy nuclear reaction’) is the controversial idea that nuclear reactions similar to those in the Sun could, under certain conditions, also occur close to room temperature.

The latter was popularised in 1989 by Martin Fleischmann and Stanley Pons, who claimed to have found evidence that such processes could take place in palladium loaded with deuterium (an isotope of hydrogen). A few other physicists, including the late Sergio Focardi at Bologna, claimed similar effects with nickel and ordinary hydrogen. But most were highly skeptical, and the field subsequently gained, as Wikipedia puts it, ‘a reputation as pathological science’.

It turned out that my physicist friend and I disagreed about which of these unlikely claims was most credible. He thought it was the neutrinos, because the work had been done by respectable scientists rather than a lone engineer with a somewhat chequered past. I favoured Rossi, on grounds of the physics. Superluminal neutrinos would overturn a fundamental tenet of relativity, but all Rossi needed was a previously unnoticed channel to a reservoir of energy whose existence is not in doubt.

More here.

In Defense of Makeup

Jane Shmidt in Open Letters Monthly:

Facepaint-243x300Earlier this year, a potentially fictitious, albeit tremendously entertaining, story was reported by Yahoo news about an Algerian man who sued his bride after seeing her without makeup for the first time on the morning after the wedding. The man claimed that he was deceived, unable to recognize her as the same woman he married. Beyond its evident humor, this story is a thinly veiled criticism of the application of makeup, suggesting that women treat their body as an object in a sale with marriage as the ultimate goal, and that they use makeup as a tool to mislead and capture a mate, to beguile the male gaze.

Bitter animosity toward women’s use of cosmetics is nothing new. In her book on makeup practices and production, Face Paint: The Story of Makeup, Lisa Eldridge – a renowned professional makeup artist – charts out the history of the debate on the value of applying color to the body. She finds that makeup has been considered a form of artifice, or even indecency, for a large portion of history from ancient Greece through the present, while, on the other side of the spectrum, some contemporary feminists have denounced makeup as an instrument of oppression that forces women to conform to an ideal.

More here.

This year marks a new language shift in how English speakers use pronouns

Gretchen McCulloch in Quartz:

ScreenHunter_1579 Dec. 23 08.09You’ve probably come across the singular pronoun “they” recently. Perhaps it was in the Washington Post’s recent addition of it to the paper’s style guide. Perhaps it was in this BBC article about gender-neutral pronouns. Perhaps it was in this viral Tumblr post comparing singular “they” to singular “you.” Wherever the source, singular “they” has become more popular in 2015 than ever before–so popular, in fact, that it’s Quartz’s (unofficial) nomination for Word of the Year.

Let’s clear something up right away. Using “they” to refer to a single person isn’t new, but words of the year rarely are. Rather, this usage has been simmering for many years, finally bursting onto the scene this year with a newfound prominence. And just in time, too. Language can and should keep up with cultural shifts, including developments in society’s understanding of gender. While some holdout grammarians and copy editors might squirm, it’s become increasingly clear that our current pronoun palette simply isn’t sufficient. Luckily, we already have a perfectly good word at the ready.

More here.

What Led to the End of Kerala’s Matrilineal Society?

As Kerala was ushered into the modern era, closer to democracy and republicanism, the women of Travancore came to occupy a central role in its fortunes. In this excerpt from the book, Pillai delves into the history of Kerala’s unusual history of matriliny.

Manu Pillai in The Caravan:

TheCaravanManuPillaiTheIvoryThrone-318x500Some anthropologists regard Kerala’s system of matrilineal kinship as the continuation of a practice that at one time existed all over the world, while others contend that it was conceived due to some mysterious, compelling circumstances that replaced patriarchy at a historical point. There are, however, two views on this that have been passed down within the region. One is mythological and based on a Malayalam treatise called Keralolpathi, as well as and a Sanskrit work called the Kerala Mahatmyam. These refer to the creation of Kerala by the legendary hero Parasurama, who is supposed to have hurled his battle-axe from Gokarna to Cape Comorin and claimed from the sea all the land in between. He is then said to have awarded this new region (conveniently) to Brahmins, after which he summoned (equally conveniently) deva (divine), gandharva (celestial minstrel), and rakshasa (demon) women for the pleasure of these men. The Nairs, the principal matrilineal caste, were the descendants of these nymphs and their Brahmin overlords, tracing their lineages in the maternal line. This version was dismissed quite appropriately by William Logan in his Malabar Manual as “a farrago of legendary nonsense.”

The other theory relates to the ancient martial tradition of the Nairs. Boys were sent off to train in military gymnasiums from the age of eight, and their sole occupation thereafter was to master the art of warfare. For them death by any other means than at the end of a sword on the battlefield was a mortifying ignominy and in their constant zeal for military excellence and glorious bloodshed, they had no time to husband women or economic resources. So a man would never “marry” a woman, as in other parts of India, and start a family with their children. Instead he would visit a lady in her natal home every now and then, solely for sexual purposes, and the offspring would be her responsibility entirely. Matriliny was, as per this theory, consequent upon the men purely being instruments of war rather than householders. So the onus of family and succession was taken care of by women, who formed large establishments and managed their affairs independently in the absence of men. While the military tradition of the Nairs, famous for its suicide bands called chavers, was well known, this theory is also more circumstantial than absolute.

More here.

What It’s Like to Be Noam Chomsky’s Assistant

Beverly S. Stohl in the Chronicle of Higher Education:

ScreenHunter_1577 Dec. 22 21.34The first time I didn’t meet Noam Chomsky was in 1992, when a TV news channel asked him to interview me about my ability to talk backward fluently. He said no. I’d like to believe he was actually away, or sick, or that he didn’t get the message at all, but most likely he brushed off the request, sticking to more serious issues. Another linguist, Steven Pinker, took the assignment and determined that my skill, rather than being a sign of linguistic brilliance, was just a trick, like “juggling lit torches from a unicycle” (which I have to admit is on my bucket list).

The second time I didn’t meet Noam Chomsky was a year later, at the Massachusetts Institute of Technology, when I was interviewed in his office in Building 20, a decrepit army research building, by his longtime colleague and friend Morris Halle, for the position of Chomsky’s assistant. Morris met with several star-struck people whose priority was to become part of Chomsky’s inner world, but my skill set and lack of familiarity with his politics made me his choice for the position. When Morris learned that psychology was my field of study, he issued me a few caveats: “This will not be a warm and fuzzy position. You have not been hired to be Professor Chomsky’s friend.”

On my second day of work, a man in his mid-60s with longish graying hair and a chiseled face I recognized from photos arrived in my office looking preoccupied. He wore a gray crew neck sweater over a blue denim shirt and blue jeans rolled up to expose sensible white socks. He held two briefcases, one of heavy blue canvas and the other worn brown leather, with the letters “NC” stamped in faded gold at the top.

More here.

On the Surfaces of Things: Mathematics and the Realm of Possibility

What follows is an essay adapted from a talk, delivered in 2010 to teenagers and parents in my hometown of Cupertino, California. The talk concerned the surfaces of things, like bodies and planets, and abstract surfaces, like the Möbius strip and the torus. The goal was to learn about the world by studying surfaces mathematically, to learn about mathematics by studying the way we study surfaces, and, ideally, to learn about ourselves by studying how we do mathematics.

Joshua Batson in Hypocrite Reader:


An atlas of squares assembles into a cube.

Imagine that you are an emperor regarding a great expanse from a tall peak. Though your territory extends beyond your sight, a map of the entire empire hangs on the wall in your study, seventy interlocking cantons. On your desk lies a thick, bound atlas. Each page is a detailed map of a county whose image on the great map is smaller than a coin.

You dream of an atlas of the world in which the map of your empire would fill but a single page. You send forth a fleet of cartographers. They scatter from the capitol and each, after traveling her assigned distance, will make a detailed map of her surroundings, note the names of her colleagues in each adjacent plot, and return.

If the world is infinite, some surveyors will return with bad news: they were the member of their band to travel the furthest and uncharted territory lies on one side of their map. But if the world is finite, there will be many happy surprises: a friend last seen in the capitol is rediscovered thousands of miles from home, having taken another path to the same place. Their maps form an atlas of the world.

You have another dream, in which the whole world is contained in your capable hands, miniature and alive. Upon waking, you begin to tear the pages out of the atlas and piece them together, using the labels on the edge of each page to stitch it to its neighbors, constructing ever larger swaths of territory.

Soon you are left with six grand charts. You fit them together and they lift off the table to form a cube.

More here.

What if Trump is winning because of his racism and bigotry, not despite it?

Jamelle Bouie in Slate:

TrumpFear_jpg_CROP_promo-xlarge2There is no question that Trump has run the most unapologetically racist and nativist campaign since George Wallace made his first national play in 1964. And, like Wallace before him, it’s been successful, drawing tens of thousands of people to massive rallies across the country. Trump probes their fears, excites their passions, and gives them voice in a way they love and understand. “We have losers. We have people that are morally corrupt. We have people that are selling this country down the drain,” Trump declares. These voters may feel anxious about their economic status. But they also hold racial and cultural resentments. They’re worried about their futures and they dislike immigrants, Muslims, and blacks.

On Monday, the Washington Post looked at the white supremacists and white nationalists who cheer Trump as an asset to their movement. Trump has opened “a door to conversation” and “electrified” some members of the movement, says one leader in the Ku Klux Klan. “I think a lot of what he says resonates with me,” says David Duke, a “Grand Wizard” in the Klan and former Louisiana politician. In a similar piece for the New Yorker, writer Evan Osnos spoke to Jared Taylor, a prominent white nationalist who described the situation as such. “I’m sure he would repudiate any association with people like me,” said Taylor, “but his support comes from people who are more like me than he might like to admit.” These voices are self-serving, but that doesn’t mean they’re wrong. Trump has shot to the top, fueled by vicious rhetoric against Latino immigrants and Syriain refugees. He has shared racist memes about black Americans and called for a ban on Muslim travel to the United States. And each time, his support ticks higher.

Economic anxiety plays a part here. But maybe Trump has discovered something we all like to deny: That in the 21st century, the racist vote is larger, louder, and more influential than we ever thought.

More here.

Down From the Trees, Humans Finally Got a Decent Night’s Sleep

Carl Zimmer in The New York Times:

ZIMMER-WEB-SUB-master1050Over the past few million years, the ancestors of modern humans became dramatically different from other primates. Our forebears began walking upright, and they lost much of their body hair; they gained precision-grip fingers and developed gigantic brains. But early humans also may have evolved a less obvious but equally important advantage: a peculiar sleep pattern. “It’s really weird, compared to other primates,” said Dr. David R. Samson, a senior research scientist at Duke University. In the journal Evolutionary Anthropology, Dr. Samson and Dr. Charles L. Nunn, an evolutionary biologist at Duke, reported that human sleep is exceptionally short and deep, a pattern that may have helped give rise to our powerful minds. Until recently, scientists knew very little about how primates sleep. To document orangutan slumber, for example, Dr. Samson once rigged up infrared cameras at the Indianapolis Zoo and stayed up each night to watch the apes nod off. By observing their movements, he tracked when the orangutans fell in and out of REM sleep, in which humans experience dreams.

… Dr. Samson and Dr. Nunn found that the time each primate species spends asleep generally corresponded to its physical size, along with other factors, such as the average number of primates in a group. The one big exception: humans. We sleep a lot less than one would predict based on the patterns seen in other primates. From time to time while sleeping, we slip into REM sleep and dream. All told, we spend about 22 percent of sleep in REM, the highest ratio of REM to total sleep in any primate, the researchers reported. Dr. Samson and Dr. Nunn have an explanation for how humans ended up sleeping so little, and so often in REM. Over tens of millions of years, they assert, changes in our ancestors’ ecology drove the evolution of new sleeping patterns. Humans increasingly have been able to achieve a good night’s sleep. A number of studies suggest that REM sleep benefits the brain. Some scientists argue that it sweeps out molecular debris, and others say it consolidates new memories into lasting impressions. But it was not easy for our monkey-like ancestors to reach REM sleep. They slept on branches, and their nights were anything but easy. As monkeys try to sleep today, they get roused by winds, tree snakes and the jostling of their fellow primates. “It’s like economy class on a plane,” said Dr. Samson. Monkeys, he believes, have to rest longer to get the benefits of REM sleep.

More here.

Blow Up the University: A Modest Proposal for Reform

by Thomas R. Wells

613-pAre universities places for neoliberal human capital formation, or for the construction of a secular cathedral of human knowledge, or the development and promulgation of policies and technologies for the material benefit of society, or a finishing school to ensure the critical thinking skills and moral character of our future rulers? Clearly they cannot very well do all these things at the same time. They must choose. And that choice, according to a series of articulate, erudite critics from the humanities, like Martha Nussbaum and William Deresiewicz, should be for the liberal arts finishing school.

The diagnosis is correct but not the solution. The university cannot be saved.

I. The problem: The pathology of prestige

Universities have become too big to succeed and too big to challenge. They funded their expansion by promising all things to all stakeholders – jobs for parents, economic growth for governments, a podium for activists, an education for students. They said we had to trust them because, like doctors, they were the best judges of the wisdom they were selling. Well we did trust them and now the university has become gatekeeper to all sorts of essential parts of modern life, from intellectual credibility to middle-class jobs with dignity to scientific facts.

But they haven't actually kept their promises. They hardly even try. The leftist critique that universities have been taken over by the logic of the market is laughable. Paying customers, whether students, parents, or government, are an afterthought. Of course the money has to come from somewhere, but universities have always found ways to escape the discipline of the market. They invest a lot in marketing but not in connecting their education to the skills valued by the labour market. They have strenuously resisted providing students with the independent standardised information about teaching quality and employability that they need to make rational choices about where and what to study. The degrees they offer, even the ones with practical seeming titles, are stuffed with irrelevant courses about theory that mainly serve to provide jobs for academics rather than practical skills for students (and much of the actual teaching work is then outsourced to grad students anyway). One is often more likely to get a real education in ideas by joining the student debating society, and real work skills by leaving for a 6-month internship.

Read more »

Intelligent design or intricate deception? What I told students during the Kitzmiller trial

by Paul Braterman

UntLarge_campus_1

University of North Texas, where I was teaching in 2005

Kitzmiller v Dover Area School District, in which judgment was pronounced on 20th December 2005, is the court case that established that Intelligent Design is not science, but a form of religiously motivated creationism, and as such may not be taught in publicly funded schools in the US. This is a shortened version of what I told the students at Texas Academy of Mathematics and Science, University of North Texas's early admissions programme, whom I was privileged to be teaching at the time of the trial. I have omitted my discussion of the embarrassing Intelligent Design pseudotext, Of Pandas and People, and the even more embarrassing statement that the Dover School Board instructed teachers to read, for reasons of space and because I have discussed them here before. I have tried to avoid rewriting in the light of what I have learnt since, but insert some comments for clarity, and links where relevant.

Pandas_and_ppl

The “supplementary textbook” at the core of the case

This is a rather unusual presentation. It is the only presentation that I have ever given in response to a specific request from the [then] President of the United States, who has given as his opinion that Intelligent Design should be discussed in schools. It is the only presentation in which you will see me, a chemistry professor, practicing philosophy and even biblical exegisis; and I should warn you that I am practicing without a license. It is the only presentation I have ever given with the expectation that a number of people in the audience will be actively hostile to what I intend to say, because the point of view that I stand for is often misrepresented in this society as being hostile to religion.

But what is really extraordinary about this presentation is, that it is necessary at all. Having been a hundred years in the making, the central notions of evolutionary biology erupted into public awareness a century and a half ago, and, over the following 50 years, the major religious groups of the industrialised world came to terms with these ideas. The creationist challenge to what has been, for over a century, the central theoretical framework of biology, is a recent development, and, very specifically, a 20th-century American phenomenon. Very recently, creationism has changed its name to Intelligent Design Theory, but this is a purely cosmetic change.

I expect that this talk will please no one. I will, as you might expect, argue against Intelligent Design arguments. Indeed, I will go much further, claiming that such arguments are part of a particular kind of mindset, which I will call literalism (although some call it fundamentalism), and that the rise of this mindset represents a most serious threat to knowledge.

Read more »

Against terrorism, let’s try idealism

by Emrys Westacott

ImagesWhen terrorist atrocities are visited on civilian populations, the immediate emotional response is a combination of shock, sadness, and anger. That is natural and understandable. But the anger people feel fuels the thought that “something must be done; ” and political leaders, acutely aware of what is expected of them, immediately proceed to take some action or other. Thus, after the recent massacre in Paris, French president François Hollande ordered bombing raids on Raqqa, an ISIS stronghold in Syria. After 9/11 George Bush ordered a military campaign against al Qaeda in Afghanistan. These knee-jerk responses may gratify the urge to act; they also satisfy the politicians' need to appear to be doing something. But these are unworthy ends. Always, the crucial question regarding any action a government takes should be: What are its likely long-term consequences? And very often, it seems, the long-term consequences of the responses to terrorist atrocities are quite contrary to what is intended or hoped for.

Without question, the violence threatened and perpetrated by organizations like ISIS and al Qaeda has to be addressed directly. Appropriate surveillance, improved security procedures, and sometimes military measures are all in order. But we should challenge the idea that those who support large-scale offensive military actions or draconian domestic security measures are the hard-heads, the realists, the pragmatists, while those who tend to be skeptical about the likely efficacy of such actions are weak, soft, unrealistic, and naïve. If anything, the opposite is true.

Let's face it, the track record of the hard-headed hawks is not exactly inspiring. After 9/11, a US military campaign in Afghanistan ousted the Taliban (who had provided a supportive platform for Al Qaeda). This was certainly applauded by most Afghanis. But fourteen years on, around 150,000 people have been killed,[1] about $700 billion dollars have been spent,[2] and Afghanistan was ranked a year ago by Transparency International as the fourth most corrupt country in the world.[3]

The cost of the US led invasion of Iraq in 2003 and its aftermath has been even greater: over 224,000 deaths (according to the Iraq Body Count Project),[4] $815 billion dollars spent, and Iraq is ranked by Transparency International as the sixth most corrupt country in the world.

Moreover, the dollars and death stats just cited seriously understate the real costs. In addition to all the deaths there are hundreds of thousands of people who are crippled, blinded, deafened, maimed, disfigured, or traumatized, not to speak of millions who are widowed, orphaned, or suffer inconsolable grief at the deaths of their children, siblings, family and friends. Millions have lost or been forced to leave their homes.

Read more »

The Cyborg of Practical Wisdom

by Charlie Huenemann

Ex-machina-movieThe biggest struggle my fellow modern-day cyborgs and I face is to create a virtual reality that connects more wholesomely with the human part of our nature. The artificial reality we currently plug into is a Terry Gilliam nightmare. Too many characters within it are armed, dangerous, and barbaric. The bright spots within it – few and far-between – are either so childish and sugary as to seem like a parody of our hearts’ deepest needs, or so smart and ironic as to mock any nobler aims. It’s Grand Theft Auto, or Sesame Street, or South Park – take your pick.

Other virtual diversions exist for us, of course. One can find meaningful examinations of human experience, sensible and judicious overviews of economic tensions, intelligent and respectful discussions of critical issues, wonderfully rich book reviews, and so on. But one has to seek out such treasures deliberately – they seldom pop up of their own accord out of the collective net consciousness – and one must have the time, patience, and discipline to attend to them. This is a bit like trying to read Moby Dick in a strip club. And, cyborg nature being what it is, not many of us will end up spending much time with brother Ishmael.

Aristotle, a human being from twenty-five centuries ago, did his best to put together a sensible account of what makes human beings happy. By his own estimate, we are social beings who like to enjoy one another’s company, usually with some nice food and drink, some music, and a conversation that stimulates the mind. We like to exercise, and to apply our best ideas to laws and social policies. Through drama and art, we love to explore vicariously the troubles we can get into, and discover for ourselves how we would feel in other people’s tragic circumstances. We are, as one might say, multidimensional beings. The trick then, according to Aristotle, is to manage this multidimensionality with reason and experience. We need to monitor our cultural intake with the fastidiousness of a Weight Watcher, judging for ourselves how much is too much, how much too little, and when more attention needs to be directed here or there.

Read more »

Stray Memories

by Hari Balasubramanian

During my middle and high school years, I became fascinated with two generations of stray dogs that lived in my neighborhood. This was in the early 1990s. My family lived in the central Indian city of Nagpur, in a 3rd story flat. The flat had a couple of balconies (decks) which gave me the chance to watch the dogs go about their routines. Instead of studying for exams – which involved the dreary task of memorizing entire sections of textbooks – I would get up in the morning and spend time watching the neighborhood strays. The dogs liked the cool air of early mornings. They played frantically, chasing each other down, trying to wrest torn rags from each other as if the rags were of great value. At 8 am, with the sun up and strong, they would be exhausted. They would lie in the shade, front limbs stretched, their snouts nuzzling in between but their noses still twitching and ears still alert for anything untoward.

Image0000 (1)

Stray dogs (desi kutta in Hindi, theru nai in Tamil) can be found almost everywhere in India. The term ‘stray', in the South Asian context, does not refer to abandoned pets (although some mixing with pet European breeds does happen). From the genetic viewpoint these dogs are actually very ancient and have been around for millennia. They seem to have evolved independently by natural selection (they were not bred commercially) and have adapted well to living in and around human settlements. And they are still around, living on the dirt shoulders of streets, alleyways, the platforms of railway stations, and the ignored nooks and interstices of infrastructure. Residential families and street vendors may occasionally feed them and look after them informally, but the strays largely fend for themselves, scavenging in rubbish dumps or wherever leftover food is available. They mark their territories with their seemingly bottomless bladders, participate heartily in the chaotic and noisy mating season which happens once a year, and work hard to raise their offspring. In this sense, the strays are as wild and independent as, say, the squirrels and crows we find everywhere. They are not always liked due to the risk of rabies, and there is an ongoing debate on how their numbers should be controlled (see this for another perspective).

Read more »

Notes on a Catastrophe

by Mathangi Krishnamurthy

The rains visited on the 1st of December, 2015. This monsoon in Chennai, we had already experienced deluge and mayhem, mid-November. The city's low-lying areas had flooded, and thousands of people had to be evacuated.

When the rains made a dramatic appearance again on the 1st of December, I wasn't prepared for a repeat of the November events. One spate had already weakened the city, and it had tired me out. One only prepares for one natural disaster a year, though all evidence should have us scurrying otherwise. The waters seeped in through the walls of my ancient building, and I looked on, fervent in my belief that this too would pass. Neither the slowly forming pool in the dining room of my second floor walkup, nor the increasingly bleak weather reports prepared me for anything but a day's worth of indoor activity, and a night of sleeping to the sound of soothing, lashing rains.

20151129-23292683942_2f2bd8b9f5_oPicture © Neetesh Kumar

From the evening of the 1st of December to the evening of the 5th of December, large parts of the city went under water and lost power. The rains came at us relentlessly as the sky dissolved into eddies of Mordor-like blackness.

I have experienced the phenomenon known as a natural disaster a few times before; first as a child, and then as an almost adult. Other floods and earthquakes had produced in me a vague sense of preparation. As if this is but another event out of a series of life possibilities, and one is bound to emerge unscathed. When the lights went out that 1st of December, I retained this sense of possibility. Fumbling around with the help of a few forgotten birthday candles, I wolfed down some leftovers, and went to sleep in the darkness. Before going to bed, I gathered all the dust cloths, old towels, and washcloths in the house and lay them down like a patchy quilt next to the walls, hoping they'd absorb the waters. I slept in the living room, the daybed sticking up against the windows, hoping that I would hear the rains stop.

Read more »

Thinking Against Violence

16stoneWeb-blog480

Natasha Lennard and Brad Evans in The NYT's The Stone:

Natasha Lennard: The premise of your book “Disposable Futures” is that “violence is ubiquitous” in the media today. There seems to be plenty of evidence to support this claim — just look at the home page of this news site for a start. But the media has always been interested in violence — “if it bleeds, it leads” isn’t exactly new. And the notion that there is just more violence in the world today — more violent material for the media to cover — doesn’t seem tenable. So what do you think is specific about the ubiquity of violence today, and the way it is mediated?

Brad Evans: It is certainly right to suggest the connections between violence and media communications have been a recurring feature of human relations. We only need to open the first pages of Aeschylus’ “Oresteia” to witness tales of victory in battle and its communicative strategies — on this occasion the medium of communication was the burning beacon. But there are a number of ways in which violence is different today, in terms of its logics intended, forced witnessing and ubiquitous nature.

We certainly seem to be entering into a new moment, where the encounter with violence (real or imagined) is becoming more ubiquitous and its presence ever felt. Certainly this has something to do with our awareness of global tragedies as technologies redefine our exposure to such catastrophic events. But it also has to do with the raw realities of violence and people’s genuine sense of insecurity, which, even if it is manufactured or illusionary, feels no less real.

One of the key arguments I make throughout my work is that violence has now become the defining organizational principle for contemporary societies. It mediates all social relations. It matters less if we are actual victims of violence. It is the possibility that we could face some form of violent encounter, which shapes the logics of power in liberal societies today. Our political imagination as such has become dominated by multiple potential catastrophes that appear on the horizon. The closing of the entire Los Angeles city school system after a reported terrorist threat yesterday is an unsettling reminder of this. From terror to weather and everything in between, insecurity has become the new normal. We see this played out at global and local levels, as the effective blurring between older notions of homeland/battlefields, friends/enemies and peace/war has led to the widespread militarization of many everyday behaviors — especially in communities of color.

More here.

Returning to Ethiopia

Ethiopian-prayer-book_TOP-min

Dinaw Mengestu in Guernica:

Growing up, we had strange bedtime rituals. In Peoria, Illinois, when my sister and I were very young, my father would sit between our beds and tell us stories of animals who fought, lied, and cheated their way through the jungle world he invented for us. There were dense forests, green hills, and rivers. There were lions, crocodiles, zebras, giraffes, and laughing hyenas, which my father, in his raspy scarred voice, would imitate. The heroes of the stories were always two mischievous monkeys who could cheat all the other animals who, while taller, stronger, and more ferocious than them, lacked their wit. In the end the monkeys always found refuge at the top of the tallest trees—a vantage point from which, in retrospect, they would have had a clear view of all the havoc they had caused.

As a child, I didn’t think of the stories as being particularly related to Ethiopia, or, on a broader scale, Africa. I didn’t think about where this landscape, with trees that, according to my father, were larger than anything I could imagine, came from, or what these animals, whom my father spoke of as if real intimates, were doing in the crowded and deeply divided bedroom my sister and I shared. They were ordinary fictions, bedtime tales invented wholesale each night, sprung effortlessly from my father’s mind like a long, deep breath. And so there he is, in both memory and imagination, straddling the narrow space between our beds with these stories that my sister and I were both desperate to hear, clueless as to how far they had traveled to wash up, as if by accident, in Middle America.

My father, of course, eventually stopped with the stories. He might have done so because we no longer asked him to tell us them, or because we were old enough to read on our own, or because it was the mid-1980s, and Caterpillar, where my father worked, was going through a round of layoffs that would bankrupt my parents’ plans of buying their first home. Or perhaps he stopped because suddenly, everywhere we turned, Ethiopia, or one tragic version of it, was staring back at us.

More here.