The organic myth of the British constitution

Michael Gardiner writes at openDemocracy on 'public' services in Britain:

William BeveridgeThe British left is packed with voices demanding an unreflective defence of ‘public services’. This public is frozen beyond any evaluation of commonality, is held to be equalising even as its bases fall away to reveal the private ownership concealed within them. The barrage is triggered in part by the Great Recession, but also in part by the sovereignty challenge being felt in the UK, concretely in Scotland in 2014. Now is a good time to reflect that the British sovereignty behind these public services has always in fact defined itself as a defence against popular sovereignty, a defence projected as timeless inheritance which is intuitive and ‘just there’.

If the nationalism standing behind the ‘British public’ throughout the press and left commentary seems oddly transparent, this transparency derives from Britain’s unusual licence to exist ‘beyond’ the national. For this is less a nation than it is a rationalisation of credit. The British union arises from the import of the Anglo-Dutch financial system after 1688, its guarantee in perpetuity by the Hanoverian crown, and central banks which supported it from the 1690s. As Daniel Defoe was describing at exactly the time of the Acts of Union in 1706-07, Britain’s raison d’état is as an investment entity, a guarantor of global money. As has been described in many accounts of the close of the seventeenth century, in this new state citizenship is understood in terms of naturalised property and the avoidance rather than the promotion of shared action. Reform it as much as you like, but collectivity is not within the scope of the British constitution.

Read the rest here.

M. J. Rosenberg: Pro-Palestinian Is Not Anti-Israel But the Opposite

M. J. Rosenberg in the Washington Spectator:

Arafat-rabin_whSometimes it is instructive to listen to what Harvard law professor Alan Dershowitz says because his way of seeing the Israel-Palestinian conflict is typical of the thinking of both the Netanyahu government and its lobby here. I say “sometimes,” because most of Dershowitz’s opinions can be found in a dozen other places — from AIPAC, the “major Jewish organizations,” neocon websites like Commentary and in statements and tweetsfrom the Israeli government itself.

But sometimes Dershowitz inadvertently provides solid insight into the mentality that enables a 45-year occupation that, even Dershowitz admits, has proven so destructive to Israel.

In a debate last week with Peter Beinart, theDaily Beast columnist and author of the bestseller, The Crisis of Zionism, Dershowitz said that, for Jews, Israel is now “an embarrassment.”

In 1967, Jews were able to beat their chest and say “wow, we’re proud to be Israel [sic], look how tough Israelis are. It was a source of pride. Today, it’s a source of embarrassment.”

And he knows why, as evidenced by his reference to 1967, the year the occupation began.

But when Beinart pointed that out, Dershowitz responded that Israel’s evolution into “an embarrassment” has nothing to do with the occupation.

More here.

infinite fossil fuel

Mag-article-large

For years, environmentalists have hoped that the imminent exhaustion of oil will, in effect, force us to undergo this virtuous transition; given a choice between no power and solar power, even the most shortsighted person would choose the latter. That hope seems likely to be denied. Cheap, abundant petroleum threw sand in the gears of solar power in the 1980s and stands ready to do it again. Plentiful natural gas, a geopolitical and economic boon, is a climatological shackle. To Vaclav Smil, the University of Manitoba environmental scientist, the notion that we can move so fast is naive, even preposterous. “Energy transitions are always slow,” he told me by e-mail. Modern energy infrastructures, assembled over decades, cannot be revamped overnight. Worse still, in his view, there is little public appetite for beginning the process, or even appreciating the magnitude of what lies ahead. “The world has been running into fossil fuels, not away from them.”

more from Charles C. Mann at The Atlantic Monthly here.

Permanent Present Tense

Permanent-present-tense-the-unforgettable-life-of-the-amnesic-patient-h-m

Henry had his first epileptic episode in 1936, at the age of ten; by 1953 his seizures had become increasingly frequent and debilitating. His family doctor referred him to William Beecher Scoville, a leading neurosurgeon at Yale Medical School. When massive doses of medication failed to quell his attacks and EEGs revealed no obvious locus of brain damage, Scoville suggested a novel surgical procedure. Using a trepanning drill he had constructed himself from auto parts, he cut two coin-sized holes in the skull, ‘doorways to Henry’s brain’, and suctioned out most of his medial temporal lobes, the front half of the hippocampus and most of the amygdala. After recovery, Henry’s seizures were significantly reduced, but it soon become apparent that the operation had vacuumed away any recollection of his hospital stay, and indeed most of the significant events of the previous few years. Catastrophically, it had also created a global anterograde amnesia: the loss of the ability to form new memories of any kind.

more from Mike Jay at the LRB here.

isadora duncan

Isadora Duncan y el mar

From childhood, Duncan saw herself as a liberator, opposed but never vanquished by philistines. In My Life she recalls that in elementary school she gave an impromptu lecture in front of the class on how there was no Santa Claus, whereupon she was sent home by an angry teacher. This was not the last of what, with pride, she called her “famous speeches.” When she became a professional, she routinely ended her concerts by coming out in front of the curtain and describing to the audience, at length, how profound her way of dancing was, as opposed to the triviality of other ways—she called ballet “an expression of degeneration, of living death”—and on how, therefore, they should contribute to the expenses of her school. (This declamatory bent was probably the least attractive aspect of Duncan’s personality, as it is of My Life, and some reviewers had a lot of fun with it.) What appeared to her most vile about ballet was its unnaturalness: the rigid back, the studied positions, the relentless daintiness. Duncan was an exemplary bohemian—a quality that was partly rooted, no doubt, in the fact that she was from California. (She was born in San Francisco and raised, mostly, in Oakland.) That region has a history of breeding idealists, animists, nonconformists.

more from Joan Acocella at the NYRB here.

Princess Not-So-Charming

From Harvard Magazine:

MJ13-mont“Fairy tales have always tapped into the subconscious, bringing to light children’s deepest fears,” says Soman Chainani ’01. In his new fantasy-adventure novel, The School for Good and Evil, he has brought that tenet into the twenty-first century. The first of a trilogy for middle-grade readers (ages nine and up), The School for Good and Evil tracks two archetypal heroines: the lovely Sophie, with her waist-long blond hair and her dreams of becoming a princess, and her friend Agatha, an unattractive, unpopular contrarian who chooses to wear black. A giant bird snatches the pair and carries them off to the School for Good and Evil, a two-pronged magical academy that trains children to become fairy-tale heroes and villains. When, to her horror, Sophie arrives at the Evil branch to learn “uglification,” death curses, and other dark arts, while Agatha finds herself at the School for Good amid handsome princes and fair maidens, the line between good and evil blurs, the meaning of beauty twists, and the girls reveal their true natures.

At the core of their journey is the “princess culture,” which Chainani defines as today’s “tyranny of pink in young-girl marketing. It tells them their responsibility is to be pink, sparkly, ultra-feminine, and—most of all—pretty.” With such an emphasis on looks, “girly girls are terrified of being ugly, and normal girls are afraid of being outcasts.” Even boys are unnerved. “They have no idea how to live up to the expectations,” he says.That’s what I am interested in capturing: what kids fear most today.” Sophie and Agatha inhabit a world like that of classic fairy tales: a place where magic and reality coexist, and dangers lurk. Yet those dangers reflect modern issues. Several episodes tackle the fear of aging; one chapter riffs on the current obsession with physical self-improvement. In a scene where Sophie is asked to contribute to the school, she becomes a campus celebrity by offering “Malevolent Makeovers” and a presentation titled “Just Say No to Drab.” When Agatha challenges her, Sophie replies, “Isn’t this compassion? Isn’t this kindness and wisdom? I’m helping those who can’t help themselves!”

“So much is based on image,” Chainani explains. “It’s such a pervasive, destructive thing.”

More here.

The emergence of individuality in genetically identical mice

From Kurzweil AI:

Cage-designHow do people and other organisms evolve into individuals that are distinguished from others by their own personal brain structure and behavior? Why do identical twins not resemble each other perfectly even when they grew up together? To shed light on these questions, the scientists observed 40 genetically identical mice that were kept in an enclosure that offered a rich shared environment with a large variety of activity and exploration options. They showed that individual experiences influence the development of new neurons in mice, leading to measurable changes in the brain. “The animals were not only genetically identical, they were also living in the same environment,” explained principal investigator Gerd Kempermann, Professor for Genomics of Regeneration, CRTD, and Site Speaker of the DZNE in Dresden. “However, this environment was so rich that each mouse gathered its own individual experiences in it. Over time, the animals therefore increasingly differed in their realm of experience and behavior.” Each of the mice was equipped with a special microchip emitting electromagnetic signals. This allowed the scientists to construct the mice movement profiles and quantify their exploratory behavior.

The result: despite a common environment and identical genes, the mice showed highly individualized behavioral patterns. In the course of the three-month experiment, these differences increased in size.

“These differences were associated with differences in the generation of new neurons in the hippocampus, a region of the brain that supports learning and memory,” said Kempermann “Animals that explored the environment to a greater degree also grew more new neurons than animals that were more passive.” Adult neurogenesis [generation of new neurons] in the hippocampus allows the brain to react to new information flexibly. With this study, the authors show for the first time that personal experiences and ensuing behavior contribute to the “individualization of the brain.” The individualization they observed cannot be reduced to differences in environment or genetic makeup. “Adult neurogenesis also occurs in the hippocampus of humans,” said Kempermann. “Hence we assume that we have tracked down a neurobiological foundation for individuality that also applies to humans.”

More here.

How the Case for Austerity Has Crumbled

Krugman_1-060613_jpg_230x1199_q85

In the New York Review of Books, Paul Krugman reviews Neil Irwin's The Alchemists: Three Central Bankers and a World on Fire, David A. Stockman's The Great Deformation: The Corruption of Capitalism in America, and Mark Blyth's Austerity: The History of a Dangerous Idea:

It’s an ill wind that blows nobody good, and the Greek crisis was a godsend for anti-Keynesians. They had been warning about the dangers of deficit spending; the Greek debacle seemed to show just how dangerous fiscal profligacy can be. To this day, anyone arguing against fiscal austerity, let alone suggesting that we need another round of stimulus, can expect to be attacked as someone who will turn America (or Britain, as the case may be) into another Greece.

If Greece provided the obvious real-world cautionary tale, Reinhart and Rogoff seemed to provide the math. Their paper seemed to show not just that debt hurts growth, but that there is a “threshold,” a sort of trigger point, when debt crosses 90 percent of GDP. Go beyond that point, their numbers suggested, and economic growth stalls. Greece, of course, already had debt greater than the magic number. More to the point, major advanced countries, the United States included, were running large budget deficits and closing in on the threshold. Put Greece and Reinhart-Rogoff together, and there seemed to be a compelling case for a sharp, immediate turn toward austerity.

But wouldn’t such a turn toward austerity in an economy still depressed by private deleveraging have an immediate negative impact? Not to worry, said another remarkably influential academic paper, “Large Changes in Fiscal Policy: Taxes Versus Spending,” by Alberto Alesina and Silvia Ardagna.

One of the especially good things in Mark Blyth’s Austerity: The History of a Dangerous Idea is the way he traces the rise and fall of the idea of “expansionary austerity,” the proposition that cutting spending would actually lead to higher output. As he shows, this is very much a proposition associated with a group of Italian economists (whom he dubs “the Bocconi boys”) who made their case with a series of papers that grew more strident and less qualified over time, culminating in the 2009 analysis by Alesina and Ardagna.

In essence, Alesina and Ardagna made a full frontal assault on the Keynesian proposition that cutting spending in a weak economy produces further weakness. Like Reinhart and Rogoff, they marshaled historical evidence to make their case. According to Alesina and Ardagna, large spending cuts in advanced countries were, on average, followed by expansion rather than contraction. The reason, they suggested, was that decisive fiscal austerity created confidence in the private sector, and this increased confidence more than offset any direct drag from smaller government outlays.

The case against empathy

Paul Bloom in The New Yorker:

ScreenHunter_195 May. 14 16.25The immense power of empathy has been demonstrated again and again. It is why Americans were rivetted by the fate of Natalee Holloway, the teen-ager who went missing in Aruba, in 2005. It’s why, in the wake of widely reported tragedies and disasters—the tsunami of 2004, Hurricane Katrina the year after, or Sandy last year—people gave time, money, and even blood. It’s why, last December, when twenty children were murdered at Sandy Hook Elementary School, in Newtown, Connecticut, there was a widespread sense of grief, and an intense desire to help. Last month, of course, saw a similar outpouring of support for the victims of the Boston Marathon bombing.

Why do people respond to these misfortunes and not to others? The psychologist Paul Slovic points out that, when Holloway disappeared, the story of her plight took up far more television time than the concurrent genocide in Darfur. Each day, more than ten times the number of people who died in Hurricane Katrina die because of preventable diseases, and more than thirteen times as many perish from malnutrition.

More here.

Commercial quantum computer leaves PC in the dust

Jacob Aron in New Scientist:

Dn23519-1_300For the first time, a commercially available quantum computer has been pitted against an ordinary PC – and the quantum device left the regular machine in the dust.

D-Wave, a company based in Burnaby, Canada, has been selling quantum computers since 2011, although critics expressed doubt that their chips were actually harnessing the spooky action of quantum mechanics. That's because they use a non-mainstream method called adiabatic quantum computing.

Unlike classical bits, quantum bits, or qubits, can take the values 0 and 1 at the same time, theoretically offering much faster computing speed. To be truly quantum, the qubits must be linked via the quantum property of entanglement. That's impossible to measure while the device is operating. But in March, two separate tests of the D-Wave device showed indirect evidence for entanglement.

Now Catherine McGeoch of Amherst College, Massachusetts, a consultant to D-Wave, has put their computer through its paces and shown that it can beat regular machines. The D-Wave hardware is designed to solve a particular kind of optimisation problem: minimising the solution of a complicated equation by choosing the values of certain variables. It sounds esoteric, but the problem crops up in many practical applications, such as image recognition and machine learning.

McGeoch and her colleague Cong Wang of Simon Fraser University, in Burnaby, ran the problem on a D-Wave Two computer, which has 439 qubits formed from superconducting niobium loops. They also tried to solve the problem using three leading algorithms running on a high-end desktop computer. The D-Wave machine turned out to be around 3600 times faster than the best conventional algorithm.

More here.

Enlightened monsters

Michael Saler in the Times Literary Supplement:

ScreenHunter_194 May. 14 16.01The child may be father to the man, but how did a girl become mother to the monster? We continue to ask that of Mary Shelley, who wrote Frankenstein, or the Modern Prometheus (1818) before she turned twenty. It is a startling work from someone so young, combining profound philosophic disquisitions with melodramatic blood and thunder. Some see it as the first science fiction novel, but as Roseanne Montillo shows in The Lady and Her Monsters, Shelley’s narrative of a scientist’s quest to discover and harness the “principle of life” was less an extrapolation into the future than a faithful representation of contemporary practices. Indeed,Frankenstein is one of the earliest horror novels about modernity, directly confronting the instabilities provoked by the scientific, Industrial and French Revolutions. Shelley seemed predestined for this task: she was the daughter of the Enlightenment philosopher William Godwin and the radical critic and early feminist Mary Wollstonecraft, as well as the wife of the Romantic poet Percy Bysshe Shelley. The novel’s power stems from its young author’s often ambivalent wrestling with Enlightenment and Romantic responses to modernity, as well as her own traumas involving issues such as parenting and childbirth. (Her mother died eleven days after giving birth to her, and Shelley herself lost her first child shortly before commencing the book.)

More here.

punk

29.-Gallery-View_DIY-HARDWARE_Gianni-Versace-210x300

Questions of what punk is aside, it’s difficult to deny that, other than the crude beauty of the Ramones, the noisy dirges of bands like Flipper, or the shouts that “Civilization’s Dying” by the Indianapolis band Zero Boys, punk is best explained by its style. It’s hard to say whether somebody thinks like a punk, but if you see somebody with a red Mohawk and a bullet belt, chances are you will make assumptions as to which subculture that person best relates. And while people who might identify as punk will probably tell you they aren’t into high fashion, it is hard to ignore the profoundly impactful relationship between punk and fashion, intertwined since Dame Vivienne Westwood and Malcolm McLaren turned their Kings Road boutique into the iconic SEX store in 1974. And now everything that Westwood, McLaren, Johnny Rotten (née Lydon), Richard Hell, Patti Smith, and a host of other punks wore, and everything that followed, is getting the high-art treatment with the Metropolitan Museum of Art exhibition PUNK: Chaos to Couture.

more from Jason Diamond at Paris Review here.

Three Ways to Catch a Liar

Paul Ekman in Big Think:

Sett_newSpotting a micro expression is the single most useful thing. This is an expression that lasts about a 25th of a second. We’ve tested over 15,000 people in all walks of life and over 99 percent of them don’t see them and yet with an hour's training on the Internet they can learn to see them. So if you want to catch liars, learn how to see micro expressions, use the micro expression training tool on my website.

However, that may only tell you that the person’s concealing an emotion. That’s a lie. It’s not going to tell you how they really feel. It may not tell you that they’re the perpetrator of a crime. Take this example – it’s a terrible example, but I have to use it: my wife is found dead. I will be the first suspect because, regrettably, the person most likely to kill their wife is the husband.

So I’m the first suspect but I love my wife. I didn’t kill her. The police are wasting their time and they’re insulting me. Time is going by and they’re not looking for the right person. I could be furious at them and concealing my anger. And so if you spot my concealed anger, it doesn’t mean I killed my wife. It only means that I’m concealing my anger.

More here.

Pakistan elections: how Nawaz Sharif beat Imran Khan and what happens next

Mohammed Hanif in The Guardian:

Here's a little fairytale from Pakistan. Fourteen years ago a wise man ruled the country. He enjoyed the support of his people. But some of his treacherous generals thought he wasn't that smart. One night he was held at gunpoint, handcuffed, put in a dark dungeon, sentenced to life imprisonment. But then a little miracle happened; he, along with his family and servants, was put on a royal plane and exiled to Saudi Arabia, that fancy retirement home for the world's unwanted Muslim leaders.

Two days ago that same man stood on a balcony in Lahore, thanked Allah and said: Nawaz Sharif forgives them all.

But wait, if it was a real fairytale, Imran Khan would have won the election instead, right? Can't Pakistani voters tell between a world-famous, world cup-winning, charismatic leader and a mere politician who refers to himself in the third person?

Well he has, sort of. But not in the way he would have liked. Visiting foreign journalists have profiled Imran Khan more than they have profiled any living thing in this part of the world. If all the world's magazine editors were allowed to vote for Imran Khan he would be the prime minister of half the English-speaking world. If Imran Khan had contested in west London he would have won hands-down. But since this is Pakistan, he has won in Peshawar and two other cities. His party is set to form a government in Khyber Pakhtunkhwa, that north-western frontier province of Pakistan which Khan's profile writers never fail to remind us is the province that borders Afghanistan and the tribal areas that the world is so scared of. Or as some others never fail to remind the world: the land of the fierce pathans.

More here.

what is bangladesh?

Bangladesh

The United Nations categorizes Bangladesh as a moderate Muslim democracy. Meanwhile, the current Foreign Minister called Bangladesh a secular country. She defined Bangladesh to be a “non-communal country” with a “Muslim majority population”. The Foreign Minister further added that the concept of a moderate Muslim democracy cannot be applied in the case of Bangladesh because it fought its war of independence on basis of the ideal of secularism. For Bangladesh, embracing religion or creating a secular identity has been a major contestation in the creation of its national identity. Identity questions for Bangladesh still stand: is it a country of secular Bengalis or Muslim Bangladeshis? This split personality of Bangladesh confounds the international observer. For an outsider, it makes perfect sense to call it a moderate Muslim democracy as a Muslim majority population lives in a country that recognizes Islam as the state religion. Since the Shahbag movement has erupted with the demands of the death penalty for the war criminals, international media remains substantially silent about it.

more from Lailufar Yasmin at berfrois here.

Churchill’s First War

From The Telegraph:

Churchill--getty_2547624bSo accustomed are we to the memory of Winston Churchill as a great statesman and war leader that it is easy to forget he was also once a nobody. The son of a failed, syphilitic politician and a promiscuous and penniless society beauty, Winston’s prospects in the 1890s looked unpromising. The only things he had to recommend him were his gift for language, his mother’s address book and his own astonishing ambition. In 1897, during the war on the borders of Afghanistan and India, he ruthlessly exploited all three to launch one of the most successful political careers in British history. Con Coughlin’s book does not pull punches about young Winston’s character. Weak of disposition, a plodder at school and a bully at Sandhurst, there are moments when like Flashman. His contemporaries loathed him. Fellow officers branded him a “self-advertiser” or “insufferably bumptious”, and mocked his inability to pass a mirror without inspecting himself or practising a speech. His involvement in the war that first made his name was not always glorious either. He only got to go to the Northwest Frontier by pulling strings. While there, he became a fan of the soon-to-be-banned dumdum bullet, and enthusiastically joined in with actions which today would be considered war crimes.

And yet there is a sense of respect here, too. Yes, Churchill was a shameless self-promoter – but he was aware his efforts would be pointless unless he had something worth promoting. While fellow subalterns lazed around the polo field, Winston pursued a gruelling course of self-improvement, quickly catching up on squandered school years. As a staff officer to General Sir Bindon Blood he worked twice as hard as any other subaltern.

More here.

My Medical Choice

Angelina Jolie in The New York Times:

AngieMY MOTHER fought cancer for almost a decade and died at 56. She held out long enough to meet the first of her grandchildren and to hold them in her arms. But my other children will never have the chance to know her and experience how loving and gracious she was. We often speak of “Mommy’s mommy,” and I find myself trying to explain the illness that took her away from us. They have asked if the same could happen to me. I have always told them not to worry, but the truth is I carry a “faulty” gene, BRCA1, which sharply increases my risk of developing breast cancer and ovarian cancer. My doctors estimated that I had an 87 percent risk of breast cancer and a 50 percent risk of ovarian cancer, although the risk is different in the case of each woman. Only a fraction of breast cancers result from an inherited gene mutation. Those with a defect in BRCA1 have a 65 percent risk of getting it, on average.

Once I knew that this was my reality, I decided to be proactive and to minimize the risk as much I could. I made a decision to have a preventive double mastectomy. I started with the breasts, as my risk of breast cancer is higher than my risk of ovarian cancer, and the surgery is more complex. On April 27, I finished the three months of medical procedures that the mastectomies involved. During that time I have been able to keep this private and to carry on with my work.

More here.

Musical noise? or noisy music?

by Dave Maier

In a previous post I considered and rejected the idea that music and noise (= non-musical sound art) are entirely incompatible – that noise is the very negation of musical meaning. This left open the question of how these things might co-exist – or, turning the question around, how composers might make use of both resources in the same composition. Today I revisit that issue. If you've heard the podcasts I've posted here, you know what sort of compositions I'm talking about, even if I haven't been able to explain how or why they work. If you want to skip the theoretical blather and check out another fab mix (a bit noisier than usual this time), scroll down now.

Hamilton bookIn his book Aesthetics & Music, philosopher Andy Hamilton seems to position himself to answer this question, but then skirts it, on the way to what he takes to be more important questions which we will not discuss here. However, that he does even this much is interesting, given most philosophers' attitudes toward music, so let's take a look. The context is the traditional philosophical question of what music is. It's clearly an art of sound in some sense, but what makes a sound musical and/or artistic? According to Hamilton, the traditional assumption (“universalism”) that music is the only art of sound has gone together with another assumption “that music exploits as material a particular range of sounds, namely tones” (45). As music has “embraced more noise elements,” he says, these two assumptions have come apart. This allows a variety of positions on the matter. We might, for example, keep the former stipulation, but expand the range of “music” to accommodate the new forms it may now take (i.e. after a century of experimentation).

Hamilton, however, goes the other route: “[P]aradoxically, the tonal basis of music has been clarified by the rejection of instrumental puritanism. Thus I reassert that music is the art of tones, while rejecting universalism and recognizing an emergent non-musical sound art which takes non-tonal sounds as its material.” In support of this, he insists that the fact that any sounds can be incorporated into music doesn't mean that any sounds can constitute music. Music, he says, is on a continuum with non-musical sound art; which something is depends on which type of sound is “predominant”. However, “this distinction is not in any way evaluative and is not intended to mark any great metaphysical divide.”

Well, I'm glad to hear that, I suppose, but now I wonder what exactly we're supposed to take away from it. In the context, it seems like he's simply making whatever concessions he needs to make to recent musical/sound-artistic history in order to continue with his story about music as the art of tone. But that gray area between tone and noise is where I live, or at least hang out a lot, so that's what I want to hear about. It's not just a theoretical exception to be swept under the rug. How does this stuff work?

Read more »