isadora duncan

Isadora Duncan y el mar

From childhood, Duncan saw herself as a liberator, opposed but never vanquished by philistines. In My Life she recalls that in elementary school she gave an impromptu lecture in front of the class on how there was no Santa Claus, whereupon she was sent home by an angry teacher. This was not the last of what, with pride, she called her “famous speeches.” When she became a professional, she routinely ended her concerts by coming out in front of the curtain and describing to the audience, at length, how profound her way of dancing was, as opposed to the triviality of other ways—she called ballet “an expression of degeneration, of living death”—and on how, therefore, they should contribute to the expenses of her school. (This declamatory bent was probably the least attractive aspect of Duncan’s personality, as it is of My Life, and some reviewers had a lot of fun with it.) What appeared to her most vile about ballet was its unnaturalness: the rigid back, the studied positions, the relentless daintiness. Duncan was an exemplary bohemian—a quality that was partly rooted, no doubt, in the fact that she was from California. (She was born in San Francisco and raised, mostly, in Oakland.) That region has a history of breeding idealists, animists, nonconformists.

more from Joan Acocella at the NYRB here.

Princess Not-So-Charming

From Harvard Magazine:

MJ13-mont“Fairy tales have always tapped into the subconscious, bringing to light children’s deepest fears,” says Soman Chainani ’01. In his new fantasy-adventure novel, The School for Good and Evil, he has brought that tenet into the twenty-first century. The first of a trilogy for middle-grade readers (ages nine and up), The School for Good and Evil tracks two archetypal heroines: the lovely Sophie, with her waist-long blond hair and her dreams of becoming a princess, and her friend Agatha, an unattractive, unpopular contrarian who chooses to wear black. A giant bird snatches the pair and carries them off to the School for Good and Evil, a two-pronged magical academy that trains children to become fairy-tale heroes and villains. When, to her horror, Sophie arrives at the Evil branch to learn “uglification,” death curses, and other dark arts, while Agatha finds herself at the School for Good amid handsome princes and fair maidens, the line between good and evil blurs, the meaning of beauty twists, and the girls reveal their true natures.

At the core of their journey is the “princess culture,” which Chainani defines as today’s “tyranny of pink in young-girl marketing. It tells them their responsibility is to be pink, sparkly, ultra-feminine, and—most of all—pretty.” With such an emphasis on looks, “girly girls are terrified of being ugly, and normal girls are afraid of being outcasts.” Even boys are unnerved. “They have no idea how to live up to the expectations,” he says.That’s what I am interested in capturing: what kids fear most today.” Sophie and Agatha inhabit a world like that of classic fairy tales: a place where magic and reality coexist, and dangers lurk. Yet those dangers reflect modern issues. Several episodes tackle the fear of aging; one chapter riffs on the current obsession with physical self-improvement. In a scene where Sophie is asked to contribute to the school, she becomes a campus celebrity by offering “Malevolent Makeovers” and a presentation titled “Just Say No to Drab.” When Agatha challenges her, Sophie replies, “Isn’t this compassion? Isn’t this kindness and wisdom? I’m helping those who can’t help themselves!”

“So much is based on image,” Chainani explains. “It’s such a pervasive, destructive thing.”

More here.

The emergence of individuality in genetically identical mice

From Kurzweil AI:

Cage-designHow do people and other organisms evolve into individuals that are distinguished from others by their own personal brain structure and behavior? Why do identical twins not resemble each other perfectly even when they grew up together? To shed light on these questions, the scientists observed 40 genetically identical mice that were kept in an enclosure that offered a rich shared environment with a large variety of activity and exploration options. They showed that individual experiences influence the development of new neurons in mice, leading to measurable changes in the brain. “The animals were not only genetically identical, they were also living in the same environment,” explained principal investigator Gerd Kempermann, Professor for Genomics of Regeneration, CRTD, and Site Speaker of the DZNE in Dresden. “However, this environment was so rich that each mouse gathered its own individual experiences in it. Over time, the animals therefore increasingly differed in their realm of experience and behavior.” Each of the mice was equipped with a special microchip emitting electromagnetic signals. This allowed the scientists to construct the mice movement profiles and quantify their exploratory behavior.

The result: despite a common environment and identical genes, the mice showed highly individualized behavioral patterns. In the course of the three-month experiment, these differences increased in size.

“These differences were associated with differences in the generation of new neurons in the hippocampus, a region of the brain that supports learning and memory,” said Kempermann “Animals that explored the environment to a greater degree also grew more new neurons than animals that were more passive.” Adult neurogenesis [generation of new neurons] in the hippocampus allows the brain to react to new information flexibly. With this study, the authors show for the first time that personal experiences and ensuing behavior contribute to the “individualization of the brain.” The individualization they observed cannot be reduced to differences in environment or genetic makeup. “Adult neurogenesis also occurs in the hippocampus of humans,” said Kempermann. “Hence we assume that we have tracked down a neurobiological foundation for individuality that also applies to humans.”

More here.

Tuesday, May 14, 2013

How the Case for Austerity Has Crumbled

Krugman_1-060613_jpg_230x1199_q85

In the New York Review of Books, Paul Krugman reviews Neil Irwin's The Alchemists: Three Central Bankers and a World on Fire, David A. Stockman's The Great Deformation: The Corruption of Capitalism in America, and Mark Blyth's Austerity: The History of a Dangerous Idea:

It’s an ill wind that blows nobody good, and the Greek crisis was a godsend for anti-Keynesians. They had been warning about the dangers of deficit spending; the Greek debacle seemed to show just how dangerous fiscal profligacy can be. To this day, anyone arguing against fiscal austerity, let alone suggesting that we need another round of stimulus, can expect to be attacked as someone who will turn America (or Britain, as the case may be) into another Greece.

If Greece provided the obvious real-world cautionary tale, Reinhart and Rogoff seemed to provide the math. Their paper seemed to show not just that debt hurts growth, but that there is a “threshold,” a sort of trigger point, when debt crosses 90 percent of GDP. Go beyond that point, their numbers suggested, and economic growth stalls. Greece, of course, already had debt greater than the magic number. More to the point, major advanced countries, the United States included, were running large budget deficits and closing in on the threshold. Put Greece and Reinhart-Rogoff together, and there seemed to be a compelling case for a sharp, immediate turn toward austerity.

But wouldn’t such a turn toward austerity in an economy still depressed by private deleveraging have an immediate negative impact? Not to worry, said another remarkably influential academic paper, “Large Changes in Fiscal Policy: Taxes Versus Spending,” by Alberto Alesina and Silvia Ardagna.

One of the especially good things in Mark Blyth’s Austerity: The History of a Dangerous Idea is the way he traces the rise and fall of the idea of “expansionary austerity,” the proposition that cutting spending would actually lead to higher output. As he shows, this is very much a proposition associated with a group of Italian economists (whom he dubs “the Bocconi boys”) who made their case with a series of papers that grew more strident and less qualified over time, culminating in the 2009 analysis by Alesina and Ardagna.

In essence, Alesina and Ardagna made a full frontal assault on the Keynesian proposition that cutting spending in a weak economy produces further weakness. Like Reinhart and Rogoff, they marshaled historical evidence to make their case. According to Alesina and Ardagna, large spending cuts in advanced countries were, on average, followed by expansion rather than contraction. The reason, they suggested, was that decisive fiscal austerity created confidence in the private sector, and this increased confidence more than offset any direct drag from smaller government outlays.

The case against empathy

Paul Bloom in The New Yorker:

ScreenHunter_195 May. 14 16.25The immense power of empathy has been demonstrated again and again. It is why Americans were rivetted by the fate of Natalee Holloway, the teen-ager who went missing in Aruba, in 2005. It’s why, in the wake of widely reported tragedies and disasters—the tsunami of 2004, Hurricane Katrina the year after, or Sandy last year—people gave time, money, and even blood. It’s why, last December, when twenty children were murdered at Sandy Hook Elementary School, in Newtown, Connecticut, there was a widespread sense of grief, and an intense desire to help. Last month, of course, saw a similar outpouring of support for the victims of the Boston Marathon bombing.

Why do people respond to these misfortunes and not to others? The psychologist Paul Slovic points out that, when Holloway disappeared, the story of her plight took up far more television time than the concurrent genocide in Darfur. Each day, more than ten times the number of people who died in Hurricane Katrina die because of preventable diseases, and more than thirteen times as many perish from malnutrition.

More here.

Commercial quantum computer leaves PC in the dust

Jacob Aron in New Scientist:

Dn23519-1_300For the first time, a commercially available quantum computer has been pitted against an ordinary PC – and the quantum device left the regular machine in the dust.

D-Wave, a company based in Burnaby, Canada, has been selling quantum computers since 2011, although critics expressed doubt that their chips were actually harnessing the spooky action of quantum mechanics. That's because they use a non-mainstream method called adiabatic quantum computing.

Unlike classical bits, quantum bits, or qubits, can take the values 0 and 1 at the same time, theoretically offering much faster computing speed. To be truly quantum, the qubits must be linked via the quantum property of entanglement. That's impossible to measure while the device is operating. But in March, two separate tests of the D-Wave device showed indirect evidence for entanglement.

Now Catherine McGeoch of Amherst College, Massachusetts, a consultant to D-Wave, has put their computer through its paces and shown that it can beat regular machines. The D-Wave hardware is designed to solve a particular kind of optimisation problem: minimising the solution of a complicated equation by choosing the values of certain variables. It sounds esoteric, but the problem crops up in many practical applications, such as image recognition and machine learning.

McGeoch and her colleague Cong Wang of Simon Fraser University, in Burnaby, ran the problem on a D-Wave Two computer, which has 439 qubits formed from superconducting niobium loops. They also tried to solve the problem using three leading algorithms running on a high-end desktop computer. The D-Wave machine turned out to be around 3600 times faster than the best conventional algorithm.

More here.

Enlightened monsters

Michael Saler in the Times Literary Supplement:

ScreenHunter_194 May. 14 16.01The child may be father to the man, but how did a girl become mother to the monster? We continue to ask that of Mary Shelley, who wrote Frankenstein, or the Modern Prometheus (1818) before she turned twenty. It is a startling work from someone so young, combining profound philosophic disquisitions with melodramatic blood and thunder. Some see it as the first science fiction novel, but as Roseanne Montillo shows in The Lady and Her Monsters, Shelley’s narrative of a scientist’s quest to discover and harness the “principle of life” was less an extrapolation into the future than a faithful representation of contemporary practices. Indeed,Frankenstein is one of the earliest horror novels about modernity, directly confronting the instabilities provoked by the scientific, Industrial and French Revolutions. Shelley seemed predestined for this task: she was the daughter of the Enlightenment philosopher William Godwin and the radical critic and early feminist Mary Wollstonecraft, as well as the wife of the Romantic poet Percy Bysshe Shelley. The novel’s power stems from its young author’s often ambivalent wrestling with Enlightenment and Romantic responses to modernity, as well as her own traumas involving issues such as parenting and childbirth. (Her mother died eleven days after giving birth to her, and Shelley herself lost her first child shortly before commencing the book.)

More here.

punk

29.-Gallery-View_DIY-HARDWARE_Gianni-Versace-210x300

Questions of what punk is aside, it’s difficult to deny that, other than the crude beauty of the Ramones, the noisy dirges of bands like Flipper, or the shouts that “Civilization’s Dying” by the Indianapolis band Zero Boys, punk is best explained by its style. It’s hard to say whether somebody thinks like a punk, but if you see somebody with a red Mohawk and a bullet belt, chances are you will make assumptions as to which subculture that person best relates. And while people who might identify as punk will probably tell you they aren’t into high fashion, it is hard to ignore the profoundly impactful relationship between punk and fashion, intertwined since Dame Vivienne Westwood and Malcolm McLaren turned their Kings Road boutique into the iconic SEX store in 1974. And now everything that Westwood, McLaren, Johnny Rotten (née Lydon), Richard Hell, Patti Smith, and a host of other punks wore, and everything that followed, is getting the high-art treatment with the Metropolitan Museum of Art exhibition PUNK: Chaos to Couture.

more from Jason Diamond at Paris Review here.

Three Ways to Catch a Liar

Paul Ekman in Big Think:

Sett_newSpotting a micro expression is the single most useful thing. This is an expression that lasts about a 25th of a second. We’ve tested over 15,000 people in all walks of life and over 99 percent of them don’t see them and yet with an hour's training on the Internet they can learn to see them. So if you want to catch liars, learn how to see micro expressions, use the micro expression training tool on my website.

However, that may only tell you that the person’s concealing an emotion. That’s a lie. It’s not going to tell you how they really feel. It may not tell you that they’re the perpetrator of a crime. Take this example – it’s a terrible example, but I have to use it: my wife is found dead. I will be the first suspect because, regrettably, the person most likely to kill their wife is the husband.

So I’m the first suspect but I love my wife. I didn’t kill her. The police are wasting their time and they’re insulting me. Time is going by and they’re not looking for the right person. I could be furious at them and concealing my anger. And so if you spot my concealed anger, it doesn’t mean I killed my wife. It only means that I’m concealing my anger.

More here.

Pakistan elections: how Nawaz Sharif beat Imran Khan and what happens next

Mohammed Hanif in The Guardian:

Here's a little fairytale from Pakistan. Fourteen years ago a wise man ruled the country. He enjoyed the support of his people. But some of his treacherous generals thought he wasn't that smart. One night he was held at gunpoint, handcuffed, put in a dark dungeon, sentenced to life imprisonment. But then a little miracle happened; he, along with his family and servants, was put on a royal plane and exiled to Saudi Arabia, that fancy retirement home for the world's unwanted Muslim leaders.

Two days ago that same man stood on a balcony in Lahore, thanked Allah and said: Nawaz Sharif forgives them all.

But wait, if it was a real fairytale, Imran Khan would have won the election instead, right? Can't Pakistani voters tell between a world-famous, world cup-winning, charismatic leader and a mere politician who refers to himself in the third person?

Well he has, sort of. But not in the way he would have liked. Visiting foreign journalists have profiled Imran Khan more than they have profiled any living thing in this part of the world. If all the world's magazine editors were allowed to vote for Imran Khan he would be the prime minister of half the English-speaking world. If Imran Khan had contested in west London he would have won hands-down. But since this is Pakistan, he has won in Peshawar and two other cities. His party is set to form a government in Khyber Pakhtunkhwa, that north-western frontier province of Pakistan which Khan's profile writers never fail to remind us is the province that borders Afghanistan and the tribal areas that the world is so scared of. Or as some others never fail to remind the world: the land of the fierce pathans.

More here.

what is bangladesh?

Bangladesh

The United Nations categorizes Bangladesh as a moderate Muslim democracy. Meanwhile, the current Foreign Minister called Bangladesh a secular country. She defined Bangladesh to be a “non-communal country” with a “Muslim majority population”. The Foreign Minister further added that the concept of a moderate Muslim democracy cannot be applied in the case of Bangladesh because it fought its war of independence on basis of the ideal of secularism. For Bangladesh, embracing religion or creating a secular identity has been a major contestation in the creation of its national identity. Identity questions for Bangladesh still stand: is it a country of secular Bengalis or Muslim Bangladeshis? This split personality of Bangladesh confounds the international observer. For an outsider, it makes perfect sense to call it a moderate Muslim democracy as a Muslim majority population lives in a country that recognizes Islam as the state religion. Since the Shahbag movement has erupted with the demands of the death penalty for the war criminals, international media remains substantially silent about it.

more from Lailufar Yasmin at berfrois here.

Churchill’s First War

From The Telegraph:

Churchill--getty_2547624bSo accustomed are we to the memory of Winston Churchill as a great statesman and war leader that it is easy to forget he was also once a nobody. The son of a failed, syphilitic politician and a promiscuous and penniless society beauty, Winston’s prospects in the 1890s looked unpromising. The only things he had to recommend him were his gift for language, his mother’s address book and his own astonishing ambition. In 1897, during the war on the borders of Afghanistan and India, he ruthlessly exploited all three to launch one of the most successful political careers in British history. Con Coughlin’s book does not pull punches about young Winston’s character. Weak of disposition, a plodder at school and a bully at Sandhurst, there are moments when like Flashman. His contemporaries loathed him. Fellow officers branded him a “self-advertiser” or “insufferably bumptious”, and mocked his inability to pass a mirror without inspecting himself or practising a speech. His involvement in the war that first made his name was not always glorious either. He only got to go to the Northwest Frontier by pulling strings. While there, he became a fan of the soon-to-be-banned dumdum bullet, and enthusiastically joined in with actions which today would be considered war crimes.

And yet there is a sense of respect here, too. Yes, Churchill was a shameless self-promoter – but he was aware his efforts would be pointless unless he had something worth promoting. While fellow subalterns lazed around the polo field, Winston pursued a gruelling course of self-improvement, quickly catching up on squandered school years. As a staff officer to General Sir Bindon Blood he worked twice as hard as any other subaltern.

More here.

My Medical Choice

Angelina Jolie in The New York Times:

AngieMY MOTHER fought cancer for almost a decade and died at 56. She held out long enough to meet the first of her grandchildren and to hold them in her arms. But my other children will never have the chance to know her and experience how loving and gracious she was. We often speak of “Mommy’s mommy,” and I find myself trying to explain the illness that took her away from us. They have asked if the same could happen to me. I have always told them not to worry, but the truth is I carry a “faulty” gene, BRCA1, which sharply increases my risk of developing breast cancer and ovarian cancer. My doctors estimated that I had an 87 percent risk of breast cancer and a 50 percent risk of ovarian cancer, although the risk is different in the case of each woman. Only a fraction of breast cancers result from an inherited gene mutation. Those with a defect in BRCA1 have a 65 percent risk of getting it, on average.

Once I knew that this was my reality, I decided to be proactive and to minimize the risk as much I could. I made a decision to have a preventive double mastectomy. I started with the breasts, as my risk of breast cancer is higher than my risk of ovarian cancer, and the surgery is more complex. On April 27, I finished the three months of medical procedures that the mastectomies involved. During that time I have been able to keep this private and to carry on with my work.

More here.

Monday, May 13, 2013

A North American’s Guide to the Use and Abuse of the Modern PhD

by Colin Eatock

Slide1You applied to the program, and you got in. Then you spent the next four, six, eight or more years stroking the capricious egos of professors, jockeying for position within your peer group and marking bad undergraduate essays for the minimum wage. You completed the research, the grant applications, the writing, the comprehensive exams, and finally the defence.

You got through it all, somehow – and now it's yours. You walked across the stage at a graduation ceremony, and an Important Person in a robe gave you the paper scroll that made it official. You are no longer a Mr. or a Ms. Now, you are a Doctor. You have a PhD.

A PhD isn't just something you've acquired, it's something you've become. It's part of who you are – and you're proud that you've transformed yourself in a way that's meaningful to you. Now that you can hold it in your hands, you feel you are someone special, and you want to tell the whole world.

But can you – or should you? And if so, how?

This is where it gets tricky. Indeed, knowing when it is professionally and socially acceptable to “use” your PhD – to call yourself Doctor, and to hope to be addressed as such in return – is a minefield where values, conventions and contexts intersect in fluid and intricate ways. And nowhere has the question ever been more perplexing than in North America today.

Ironically, this issue is often less troublesome in parts of Europe, Asia and Latin America. In many societies, scholarship and professional rank are highly respected things – and terms of address are an art form, requiring subtlety and precision. It would be tantamount to an insult to fail to address any kind of a doctor as Doctor.

But in North America – where traditions are discarded, hierarchies are flouted, and everything is supposed to be so much easier as a result – the rules surrounding the PhD designation are as clear as mud. Today's freshly minted scholars stand on shifting sands, and often have no idea when or where – or even if – it is acceptable to casually slip the initials Dr. in front of their name.

Read more »

The skies and scenes of Südtirol: Photographs

by S. Abbas Raza

The light is very dramatic in the mountains of the Eisacktal at sunrise and sunset and changes from minute to minute. Even at a given time the sky can often look completely different in different directions. Yesterday evening I took a walk from my house in Brixen to the nearby village of Neustift and took some photographs along the way. Here they are without further commentary:

039

045

109

Read more »

Sunday, May 12, 2013

Assad’s Castaways

Guernica-assad

J. Malcolm Garcia in Guernica:

Moutasem and Sarah watch their breath in the frigid February air. We are in the principal’s office of Muhammad al-Fatih, a secondary school for teenagers of Syrian refugees in Antakya, Turkey. The school has no heat, but it is better to freeze here than to be in Syria right now, my Syrian translator Hazim tells me. There, the army patrols villages and cities, killing suspected activists. Men, women, children. No one is safe. If the army could arrest the air, he says, it would.

Hazim is a Sunni Muslim, as are the students in the school and the rebels in the Free Syrian Army (FSA). The rebels have been battling the troops of President Bashar al-Assad since March 2011.

Syria’s Alawite minority and Sunni majority have been at odds for hundreds of years. The minority Alawites, like Assad, dominate Syria’s government, hold key military positions, and enjoy a disproportionate share of the country’s wealth.

Moutasem and Sarah, and other children I will meet in the coming days, have had their lives upended by a war made more complicated by centuries of ethnic rivalry.

Moutasem, fifteen, wears black shoes, pressed blue jeans, and a red wool sweater, and slouches, relaxed, in his chair. His eyes stare intently. When he sees me shiver in the cold, he offers me his coat.

Without Concepts

Richard Marshall interviews Edouard Machery in 3:AM Magazine:

3:AM: I think one way that we can immediately see the importance of your approach to philosophy and cognitive science is by discussing your work on racism. Racism has traditionally been thought of as either a question of nature – roughly, the thought that we’re born to think in racial terms– or nurture – roughly, our culture, upbringing, environment constructs races, and that they don’t exist in nature. You took the two research traditions, the nature tradition and the nurture tradition, and combined them. Can you say something about why you thought this combined approach was important at the time and what difference such an approach has made on research into this? Has it been an approach that has been well received by those in the previously opposing camps?

EM: Many social phenomena, such as racism, have been studied by, one the one hand, cultural anthropologists, sociologists, and historians, and on the other hand, by biologists and by evolutionary-minded behavioral scientists (anthropologists and psychologists). Sadly, these two traditions have failed to engage with one another, and, as a result, our understanding of many social phenomena remains incomplete. In my opinion, it is uncontroversial that social and psychological phenomena like racism or morality result from evolved cognitive structures, whose understanding requires an evolutionary perspective, but that many of their properties are the product of contingent historical trajectories. Integrating the two explanatory traditions is what I called the “integration challenge.” In my view, the theory of cultural evolution provides a framework for this integration.


[Photo of Gil-White from his home page.]

Racism is a case in point. As I have argued, following in part Gil-White’s groundbreaking work, we have evolved a sensitivity to “ethnic markers” (roughly, to markers such as clothes, accent, etc., that indicate what cultural group one belongs to) and a motivation to interact preferentially with members of own our cultural group. Racism is a by-product of this evolved sensitivity and motivation, and it emerges when skin color and other physical properties trigger our sensitivity to ethnic markers.

This hypothesis is useful to understand the unity of a large range of social and psychological phenomena, whose fundamental identity has often been ignored, or even denied, by historians and cultural anthropologists. On the other hand, research in history and cultural anthropology is needed to understand the peculiarities of racism in different historical contexts.