This video contains the most preposterous performance I have seen yet in a talk show, by the “Republican strategist” Jack Burkman. It is as offensive as it is ludicrous and laughable, and Robert Fisk and Anas al-Tikriti seem to have trouble deciding whether to get angry or break out in guffaws at the ridculousness of it all!
What makes a person gay? Is it genetics, upbringing, or some combination of the two? Over the past few decades, a slew of scientific research has bolstered the notion that sexuality is, at least in part, innate. Studies of the sexual behavior of various animal species have shown that homosexuality is not just a human phenomenon. Then there is the curious finding that the number of older brothers a male has may biologically increase his chances of being gay.
Now Simon LeVay, a former Harvard neuroscientist, has written, “Gay, Straight, and the Reason Why: The Science of Sexual Orientation,” a comprehensive, engaging and occasionally quite funny look at the current state of the research on the topic. LeVay is one of the leading authorities in the field: Back in 1991, he discovered that INAH3, a structure in the hypothalamus of the brain that helps to regulate sexual behavior, tended to be smaller in gay men than in straight men. It was a watershed moment in our understanding of sexual orientation (the study was published at the height of the AIDS epidemic, when the disease was widely regarded in religious circles as divine punishment for the sin of being gay) and the first scientific finding to support the idea that gayness might be more than just a lifestyle.
People have used mind control to change images on a video screen, a study reports. The volunteers, whose brains were wired up to a computer, enhanced one of two competing images of famous people or objects by changing firing rates in individual brain cells.
The research, by Moran Cerf from the California Institute of Technology in Pasadena and his colleagues, demonstrates how our brains, which are constantly bombarded with images, noise and smells, can, through conscious thought, select what stimuli to notice and what to ignore (see video). The research is particularly exciting, says neuroengineer John Donoghue of Brown University, “because it shows how we can now peer into the process of thinking at a level we have not been able to get at before”. Donoghue was responsible for the first successful transplantation of a chip into the motor cortex of a tetraplegic man, enabling him to move a computer cursor and manipulate a robotic arm with his mind.
In the last six years or so they have shown that single neurons can fire when subjects recognise — or even imagine — just one particular person or object. They propose that activity in these neurons reflect the choices the brain is making about what sensory information it will consider further and what information it will neglect.
Over at The Immanent Frame:
NS: What makes modern sociological terms like “secularism” and “secularization” useful for interpreting eighteenth- and nineteenth-century literature? Is there a danger of falling into misleading anachronism?
CJ: There’s always that danger when we use a term from one historical period to describe aspects of another one. “Secularism” first emerges in Victorian England as a self-description, a way to avoid being labeled an atheist, and it has a long history within Christianity before that, as the secular, or worldly, time before the Second Coming. “Secularization” is a bit trickier, since it aims to describe a process and to give that process the aura of scientific neutrality, like the weather. I think the danger is not so much anachronism—which, frankly, I don’t think is a bad thing anyway—but rather forgetting that terms are never merely descriptive. So, I use the term, and I try to be reflexive about it. It’s comforting for many people to see themselves as living on the far side of a secularization process, and it’s that sense of comfort that I’d like to disrupt a bit.
NS: What does it mean for you to be reflexive?
CJ: What I mean by “reflexivity” is really just a critical consciousness that whenever you invoke a term, you are also invoking its history—the conditions under which it was forged and the uses to which it has been subsequently put. At the same time, we need these terms: something has changed over the course of modernity, for instance, and I’m comfortable with calling that change “secularization,” as long as it’s defined very carefully and I know what the stakes are in a given definition. Reflexivity is just my shorthand for the process, which I take to be central to serious intellectual practice, anyway—to strike the balance between using a term or concept or idea and simultaneously being aware of what you’re doing when you use it. It’s a mental habit of disembedding from the stuff you really care about—which, appropriately enough, is a pretty good definition of the secular!
My most beloved and second oldest friend, my younger sister, turned a new decade a few days ago. It was an occasion to openly appreciate the happiness she brings me. A recent study suggests that this effect is wholly normal. Deborah Tannen in the NYT offers some further thoughts on the topic:
“Having a Sister Makes You Happier”: that was the headline on a recent article about a study finding that adolescents who have a sister are less likely to report such feelings as “I am unhappy, sad or depressed” and “I feel like no one loves me.”
These findings are no fluke; other studies have come to similar conclusions. But why would having a sister make you happier?
The usual answer — that girls and women are more likely than boys and men to talk about emotions — is somehow unsatisfying, especially to a researcher like me. Much of my work over the years has developed the premise that women’s styles of friendship and conversation aren’t inherently better than men’s, simply different.
A man once told me that he had spent a day with a friend who was going through a divorce. When he returned home, his wife asked how his friend was coping. He replied: “I don’t know. We didn’t talk about it.”
His wife chastised him. Obviously, she said, the friend needed to talk about what he was going through.
This made the man feel bad. So he was relieved to read in my book “You Just Don’t Understand” (Ballantine, 1990) that doing things together can be a comfort in itself, another way to show caring. Asking about the divorce might have made his friend feel worse by reminding him of it, and expressing concern could have come across as condescending.
The man who told me this was himself comforted to be reassured that his instincts hadn’t been wrong and he hadn’t let his friend down.
But if talking about problems isn’t necessary for comfort, then having sisters shouldn’t make men happier than having brothers. Yet the recent study — by Laura Padilla-Walker and her colleagues at Brigham Young University — is supported by others.
[H/t: Maeve Adams]
Let me start from the beginning. Vasilij had fallen ill, and I went to see him after visiting an exhibition in the Riga Art Space. The exhibition included paintings of the most diverse quality (including very poor ones), grouped by decade and forming what was quite literally a kind of labyrinth. The subterranean exhibition space was crammed with works, which had even prompted someone to write in the visitors’ book that art isn’t firewood (evidently meaning that paintings cannot be piled up like logs of firewood). I recalled this comment when, trying to step back in order to get a better view of a large painting, I tripped on the steps directly in front of the painting. And so I sat in Vasilij’s living room with my sprained leg on a pillow and, while sipping tea, recounted my impressions. Our state of health prompted us to adopt a resignedly ironic view. At the beginning of the conversation I mentioned the guiding principle of the exhibition: to cast a look at the art of the Soviet period without ideological prejudices, something that may have accounted for the varying quality of the exhibited works. “I didn’t know that ideological prejudice or the lack of it could serve as a criterion for quality in art”, said Vasilij scornfully.
more from Janis Taurens at Eurozine here.
‘I am a camera with its shutter open, quite passive, recording, not thinking.’ Anyone familiar with the declaration by the narrator of Christopher Isherwood’s most enduring work of fiction, Goodbye to Berlin (1939), will be surprised by how uncinematic, indeed incomprehensive, his diary entries can be. There’s a lot of thinking, and nothing like the gestures towards abandoning subjectivity and self-consciousness that Isherwood crafted into his novels, not least the one masterpiece penned during the period covered by this second collection – A Single Man (1964). As in the first volume of diaries, published in 1996, Isherwood comes across as, by turns, rebarbative, loving, insecure, opinionated, self-critical, self-destructive, reticent, controlling and grand. His sing-song voice – caught in the 2007 documentary Chris and Don: A Love Story – is hard to square with these entries, which are rarely light-hearted. What they are, however, is a huge relief after this book’s thousand-page predecessor.
more from Richard Canning at Literary Review here.
For most of us, one of the fundamental appeals of art is its exemplary capacity in the struggle against entropy — a cultural artifact is valued according to the degree of order it embodies — and the strength of its resistance to the ravages of time. The more intricately woven the tapestry or solidly constructed the pyramid, the more reassured we are that perhaps Kansas got it wrong with regard to all we are being dust in the wind. Of course, this being the case, modernist and postmodernist artists have made it their business to challenge this preconception on a number of fronts — by ostentatiously reintegrating the already discarded detritus of culture into new arrangements, as in the collages of Kurt Schwitters and the Combines of Robert Rauschenberg; by emphasizing the spontaneous improvisational gesture in order to destabilize the balance between order and chaos, as in the abstract expressionist drip paintings of Jackson Pollock; by creating deliberately ephemeral performances, happenings and installations whose only record is whatever documentation or relics happen to be left over, as in Chris Burden’s often life-endangering actions of the early 1970s, whose collectible evidence consists of snapshots, Super-8 film, audiocassettes and a handful of used bullets. One of the pivotal figures in the development of this broad-spectrum aesthetic of decay was Alberto Burri (1915-1995), an Italian painter who first gained attention with his abstract compositions stitched together from scraps of surplus burlap sacks, then proceeded to explore the surface possibilities of shredded and burned plastic, welded plates of scrap metal, eroded acoustic tile and other quotidian industrial materials.
more from Doug Harvey at the LA Weekly here.
Pretty women wonder where my secret lies.
I'm not cute or built to suit a fashion model's size
But when I start to tell them,
They think I'm telling lies.
It's in the reach of my arms
The span of my hips,
The stride of my step,
The curl of my lips.
I'm a woman
I walk into a room
Just as cool as you please,
And to a man,
The fellows stand or
Fall down on their knees.
Then they swarm around me,
A hive of honey bees.
It's the fire in my eyes,
And the flash of my teeth,
The swing in my waist,
And the joy in my feet.
I'm a woman
Men themselves have wondered
What they see in me.
They try so much
But they can't touch
My inner mystery.
When I try to show them
They say they still can't see.
It's in the arch of my back,
The sun of my smile,
The ride of my breasts,
The grace of my style.
I'm a woman
Now you understand
Just why my head's not bowed.
I don't shout or jump about
Or have to talk real loud.
When you see me passing
It ought to make you proud.
It's in the click of my heels,
The bend of my hair,
the palm of my hand,
The need of my care,
'Cause I'm a woman
by Maya Angelou
from And Still I Rise, 1978
From The Independent:
The Booker prize-winning writer Arundhati Roy has made a strident defence of comments she made over the disputed territory of Kashmir after the Indian government threatened to arrest her for sedition. The authorities in Delhi have taken legal advice over whether to bring charges against the novelist and activist after she said Kashmir had never been an “integral part of India”. “Even the Indian government has accepted this. Why are we trying to change this now?” she added, at a public meeting, at which one of the other speakers was veteran separatist leader, Syed Ali Shah Geelani.The comments of Ms Roy were immediately seized on by political opposition, which demanded she be charged. Law minister, Veerappa Moily, said while India enjoyed freedom of speech, “it can't violate the patriotic sentiments of the people”.
But Ms Roy, writing from Srinagar, the largest town in the Kashmir valley and the scene of numerous deaths of protesters this year, said she had only given voice to what millions of people in Kashmir had been saying for a long time.”Pity the nation that has to silence its writers for speaking their minds,” she said. “Pity the nation that needs to jail those who ask for justice while communal killers, mass murderers, corporate scamsters, looters, rapists and those who prey on the poorest of the poor, roam free.”Ms Roy's comments come after the deaths of dozens of protesters in the Kashmir valley since new demonstrations for autonomy erupted in June. The once-independent kingdom has been fought over since 1947 when its Hindu ruler decided the Muslim-majority state should join independent India, rather than the newly-created Pakistan.
From Harvard Magazine:
Attending a quality kindergarten that provides experienced teachers and small classes yields measurable benefits, such as higher salaries in adulthood. That finding, in a study led by professor of economics Raj Chetty, has caused a stir by demonstrating that even the earliest school experiences can affect students’ subsequent quality of life, exerting more influence than researchers previously thought. Chetty and his colleagues, including Harvard Kennedy School associate professor of public policy John Friedman, examined data from Project STAR, a study of nearly 12,000 Tennessee kindergartners conducted in the mid 1980s. The children were randomly assigned to their teachers and to classes that were small (about 15 students) or large (around 24 students) and subsequently tracked (see “The Case for Smaller Classes,” May-June 1999, page 34).
Previous analyses of Project STAR showed that children in small classes, and those with the best teachers, scored higher on standardized tests in the primary grades. But by the time those students reached junior high, the advantage vanished, a phenomenon known as “fade out.” “By the time they’re in seventh or eighth grade, the kids in a better kindergarten class are not doing any better on tests than the kids in not-so-good classes,” Chetty says. Conventional wisdom held that the boost from a good kindergarten ebbed with time. “What’s really surprising about our study,” Chetty says, “is that [the advantage] comes back in adulthood.” When he and his colleagues looked at what the students—now in their early thirties—recently earned, they found that those who had the best kindergarten teachers make more money.
Lilya Kaganovsky in Kritik's Mad World blog:
In 2008, six years after The Wire first aired on HBO, Film Quarterly published a review of the complete fourth season on DVD, which described (in a positive way) the show’s fans as junkies and the show as a drug. Fans come to video stores, wrote J. M. Tyree, jonesing for a hit, desperate for another dose of a show that missed its audience only by finding it (much like Proust suggests a good novelist does) after the fact, when its five (really four and a half) seasons were nearly over:
There is a growing cult around The Wire, although many of its members do not subscribe to HBO, appearing instead like junkies at their local video rental stores months after the original broadcasts, and helping the show continue its extraordinary afterlife.
If Season 4 of Mad Men has been about anything, it has been about addiction. Cigarettes, alcohol, and sex appeared this season no longer bathed in the retrospective glow of nostalgia, but as vice, pure and simple, starting with Don’s masochistic sex with a prostitute in the first episode, and ending with Midge’s heroin addiction. As with so much television since the 1990s, and in the realist novel before that, smoking and drinking are used only to show weakness of character, a man (or woman) out of control. Pete, for example, who has been a much more upstanding citizen this season, doesn’t smoke and barely drinks, and the same goes for Henry Francis, as he holds his moral high ground against Betty’s fits of rage. “You need a drink? What are you, a wino? You need a drink?” he snarls during a memorable car ride home (“The Summer Man,” 4.8). No wonder Don vomits twice during this season: his very being is rejecting the thing he has become, while addiction itself is linked to cancer that eats away at you from the inside (another theme that begins with Anna Draper’s illness and ends with a possible anti-smoking campaign for the American Cancer Society).
But what, as Avital Ronell asks in Crack Wars, do we hold against the drug addict? What do we hold against the drug addict?
… that he cuts himself off from the world, in exile from reality, far from objective reality and the real like of the city and the community; that he escapes into a world of simulacrum and fiction. We disapprove of hallucinations. . . . We cannot abide the fact that his is a pleasure taken in an experience without truth.
Ex-stasis, going beyond/outside of yourself, the “high” of transgression. For pleasure to be what it is, says Ronell, it has to exceed a limit of what is altogether wholesome and healthy. Otherwise, “it’s something like contentedness, which can be shown to be in fact an abandonment of pleasure.”
Fred Pearce reviews Laurence C. Smith's The World in 2050, in Seed:
The World in 2050 is the best new geography book of the year. If that sounds underwhelming, it shouldn’t. Geography is the new hot discipline. A new generation of geographers is integrating the myriad concerns of the world, whether economic or political, social or environmental.
They are making sense of the globalisation of economics and environmental concerns in a way potentially as important as the cartographers of the middle ages. They are charting our limits and firing our imaginations. In this “thought experiment” into what kind of a world we may face just 40 years hence, Larry Smith shows the power of geography superbly with some literary ability.
To set the scene, he offers four global forces that will shape the coming decades. The first is escalating human demand on diminishing global resources, from water to oil to food. Smith skilfully sums up the global revolution created by the widespread use of fossil fuels in a sentence. “Packed inside a single barrel of oil is about the same amount of energy as would be produced from eight years of labour by an average-sized man.”
And he poses a central ethical dilemma with similar pithiness, asking: “What if you could play God and do the noble, ethically fair thing by converting the entire developing world’s level of material consumption to that now carried out by North Americans. By merely snapping your fingers you could eliminate this misery. Would you?” He adds: “I sure hope not.”
Then there is demography. He looks forward to the completion of the “demographic transition” and the end of population growth, but wonders how close a stable population may be.
Mark Blyth over at Triple Crisis:
In my first triple crisis piece I wrote about John Quiggin’s new book thesis concerning Zombie Economic ideas. Lead zombie of the moment is the idea of fiscal austerity as the way out of the crisis, despite oodles of evidence to the contrary. In short, we need to cut budgets to restore fiscal sanity, and we know that this is the way forward since small open economies in the 1980s (Ireland, Belgium, Denmark) that cut their budgets still grew. The economic (ir)rationale for this has been pointed out by Krugman, Stiglitz, and others. But for me the most interesting, and most tragic part of this story, are the distributional consequences of these policies, and the politics that they engender.
The first problem with such a policy is that if it works at all, it only works when everyone else is growing. If everyone else shrinks at the same time then what is individually rational becomes collectively disastrous, and viciously zero-sum. The second problem, the distributional one, is who pays for this debt crisis? The answer is ‘not those who made the mess in the first place’ – namely, finance. Instead, the double ‘put’ (quite literally) is on those who can afford it least, lower income taxpayers and consumers: once in the form of the bailouts, lost revenue, and lost growth, and now twice in the form of the fiscal consolidation (zombie-slashed public services) needed to pay back the debt generated from the bailout.
It is in this context that the much-anticipated budget cuts of the British government announced last week come to the fore. Britain has embarked upon a giant natural experiment to settle the stimulus versus austerity debate once and for all by plumping for austerity, and on a truly epic scale.
As Reinhardt and Rogoff remind us, approximately eighty percent of the time you have a banking crisis it will be followed by a sovereign debt crisis. As the public sector levers up to compensate for the fall in private spending, deficits are generated and new debt issues become a necessity. The UK economy was hit harder than many of its European peers when finance imploded because a full quarter of all British tax receipts came from the financial sector. This, plus the effect of the British economy’s automatic stabilizers, resulted in a budget deficit of 10.1 percent of GDP by 2011, with British government debt issues rising to 58.5 percent of GDP to plug these gaps. This ‘death spiral,’ so the argument of the British government goes, has to be reversed since ever-increasing debts will lead to ever-increasing interest payments, eventually turning Britain into Greece. To avoid this the proposed sacrifice is a $128 billion reduction in public spending over four years, which it is hoped will reduce the budget deficit from 10.1 percent of GDP to 2.1 percent by 2014. Virtue, it seems, favors the bold.
Roger Boylan reviews The Autobiography of Mark Twain: Volume I, in the Boston Review:
The posthumous career of Samuel Langhorne Clemens, better known as Mark Twain, has been a busy one.
According to the staff of the University of California’s Mark Twain Project, more than 5,000 previously unknown letters of Twain’s have surfaced in the last 50 years. This represents an average of two new letters per week, but still only about one-tenth of the 50,000 or so he is believed to have written. Two of his best-known works were published after his death: the iconoclastic Letters from the Earth, in which a not-yet-fallen Satan, on a fact-finding trip to Earth, analyzes the follies of the human race in a series of letters to his fellow angels (“Now my kids can learn how to be good atheists!” a friend of mine exclaimed); and the bizarre supernatural fantasy No. 44, The Mysterious Stranger, set in a medieval Austria no better prepared than Twain’s America to deal with the harsh truth about humanity, as expounded by an otherworldly visitor calling himself “No. 44.” The former was released in 1962, the latter in 1982 (a fraudulent version appeared in 1916).
These repeated encores would neither have displeased nor surprised Twain, who approved of the idea of withholding publication until after death. “I will leave it behind,” he said of one of his unpublished writings, “and utter it from the grave. There is free speech there, and no harm to the family.” He said substantially the same thing about his autobiography, decreeing that it remain unpublished until a hundred years after he died:
A book that is not to be published for a century gives the writer a freedom which he could secure in no other way. In these conditions you can draw a man without prejudice exactly as you knew him and yet have no fear of hurting his feelings or those of his sons or grandsons.
Twain was also deeply fond of his own celebrity, which, by delaying publication, he sought to extend into the future. And the first volume of his Autobiography, with two more to come, is now rolling off the presses, as requested, a hundred years after the occasion when news of his death was not exaggerated. Volume I is divided into two main sections: “Preliminary Manuscripts and Dictations, 1870–1905,” consisting of autobiographical jottings and early odds and ends, and the “Autobiography” proper, starting in 1906, when Twain began to dictate his reminiscences to a stenographer. Some of the earlier pieces were published in his lifetime; in 1906 the North American Review printed excerpts, titled “Chapters from My Autobiography,” to generally favorable critical reception.
But beset by doubts and chronically depressed—for good reason, having gone bankrupt or nearly bankrupt twice, and having lost a son, a daughter, and a wife over the years (the second of his three daughters, Jean, would die in 1909)—Twain continued to blow hot and cold on the whole idea of writing an autobiography
Andrew Gelman over at Statistical Modeling, Causal Inference, and Social Science:
Nate Silver and Justin Wolfers are having a friendly blog-dispute about momentum in political polling. Nate and Justin each make good points but are also missing parts of the picture. These questions relate to my own research so I thought I'd discuss them here.
There ain't no mo'
Nate led off the discussion by writing that pundits are always talking about “momentum” in the polls:
Turn on the news or read through much of the analysis put out by some of our friends, and you're likely to hear a lot of talk about “momentum”: the term is used about 60 times per day by major media outlets in conjunction with articles about polling.
When people say a particular candidate has momentum, what they are implying is that present trends are likely to perpetuate themselves into the future. Say, for instance, that a candidate trailed by 10 points in a poll three weeks ago — and now a new poll comes out showing the candidate down by just 5 points. It will frequently be said that this candidate “has the momentum”, “is gaining ground,” “is closing his deficit,” or something similar.
Each of these phrases are in the present tense. They create the impression that — if the candidate has gone from being 10 points down to 5 points down, then by next week, he'll have closed his deficit further: perhaps he'll even be ahead!
But, as Nate points out, this ain't actually happening:
Say that a candidate has improved her position in the polls from August to September. Is her position more likely than not to improve further from September to October? . . . this is not what we see at all . . . Sometimes, a candidate who has gained ground in the polls continues to do so; otherwise, the trend reverses itself, or the race simply flatlines. . . . There is also no sign of momentum we look at the change in polling between other periods. . . . In general elections, the direction in which polls have moved is not predictive of the direction in which they will move. [italics added]
I like Nate's analysis. It's very much in the Bill James style, but with graphs.
Consider the time scale
Enter Justin Wolfers, who writes that Nate is all wrong, that there is momentum in political polling.
Justin argues that Nate made a mistake by using the same data in his “before” and “after” comparison.
“Liberal principles,” declares Milbank, “will always ensure that the rights of the individual override those of the group.” For this reason, he concludes, “liberalism cannot defend corporate religious freedom.” The neutrality liberalism proclaims “is itself entirely secular” (it brackets belief; that’s what it means by neutrality) and is therefore “unable to accord the religious perspective [the] equal protection” it rhetorically promises. Religious rights “can only be effectively defended pursuant to a specific and distinctly religious framework.” Liberal universalism, with its superficial respect for everyone (as long as everyone is superficial) and its deep respect for no one, can’t do it. If that is so, then the other contributors to this volume are whistling “Dixie,” at least with respect to the hope declared by Rawls that liberalism in some political form might be able to do justice to the strongly religious citizens of a liberal state. Milbank’s fellow essayists cannot negotiate or remove the impasse he delineates, but what they can do, and do do with considerable ingenuity and admirable tact, is find ways of blunting and perhaps muffling the conflict between secular and religious imperatives, a conflict that cannot (if Milbank is right, and I think he is) be resolved on the level of theory, but which can perhaps be kept at bay by the ad-hoc, opportunistic, local and stop-gap strategies that are at the heart of politics.
more from Stanley Fish at The Opinionater here.
The ethico-political preference for a democratic model in which parties are – formally, at least – subordinate to state mechanisms falls into the trap of the ‘democratic fiction’. It ignores the fact that, in a ‘free’ society, domination and servitude are located in the ‘apolitical’ economic sphere of property and managerial power. The Party’s distance from state apparatuses and its ability to act without legal constraint afford a unique possibility: ‘illegal’ activity can be undertaken not only in the interest of the market but – sometimes – in the interest of the workers too. For example, when the 2008 financial crisis hit China, the instinctive reaction of the Chinese banks was to follow the cautious approach of Western banks, radically cutting back on lending to companies wishing to expand. Informally (no law legitimised this), the Party simply ordered the banks to release credit, and thus succeeded – for the time being – in sustaining the growth of the Chinese economy. To take another example, Western governments complain that their industries cannot compete with the Chinese in producing green technology, since Chinese companies get financial support from their government. But what’s wrong with that? Why doesn’t the West simply follow China and do the same?
more from Slavoj Žižek at the LRB here.