No Fire Zone – Trailer from Zoe Sale on Vimeo.
[Thanks to Wolf Böwig.]
No Fire Zone – Trailer from Zoe Sale on Vimeo.
[Thanks to Wolf Böwig.]
Mohsin Hamid in The Guardian:
In 2007, six years after the terrorist attacks of 11 September 2001, I was travelling through Europe and North America. I had just published a novel, The Reluctant Fundamentalist, and as I travelled I was struck by the large number of interviewers and of audience members at Q&As who spoke of Islam as a monolithic thing, as if Islam referred to a self-contained and clearly defined world, a sort of Microsoft Windows, obviously different from, and considerably incompatible with, the Apple OS X-like operating system of “the west”.
I recall one reading in Germany in particular. Again and again, people posed queries relating to how “we Europeans” see things, in contrast to how “you Muslims” do. Eventually I was so exasperated that I pulled my British passport out of my jacket and started waving it around my head. “While it's true the UK hasn't yet joined the eurozone,” I said, ” I hope we can all agree the country is in fact in Europe.”
Six years on, a film inspired by the novel is in the process of appearing on screens around the world, and I am pleased to report that those sorts of questions are a little rarer now than they were in 2007. This represents progress. But it is modest progress, for the sense of Islam as a monolith lingers, in places both expected and unexpected.
More here.
Justin Sonnenburg, a microbiologist at Stanford, suggests that we would do well to begin regarding the human body as “an elaborate vessel optimized for the growth and spread of our microbial inhabitants.” This humbling new way of thinking about the self has large implications for human and microbial health, which turn out to be inextricably linked. Disorders in our internal ecosystem — a loss of diversity, say, or a proliferation of the “wrong” kind of microbes — may predispose us to obesity and a whole range of chronic diseases, as well as some infections. “Fecal transplants,” which involve installing a healthy person’s microbiota into a sick person’s gut, have been shown to effectively treat an antibiotic-resistant intestinal pathogen named C. difficile, which kills 14,000 Americans each year. (Researchers use the word “microbiota” to refer to all the microbes in a community and “microbiome” to refer to their collective genes.) We’ve known for a few years that obese mice transplanted with the intestinal community of lean mice lose weight and vice versa. (We don’t know why.) A similar experiment was performed recently on humans by researchers in the Netherlands: when the contents of a lean donor’s microbiota were transferred to the guts of male patients with metabolic syndrome, the researchers found striking improvements in the recipients’ sensitivity to insulin, an important marker for metabolic health. Somehow, the gut microbes were influencing the patients’ metabolisms.
more from Michael Pollan at the NY Times Magazine here.
According to genetics, there is not much that makes us human; depending on how you count, we share 98.5 per cent of our genes with chimpanzees. Perhaps this is not such a significant matter, given that we also share about 60 per cent of our genes with tomatoes. As this shows, human beings are fully part of nature, and the elements that make us make not just the rest of the animal and vegetable kingdoms, but the rocks beneath our feet and the stars in the sky above us. So what does make us human? It is not that we live in social groups: ants, antelopes and sparrows do the same. It is not that we have nuanced emotional lives: so do dogs and baboons. It is not even that we have language, for other things – including trees, as it happens – have communication systems, too, and it might be that some of those systems are quite complex, as appears to be the case with dolphins, for example. But in the human case the system of communication – language – is particularly complex and flexible, with great expressive power, and this makes possible the phenomenon of culture. If I were to pick one thing that separates humanity from the rest of the living world, culture is it.
more from AC Grayling at The New Statesman here.
In the summer of 1494, soon after his engagement, Albrecht Dürer made a startlingly intimate drawing of his fiancée Agnes Frey. One might have expected a twenty-three-year-old to depict his betrothed as a source of love, or comfort or well-being, all the more since her substantial dowry would soon launch his independent career. Instead, Albrecht showed Agnes twisted up in a knot of anxious introversion. She looks withdrawn and preoccupied, and the circles under her heavy-lidded eyes may even make one think she has been crying. In its frank portrayal of an informal moment of unguarded emotion, there had never been a drawing quite like this before. Typically portraiture was honorific and meant to represent the exemplary virtues of the person shown; Dürer instead often sought to capture the idiosyncratic and psychological characteristics of the people he portrayed. He was fascinated with the close scrutiny of dark and brooding emotion. This is especially evident in his self-portraits, many of which show him in states of melancholy, doubt, or disease.
more from Andrew Butterfield at the NYRB here.
From Kurzweil AI:
A world-first completed clinical study by an Australian team has found Kava, a medicinal South Pacific plant, significantly reduced the symptoms of people suffering anxiety. The study, led by the University of Melbourne, revealed Kava could be an alternative to pharmaceutical products for the hundreds of thousands of Australians who suffer from generalized anxiety disorders (GAD) “In this study we’ve been able to show that Kava offers a potential natural alternative for the treatment of chronic clinical anxiety; unlike some other options, it has less risk of dependency and less potential for side effects,” said lead researcher, Dr Jerome Sarris from Department of Psychiatry at the University of Melbourne. The study also found that people’s genetic differences (polymorphisms) of certain neurobiological mechanisms called GABA transporters may modify their response to Kava. “If this finding is replicated, it may pave the way for simple genetic tests to determine which people may be likely to have a beneficial anxiety-reducing effect from taking Kava,” Sarris said.
An additional novel finding of the study, recently published in Phytotherapy Research, was that Kava increased women’s sex drive compared to those in the placebo group, believed to be due to the reduction of anxiety, rather than any aphrodisiac effect. Future studies confirming the genetic relationship to therapeutic response, and any libido-improving effects from Kava is now required. Dr Sarris said these significant findings are of importance to sufferers of anxiety and to the South Pacific region, which relies on Kava as a major export.
More here.
From The Atlantic:
The season finale marked the last regular SNL appearance of Seth Meyers (slated to succeed Jimmy Fallon as host of NBC's Late Night), Fred Armisen, and Bill Hader. (Jason Sudeikis's return remains uncertain.) The show sent them off with a mostly strong episode and some fitting farewell moments. Host Ben Affleck was joined by his wife, Jennifer Garner, during the monologue. Amy Poehler joined Seth Meyers for Weekend Update. Musical guest Kanye West performed “Black Skinhead” and “New Slaves.”
Picture: Seth Meyers beats out Anderson Cooper for Stefon's hand in marriage.
More here.
Richard Marshall interviews Gordon Finlayson in 3:AM Magazine:
3:AM: A key discussion in contemporary liberal theory of ethics and politics is the relationship and differences between Habermas and Rawls. Can you say something about what you take the main points of dispute are and where you stand on this?
GF: Sure. In my view, despite the amount of ink that has been spilt on Habermas and Rawls in their respective fields, relatively little attention has been paid to the dispute between them. This is largely because influential commentators and critics were quick to judge their exchange in the Journal of Philosophy a damp squib.
This was in part because expectations ran high, at the time, because two of the greatest social and political theorists of the 20th century, although working in different traditions, roughly analytic political philosophy and German Social theory had engaged each other in debate. It was also because in truth neither thinker was sufficiently well apprised of the detail of the others theory – unsurprisingly really, since they worked in very different traditions and each had just spent the last few years writing their own major work of political theory. Finally, everyone at the time, including the disputants themselves, were seduced by the assumption that the salient point of comparison between their respective theories was Habermas’s principle (U) and his conception of the moral standpoint, and Rawls’s argument that the principles of justice are those that would be chosen by a rational and reasonable persons in the Original Position. Almost everyone who has written on Habermas and Rawls makes that particular mistake.
My take on that is straightforward. The debate between them concerns their respective political theories. It is basically a dispute between Rawls’s theory of Political Liberalism, and Habermas’s Discourse Theory of Law. It is not primarily a dispute between Rawls’s A Theory of Justice, and Habermas Discourse Ethics. Principle (U) is the central idea in Habermas’s Discourse Ethics, which is a moral theory, not a theory of law or of democratic legitimacy, while the argument from the Original Position takes a back seat in Rawls’s Political Liberalism. People who interpret the Habermas Rawls dispute in the light of the contrast between Habermas’s principle (U) and Rawls’s Original Position, are looking at the wrong thing and so miss the real points of dispute.
What people should have been asking is this. What are the central organizing ideas of their respective political theories, and on what significant points do these ideas conflict?
Mike Jay reviews Suzanne Corkin's Permanent Present Tense: The Man with No Memory, and What He Taught the World in the LRB:
Memory creates our identity, but it also exposes the illusion of a coherent self: a memory is not a thing but an act that alters and rearranges even as it retrieves. Although some of its operations can be trained to an astonishing pitch, most take place autonomously, beyond the reach of the conscious mind. As we age, it distorts and foreshortens: present experience becomes harder to impress on the mind, and the long-forgotten past seems to draw closer; University Challenge gets easier, remembering what you came downstairs for gets harder. Yet if we were somehow to freeze our memory at the youthful peak of its powers, around our late twenties, we would not create a polished version of ourselves analogous to a youthful body, but an early, scrappy draft composed of childhood memories and school-learning, barely recognisable to our older selves.
Something like this happened to the most famous case of amnesia in 20th-century science, a man known only as ‘H.M.’ until his death in 2008. When he was 27, a disastrous brain operation destroyed his ability to form new memories, and he lived for the next 55 years in a rolling thirty-second loop of awareness, a ‘permanent present tense’. During this time he was subjected to thousands of hours of tests, of which naturally he had no recall; he provided data for hundreds of scientific papers, and became the subject of a book (Memory’s Ghost by Philip Hilts) and a staple of popular science journalism; by the 1990s digital images of his uniquely disfigured hippocampus featured in almost every standard work on the neuroscience of memory. Since his death his brain has been shaved into 2401 slices, each 70 microns thick, compared in one account to the slivers of ginger served with sushi. Suzanne Corkin, an MIT neuroscientist, first met him in 1962 and after 1980 became his lead investigator and ‘sole keeper’. Permanent Present Tense is her account of Henry Gustave Molaison – his full identity can finally be revealed – and the historic contribution he made to science.
Corkin had a reputation for strict policing of access to Henry, a charge she happily concedes: ‘I did not want him to become a sideshow attraction – the man without a memory.’ After the death of his mother, his last thirty years were spent at a Connecticut nursing home in strict anonymity, with staff sworn to secrecy and filming prohibited. More than a hundred carefully screened researchers were admitted over the years to perform brain scans and cognitive tests, but were never told his name. Corkin’s lucid, well-organised telling of Henry’s story merges intimate case history with an account of the current scientific understanding and how it was reached.
Christopher Benfey in the NYRB's blog:
Spring should be a time of portents and premonitions, winged harbingers (“I dreaded that first Robin, so,” as Emily Dickinson put it with characteristic ambivalence) and new beginnings.
This thought struck me as I read Megan Marshall’s sympathetic new biography of Margaret Fuller, which opens with a familiar phrase from Virgil’s Aeneid, one that inspired an essay Fuller wrote during her precocious childhood. Possunt, quia posse videntur means, roughly, “They can because they think they can,” and describes a team of rowers who, according to Marshall, “will themselves to win a race.” The phrase, which Fuller thought demonstrated “confidence in the future,” gives Marshall an overarching theme for Fuller’s fiercely driven life.
But Fuller also made use of the Aeneid when she was less confident of the future. She was known to perform the ancient form of divination in which a passage of Virgil selected at random is assumed to reveal what lies ahead. Sir Philip Sidney described the practice, with a dash of skepticism, in his Defence of Poesy:
And so far were they carried into the admiration thereof, that they thought in the chanceable hitting upon any such verses great fore-tokens of their following fortunes were placed; whereupon grew the word of Sortes Virgilianae, when by sudden opening Virgil’s book they lighted upon some verse of his making.
she peers, stunned, from cell 22
that such dumb minuteness
can shake the earth
Enda O'Doherty in Eurozine:
Germans have featured prominently among those who have sometimes had difficulty in believing that their native tongue is quite up to the mark, or, as we say in our barbarous contemporary jargon, fit for purpose. The German invention of printing in the mid-fifteenth century was certainly to give a boost to the prestige of vernacular languages (at the expense of the universal language, Latin). It was also to be important in spreading the new religion, Protestantism. Martin Luther enthused:
Printing is God's most recent gift and his greatest. Through it, in effect, God wishes to make known the true religion to the whole world, right to the extremities of the Earth.
And so it came to pass. But the Whiggish Protestantism (still alive in the popular cultural histories of Lord Bragg, formerly Melvyn) which celebrates the unstoppable spread of the Word, to be read and chewed over by the individual in private – an improving substitute for the “nonsense” mumbled by the priest in an incomprehensible language – tends to forget that in the short term virtually no one could read, whereas all could see and grasp the meaning of the wall paintings, statues and altarpieces in the church, which the Protestants for the most part were so keen to efface or destroy. The short term in this context, we should remember, was rather long. Mass literacy came to England only in the nineteenth century.Luther however, after his initial enthusiasm, seems to have had second thoughts about the wisdom of translating the Bible into German and making it available to everyone, or everyone who could read (he was to find, disturbingly, that they disagreed with him about what it meant).
John B. Thompson reviews Stephen R. Platt's Autumn in the Heavenly Kingdom : China, the West, and the Epic Story of the Taiping Civil War and Tobie Meyer-Fong's What Remains : Coming to Terms with Civil War in 19th Century China in the LA Review of Books:
The Taiping Civil War (1850–1864) started with a dream. Hong Xiuquan, a young scholar from Guangdong, a province in southern China, aspired to the government position and the unassailable status guaranteed by success in imperial civil service examinations. However, in 1837, Hong flunked the provincial-level examination in Canton, the province’s major city, for the third time and returned home broken. He collapsed into episodic trances in which he traveled to a heavenly realm and met an old man in a black dragon robe. The man, whom Hong understood to be his “father,” stood grieving at the edge of heaven, dismayed by the people of his creation who had been led astray by demons. He dispatched Hong to earth, along with a middle-aged man identified as Hong’s “elder brother,” to slay these devils.
Until 1843, Hong had no vocabulary to explain his visions. That year, he rediscovered a collection of Bible passages he had obtained in Canton years before, and the meaning of his visions became clear: his heavenly father was God. His elder brother was Jesus. The demons were China’s false idols and Hong was China’s savior. Hong immediately began to preach his vision along with the New Testament in the mountains of southern China and quickly amassed a growing following among the farmers and villagers.
Over time, Hong resolved to establish on earth the kingdom he had seen in heaven. He redefined the demons from the idols of China’s cultural inheritance to the alien Manchu rulers of the Qing Dynasty. “God had divided the kingdoms of the world […] just as a father divides his estates among his sons,” Hong said. “Why should these Manchus forcibly enter China and rob their brothers of their estate?” In 1850, Hong and his Society of God Worshippers openly rebelled against Qing authorities. In 1851, Hong formally declared the existence of the Taiping Heavenly Kingdom with himself as Heavenly King. By 1853, his resourceful, ever-growing army had captured the old Ming Dynasty capital of Nanjing. From that point until the end of the civil war, there were effectively two states within China.
Claude S. Fischer in Boston Review:
Some observers respond to questions raised by the Flynn Effect by dismissing intelligence testing as an exercise in cultural domination. This ostrich-like response ignores the fact that IQ scores, whatever they measure, consistently correlate with important outcomes such as how well people perform their jobs and how long they live. Such dismissal also ignores the growing evidence that there is a physical, neurological basis to cognition and cognitive skills.
A more serious critique of the research attacks the definition of intelligence. Researchers in the intelligence field define it as a general capability to reason, understand complex ideas, think abstractly, and solve problems. You can measure it, they argue, using IQ tests. Critics consider these tests to be superficial and argue that they ignore other kinds of intelligence such as emotional intelligence or deeper traits such as wisdom. While researchers cannot track historical trends in wisdom, they are trying to wise up about the apparent historical increase in IQ.
One might suspect that the tests have gotten easier. They haven’t. In fact, the tests have gotten harder in order to keep the average IQ at one hundred. By reversing that process, Flynn showed the long-term rise in real performance.
Other challengers argue that we are not really smarter than our great-grandparents; it’s just that people today learn the answers to test questions in school or have become familiar with testing. However, scores on the parts of tests that are most easily taught and are the most culture-laden—say, recognizing vocabulary or knowing geography—have not changed much. Scores on those parts of tests that measure the most abstract, presumably culture-free thinking—say, drawing logical inferences from patterns in designs—have risen the most. The sorts of thinking that are supposedly most detached from classroom and cultural learning are the ones that have really improved.
So if a real increase in some kind of cognitive ability is under way, the question is why.
A conversation with Lee Smolin in Edge:
The main question I'm asking myself, the question that puts everything together, is how to do cosmology; how to make a theory of the universe as a whole system. This is said to be the golden age of cosmology and it is from an observational point of view, but from a theoretical point of view it's almost a disaster. It's crazy the kind of ideas that we find ourselves thinking about. And I find myself wanting to go back to basics—to basic ideas and basic principles—and understand how we describe the world in a physical theory.
What's the role of mathematics? Why does mathematics come into physics? What's the nature of time? These two things are very related since mathematical description is supposed to be outside of time. And I've come to a long evolution since the late 80's to a position, which is quite different from the ones that I had originally, and quite surprising even to me. But let me get to it bit by bit. Let me build up the questions and the problems that arise.
One way to start is what I call “physics in a box” or, theories of small isolated systems. The way we've learned to do this is to make an accounting or an itinerary—a listing of the possible states of a system. How can a possible system be? What are the possible configurations? What were the possible states? If it's a glass of Coca Cola, what are the possible positions and states of all the atoms in the glass? Once we know that, we ask, how do the states change? And the metaphor here—which comes from atomism that comes from Democritus and Lucretius—is that physics is nothing but atoms moving in a void and the atoms never change.