On Miklós Bánffy’s Transylvanian Trilogy

0375712305.01.LZZZZZZZMatt Seidel at The Millions:

These and other sketches of Transylvanians gone wild demonstrate a benign ridiculousness, but Bánffy also sees the corrosive effects of comedy. (Tellingly, one of the novel’s villains, Pal Uzdy, occasionally bursts out in strange, meaningless laughter.) When a newly-appointed Prefect is pelted with eggs in Parliament, Abady laughs along with the others before becoming overcome with sadness: “He thought only of the fact that an innocent man had been humiliated, and that it was callous and distasteful that everyone should think the whole affair a tremendous joke and nothing more.”

His Hungarian colleagues think most everything is a tremendous joke, a quality directly related to their failure to take the gathering international storm seriously:

The sad truth was that all of them found anything that did not concern their own country fit only for mockery and laughter. To them such matters were as remote from reality as if they had been happening on Mars; and therefore fit only for schoolboy puns and witty riposte.

Abady mistrusts his countrymen’s love of the comic as a form of irresponsibility.

more here.

How To Think Like A Neandertal

NeanderthalEva McGuire at Dublin Review of Books:

One of the most famous Neandertal individuals, and the most complete Neandertal skeleton that has been found, is Shanidar 1, who lived and died in Iraqi Kurdistan about fifty thousand years ago. Shanidar 1 was male, between thirty and forty when he died, about 5’8” tall. His skeleton also reveals the many physical traumas he suffered during his life. His right arm had been injured beyond use or else amputated above the elbow many years prior to death. Several bones in his right foot had been badly broken leading to serious arthritic degeneration in his right ankle and knee, while his left leg, knee and foot were normal. He had received a devastating blow to the left side of his face which had crushed his left cheekbone, the left side of his cranium and probably blinded him in his left eye. This facial trauma had also healed many years before his death. He had received a separate wound to his right scalp, deep enough to cut the bone. This too had healed before his death. Whether these injuries occurred due to a single incident or separate events is not known. They may have been the result of a hunting accident or have been due to a violent interaction with another Neandertal. After all, interpersonal violence is a characteristic not only of modern humans but of many non-human primates, including chimps, our closest cousins. While his injuries demonstrate that Shanidar 1 had a very tough life, perhaps what is most significant about them is the fact that he survived them. He must have been cared for and this is a huge clue to the Neandertal mind. He would have, for some time at least, been incapacitated, unable to take part in hunting and unable to care for or feed himself.

more here.

Has wealth made Qatar happy?

Matthew Teller in BBC News Magazine:

_74500329_camel_car_afp624From desperate poverty less than a century ago, this, after all, has become the richest nation in the world, with an average per-capita income topping $100,000 (£60,000).

What's less well understood is the impact of such rapid change on Qatari society itself.

You can feel the pressure in Doha. The city is a building site, with whole districts either under construction or being demolished for redevelopment. Constantly snarled traffic adds hours to the working week, fuelling stress and impatience.

Local media report that 40% of Qatari marriages now end in divorce. More than two-thirds of Qataris, adults and children, are obese.

Qataris benefit from free education, free healthcare, job guarantees, grants for housing, even free water and electricity, but abundance has created its own problems.

“It's bewildering for students to graduate and be faced with 20 job offers,” one academic at an American university campus in Qatar tells me. “People feel an overwhelming pressure to make the right decision.”

In a society where Qataris are outnumbered roughly seven-to-one by expatriates, long-term residents speak of a growing frustration among graduates that they are being fobbed off with sinecures while the most satisfying jobs go to foreigners.

The sense is deepening that, in the rush for development, something important has been lost.

Read the rest here.

The Continuing Evolution of Genes

Carl Zimmer in The New York Times:

GeneEach of us carries just over 20,000 genes that encode everything from the keratin in our hair down to the muscle fibers in our toes. It’s no great mystery where our own genes came from: our parents bequeathed them to us. And our parents, in turn, got their genes from their parents. But where along that genealogical line did each of those 20,000 protein-coding genes get its start? That question has hung over the science of genetics ever since its dawn a century ago. “It’s a basic question of life: how evolution generates novelty,” said Diethard Tautz of the Max Planck Institute for Evolutionary Biology in Plön, Germany. New studies are now bringing the answer into focus. Some of our genes are immensely old, perhaps dating all the way back to the earliest chapters of life on earth.

But a surprising number of genes emerged more recently — many in just the past few million years. The youngest evolved after our own species broke off from our cousins, the apes. Scientists are finding that new genes come into being at an unexpectedly fast clip. And once they evolve, they can quickly take on essential functions. Investigating how new genes become so important may help scientists understand the role they may play in diseases like cancer.

More here.

Monday, April 28, 2014

Sunday, April 27, 2014

A Fundamental Theory to Model the Mind

Jennifer Oullette in Quanta:

Sand-brain-1_james_o_brienIn 1999, the Danish physicist Per Bak proclaimed to a group of neuroscientists that it had taken him only 10 minutes to determine where the field had gone wrong. Perhaps the brain was less complicated than they thought, he said. Perhaps, he said, the brain worked on the same fundamental principles as a simple sand pile, in which avalanches of various sizes help keep the entire system stable overall — a process he dubbed “self-organized criticality.”

As much as scientists in other fields adore outspoken, know-it-all physicists, Bak’s audacious idea — that the brain’s ordered complexity and thinking ability arise spontaneously from the disordered electrical activity of neurons — did not meet with immediate acceptance.

But over time, in fits and starts, Bak’s radical argument has grown into a legitimate scientific discipline. Now, about 150 scientists worldwide investigate so-called “critical” phenomena in the brain, the topic of at least three focused workshops in 2013 alone. Add the ongoing efforts to found a journal devoted to such studies, and you have all the hallmarks of a field moving from the fringes of disciplinary boundaries to the mainstream.

More here.

The Possibility of Self-Sacrifice

Na'aman---Wilding-Davison-web

Oded Na’aman in Boston Review:

Normally, death is present in our lives as an ending-yet-to-arrive. For most of us, Simone Weil writes, “Death appears as a limit set in advance on the future.” We make plans, pursue goals, navigate relationships—all under the condition of death. We lead our lives under the condition of death; our actions are shaped by it as a surface is shaped by its boundaries.

However, as we approach this boundary, when our end is present, we are nothing but terror. All pursuits disintegrate, and our self-understanding collapses. At once we are expelled from the sphere of meaning. We are nothing more than this body. This body and its last breath. It is not simply that we cannot survive our own death; we cannot bear the sight of it. We do not want to die. Not now.

And yet the possibility of self-sacrifice suggests that this terror can be overcome, that death can be meaningful. One recent example is that of Mohamed Bouazizi, the Tunisian street vendor who set himself on fire in December 2010 and whose death put in motion the massive uprising known as the Arab Spring. But there are many less noted acts of self-sacrifice. In different places and moments in time, in different languages and cultures, soldiers, activists, lovers, friends, and parents exhibit a willingness to die that demands our attention.

Such acts, so difficult to comprehend, may seem at first sight unworthy of serious consideration. But rushing to this conclusion would be a mistake. It is not only that by dismissing acts of self-sacrifice as unintelligible we disavow a prevalent and influential human phenomenon. Understanding these acts may also shed light on the way we value things more generally. Indeed, we will see that even if most of us will never actually take such extreme measures, the possibility of self-sacrifice is part of living a meaningful life.

More here.

Ants Swarm Like Brains Think

3158_55d99a37b2e1badba7c8df4ccd506a88

Carrie Arnold in Nautilus:

Each of the brain’s 86 billion neurons can be connected to many thousands of others. When a neuron fires, it sends a signal to nearby neurons that changes the probability that they will also fire. Some neurons are excitatory, and increase the chances that other neurons will fire. Others are inhibitory, and reduce this chance. A combination of inputs from a given neuron’s neighbors will determine if it fires. If two neurons make each other fire often, the synapse between them (a small gap across which chemical or electrical signals are passed) will strengthen, so that they can more readily provide feedback to each other.

“This is where you get the saying that ‘Neurons that fire together, wire together,’ ” says Dmitri Chklovskii, a neuroscientist at the Howard Hughes Medical Institute. But what is often not appreciated about this adage is that wiring also requires the second neuron to send a message to the first that it, too, has fired.

“The only way the upstream neuron knows that the second neuron fired is that it produces a feedback spike. This helps the synapse make the decision to get stronger,” Chklovskii says. Feedback is where the similarity with ants begins. “Feedback loops are everywhere on every level. They allow the system to realize that what it used to be doing isn’t working any more, and to try something new.”

Both ants and brains actually rely on two types of feedback, held in a delicate balance: negative (or inhibitory) feedback, and positive (or excitatory) feedback. “Negative feedback tends to cause stability. Positive feedback tends to cause runaway behavior,” said Tomer Czaczkes, an ant biologist at the University of Regensburg in Germany. “These two simple rules make something very powerful.”

The foraging response to food is an example of a positive feedback loop, and familiar to anyone who has had a picnic ruined by a line of ants marching in single file toward their meal. But knowing when not to leave the nest and risk predation and dehydration may be just as important as knowing when to take advantage of a windfall of seeds. At low levels of an input (a small amount of food, for example) positive feedback dominates. At high levels of input, negative feedback dominates, helping to prevent runaway processes.

More here.

India: Censorship by the Batra Brigade

Doniger_1-050814_jpg_600x647_q85

Wendy Doniger in the NYRB (photo: Kuni Takahashi/The New York Times/Redux):

In February of this year, after a long career of relative obscurity in the ivory tower, I suddenly became notorious. In 2010, Penguin India had published a book of mine,The Hindus: An Alternative History, which won two awards in India: in 2012, the Ramnath Goenka Award, and in 2013, the Colonel James Tod Award. But within months of its publication in India, a then-eighty-one-year-old retired headmaster named Dina Nath Batra, a proud member of the far-right organization Rashtriya Swayamsevak Sangh (RSS), had brought the first of a series of civil and criminal actions against the book, arguing that it violated Article 295a of the Indian Penal Code, which forbids “deliberate and malicious acts intended to outrage religious feelings of any class” of citizens.

After fighting the case for four years, Penguin India, which had recently merged with Bertelsmann, abandoned the lawsuit, agreeing to cease publishing the book. (It also agreed to pulp all remaining copies, but—as it turned out—not a single book was destroyed; all extant copies were quickly bought up from the bookstores.) When Penguin told me it was all over, I thought it was all over, and was grateful for the long run we’d had.

There wasn’t anything special about my book; Batra had been attacking other books for some time. But what was special, and unexpected, was the volume and intensity and duration of the outcry in reaction to Penguin’s action: other authors withdrew their books from Penguin, defying it to pulp them, too; people accused the publishers of cowardice for giving up without even taking the case to court, in contrast with their former courage in successfully (and at very great expense) defending Lady Chatterley’s Lover in 1960. One Bangalore law firm issued a legal notice suggesting that the Penguin logo be changed from a penguin to a chicken.

More here.

Davidson and Derrida

9780804737531

Richard Marshall interviews Sam Wheeler III in 3:AM Magazine:

3:AM: So firstly, what are the similarities between Davidson and Derrida you find interesting? And how is Wittgenstein part of this ‘deconstruction’ mix you develop? You say it’s taken you longer to work out how Davidson and Derrida differ? Have you got it sorted now? What are the differences?

SW: The correspondences are between notions like indeterminacy of interpretation and Derrida’s “free play” and their shared denial that there is a given. Absent a language (the logoi, the language of thought, Fregean senses) whose words, by their very nature, refer to their referents, interpreting one system of signs in terms of another allows choices. That is, without a common thing that expressions from two systems express, there is nothing to go on but what people say when. So, there is slack.

Without a given domain of objects, another kind of fixity of meaning, “referring to the same object” is unavailable. Derrida in relation to Heidegger is somewhat akin to Davidson in relation to Quine, in dropping the remaining bit of realistic metaphysics.

Derrida and Davidson go in different directions. Davidson’s applications of his semantics and ontological views is largely on topics central to analytic philosophy, with some exceptions. Derrida, of course, primarily writes about topics in continental philosophy. He is interested in interpretations that go beyond the logical/truth-conditional. He takes the denial of logoi to mean that rhetorical features of a discourse can be just as important as what we would regard as “logical.” So, many of Derrida’s interpretations of, say Plato, rest on what we could regard as accidental features, for instance that “pharmakon” is both drug and poison.

The one figure I know they both thought about is J.L. Austin. Davidson spent several classes on Austin’s How to Do Things with Words in his 1967 class on philosophy of language. Derrida wrote Signature, Event, Context on the same text. They raise some of the same issues. Both Davidson and Derrida are suspicious of theories which ignore “marginal” cases. Davidson was particularly interested in arguing that what you can do with a sentence is not a good starting-point for semantics. Derrida and Davidson both point out that there is no way to fix a particular speech act to a sentence. Davidson’s notes that no sign can label something an assertion. Derrida makes a similar point about signatures.

Derrida may have read Davidson, since he did read my essay relating his views to indeterminacy of interpretation. I’m pretty sure Davidson never read Derrida. But Derrida would have very much admired Davidson’s essay on James Joyce, I think. Davidson would, if patient, have admired Derrida’s “White Mythology.”

More here.

Toward Cultural Citizenship

Jonathan Shaw in Harvard Magazine:

HumBhabha, the Rothenberg professor of the humanities, who is teaching “The Art of Reading” this spring with Marquand professor of English Peter Sacks, asserts that the humanities are “the preeminent sciences of interpretation.” Whether assessing linguistic, aural, or visual evidence, “the humanities through literature, the classics, modern languages, [or]…philosophy” use interpretation to create a “whole world of associations, contexts, significations, and values.” Interpretation, he stresses, is therefore an activity that through the exercise of judgment about important works (of art, literature, music, sculpture, architecture, etc.) “creates social and cultural values. And therefore, the humanities help us to become…not just political citizens, not social citizens, not citizens in a legal sense, but cultural citizens. That is the real force of the humanities.”

Humanistic interpretation also plays an important role in coping with the outpouring of information from the digital world, Bhabha says. “As we teach our students how to interpret, that allows the flood of facts and information to be turned into knowledge. Interpretation is the mediating force that winnows through all the information” to produce and categorize knowledge.

More here.

Gabriel García Márquez: ‘His world was mine’

Salman Rushdie in The Telegraph:

GaboGabo lives. The extraordinary worldwide attention paid to the death of Gabriel García Márquez, and the genuine sorrow felt by readers everywhere at his passing, tells us that the books are still very much alive. Somewhere a dictatorial “patriarch” is still having his rival cooked and served up to his dinner guests on a great dish; an old colonel is waiting for a letter that never comes; a beautiful young girl is being prostituted by her heartless grandmother; and a kindlier patriarch, José Arcadio Buendía, one of the founders of the new settlement of Macondo, a man interested in science and alchemy, is declaring to his horrified wife that “the earth is round, like an orange”. We live in an age of invented, alternate worlds. Tolkien’s Middle-earth, Rowling’s Hogwarts, the dystopic universe of The Hunger Games, the places where vampires and zombies prowl: These places are having their day. Yet in spite of the vogue for fantasy fiction, in the finest of literature’s fictional microcosms there is more truth than fantasy. In William Faulkner’s Yoknapatawpha, RK Narayan’s Malgudi, and, yes, the Macondo of Gabriel García Márquez, imagination is used to enrich reality, not to escape from it.

One Hundred Years of Solitude is 47 years old now, and in spite of its colossal and enduring popularity, its style – magic realism – has largely given way, in Latin America, to other forms of narration, in part as a reaction against the sheer size of García Márquez’s achievement. The most highly regarded writer of the next generation, Roberto Bolaño, notoriously declared that magic realism “stinks”, and jeered at García Márquez’s fame, calling him “a man terribly pleased to have hobnobbed with so many Presidents and Archbishops”. It was a childish outburst, but it showed that for many Latin American writers the presence of the great colossus in their midst was more than a little burdensome. (“I have the feeling,” Carlos Fuentes once said to me, “that writers in Latin America can’t use the word ‘solitude’ anymore, because they worry that people will think it’s a reference to Gabo. And I’m afraid,” he added, mischievously, “that soon we will not be able to use the phrase ‘one hundred years’ either.”) No writer in the world has had a comparable impact in the past half-century. Ian McEwan has accurately compared his pre-eminence to that of Charles Dickens. No writer since Dickens was so widely read, and so deeply loved, as Gabriel García Márquez.

More here.

Saturday, April 26, 2014

The Art of Willem de Kooning

0c7e71dc-cc1b-11e3-a934-00144feabdc0Jackie Wullschlager at The Financial Times:

As postwar art recedes into history, its giant figures stand out more starkly. In American painting, the status of Willem de Kooning, last of the abstract expressionists to die, in 1997, is outstripping that of everyone else.

His career was not only longer, but more unexpected and contrary, less dependent on a signature style, than those of his peers Jackson Pollock the Dripper, Barnett Newman the Zipper, or Mark Rothko, the high priest of transcendent rectangles. De Kooning is as rowdy and irreverent as a pop artist, but tuned into European modernism in ways that give depth and emotional richness. By refusing to conform to ideals of abstraction, dissolving gaps between figurative and non-figurative, he appears immediate and present within today’s non-hierarchical approaches.

Not that he is easy. A show at New York’s Gagosian Gallery last winter reignited the debate over his old-age work: is it exquisite simplicity, or the outpourings of an alcoholic, demented fool?

more here.

Akhil Sharma: ‘I feel as if I’ve shattered my youth on this book’

Nicholas Wade in The Guardian:

GetImageIt took 12-and-a-half years and I can't believe how bad that time was,” says Akhil Sharma. “I was such a different person when I began writing it that I feel as if I've shattered my youth on this book. I still find it hard to believe that it's over, and I have this constant fear that I need to go and sit at my computer.” Sharma is talking about his second novel, Family Life, which is published in the UK next week. It tells the autobiographical story of a family's emigration from India to the US in the late 1970s, and how an accident that left the elder son severely brain-damaged brought them close to collapse. The book has already been published to much acclaim in the US – “Deeply unnerving and gorgeously tender at its core” said the New York Times – matching the praise Sharma received when he emerged in the late 90s with prize-winning short stories and then a 2001 debut novel, An Obedient Father, which won the PEN/Hemingway award. But the positive response to Family Life still feels “almost as unreal as the book being done,” he says. The intervening period of silence – although he was named on Granta's 2007 list of best young American writers – has not been easy.

Just as Sharma moved to New York with his parents and elder brother, Anup, so the eight-year-old Ajay Mishra makes the same journey in Family Life. While Ajay finds the transition difficult, his elder brother Birju thrives in America, and even wins a place at a prestigious high school. But shortly before term starts Birju hits his head on the bottom of a swimming pool and lays unconscious underwater for three minutes, resulting in catastrophic brain damage that leaves him blind and unable to communicate or move. After two years in hospital and nursing homes the family take Birju home, and the rest of Ajay's childhood is played out against a backdrop of 24-hour care, a stream of crackpot faith healers and a family increasingly defined by alcoholism, destructive tensions and lies. But while the broad facts of that story match those of Sharma and his brother Anup's own lives, finding a way to most effectively tell it proved almost insuperable. From the beginning he was aware that it was “a coming-of-age story, an illness story and the story of a child's love for his parents and his brother”.

More here.

The Romantics and the Orient: What English Poetry owes to the Middle East

Samar Attar in Informed Comment:

Borrowed-imagination-british-romantic-poets-their-arabic-islamic-samar-attar-hardcover-cover-artMany times I have asked myself how can I concentrate on the British Romantic Poets when people in Syria, the country of my birth, have been killing each other for the last four years, supposedly in the name of liberty and freedom? Thousands of men, women and children have been killed, maimed, or made homeless. Historical monuments that stood there for centuries were erased from the ground. Was I trying desperately to forget my misery and immerse myself in a mythical world? If so, I did not entirely succeed.

Ironically, even in the poetry of the Romantics, I cannot escape the haunting images of past such paroxysms of violence in my homeland. Coleridge much admired Hulagu,the brother of Kublai Khan (the Mongol Emperor of China), but Syrians remember him as a tyrant who sacked northern Syria in 1260, and attempted to destroy the latest traces of civilization. Byron greatly admired Timur Lang (Tamerlane), who destroyed Aleppo and Damascus in the fourteenth century, leaving behind him mountains of skulls out of which hundreds of towers and pyramids were built. But neither of these icons of the romantics can be blamed for the current destruction. It is the Syrians themselves who are engaged in self-destruction, not, as in former times, intruders.

More here.

Time’s Arrow Traced to Quantum Source

Natalie Wolchover in Quanta:

ScreenHunter_594 Apr. 26 17.55Coffee cools, buildings crumble, eggs break and stars fizzle out in a universe that seems destined to degrade into a state of uniform drabness known as thermal equilibrium. The astronomer-philosopher Sir Arthur Eddington in 1927 cited the gradual dispersal of energy as evidence of an irreversible “arrow of time.”

But to the bafflement of generations of physicists, the arrow of time does not seem to follow from the underlying laws of physics, which work the same going forward in time as in reverse. By those laws, it seemed that if someone knew the paths of all the particles in the universe and flipped them around, energy would accumulate rather than disperse: Tepid coffee would spontaneously heat up, buildings would rise from their rubble and sunlight would slink back into the sun.

“In classical physics, we were struggling,” said Sandu Popescu, a professor of physics at the University of Bristol in the United Kingdom. “If I knew more, could I reverse the event, put together all the molecules of the egg that broke? Why am I relevant?”

Surely, he said, time’s arrow is not steered by human ignorance. And yet, since the birth of thermodynamics in the 1850s, the only known approach for calculating the spread of energy was to formulate statistical distributions of the unknown trajectories of particles, and show that, over time, the ignorance smeared things out.

Now, physicists are unmasking a more fundamental source for the arrow of time: Energy disperses and objects equilibrate, they say, because of the way elementary particles become intertwined when they interact — a strange effect called “quantum entanglement.”

More here.