Gabriel García Márquez, Conjurer of Literary Magic, Dies at 87

Jonathan Kandell in the New York Times:

ScreenHunter_594 Apr. 18 08.07Gabriel García Márquez, the Colombian novelist whose “One Hundred Years of Solitude” established him as a giant of 20th-century literature, died on Thursday at his home in Mexico City. He was 87.

Cristóbal Pera, his former editor at Random House, confirmed the death. Mr. García Márquez learned he had lymphatic cancer in 1999, and a brother said in 2012 that he had developed senile dementia.

Mr. García Márquez, who received the Nobel Prize for Literature in 1982, wrote fiction rooted in a mythical Latin American landscape of his own creation, but his appeal was universal. His books were translated into dozens of languages. He was among a select roster of canonical writers — Dickens, Tolstoy and Hemingway among them — who were embraced both by critics and by a mass audience.

“Each new work of his is received by expectant critics and readers as an event of world importance,” the Swedish Academy of Letters said in awarding him the Nobel.

Mr. García Márquez was a master of the literary genre known as magical realism, in which the miraculous and the real converge. In his novels and stories, storms rage for years, flowers drift from the skies, tyrants survive for centuries, priests levitate and corpses fail to decompose. And, more plausibly, lovers rekindle their passion after a half-century apart.

More here.

Thursday, April 17, 2014

Thursday Poem

Lullaby for a Daughter

Go to sleep. Night is a coal pit
full of black water —
……… night is a dark cloud
full of warm rain.

Go to sleep. Night is a flower
resting from bees —
……… night's a green sea
swollen with fish.

Go to sleep. Night is a white moon
riding her mare —
……… night's a bright sun
burned to black cinder.

Go to sleep,
night's come,
cat's days,
owl's day,
star's feast of praise,
moon to reign over
her sweet subject, dark.

by Jim Harrision
from The Shape of the Journey
Copper Canyon Press, 1998

In nature, death is not defeat

Eva Saulitis in Orion Magazine:

BirthNatureDeathWar_24MB-384x287FOR TWENTY-SIX SEPTEMBERS I’ve hiked up streams littered with corpses of dying humpbacked salmon. It is nothing new, nothing surprising, not the stench, not the gore, not the thrashing of black humpies plowing past their dead brethren to spawn and die. It is familiar; still, it is terrible and wild. Winged and furred predators gather at the mouths of streams to pounce, pluck, tear, rip, and plunder the living, dying hordes. This September, it is just as terrible and wild as ever, but I gather in the scene with different eyes, the eyes of someone whose own demise is no longer an abstraction, the eyes of someone who has experienced the tears, rips, and plunder of cancer treatment. In spring, I learned my breast cancer had come back, had metastasized to the pleura of my right lung. Metastatic breast cancer is incurable. Through its prism I now see this world.

…NO ONE TEACHES US how to die. No one teaches us how to be born, either. In an essay about visiting the open-air cremation pyres of Varanasi, India, Pico Iyer quotes the scholar Diana L. Eck: “For Hindus, death is not the opposite of life; it is, rather, the opposite of birth.” It happens that my stepdaughter, Eve, is pregnant. I’ve known her since she was three years old; she’s thirty now. One late afternoon this spring, early in her pregnancy, early in my diagnosis, we picked bags of wild rose petals together in a meadow below my house; she intended to make rose-flavored mead. We hadn’t talked much about the implications of my cancer recurrence; in the meadow, we almost didn’t have to. It hovered in the honeyed sunlight between us. That light held the fact of life growing inside her and the cancer growing inside me equally, strangely. We talked around the inexplicable until, our bags full of pale pink petals, we held each other in the tall grass and cried. Watching her body change in the months since, without aid of technology or study or experience, watching her simply embody pregnancy, should teach me something about dying. In preparation for giving birth, she reads how-to books, takes prenatal yoga, attends birthing classes. She studies and imagines. Yet no matter how learned she becomes, how well informed, with the first contraction, her body will take over. It will enact the ancient, inborn process common to bears, goats, humans, whales, and field mice. She will inhabit her animal self. She will emit animal cries. She will experience the birth of her child; she will live it. Her body—not her will or her mind or even her self—will give birth. Can I take comfort in the countless births and deaths this earth enacts each moment, the jellyfish, the barnacles, the orcas, the salmon, the fungi, the trees, much less the humans?

More here.

A New Idea of Why Eating Less Increases Life Span

Annie Sneed in Scientific American:

MonkNematode worms, fruit flies, mice and other lab animals live longer, healthier lives when they eat less than they otherwise would if more food were available. Primates may also benefit, and perhaps humans—which is why research funds are pouring into this phenomenon. But all this raises a puzzling question: Why did creatures evolve such a mechanism in the first place? Researchers have declared the most popular theory doesn’t make evolutionary sense, and they’ve proposed a new explanation in its place. The most prominent theory involves what happens physiologically during times of food scarcity. When the living is good, natural selection favors organisms that invest energy in reproduction. In times of hardship, however, animals have fewer offspring, diverting precious nutrients to cell repair and recycling so they can survive until the famine ends, when reproduction begins anew. Cell repair and recycling appear to be substantial antiaging and anticancer processes, which may explain why underfed lab animals live longer and rarely develop old-age pathologies like Margo Adler agrees with the basic cellular pathways, but she’s not so sure about the evolutionary logic.

Adler, an evolutionary biologist at the University of New South Wales in Australia, says this popular idea relies on a big assumption: that natural selection favors this energy switch from reproduction to survival because animals will have more young in the long run—so long as they actually survive and reproduce. “This idea is repeated over and over again in the literature as if it’s true, but it just doesn’t make that much sense for evolutionary reasons,” she says. The problem, Adler says, is that wild animals don’t have the long, secure lives of their laboratory cousins. Instead, they’re not only endangered by famine but by predators and pathogens, random accidents and rogue weather as well. They also face physiological threats from a restricted diet, including a suppressed immune system, difficulty with healing and greater cold sensitivity. For these reasons, delaying reproduction until food supplies are more plentiful is a huge risk for wild animals. Death could be waiting just around the corner. Better to reproduce now, Adler says. The new hypothesis she proposes holds that during a famine animals escalate cellular repair and recycling, but they do so for the purpose of having as many progeny as possible during a famine, not afterward.

More here.

Wednesday, April 16, 2014

Why Chris Marker’s Radical Images Influenced So Many Artists

A-museum-built-by-Chris-M-009

Sukhdev Sandhu, William Gibson, Mark Romanek, and Joanna Hogg discuss Marker in The Guardian (h/t: Meg Toth; image of a museum built by Chris Marker in Second Life). Sandhu:

Marker didn't regard artistic forms as sacred. He didn't believe in the primacy of celluloid or the cinema screen. He was continually embracing and experimenting with new technologies: one of his richest later works was a CD-Rom entitled Immemory (1997); he created Photoshop cartoon-collages for the French website Poptronics; the Whitechapel show includes a projection of Ouvroir: The Movie (2010), a tour of a museum he created on Second Life, as well as the UK premiere of Zapping Zone (Proposal for an Imaginary Television) (1990-94), a sprawling assemblage of videos, computers and light boxes.

“Marker was always interested in transformation,” recalls Darke. This fascination with the ability of new technologies to transform ideas of human identity, social connection and the nature of memory makes him a strikingly contemporary figure whose work has been embraced by young art students as much as cinephiles. His claim to be a “bricoleur” – a collector of pre-existing visual material – is resonant now that the harvesting, assembling and curation of images has become as important as their creation. His fondness for revisiting old material and reusing it in new contexts resonates with the present era's unprecedented ability not only to store huge digital archives, but to click, drag and recontextualise their contents across limitless formats.

At a time when corporations and governments alike are hell-bent on surveilling and snooping on citizens, Marker's anonymity feels like a thrilling and prophetic act of resistance.

More here.

Karl Polanyi Explains It All

9780807056431

Robert Kuttner in The American Prospect:

In November 1933, less than a year after Hitler assumed power in Berlin, a 47-year-old socialist writer on Vienna’s leading economics weekly was advised by his publisher that it was too risky to keep him on the staff. It would be best both for the Österreichische Volkswirt and his own safety if Karl Polanyi left the magazine. Thus began a circuitous odyssey via London, Oxford, and Bennington, Vermont, that led to the publication in 1944 of what many consider the 20th century’s most prophetic work of political economy, The Great Transformation: The Political and Economic Origins of Our Time.

Polanyi, with no academic base, was already a blend of journalist and public intellectual, a major critic of the Austrian School of free-market economics and its cultish leaders, Ludwig von Mises and Friedrich Hayek. Polanyi and Hayek would cross swords for four decades—Hayek becoming more influential as an icon of the free-market right but history increasingly vindicating Polanyi.

Reluctantly, Polanyi left Vienna for London. Two of his British admirers, the Fabian socialist intellectuals G.D.H. Cole and Richard Tawney, found him a post at an Oxford—sponsored extension school for workers. Polanyi’s assignment was to teach English social and economic history. His research for the course informed the core thesis of his great book; his lecture notes became the working draft. This month marks the 70th anniversary of the book’s publication and also the 50th anniversary of Polanyi’s death in 1964.

Looking backward from 1944 to the 18th century, Polanyi saw the catastrophe of the interwar period, the Great Depression, fascism, and World War II as the logical culmination of laissez-faire taken to an extreme. “The origins of the cataclysm,” he wrote, “lay in the Utopian endeavor of economic liberalism to set up a self-regulating market system.” Others, such as John Maynard Keynes, had linked the policy mistakes of the interwar period to fascism and a second war. No one had connected the dots all the way back to the industrial revolution.

More here.

How the President Got to ‘I Do’ on Same-Sex Marriage

20cover-master1050-v5

Jo Becker in the NYT Magazine (photo illustration by Daan Brand for The New York Times. Obama: Mark Wilson/Getty Images.):

Despite the president’s stated opposition, even his top advisers didn’t believe that he truly opposed allowing gay couples to marry. “He has never been comfortable with his position,” David Axelrod, then one of his closest aides, told me.

Indeed, long before Obama publicly stated that he was against same-sex marriage, he was on the record supporting it. As an Illinois State Senate candidate from Chicago’s liberal Hyde Park enclave, Obama signed a questionnaire in 1996 saying, “I favor legalizing same-sex marriages, and would fight efforts to prohibit such marriages.” But as his ambitions grew, and with them the need to appeal to a more politically diverse electorate, his position shifted.

In the course of an unsuccessful run for a House seat in 2000, he said he was “undecided” on the question. By the time he campaigned for the presidency, he had staked out an even safer political position: Citing his Christian faith, he said he believed marriage to be the sacred union of a man and a woman.

The assumption going into the 2012 campaign was that there was little to be gained politically from the president’s coming down firmly in favor of same-sex marriage. In particular, his political advisers were worried that his endorsement could splinter the coalition needed to win a second term, depressing turnout among socially conservative African-Americans, Latinos and white working-class Catholics in battleground states.

But by November 2011, it was becoming increasingly clear that continuing to sidestep the issue came with its own set of costs. The campaign’s internal polling revealed that the issue was a touchstone for likely Obama voters under 30.

More here.

How Philosophy Makes Progress

Photo_49303_portrait_large

Rebecca Newberger Goldstein in The Chronicle of Higher Education (image: André da Loba for The Chronicle):

Questions of physics, cosmology, biology, psychology, cognitive and affective neuroscience, linguistics, mathematical logic: Philosophy once claimed them all. But as the methodologies of those other disciplines progressed—being empirical, in the case of all but logic—questions over which philosophy had futilely sputtered and speculated were converted into testable hypotheses, and philosophy was rendered forevermore irrelevant.

Is there any doubt, demand the naysayers, about the terminus of this continuing process? Given enough time, talent, and funding, there will be nothing left for philosophers to consider. To quote one naysayer, the physicist Lawrence Krauss, “Philosophy used to be a field that had content, but then ‘natural philosophy’ became physics, and physics has only continued to make inroads. Every time there’s a leap in physics, it encroaches on these areas that philosophers have carefully sequestered away to themselves.” Krauss tends to merge philosophy not with literature, as Wieseltier does, but rather with theology, since both, by his lights, are futile attempts to describe the nature of reality. One could imagine such a naysayer conceding that philosophers should be credited with laying the intellectual eggs, so to speak, in the form of questions, and sitting on them to keep them warm. But no life, in the form of discoveries, ever hatches until science takes over.

There’s some truth in the naysayer’s story. As far as our knowledge of the nature of physical reality is concerned—four-dimensional space-time and genes and neurons and neurotransmitters and the Higgs boson and quantum fields and black holes and maybe even the multiverse—it’s science that has racked up the results. Science is the ingenious practice of prodding reality into answering us back when we’re getting it wrong (although that itself is a heady philosophical claim, substantiated by concerted philosophical work).

And, of course, we have a marked tendency to get reality wrong.

More here.

Hejinian, Whitman, and more on the politics of sleep

A.Cortina_El_suenoSiobhan Phillips at Poetry Magazine:

Sleep is invisible and inconsistent. Aping death, sleep in fact prevents it; at the very least, sleep deprivation leads to premature demise (and before that, failures in mood, metabolism, cognitive function). All animals sleep, and it makes sense for none of them, evolutionarily, since it leaves the sleeper defenseless to predation. Sleep is common, public, a vulnerability we all share—even as sleep also brackets the sleeper in the most impenetrable of privacies. Nothing, everyone knows, is harder to communicate than one’s dream.

And then there’s time. Sleep seems to remove us from the general tyranny of the advancing clock. When you wake, 20 minutes could have passed as easily as three hours. But sleep defines time, dividing day and night. Humans discover circadian rhythm through the urge to sleep. That urge is, of course, cyclic, endless: always more sleep to be had. But sleep measures forward progress by consolidating our sense of the past. (Steven W. Lockley and Russell G. Foster lay out the evidence for this and other facts in their briskly informative Sleep: A Very Short Introduction.) In sleep, our brains decide what to keep and discard. Without sleep, we would dissolve into overloaded confusion.

more here.

crimean meditations

570_jmJacob Mikanowski at The Millions:

Who does the Crimea belong to?

First of all, to the sea that made it. Seven thousand years ago, the Black Sea was much lower than it is today. Then a waterfall tumbled over the Bosporus, and the waters began to rise. The flood cut the Crimea off from the mainland – all the way except for a narrow isthmus called the Perekop. Ever since, it has been a rocky island on the shores of a sea of grass.

The steppes belonged to the nomads. Grass meant horses, and freedom. The steppes stretched north, from the mouth of the Danube to the Siberian Altai. Across the centuries they were home to various nomadic confederations and tribes: Scythians, Sarmatians, Huns, Pechenegs, Cumans, Mongols, and Kipchak Turks. The legendary Cimmerians predate them all; the Cossacks are still there today.

At times, the nomadic tribes made their home in Crimea too.

more here.

The Dadliest Decade

Robinwilliams_397139Willie Osterweil at The Paris Review:

The eighties, at least, were drenched in cocaine and neon, slick cars and yacht parties, a real debauched reaction. But nineties white culture was all earnest yearning: the sorrow of Kurt Cobain and handwringing over selling out, crooning boy-bands and innocent pop starlets, the Contract With America and the Starr Report. It was all so self-serious, so dadly.

Today, by some accounts, the nineties dad is cool again, at least if you think normcore is a thing beyond a couple NYC fashionistas and a series of think pieces. Still, that’s shiftless hipsters dressed like dads, not dads as unironic heroes and subjects of our culture. If the hipster cultural turn in the following decades has been to ironize things to the point of meaninglessness, so be it. At least they don’t pretend it’s a goddamn cultural revolution when they have a kid: they just let their babies play with their beards and push their strollers into the coffee shop. In the nineties, Dad was sometimes the coolest guy in the room. He was sometimes the butt of the joke. He was sometimes the absence that made all the difference. But he was always, insistently, at the center of the story.

more here.

Shift the meat-to-plant ratio

Miles Becker in Conservation:

Meat-heads Can farmers feed an additional 4 billion people with current levels of crop production? A team from the University of Minnesota tackled the problem by shifting the definition of agricultural productivity from the standard measure (tons per hectare) to the number of people fed per hectare. They then audited the global caloric budget and found a way to squeeze out another 4 quadrillion calories per year from existing crop fields. Their starting point was meat production, the most inefficient use of calories to feed people. The energy available from a plant crop such as corn dwindles dramatically when it goes through an intermediate consumer such as a pig. Beef has the lowest caloric conversion efficiency: only 3 percent. Pork and chicken do three to four times better. Milk and eggs, animal products that provide us essential nutrients in smaller batches, are a much more efficient use of plant calories.

The researchers calculated that 41 percent of crop calories made it to the table from 1997 to 2003, with the rest lost mainly to gastric juices and droppings of livestock. Crop calorie efficiency is expected to fall as the meat market grows. Global meat production boomed from 250.4 million tons in 2003 to 303.9 million tons by 2012, as reported by the FAO. Rice production, mainly for human food, dwindled by 18 percent over the same time period. The authors of the 2013 paper, published in Environmental Research Letters, suggested a trend reversal would be desirable. They estimated that a shift from crops destined for animal feed and industrial uses toward human food could hypothetically increase available calories by 70 percent and feed another 4 billion people each year.

More here.

brain’s anti-distraction system

From Phys.Org:

BrainTwo Simon Fraser University psychologists have made a brain-related discovery that could revolutionize doctors' perception and treatment of attention-deficit disorders.

This discovery opens up the possibility that environmental and/or genetic factors may hinder or suppress a specific brain activity that the researchers have identified as helping us prevent distraction. The Journal of Neuroscience has just published a paper about the discovery by John McDonald, an associate professor of psychology and his doctoral student John Gaspar, who made the discovery during his master's thesis research. This is the first study to reveal our brains rely on an active suppression mechanism to avoid being distracted by salient irrelevant information when we want to focus on a particular item or task.

McDonald, a Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the specific neural index of suppression in his lab in 2009. But, until now, little was known about how it helps us ignore visual distractions. “This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It's like finding Waldo in a Where's Waldo illustration,” says Gaspar, the study's lead author.

More here.

Tuesday, April 15, 2014

Paul de Man Was a Total Fraud: The literary-critical giant lied about every part of his life

Robert Alter in The New Republic:

Article_inset_alterEvelyn Barish begins her impressively researched biography by flatly stating that “Paul de Man no longer seems to exist.” This may be an exaggerated expression of frustration by a biographer whose long-
incubated work now appears after what might have been the optimal time for it. Yet there is considerable truth in what she says. De Man is now scarcely remembered by the general public, though he was the center of a widely publicized scandal in 1988, five years after his death at the age of 64. In the 1970s and 1980s, he was a central figure, an inevitable figure, in American literary studies, in which doctoral dissertations, the great barometer of academic fashion, could scarcely be found without dozens of citations from his writings. But the meteor has long since faded: over the past decade and more, I have only rarely encountered references to de Man in students’ work, committed as they generally are to 
marching with the zeitgeist.

Paul de Man arrived in the United States from his native Belgium in the spring of 1948. He would remain in this country illegally after the expiration of his temporary visa, on occasion finding ways to elude the Immigration and Naturalization Service. But that, as Barish’s account makes clear, was the least of his infractions of the law. Eventually he would be admitted, with a considerable amount of falsification on his part, to the doctoral program in comparative literature at Harvard, from which he would 
receive a degree, in somewhat compromised circumstances, in 1960. He then went on to teach at Cornell, briefly at Johns Hopkins, and most significantly at Yale, where he became a “seminal” scholar and an 
altogether revered figure.

More here.

2014 Pulitzer Prize Winners in Journalism, Letters, Drama and Music

From the New York Times:

ScreenHunter_591 Apr. 16 10.40FICTION

DONNA TARTT

“The Goldfinch” (Little, Brown)

Ms. Tartt’s best-selling novel is about a boy who comes into possession of a painting after an explosion at a museum.

In a phone conversation on Monday, Ms. Tartt, 50, said the novel “was always about a child who had stolen a painting,” but it was only two years into writing the book that she saw “The Goldfinch,” a 17th-century work by Carel Fabritius.

“It fit into the plot of the book I was writing in ways I couldn’t have imagined,” she said. “It had to be a small painting that a child could carry, and that a child could be obsessed by.”

Finalists Philipp Meyer, “The Son”; Bob Shacochis, “The Woman Who Lost Her Soul.”

More here.

Why Nobody Can Tell Whether the World’s Biggest Quantum Computer is a Quantum Computer

D-wave_technology_e_img_5948_export

Leo Mirani and Gideon Lichfield in Quartz (via Jennifer Ouellette, D-Wave Systems photo):

For the past several years, a Canadian company called D-Wave Systems has been selling what it says is the largest quantum computer ever built. D-Wave’s clients include Lockheed Martin, NASA, the US National Security Agency, and Google, each of which paid somewhere between $10 million and $15 million for the thing. As a result, D-Wave has won itself millions in funding and vast amounts of press coverage—including, two months ago, the cover of Time (paywall).

These machines are of little use to consumers. They are delicate, easily disturbed, require cooling to just above absolute zero, and are ruinously expensive. But the implications are enormous for heavy number-crunching. In theory, banks could use quantum computers to calculate risk faster than their competitors, giving them an edge in the markets. Tech companies could use them to figure out if their code is bug-free. Spies could use them to crack cryptographic codes, which requires crunching through massive calculations. A fully-fledged version of such a machine could theoretically tear through calculations that the most powerful mainframes would take eons to complete.

The only problem is that scientists have been arguing for years about whether D-Wave’s device is really a quantum computer or not. (D-Wave canceled a scheduled interview and did not reschedule.) And while at some level this doesn’t matter—as far as we know, D-Wave’s clients haven’t asked for their money back—it’s an issue of importance to scientists, to hopeful manufacturers of similar machines, and to anyone curious about the ultimate limits of humankind’s ability to build artificial brains.

More here.

russia and the history of ‘eurasianism’

CzarPádraig Murphy at The Dublin Review of Books:

There is thus a lively debate in Russia itself on the country’s orientation. The question is, where does the leadership stand in this debate? The answer is difficult, because not only has Russia become more autocratic under Putin, but the circle of real decision-makers has become ever smaller. According to some accounts, it may consist of no more than five people. But, reviewing the period since 2000, when Putin assumed power, it is plausible that it began with a continuation of a commitment to democracy and a market economy, associated with a growing resentment at lack of consideration on the part of the West to certain deep Russian concerns – NATO enlargement, treatment as a poor supplicant, disregard for what are seen as legitimate interests in the neighbourhood etc. Angela Stent cites a senior German official complaining of an “empathy deficit disorder” in Washington in dealing with Russia. The pathology that this caused became progressively more virulent in the intervening years, culminating in 2003 in the invasion of Iraq without any Security Council mandate, indeed, in open defiance of the UN. After this, the New York Times magazine’s Ron Suskind reported on a visit to the Bush White House in 2004 in the course of which he recounts that “an aide” (commonly supposed to be Karl Rove) “said that guys like me were ‘in what we call the reality-based community’, which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality’… ‘That’s not the way the world really works any more’, he continued. ‘We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality – judiciously, as you will – we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors … and you, all of you, will be left to just study what we do.’”

more here.