The Lost Futures of Chris Marker

Tumblr_m3yufg2fjb1qb59k0o1_1280_jpg_470x472_q85J. Hoberman on Chris Marker, in the New York Review of Books blog:

Gracefully off-kilter, stylized as semaphores, the shadow of a man and an outlined woman are positioned at the center of a sea shell spiral. Are they dancing on air—or falling into the void?

The poster for Alfred Hitchcock’s Vertigo is scarcely less haunting than the movie. I first saw the image, without understanding what it was, as a nine-year-old on summer vacation and carried the memory with me for some twelve or fifteen years before I first saw the film. It was, as the filmmaker Chris Marker—one of Vertigo’s most ardent admirers—might say, a memory of the future.

As Vertigo is the most uncanny of movies, it feels more than coincidental that on August 1, two days following Marker’s death, at ninety-one, in Paris, the British film journal Sight and Sound announced, with no little fanfare that, after forty years, his favorite movie had finally dethroned Citizen Kane atop the magazine’s once-a-decade critics poll.

Kane is the movie that hyper-dramatized the act of filmmaking. Vertigo is about film-watching in extremis—the state of being hopelessly, obsessively in love with an image. Unlike Kane (or Psycho for that matter), Vertigo was not immediately recognized as great cinema, except in France by people like Marker. Steeped as it is in the pathos of unrecoverable memory, Marker’s La Jetée (1962) was probably the first movie made under Vertigo’s spell.

Tied for fiftieth place (one vote ahead of Rear Window) in the Sight and Sound poll, La Jetée is Marker’s most generally known work, in part because it was remade in the mid 1990s by Terry Gilliam as 12 Monkeys. Marker was the opposite of a celebrity; he was famous not for his well-knownness but for a certain willful unknowability. The man born Christian François Bouche-Villeneuve was permanently incognito. He allowed few interviews and carefully concealed his personal life; although he turned his camera on countless people, including several fellow filmmakers (Andrei Tarkovsky, Akira Kurosawa), he never allowed himself to be photographed.

I called Marker a “filmmaker” but it would more accurate to term him a “film artist.” His oeuvre encompasses movies, photography, videos, TV series, CD-ROMS, computer games, and gallery installations. Some of these might be considered memento mori, often for the film medium. Others propose cinema as a model for historical consciousness. “We can see the shadow of a film on television, the longing for a film, the nostalgia, the echo of a film, but never a film,” is a characteristic Marker observation; one of his favorite aphorisms is borrowed from George Steiner: “It is not the past that rules us—it is the image of the past.”

Notes of a Novice Student of India

India-map1Justin E. H. Smith in Berfrois:

Any specialist on anything will have had that peculiar experience of coming across some casual comment from a total non-specialist about the very thing to which one has devoted one’s life, a comment made as if there were no such thing as specialist knowledge, as if what we know in any domain at all were just so much hearsay and vulgarisation. Lord knows I’ve seen plenty of people denouncing Descartes, for example, or praising Spinoza (seldom the reverse), who know nothing, but nothing, about Descartes or Spinoza. This is easy and costless to do (and we all do it, including those of us who pride ourselves on being specialists and who really care about getting things right in our special domains), so long as one doesn’t mingle with the specialists in the domain about which one holds forth.

I’ve been thinking about how this works, about this seldom-discussed aspect of the sociology of knowledge, quite a bit recently, as I go deeper in my mid-career shift to what used to be called ‘Indology’ (more on this telling term soon). I am still a near-absolute beginner, yet I am now reaching the point where I can no longer say whatever I want to say on the grounds that I don’t know anything anyway, and that the people with whom I’m speaking don’t know anything either. I am now interacting with people who do not find it at all peculiar to care about Pāṇinian syntax theory, or about the rules of proper inference in Navya-Nyāya logic. The days are over when I could make sweeping claims about civilizational differences (the sort of sweeping claims my colleagues in philosophy often make) as regards rationality, for example. So in short I’m learning to be careful about what I say, which is really nothing other than entering a community of specialists. I expect anything I say now will appear naive to me when I look back on it in a few years, which is only to say that I will have entered more fully into that community. But one has to start somewhere.

What used to be called ‘Indology’ is now referred to more obliquely by phrases such as ‘South Asian Studies’, ‘Religions- und Kulturgeschichte Südasiens’, and so on. To some extent this shift can be explained as part of the broader changes that turned geology into ‘earth science’, and so on. Here, it’s just a matter of rebranding, and has nothing to do with respecting the sensibilities of the subjects themselves that are being studied (rocks and sediment don’t have sensibilities). In addition, there is the broad impact of Saïd’s critique of Orientalism, and the bizarre presumption that if we redescribe ourselves as doing ‘studies’ of something rather than the ‘-logy’ of it, then we are somehow immune to that critique. But unlike the transformation of Sinology into East-Asian Studies (it’s gone translinguistic now, too: in Montreal you can major in ‘Études est-asiatiques’), Indology is weighted down by other historical legacies than just the one Saïd picked out, since the gaze upon India has often been one that did not treat it as exotically other, but also, for often less than liberal reasons, treated it as fundamentally, autochthonously, the same.

Wilde in the Office

From LARB:

OscarFor those interested in, or like me obsessed with, anniversaries: this summer marks the quasquicentennial of Oscar Wilde’s first ever office job (fifty years to go before the dodransbicentennial and a century before the sestercentennial). Admittedly, the significance of the event pales in comparison with the centennial of the sinking of Titanic or the bicentenary of Charles Dickens’s birth, but Wilde’s experience in the office provides that curious anniversary where the writer, who wants the best of both worlds as a journalist and a serious author, can see in practice whether such a thing is possible or desirable. When he was 33 years old, Wilde began working for the publishing firm Cassell & Company for the duration of more than two years. His best non-fiction and fiction work was produced during the time he spent in the office at Ludgate Hill, near Fleet Street. In between May 18, 1887, when he signed the contract with Thomas Wemyss Reid, who was general manager of the company, and October 1889, when he was handed his notice, Wilde managed to write the most brilliant and lengthy of his essays, including “The Critic as Artist,” “The Decay of Lying,” “Pen, Pencil and Poison,” and “The Portrait of Mr W. H.,” a speculation on Shakespeare's Sonnets (which later became a favourite of Borges), not to mention The Picture of Dorian Gray, which is often considered Wilde's best work and the defining text of the late-Victorian age. It is difficult to imagine a serious author of our day performing a similar feat. Could Jonathan Franzen, that great enemy of superficial twittering, have written The Corrections while editing GQ, spending his weekdays in its offices? While numerous contemporary authors prefer unplugging the network cable from their laptops while writing, Wilde did the opposite thing and tried to have as many connections as possible, which he thought would contribute to his competence and inventiveness as an author.

Having toured the United States and parts of England during the early 1880s for a series of lectures about decoration, fashion, and applied arts, Wilde had amused American and British audiences with his personality and oratorical skills. When this great tour came to an end, he immediately looked for fame in prestigious literary magazines and newspapers where he could review books and publish essays about his favourite subjects. In the course of a year he reviewed dozens of books, some of which he confessed to not reading in their entirety (“I never read a book I must review,” he wrote, “it prejudices you so.”) Building for himself a credible byline which he hoped would open new opportunities for him, Wilde inhabited a freelancer's existence for a few years. This period was central to his growth as an independent thinker.

More here.

Two Steps to Free Will

From Harvard Magazine:

WillAstronomy naturally inspires cosmic thinking, but astronomers rarely tackle philosophical issues directly. Theoretical astrophysicist Robert O. Doyle, Ph.D. ’68, associate of the department of astronomy, is an exception. For five years, Doyle has worked on a problem he has pondered since college: the ancient conundrum of free will versus determinism. Do humans choose their actions freely, exercising their own power of will, or do external and prior causes (even the will of God) determine our acts? Since the pre-Socratics, philosophers have debated whether we live in a deterministic universe, in which “every event has a cause, in a chain of causal events with just one possible future,” or an indeterministic one, in which “there are random (chance) events in a world with many possible futures,” as Doyle writes in Free Will: The Scandal in Philosophy (2011). The way out of the bottle, he says, is a “two-stage model” whose origin he traces to William James, M.D. 1869, LL.D. ’03, philosopher, psychologist, and perhaps the most famous of all Harvard’s professors. Some of the confusion, Doyle believes, stems from how thinkers have framed the question—in an either/or way that allows only a rigidly predetermined universe or a chaotic one totally at the mercy of chance. David Hume, for example, asserted that there is “no medium betwixt chance and an absolute necessity.” But Doyle also finds the term “free will” unclear and even unintelligible, because the condition of “freedom” applies to the agent of action, not the will: “I think the question is not proper, whether the will be free, but whether a man be free,” in John Locke’s concise phrasing. “The element of randomness doesn’t make us random,” Doyle says. “It just gives us possibilities.”

Doyle limns a two-stage model in which chance presents a variety of alternative possibilities to the human actor, who selects one of these options and enacts it. “Free will isn’t one monolithic thing,” he says. “It’s a combination of the free element with selection.” He finds many antecedents in the history of philosophy—beginning with Aristotle, whom he calls the first indeterminist. But he identifies James as the first philosopher to clearly articulate such a model of free will, and (in a 2010 paper published in the journal William James Studies and presented at a conference honoring James; see “William James: Summers and Semesters”) he honors that seminal work by naming such a model—“first chance, then choice”—“Jamesian” free will.

More here.

Thursday, August 23, 2012

ryan, rand, hayek

26economy-articleLarge-v4

In actuality, Ryan is like a lot of politicians who merely cherry-pick Hayek to promote neoclassical policies, says Peter Boettke, an economist at George Mason University and editor of The Review of Austrian Economics. “What Hayek has become, to a lot of people, is an iconic figure representing something that he didn’t believe at all,” Boettke says. For example, despite his complete lack of faith in the ability of politicians to affect the economy, Hayek, who is frequently cited in attacks on entitlement programs, believed that the state should provide a base income to all poor citizens. To be truly Hayekian, Boettke says, Ryan would need to embrace one of his central ideas, known as the “generality norm.” This is Hayek’s belief that any government program that helps one group must be available to all.

more from Adam Davidson at the NY Times Magazine here.

beauty and self-hatred

Helen-Gurley-Brown-006

Ilse has grown up in the shadow of Cosmo-culture, where drastic measures are encouraged if beauty, and therefore confidence and “empowerment”, is the end result. The death of Cosmopolitan’s Helen Gurley Brown, plastic surgery pioneer, has brought some of her choice quotes to the surface. “Self-help,” she said to Nora Ephron, explaining the methods she used to improve her flaws. “I wish there were better words, but that is my whole credo. You cannot sit around like a cupcake asking other people to come and eat you up and discover your great sweetness and charm. You’ve got to make yourself more cupcakable all the time so you’re a better cupcake to be gobbled up.” The formula she laid down for Cosmopolitan in 1965 relied on constant renovation, improvement and a continual quest for achievement, where anybody can be beautiful, if only they try hard enough.

more from Eva Wiseman at The Observer here.

the inimitable

Oates_2-081612_jpg_230x986_q85

Dickens is so brilliant a stylist, his vision of the world so idiosyncratic and yet so telling, that one might say that his subject is his unique rendering of his subject, in an echo of Mark Rothko’s statement, “The subject of the painting is the painting”—except of course, Dickens’s great subject was nothing so subjective or so exclusionary, but as much of the world as he could render. If Dickens’s prose fiction has “defects”—excesses of melodrama, sentimentality, contrived plots, and manufactured happy endings—these are the defects of his era, which for all his greatness Dickens had not the rebellious spirit to resist; he was at heart a crowd-pleaser, a theatrical entertainer, with no interest in subverting the conventions of the novel as his great successors D.H. Lawrence, James Joyce, and Virginia Woolf would have; nor did he contemplate the subtle and ironic counterminings of human relations in the way of George Eliot and Thomas Hardy, who brought to the English novel an element of nuanced psychological realism not previously explored. Yet among English writers Dickens is, as he once called himself, part-jesting and part-serious, “the inimitable.”

more from Joyce Carol Oates at the NYRB here.

In-law infighting boosted evolution of menopause: Conflict between generations of unrelated childbearing women affects offspring survival

From Nature:

DauConflict between women and their daughters-in-law could be a factor in explaining an evolutionary puzzle — the human menopause. Humans, pilot whales and killer whales are the only animals known to stop being able to reproduce long before they die. In terms of evolution, where passing on your genes is the main reason for living, the menopause remains puzzling. Now, using a large data set from Finland, researchers have for the first time been able to test a hypothesis that competition between different generations of genetically unrelated breeding women could have promoted the evolution of the menopause. The results are published today in Ecology Letters1. Mirkka Lahdenperä, an ecologist at the University of Turku in Finland, and her colleagues used data from meticulous birth, death and marriage records kept by the Lutheran church in the country between 1702 and 1908. As they dug into the data, the researchers found that the chances of children dying increased when mothers-in-law and daughters-in-law gave birth around the same time. For children of the older women, survival dropped by 50%. For children of the daughters-in-law, it dropped by 66%. However, if mothers and daughters had children at the same time, the survival of those children wasn’t affected. The results suggest that it would be beneficial to stop having children once your daughter-in-law entered the fray. “We were surprised that the result was so strong,” says Andrew Russell, an ecologist at the University of Exeter, UK, who was part of the research team. He suggests that perhaps in-laws fought over food for their children instead of cooperating as mothers and daughters might.

Other theories to explain the menopause include the mother hypothesis, which suggests that older women have an increased chance of dying in childbirth, and the grandmother hypothesis — that the benefits to the family when women care for their grandchildren provide an evolutionary reason to stay alive after reproductive age. Using an inclusive-fitness model, which counts the number of gene equivalents passed from generation to generation, the team showed that when mothers and their sons' wives had children at the same time, there was strong selection against women remaining fertile past the age of 51.

More here.

Just Think No

Maureen Dowd in The New York Times:

Maureen-Dowd-yoga-ny-times2There’s something trying about an unforgiving man suddenly in need of forgiveness. Yet Todd Akin is right. He shouldn’t have to get out of the United States Senate race in Missouri simply for saying what he believes. He reflects a severe stance on abortion that many in his party embrace, including the new vice presidential candidate.

…In asserting that women have the superpower to repel rape sperm, Akin ratcheted up the old chauvinist argument that gals who wear miniskirts and high-heels are “asking” for rape; now women who don’t have the presence of mind to conjure up a tubal spasm, a drone hormone, a magic spermicidal secretion or mere willpower to block conception during rape are “asking” for a baby.

“The biological facts are perhaps inconvenient, but whether the egg meets the sperm is a matter of luck or prevention,” says Dr. Paul Blumenthal, a professor of obstetrics and gynecology who directs the Stanford Program for International Reproductive Education and Services. “If wishing that ‘I won’t get pregnant right now’ made it so, we wouldn’t need contraceptives.” When you wish upon a rape. Dr. Blumenthal is alarmed that Akin is a member of the House Committee on Science, Space and Technology. “What is very disturbing to me is that people like Mr. Akin who have postulated this secret mechanism for avoiding pregnancy have developed their own make-believe world of science based on entirely self-serving beliefs of convenience or just ignorance,” he said. “I don’t think we want these people to be responsible for the lives of others.” But, for all the Republican cant about how they want to keep government out of the lives of others, the ultraconservatives are panting to meddle in the lives of others. Contrary to President Obama’s refreshing assertion Monday that a bunch of male politicians shouldn’t be making health care decisions for women, this troglodyte tribe of men and Bachmann-esque women craves that responsibility.

More here.

Thursday Poem

A Song

(A poem composed in 28 A.D. Korea)

When my dead mother comes to me
and asks me to lend her my shoes
I take off my shoes.

When my dead mother comes to me
and asks me to hold her up, for she has no feet
I take off my feet.

When my dead mother comes to me
and asks me to lend her, lend her
I even rip out my heart.

In the sky, mountains rise, trails rise.
At a place where there is no one
two round moons ascend.

(Translated from Me Korean by Don Mee Choi)

by Kim Hye-Sun
from Arts & Letters, Fall 2000

Zbigniew Brzezinski: U.S. Fate Is in U.S. Hands

Robert W. Merry interviews Zbigniew Brzezinski in The National Interest:

ScreenHunter_44 Aug. 23 11.05No one disputes that Zbigniew Brzezinski resides within the circle of America’s most brilliant and prolific foreign-policy experts. The former White House national-security adviser under Jimmy Carter has written or coauthored eighteen books, including his most recent, Strategic Vision: America and the Crisis of Global Order, a probing analysis of America’s challenges in a fast-changing world. Brzezinski is a counselor and trustee at the Center for Strategic and International Studies and a senior research professor at the School of Advanced International Studies at Johns Hopkins University. The National Interest caught up with Brzezinski at his CSIS office for an interview about his book and the current state of the world. The interview was conducted by TNI editor Robert W. Merry.

In your book, you talk about the Atlantic West’s grand opportunity for what you called a “new era of Western global supremacy” after the Soviet collapse. But it didn’t happen. To what extent do you think this failure resulted from human folly, and to what extent was it a product of forces beyond the control of the Atlantic West or its leaders?

I think both. But the West was fatigued, and Europe, certainly, lost a sense of its global responsibility and became more provincial in outlook. That, in part, was connected unavoidably with the task of constructing something that was called, originally, the European Community, that led to the European Union (although the two names should have been in a different sequence, because the European Community had more coherence than the current European Union). And the United States embarked on a kind of self‑gratification and self‑satisfaction, almost acting as if it really thought that history had come to an end.

More here.

Quackery and Mumbo-Jumbo in the U.S. Military

Harriet Hall in Slate:

120815_MEDEX_cupping2EX.jpg.CROP.rectangle3-largeThe military uses some of the most technologically sophisticated machinery and innovative medical techniques in history. But a disturbing current of pseudoscience in the military is wasting money, perpetuating myths, and putting our troops in danger. I am a retired U.S. Air Force colonel, so this hits close to home. An organization I was once proud to belong to has become a source of embarrassment.*

An ongoing DoD failure is the infiltration of quackery into military medicine. It’s not as dangerous to our troops as a bomb detector that can’t detect bombs, but it’s wasting tax dollars and medical resources on unscientific mumbo-jumbo that “works” only as a placebo. In some cases, it is demonstrably harmful.

Acupuncture is based on a mythical, nebulous energy called qi that has never been detected, even though scientific instruments are capable of measuring quantum energies at the subatomic level. It is said to flow through hypothetical meridians and to be altered by sticking needles into hypothetical acupuncture points. Originally, there were 360 acupuncture points, corresponding to the days of the year, which is not surprising since the idea grew out of astrology. Now so many acupoints have been described that one wag suggested there was no place left on the skin that wasn’t an acupuncture point in someone’s system.

More here.

Fans Worry After Pakistan Twitter Star Goes Off Line

Declan Walsh in the New York Times:

17192297burnChanneling the American comic Stephen Colbert, the determinedly anonymous blogger behind @MajorlyProfound adopted the voice of a pompous, paranoid, honor-obsessed nationalist — Twitter posts typically started with cries of “whoa!” or “OUTRAGE!!” — then took things a step or three further. The result was a searingly funny and often jet-black perspective on Pakistan’s rolling crises that pushed the boundaries of what is considered politically acceptable — or personally prudent.

A Pakistani should have been given the honor of lighting the Olympic flame, @MajorlyProfound declared during the recent opening ceremony, in recognition of “our expertise at burning things” like NATO supply trucks and Indian luxury hotels.

Later, he suggested that the national team could do well in archery, but only if a photo of an Ahmadi — a religious minority that suffers grave persecution — were placed on the target board.

“Pakistani shooters sure to win gold,” he wrote on Twitter. “But there is a danger they might throw grenade instead.”

Such jagged wit won @MajorlyProfound more than 10,000 followers on Twitter, many of them influential in the Pakistani and Indian news media. Foreign journalists started to quote him in stories, sensing he had become a cultural touchstone of sorts.

But the man behind the phenomenon assiduously shunned the spotlight. “I’m just a nobody,” he wrote in an e-mail exchange started by The New York Times before his disappearance. “I like to poke fun at absurdity.”

More here.

Wednesday, August 22, 2012

war and gardening

English-prime-minister-winston-churchill-standing-alone-in-a-garden-during-wwii

We are in an era when gardens are front and center for hopes and dreams of a better world or just a better neighborhood, or the fertile space where the two become one. There are farm advocates and food activists, progressive farmers and gardeners, and maybe most particular to this moment, there’s a lot of urban agriculture. These city projects hope to overcome the alienation of food, of labor, of embodiment, of land, the conflicts between production and consumption, between pleasure and work, the destructiveness of industrial agriculture, the growing problems of global food scarcity, seed loss. The list of ideals being planted and tended and sometimes harvested is endless, but the question is simple. What crops are you tending? What do you hope to grow? Hope? Community? Health? Pleasure? Justice? Gardens represent the idealism of this moment and its principal pitfall, I think. A garden can be, after all, either the ground you stand on to take on the world or how you retreat from it, and the difference is not always obvious.

more from Rebecca Solnit at Orion Magazine here.

Paradoxes of Altruism in the Digital Age

William Flesch in the Los Angeles Review of Books:

1345081438Some evolutionary biologists, David Sloan Wilson among them, think that there are reasons for seeing human cooperation as deriving from a genuine genetic propensity for altruism. Altruism and prosocial tendencies may be taken as roughly synonymous. Species (humans pre-eminently) that tend to engage in behavior which promotes the general welfare — even at the cost of some individual sacrifice — are able to cooperate in ways that help everyone.

They can do this despite the huge risk that free riders will derail the whole system. What prevents free riders from undermining altruism by taking such advantage of altruists that they die out? The answer that many evolutionary biologists, evolutionary psychologists, sociologists, anthropologists, neuroscientists, game theorists, and, even narrative theorists like yours truly, have converged on is the concept of what’s now known as altruistic punishment.

The idea behind altruistic punishment is that uninvolved third party witnesses will punish defectors, cheaters, and free riders. They, or a significant number of them, won’t let a self-dealer or serious violator of social norms get away with social or moral transgressions, even if they have to pay a price themselves. And they won’t even let those who are indifferent to the violators get away with such transgressions either. Many people are still angry at the 38 people who allegedly witnessed the rape and murder of Kitty Genovese from the safety of their apartments in 1964 and didn’t bother to call the police.

More here.

Westernistic civilization

Debeljak

Instead of subscribing to the ideology that views the world through the “hard” lens of conflict between “the West and the Rest”, let us try a theory that looks at the world through the “soft” lens of “westernistic” civilization. An analogy between Hellenistic and westernistic civilization is helpful. In much the same way as classical Greece cannot be equated with Hellenic civilization, the modern West is not the same as westernistic civilization. Until 4 BC and the twilight of city-states, classical Greek civilization remained within the territorial borders of the southern Balkans. Similarly, the civilization of Latin Christianity or the traditional West was firmly rooted in the western countries of Europe until the advent of modernity. The Hellenistic civilization of Alexander the Great emanated from classical Greek heritage, but territorially it stretched across the entire world then known to man, reaching to Egypt and India, Tajikistan and Afghanistan. In the same way, the westernistic civilization that has arisen from modern western heritage comprises the entire known world today.

more from Ales Debeljak at Eurozine here.

Not In My Name: Islam, Pakistan and the Blasphemy Laws

Mehdi Hasan in the Huffington Post:

ScreenHunter_43 Aug. 22 15.35You could not make it up. An 11-year old Christian girl in Pakistan with Down's Syndrome is in police custody, and could face the death penalty, forallegedly burning pages from the Quran.

The girl, who has been identified as Rifta Masih, was arrested on blasphemy charges and is being held in Islamabad pending a court appearance later this month. She was detained by police after an angry mob turned up at her family's single-roomed home in a poor district on the outskirts of the Pakistani capital.

“About 500-600 people had gathered outside her house in Islamabad, and they were very emotional, angry, and they might have harmed her if we had not quickly reacted,” Pakistani police officer Zabi Ullah told reporters.

“Harmed her”? Really? I mean, really? What on Allah's earth is wrong with so many self-professed Muslims in the self-styled Islamic Republic of Pakistan? Have they taken leave of their morals as well as their senses?

More here.