revisiting ‘pickup on south street’

Pickup-on-south-street-1953Willie Osterweil at The Paris Review:

If you’re feeling polemical, you might argue that all Hollywood cinema is anticommunist: as the central commodity of the culture industry, big studio movies are designed for nothing so much as circulating and producing capital. But if we want to talk Communist with a capital C—you know, where the C stands for USSR—then Hollywood’s anticommunist films are a special and specific genre of flops and farces, a cinematic tradition featuring such classics as I Married a Communist, The Red Menace,Assignment: Paris, and My Son John. (Spoiler: John’s a goddamned Bolshie!)

The fifties saw the heyday of anticommie popcorn flicks. True, the silent era had itsBolshevism on Trial and Red Russia Revealed, and the eighties met with Soviet invasion inRed Dawn and some serious anti-Vietcong violence in the later Rambo movies. But when you wanna see a square-jawed U.S. American call a sweaty creep a commie and slug him in the mouth, it’s the postwar period you turn to. Though most of the era’s anticommunist films were too vulgar and outlandish to survive as anything other than hilarious artifacts—or as evidence of the ever-imperialist, state-serving agenda of the Hollywood apparatus, depending on which side of the bed you woke up on—a few, Robert Aldrich’s Kiss Me Deadly and Samuel Fuller’s Pickup on South Street among them, are truly great works of cinema. (Granted, 1982’s Rambo: First Blood—if you excise the last four minutes, when Sly gives a speech crying about how hippies, those “maggots at the airport,” spit on him—is also pretty great.) Both are tense, pulpy noirs, both center around the sale of nuclear secrets, and both take anticommunism more as a genre then a narrative drive. But only one, Pickup on South Street (1953), is being revived this week at Film Forum, in New York.

more here.



The many layers of Mark Bradford’s work

150622_r26648-320Calvin Tomkins at The New Yorker:

Bradford had recently installed a major sculpture at the Los Angeles International Airport, and we went to see it the next morning. On the way, he told me that he used to get his mother to drive him to LAX so that they could have dinner there and watch the planes take off and land. Later, as a teen-ager, he’d skip school and take the bus. “I loved the old Pan Am terminal, the international one,” he said. “I’d see a plane land from Switzerland or Ghana or someplace, and I’d run to where the passengers were getting off and pretend to be getting off with them. First time I ever heard foreign languages. I’d push the Smarte Cartes back into the terminal and collect a dollar each for my lunch money.”

His sculpture was clearly visible from the main entrance to one of the international terminals—a four-sided wooden structure, suspended from the skylight at the far end of the departure hall. Bradford called Sarah Cifarelli, the airport’s art manager, on his mobile phone; while we waited for her to arrive, he said that he’d wanted to make something that felt both ancient and modern—a cross between a medieval bell tower and “that thing for sports events, the Jumbotron.” The sculpture, called “Bell Tower,” was made of aluminum, paper, and weathered plywood sheets, stained and graffitied from years of being used as barricades. (He’d salvaged them from construction sites.)

more here.

A Practical Vision of a More Equal Society

Thomas Piketty in the New York Review of Books:

9780674504769-lgAnthony Atkinson occupies a unique place among economists. During the past half-century, in defiance of prevailing trends, he managed to place the question of inequality at the center of his work while demonstrating that economics is first and foremost a social and moral science. In his new book, Inequality: What Can Be Done?—more personal than his previous ones and wholly focused on a plan of action—he provides us with the broad outlines of a new radical reformism.

There’s something reminiscent of the progressive British social reformer William Beveridge in Atkinson’s reformism, and the reader ought to enjoy his way of presenting his ideas. The legendarily cautious English scholar reveals a more human side, plunges into controversy, and sets forth a list of concrete, innovative, and persuasive proposals meant to show that alternatives still exist, that the battle for social progress and equality must reclaim its legitimacy, here and now. He proposes universal family benefits financed by a return to progressive taxation—together, they are intended to reduce British inequality and poverty from American levels to European ones.

He also argues for guaranteed public-sector jobs at a minimum wage for the unemployed, and democratization of access to property ownership via an innovative national savings system, with guaranteed returns for the depositors. There will be inheritance for all, achieved by a capital endowment at age eighteen, financed by a more robust estate tax; an end to the English poll tax—a flat-rate tax for local governments—and the effective abandonment of Thatcherism. The effect is exhilarating. Witty, elegant, profound, this book should be read: it brings us the finest blend of what political economy and British progressivism have to offer.

More here.

Anti-ageing pill pushed as bona fide drug

Erika Check Hayden in Nature:

GettyImages-92930493_bought_webDoctors and scientists want drug regulators and research funding agencies to consider medicines that delay ageing-related disease as legitimate drugs. Such treatments have a physiological basis, researchers say, and could extend a person’s healthy years by slowing down the processes that underlie common diseases of ageing — making them worthy of government approval. On 24 June, researchers will meet with regulators from the US Food and Drug Administration (FDA) to make the case for a clinical trial designed to show the validity of the approach.

Current treatments for diseases related to ageing “just exchange one disease for another”, says physician Nir Barzilai of the Albert Einstein College of Medicine in New York. That is because people treated for one age-related disease often go on to die from another relatively soon thereafter. “What we want to show is that if we delay ageing, that’s the best way to delay disease.” Barzilai and other researchers plan to test that notion in a clinical trial called Targeting Aging with Metformin, or TAME. They will give the drug metformin to thousands of people who already have one or two of three conditions — cancer, heart disease or cognitive impairment — or are at risk of them. People with type 2 diabetes cannot be enrolled because metformin is already used to treat that disease. The participants will then be monitored to see whether the medication forestalls the illnesses they do not already have, as well as diabetes and death.

More here.

Wednesday Poem

The First Man on the Moon
.

now i’ve been here a week. a year, a day –
seen all there is to be seen: dust, dust, dust –
eyes now only for the blue balloon of the earth
hanging above me, teeming with living life,
strangely enough, i don’t often think of my wife,
but could cry again and again
over that robot in denmark
who said to journalists:
i don’t think it’d be at all a bad thing
to be a human being
.

by C. Buddingh'
from Gedichten 1938-1978
publisher: De Bezige Bij, Amsterdam, 1971
.

Read more »

All Possible Humanities Dissertations Considered as Single Tweets

Stephen Burt in The New Yorker:

Burt-Humanities-Dissertations-as-Single-Tweets-690The name we’ve been using for this stuff is anachronistic. Here’s a better name.

Truth-claims from our discipline cannot be properly judged without expertise that almost no one in our discipline has.

Our discipline should study its own disciplinary formation; that study proves that our discipline shouldn’t exist.

An old, prestigious thing still deserves its prestige, but for a heretofore undiscovered reason.

This feature of modern life began slightly earlier than you thought, and my single text proves it.

Please adopt my buzzword.

This author, normally seen as opposed to certain bad things, in fact supported them without realizing it.

This author, normally seen as naïve or untrained, is in fact very self-aware, and hence more like us.

That obscure, élite thing once had a popular audience.

This short text, seen rightly, reveals the contradictions of a whole culture.

A supposedly fanatical, militant movement that readers have been taught to fear makes perfect sense to those who support it.

The true meaning of a famous work can be recovered only through juxtaposition with this long obscure historical moment or artifact.

I found a very small thing in an archive, but I can relate it to a big thing.

Read the full post here.

Tuesday, June 16, 2015

Do Corporations Have Minds?

Joshua Knobe in the New York Times:

15stoneWeb-blog480Back in 2007, Viacom filed a copyright infringement lawsuit. It alleged that the defendants had created a website that was hosting copyrighted Viacom videos, including everything from the “The Colbert Report” to “SpongeBob SquarePants.” But that was not all. Viacom made a series of allegations about the defendants’ mental states. It alleged that they specifically “intended” to host these illegal videos, that they were doing so “knowingly” and with “brazen disregard” for the law.

So far, all of this may seem perfectly straightforward. But here is the surprising part. The defendants in the suit were not individual human beings. They were YouTube and its parent company, Google. In other words, the entities that were alleged to have all of these intentions and attitudes were actually corporations.

Cases like this one have long puzzled philosophers. In everyday speech, it seems perfectly correct to say that a corporation can “intend,” “know,” “believe,” “want” or “decide.” Yet, when we begin thinking the matter over from a more theoretical standpoint, it may seem that there is something deeply puzzling here. What could people possibly mean when they talk about corporations in this way?

More here.

Nabokov in the Utah mountains

AltalodgeRobert Roper at The American Scholar:

The slender Russian man is on vacation. He has an arrogantly beautiful face and is accompanied by an oddly tall little boy, as he stalks up and down a trout stream in the Wasatch Range, a few miles east of Sandy, Utah. They deploy butterfly nets. “I walk from 12 to 18 miles a day,” he writes in a letter mailed in July of 1943, “wearing only shorts and tennis shoes … always a cold wind blowing in this particular cañon.”

The eccentric Russian novelist chasing butterflies—the signature image of Vladimir Nabokov in America—came to beguile millions. “A man without pants and shirt” was how a local teenager, John Downey, saw him when encountering Nabokov on the Little Cottonwood Canyon road. He was “dang near nude,” Downey recalled, and when he asked Nabokov what he was up to, the stranger would not explain at first.

Nabokov was 44. That November he would have his two front teeth removed, all the rest soon following. (“My tongue is like someone who comes home and finds all his furniture gone.”) He smoked five packs of cigarettes a day and had a tubercular look. For the previous 20 years he had lived on an edge—he was an artist, after all, and deprivation went with the territory. His wife worked odd jobs to support them, and neither of them had ever been much for cooking or packing on the pounds.

more here.

Letting go of Philip Roth

Burke-Roth-HallmanJ.C. Hallman at The Baffler:

The phenomenon of Philip Roth’s “retirement”—and that seems to be what it is now, a phenomenon—is not about a writer’s vanity, an ego grown so massive it’s like a publicity black hole sucking up limelight that might have shined warmly on other equally deserving authors. Nor is it about an inability to shut up, even though Roth admitted that his decision to quit writing, announced abruptly in 2012, had triggered in him an impulse to “chatter.” (Almost everyone has taken this quotation out of context, and I have too, which means that “chatter” may be on its way to becoming one of those offhand remarks that gets used to make a famous person appear to mean the opposite of what he probably did mean.)

No, Roth’s announcement that he would leave the literary stage, followed by his conspicuous failure to do so in favor of a series of curtain calls, is about us—Roth’s audience, a community of readers. We’re the ones endlessly fascinated by Roth’s penchant to pontificate about himself in public, from an interview with the BBC aired last spring (titled “Philip Roth Unleashed”) to a promised appearance on The Colbert Report (reportedly scheduled for last summer, but apparently scrapped). Through it all, Roth continues to insist that he’s retreating into full Garbo mode. “You can write it down,” he told a reporter last May after a star turn at the 92nd Street Y. “This was absolutely the last public appearance I will make on any public stage, anywhere”—this just a week before collecting an award from the Yaddo writer’s retreat and two weeks before accepting an honorary doctorate at the conservative Jewish Theological Seminary.

more here.

the early meteorologists

Fara_06_15Patricia Fara at Literary Review:

The Greek origins of the word 'meteorology' indicate that trying to foretell the weather is nothing new. In Aristotle's model, the universe was divided, like a two-tiered gobstopper, into the heavens – where the planets serenely rotated in orderly circles – and the central chaotic terrestrial sphere, the skies of which extended out as far as the moon's orbit, home not only to birds, clouds and rainbows, but also to meteors. Like other transient phenomena, such as comets, meteors were often interpreted as messages from God. Only gradually were they relegated from terrestrial to astronomical science.

Following the invention of thermometers, barometers and other measuring instruments, Enlightenment gentlemen attempted to rationalise the cosmos, faithfully recording everything and anything that might help establish regular patterns of behaviour. To their humiliation, despite investing in the latest equipment, these meticulous observers were outperformed by illiterate farmers and sailors drawing on decades of experience. Even frogs and farm animals seemed more prescient, apparently possessing a sixth sense about trouble lying ahead. Self-styled experts remained fallible, especially when viewed in hindsight: in 1938, a British steam engineer correctly calculated that industrially generated carbon dioxide would cause the earth's temperature to rise, but concluded that the extra few degrees would help ward off an impending ice age.

more here.

The Downside of Treadmill Desks

Gretchen Reynolds in The New York Times:

DeskTreadmill desks are popular, even aspirational, in many offices today since they can help those of us who are deskbound move more, burn extra calories and generally improve our health. But an interesting new study raises some practical concerns about the effects of walking at your workspace and suggests that there may be unacknowledged downsides to using treadmill desks if you need to type or think at the office. The drumbeat of scientific evidence about the health benefits of sitting less and moving more during the day continues to intensify. One study presented last month at the 2015 annual meeting of the American College of Sports Medicine in San Diego found that previously sedentary office workers who walked slowly at a treadmill desk for two hours each workday for two months significantly improved their blood pressure and slept better at night. But as attractive as the desks are for health reasons, they must be integrated into a work setting so it seems sensible that they should be tested for their effects on productivity. But surprisingly little research had examined whether treadmill desks affect someone’s ability to get work done.

So for the new study, which was published in April in PLOS One, researchers at Brigham Young University in Provo, Utah, recruited 75 healthy young men and women and randomly assigned them to workspaces outfitted with a computer and either a chair or a treadmill desk. The treadmill desk was set to move at a speed of 1.5 miles per hour with zero incline. None of the participants had used a treadmill desk before, so they received a few minutes of instruction and practice. Those assigned a chair were assumed to be familiar with its use.

More here.

Tuesday Poem

Snow

Lost in an infinity of misted mirrors
among shelves of Optrex,

Pepsodent and pink calamine,
I dunked net petticoats into sugar solution

to froth out the nylon frills of that
first dance dress.

Hanging it to drip-dry over
the porcelain sink I squeezed

obdurate adolescent flesh
into a rubber roll-on that chaffed my thighs,

attaching 15 deniers
to silk suspenders,

before turning to straighten
the wayward seam along

a newly shaved leg and wriggle
into my strapless Wonderbra.

Then spitting into the little
Bakelite box to soften the black wax,

a flick of mascara
applied with a tiny brush.

Backcombing my hair, the lady
on the Elnett tin of hairspray

smiled with a poise
I could never muster.

So much preparation
to end alone

beneath the rotating mirror-ball,
as the last waltz faded

and flakes of light spangled
my bare arms in falling snow.

.
by Sue Hubbard
from Ink Sweat and Tears

Dinomania: the story of our obsession with dinosaurs

Tom Holland in The Guardian:

D04121df-ff90-4154-8dab-42b1d916dd46-1020x526Dinosaurs, though, have never just been for children. From the time when they were first classified, back in the days of the Napoleonic wars, up to the present, they have served as the focus for fittingly weighty themes: industrialisation, national rivalries, and survival and extinction. Their fascination reaches deep indeed.

There is certainly nothing new about the instinct to marvel at giant fossils, nor to dream of putting flesh back on their bones. At the height of the Roman empire, during the reign of Tiberius, a devastating earthquake – “the worst in human memory”, according to Pliny the Elder – exposed a series of colossal skeletons. The locals, convinced that these were the remains of ancient heroes, were reluctant to desecrate their graves; but knowing of the emperor’s interest in such matters, they reverently sent him a single, massive tooth. Tiberius, eager to see with his own eyes just how large the man from which it came would have stood, commissioned a mathematician to calculate the hero’s proportions, and then to build him a scale model. The tooth – which we are informed was over a foot long – was not, of course, human, but most likely from a mastodon. Elsewhere, though, in lands where rocks bore the fossils of dinosaurs, ancient peoples were perfectly capable of recognising them as the remains of non-human creatures. In China, they were identified as dragon bones, while in North America, as the historian Adrienne Mayor has convincingly demonstrated, tales told by the Plains Indians of the Thunder Bird were inspired at least in part by the spectacle of pterosaur fossils. There seems never to have been a time nor a culture in which mysterious bones did not captivate those who beheld them.

Even in early 19th-century Britain, where dinosaurs were first dated correctly and classified, flights of imagination were at least as important to the project of conceptualising them as painstaking anatomical study. The rocks of England were not, as those of Alberta would prove to be, rich in articulated skeletons. When, in 1824, an assortment of fossilised bones and teeth discovered in the depths of various Oxfordshire quarries were assembled in one place, it was not immediately apparent from what kind of animal they had come. William Buckland, a clergyman who was also a professor of geology at Oxford University, identified them as having belonged to a massive lizard, which produced the decorous Greek translation Megalosaurus. Meanwhile, at much the same time, other no less revelatory finds were being made along the length of the south coast. In Sussex, a local doctor uncovered the fragmentary remains of what appeared to have been two more species of colossal, extinct land-reptiles: Iguanodon and Hylaeosaurus, as they were named. Simultaneously, in Lyme Regis, a professional fossil hunter named Mary Anning was busy demonstrating that the seas and skies of prehistoric England had been no less the haunt of remarkable monsters than had been its swamps. Many, when they inspected the long-necked plesiosaurs, the shark-like ichthyosaurs and the bat-like pterosaurs discovered by Anning, found that only the language of epic was adequate to the primordial vistas that opened up before their gaze. The family likeness of Anning’s monsters, so her biographer declared rhapsodically, was “to the evil spirits who beset Aeneas or Satan in an old illustrated Virgil or Paradise Lost”.

Read the rest here.

Black Like Her

Cobb-Black-Like-Her-690

Jelani Cobb in The New Yorker (Photo by Colin Mulvany/The Spokesman Review via AP):

Among African-Americans, there is a particular contempt, rooted in the understanding that black culture was formed in a crucible of degradation, for what Norman Mailer hailed as the “white Negro.” Whatever elements of beauty or cool, whatever truth or marketable lies there are that we associate with blackness, they are ultimately the product of a community’s quest to be recognized as human in a society that is only ambivalently willing to see it as such. And it is this root that cannot be assimilated. The white Negroes, whose genealogy stretches backward from Azalea through Elvis and Paul Whiteman, share the luxury of being able to slough off blackness the moment it becomes disadvantageous, cumbersome, or dangerous. It is an identity as impermanent as burnt cork, whose profitability rests upon an unspoken suggestion that the surest evidence of white superiority is the capacity to exceed blacks even at being black. The black suspicion of whites thus steeped in black culture wasn’t bigotry; it was a cultural tariff—an abiding sense that, if they knew all that came with the category, they would be far less eager to enlist.

But this is precisely what makes the Dolezal deception complicated. Artists like Eminem and Teena Marie, white people who were by and large accepted by black people as a legitimate part of black cultural life, nonetheless had to finesse a kind of epidermal conflict of interest. Irrespective of their sincerity, a portion of their profitability lay in their status as atypically white. Dolezal’s transracialism was imbued with exactly the opposite undertaking. She passed as black and set about shouldering the inglorious, frustrating parts of that identity—the parts that allocate responsibility for what was once called “uplifting the race.” It’s an aspect of her story that at least ought to give her critics—black ones, particularly—a moment of pause.

Dolezal is, like me, a graduate of Howard University, a place where the constellation of black identities and appearances is so staggeringly vast as to ridicule the idea that blackness could be, or ever has been, any one thing. What I took from Howard, besides that broadened sense of a world I’d presumed to know, was an abiding debt to those who’d fought on its behalf and a responsibility to do so for those who came afterward. It’s easy to deride Dolezal’s dishonesty—to ridicule her hoax as a clever means of sidestepping the suspicion with which white liberals are commonly greeted—until we reflect on a photograph of Walter White, the aptly named man who served as the second black president of the N.A.A.C.P. Or one of Louis T. Wright, who served as the national chairman of the N.A.A.C.P. board during the Great Depression. In the nineteen-twenties, amid a feud with the organization, the black nationalist Marcus Garvey criticized the N.A.A.C.P. for being a organization whose black and white members were essentially indistinguishable.

More here.

No Reason To Blame Liberals (Or, The Unbearable Lightness of Perversity Arguments)

US-death-sentence-executions

Margo Schlanger reviews Naomi Murakawa's The First Civil Right: How Liberals Built Prison America, in The New Rambler:

Moving then to what Murakawa demonstrates about federal law-and-order politics, I think it is useful to reslice her arguments a bit crosswise. Murakawa is most successful, but least novel, when she argues that liberals, or, at least, Democrats, have contributed importantly to mass incarceration. President Clinton’s 1994 Crime Bill was undoubtedly a highly punitive intervention in federal crime policy. And it must have had effects (though they are hard to quantify) on state incarceration levels too: All those thousands of cops must have arrested people. And federal truth-in-sentencing incentives had at least a marginal effect of increasing incarceration in a number of states. (Although abundant recent work undermines the claim that increased sentences, rather than changes in other aspects of the criminal justice process, have been the main drivers of increased incarceration in recent decades.) The 1984 Sentencing Reform Act was, likewise, a consequentially punitive change in federal policy, and likewise passed with Democratic support. But we didn’t need a new book to make those well-recognized points. The contribution of The First Civil Rightis not to argue that President Clinton’s 1994 Crime Bill was punitive and problematic, that Clinton was a triangulator, or more generally that sometimes Democrats aren’t very liberal. Nor is Murakawa’s novel contribution the argument that liberals in 1984 got rolled, when compromise after compromise led Ted Kennedy and his fellow Democrats to acquiesce to a punitive federal sentencing regime change.

Rather, Murakawa’s contribution—the attention-grabbing claim at the core of her book—is her argument that liberalism has built modern American mass incarceration. Her major point along these lines is that the liberal preoccupation with using fair, non-racist procedures has contributed importantly to the growth of the carceral state, taming reform urges, entrenching the punitive regime. This argument sounds in perversity—on Murakawa’s account, liberalism’s attempt to improve racial justice using procedural tools not only fails, it is counter-productive, entrenching and worsening the system’s inequities.

Albert Hirschman, in The Rhetoric of Reaction: Perversity, Futility, Jeopardy, analyzed the appeal of the perversity argument as a reactionary trope: “What better way to show him up as half foolish and half criminal than to prove that he is achieving the exact opposite of what he is proclaiming as his objective?”

More here.

Artless: Why do intelligent people no longer care about art?

Download (4)

Michael Lind tries to make the case in The Smart Set:

There is still an art world, to be sure, in New York and London and Paris and elsewhere. But it is as insular and marginal as the fashion world, with a similar constituency of rich buyers interacting with producers seeking to sell their wares and establish their brands. Members of the twenty-first century educated elite, even members of the professoriate, will not embarrass themselves if they have never heard of the Venice Biennale.

Many of the Arts Formerly Known as Fine seem to have lost even a small paying constituency among rich people, and live a grant-to-mouth existence. In the old days, bohemian painters lived in garrets and tried to interest gallery owners in their work. Their modern heirs — at least the ones fortunate to have university jobs — can teach classes and apply for grants from benevolent foundations, while creating works of art that nobody may want to buy. Born in bohemia, many aging arts have turned universities into their nursing homes.

What happened? How is it that, in only a generation or two, educated Americans went from at least pretending to know and care about the fine arts to paying no attention at all?

The late Hilton Kramer, editor of The New Criterion, blamed the downfall of the fine arts on purveyors of Pop Art like Andy Warhol. And Jeff Koons, who replaced Arnoldian “high seriousness” and the worship of capital-c Culture with iconoclasm, mockery, and irony. A Great Tradition of two millenia that could be felled by Andy Warhol must have been pretty feeble! But the whole idea of a Phidias-to-Pollock tradition of Great Western Art was unhistorical. The truth is that the evolution (or if you like the degeneration) from Cezanne to Warhol was inevitable from the moment that royal, aristocratic and ecclesiastical patronage was replaced by the market.

Having lost their royal and aristocratic patrons, and finding little in the way of public patronage in modern states, artists from the 19th century to the 21st have sought new patrons among the wealthy people and institutions who have formed the tiny art market. It was not the mockery of Pop artists but the capitalist art market itself which, in its ceaseless quest for novelty, trivialized and marginalized the arts.

More here.

Monday, June 15, 2015