Video length: 1:03
Category: Recommended Reading
White supremacy is everywhere: How do we fight a concept that has so thoroughly permeated our politics and culture?
Anis Shivani in Salon:
In the first part of this series, I focused on some of the history of white supremacy, particularly its late 20th-century versions, which continue to have so much influence today upon the current alt-right movement. It’s important to understand this history — some of which enters into truly exotic terrain — to understand the continuity of ideas, and to realize that we are not facing anything really new in the current manifestation of white supremacy.
But there’s a more mundane side to white supremacy, which deserves to be studied with as much attention: the way in which white supremacy works in and through institutions that we otherwise think of as legitimate to the core, and even essential to the workings of liberal democracy. If we explore how this has occurred recently, then we can no longer push white supremacy aside as an ideology that can be prevented from infecting so-called “mainstream” institutions. I’m thinking primarily of political parties, but once we admit that white supremacy is a fundamental influence on how parties reinvent and calibrate themselves, then this necessarily sweeps the social organism as a whole into the indictment.
White supremacy implies a certain logic that is inimical to that of the Enlightenment (the foundation of modern democracy). It is no coincidence that much of contemporary white supremacy continues to focus on the Illuminati and Freemasons as the disseminators of “secular humanism” (i.e., the core values of the Enlightenment), or that conspiracy theory mines the same territory when it takes on “The Protocols of the Elders of Zion” (attacked as a worldwide conspiracy to bring about godless materialism) or such obsessions as the Bilderberg Group, the Trilateral Commission, the Council on Foreign Relations and the rest of the institutions associated with the New World Order (largely meaning the forces of globalization). Against the Enlightenment, which is said to lead to the weakening of the nation as an embodiment of the pure idea of race, the white supremacist insists on separation of races as his natural right. Against mongrelization, the white supremacist desires purity.
More here.
Wednesday Poem
In Manufacturing
If one of us, because his wife had left,
Pulled a base tube from the press too late or soon,
And another, because a child had cried
All night, brought her caliper too quickly
Around its flange, a hundred or more tubes
Might come off the line thin-walled, out of round.
And if the noise drowned a loud joke to drone,
When you stopped to hear, the saw would kick
A crimped or kinked tube you could never catch
Until it swung toward the next machine.
Like medicine, the line was meant to work
One way. Nothing went better than planned
Though the plan's one phrase repeated again
And again, while the tubes, soft from furnaces,
Diminished in the dies, a hot and balky work
I would not choose for a daughter or son,
Though part of the dullest business is fun.
And on dog days all our horseplay was ice
That set us running between cooling vats
Where the tubes went tallowy and vanished
And erupted again in cowls of green smoke
While the cranes roared overhead like angels.
And the boss would lay the gospel down
Because someone complained from far away,
So to this day when I stand in a green pool
Under a clanking air conditioner
Or read how a plane goes mysteriously down
I wonder if it might have been us or some
Other crew who did or did not do what
Now is certain, or if the error is even human.
by Rodney Jones
from Transparent Gestures
Houghton Mifflin, 1989
.
A Generation of Sociopaths – how Trump and other Baby Boomers ruined the world
Jane Smiley in The Guardian:
The day before I finished reading A Generation of Sociopaths, who should pop up to prove Bruce Cannon Gibney’s point, as if he had been paid to do so, but the notorious Joe Walsh (born 1961), former congressman and Obama denigrator. In answer to talkshow host Jimmy Kimmel’s plea for merciful health insurance, using his newborn son’s heart defect as an example, Walsh tweeted: “Sorry Jimmy Kimmel: your sad story doesn’t obligate me or anyone else to pay for somebody else’s health care.” Gibney’s essential point, thus proved, is that boomers are selfish to the core, among other failings, and as a boomer myself, I feel the “you got me” pain that we all ought to feel but so few of us do. Gibney is about my daughter’s age – born in the late 1970s – and admits that one of his parents is a boomer. He has a wry, amusing style (“As the Boomers became Washington’s most lethal invasive species … ”) and plenty of well parsed statistics to back him up. His essential point is that by refusing to make the most basic (and fairly minimal) sacrifices to manage infrastructure, address climate change and provide decent education and healthcare, the boomers have bequeathed their children a mess of daunting proportions. Through such government programmes as social security and other entitlements, they have run up huge debts that the US government cannot pay except by, eventually, soaking the young. One of his most affecting chapters is about how failing schools feed mostly African American youth into the huge for-profit prison system. Someday, they will get out. There will be no structures in place to employ or take care of them.
The boomers have made sure that they themselves will live long and prosper, but only at the expense of their offspring. That we are skating on thin ice is no solace: “Because the problems Boomers created, from entitlements on, grow not so much in linear as exponential terms, the crisis that feels distant today will, when it comes, seem to have arrived overnight.” As one who has been raging against the American right since the election of Ronald Reagan, as someone with plenty of boomer friends who have done the same, I would like to let myself off the hook, but Gibney points out that while “not all Boomers directly participated, almost all benefited; they are, as the law would have it, jointly and severally liable”.
More here.
Century-old tumours offer rare cancer clues
Heidi Ledford in Nature:
Deep in the basement archives of London's Great Ormond Street Hospital for Children reside the patient records that cancer researcher Sam Behjati hopes will put the hospital's past to work for the future. On 2 May, he and his colleagues published the result: DNA sequences from the genomes of three childhood tumour samples collected at the facility almost a century ago1. Those historic cells help to address a modern problem: the small number of tumour samples from rare cancers that are available for researchers to sequence. Behjati knows this problem well. At the Wellcome Trust Sanger Institute in Hinxton, UK, he tracks the genomic miswiring that can lead to rare childhood cancers. And as someone who also treats patients, he has been frustrated by the paucity of evidence backing up much of his practice. “The treatment regimens for children with rare cancers are essentially made up,” Behjati says. “If you’ve got three or four patients nationally, how are you ever going to conduct a reasonable clinical trial?” To expand the pool of samples that he could sequence, he decided in 2014 to harness advances in genome sequencing that had already made it possible to sequence DNA from pathology samples a few decades old. The hospital's 165-year archive of samples and patient records provided the opportunity to see how far back in time he could go.
The work highlights the wealth of material that is available in such archives, says Danielle Carrick, a programme director at the US National Cancer Institute in Rockville, Maryland. Mining such archives can expand the options for studying rare conditions and understudied ethnic populations, she notes, and make large, population-scale studies possible. Researchers have analysed DNA from much older specimens: fragments of genome sequence have been used to study ancient human populations from hundreds of thousands of years ago. But DNA tends to degrade over time, and cancer researchers need high-quality sequences to pinpoint the many individual mutations that can contribute to tumour growth.
More here.
Tuesday, May 16, 2017
Have You Ever Had an Intense Experience of Mystical Communion with the Universe, Life, God, etc?
Louis Kahn’s work is accessible, minimal, simple, solid, systematic, and self-evident. It is also the exact opposite.
Thomas de Monchaux in n + 1:
Here are two things to know about architects. First, they are fastidious and inventive with their names. Frank Lincoln Wright was never, unlike Sinatra, a Francis. He swapped in the Lloyd when he was 18—ostensibly in tribute to his mother’s surname on the occasion of her divorce, but also to avoid carrying around the name of a still more famous man, and for that nice three-beat meter, in full anticipation of seeing his full name in print. In 1917, Charles-Edouard Jeanneret-Gris—who is to modern architecture what Freud is to psychoanalysis—was given the byline Le Corbusier (after corbeau, crow) by his editor at a small journal, so that he could anonymously review his own buildings. The success of the sock puppet critic meant that after the critiques were collected into a book-length manifesto, the nom-de-plume eventually took over Jeanneret-Gris’ architect persona, as well. Ludwig Mies—the inventor of the glass-walled skyscraper—inherited an unfortunate surname that doubled as a German epithet for anything lousy or similarly defiled. He restyled himself Miës van der Rohe—vowel-bending heavy-metal umlaut and all—with the Dutch geographical tussenvoegsel “van” from his mother’s maiden name to add a simulation of the German nobiliary particle, von. Ephraim Owen Goldberg became Frank Gehry.
Second, all architects are older than you think. Or than they want you to think. Unlike the closely adjacent fields of music and mathematics, architecture has no prodigies. Design and construction take time. At 40, an architect is just starting out. Dying at 72 in architecture is like dying at 27 in rock and roll.
More here.
The Myth That Humans Have Poor Smell Is Nonscents
Ed Yong in The Atlantic:
For years, John McGann has been studying the science of smell by working with rats and mice at Rutgers University. But when he turned his attention to humans, he was in for a shock. The common wisdom is that our sense of smell stinks, compared to that of other mammals. McGann had always suspected that such claims were exaggerated, but even he wasn’t prepared for just how acute his volunteers’ noses were. “We started with an experiment that involved taking two odors that humans can’t tell apart—and we couldn’t find any,” he says. “We tried odors that mice can’t tell apart and humans were like: No, we’ve got this.”
In a new paper, McGann joins a growing list of scientists who argue that human olfaction is nothing to sniff at. We can follow smell trails. We discriminate between similar odors and detect a wide range of substances, sometimes more sensitively than rodents and dogs can. We live in a rich world of scents and sensibility, where odors deeply influence our emotions and behavior. “I was taught in school that human olfaction isn’t a great sense,” he says. “It’s taught in introductory psychology courses and it’s in the textbooks. But this whole thing is a crazy myth.”
For this crime against olfaction, McGann accuses Paul Broca, a 19th-century French neuroscientist. Broca was a materialist who argued that the mind arose from the brain—a position that brought vigorous opposition from the Catholic Church, which believed in a separate and disembodied soul. This intellectual battle colored Broca’s interpretation of the brain.
For example, he noted that the lobes at the front of the human brain, which had been linked to speech and thought, are relatively bigger than those of other animals. By contrast, he noticed that our olfactory bulbs—a pair of structures that govern our sense of smell—are relatively smaller, flatter, and positioned less prominently.
More here.
The Fantastic Mr Feynman
Video length: 59:42
Writers Should Maintain a Certain Distance with the World: Anita Desai
Veena Gokhale in The Wire:
Anita Desai, three time Booker Prize nominee, winner of several prestigious awards and with more than 17 books to her credit, was awarded the International Literary Grand Prize at the Blue Metropolis Literary Festival in Montreal, on April 29. The 10,000-Canadian dollar prize is awarded each year – since 2000 – to a world-renowned author in recognition of a lifetime of literary achievement. Former winners include Norman Mailer, Margaret Atwood, A.S. Byatt and Amitava Ghosh. Desai, known for books like Baumgartner’s Bombay, Clear Light of Day and In Custody, spoke to The Wire in Montreal.
You have been awarded the Blue Metropolis International Literary Grand Prize. You received the Benson Medal in 2003 and the Padma Bhushan in 2014. Do awards matter, or are they incidental to the writing?
This particular award tells me I have crossed a border and am now of an age where I can be given certain awards! (Smiles). Awards are certainly incidental. They are unexpected; they are not something you work towards, no.
What do you think is the purpose of literature? The worth of literature is being questioned these days, certainly here in Canada.
One works on two levels. At the subconscious level one is not working with an agenda, one is working out of a compulsion to tell your story, to put words on paper, to keep something from disappearing. And the joy of using language ought not to be forgotten.
On a conscious level, after you’ve written your work, sometimes it takes you by surprise. You say, oh, is that what it was about? At the end of the book you say, so that’s why it stayed in your mind for so long. What’s the reason for writing it? And invariably the reason is to tell the truth, in a somewhat sideways, somewhat subversive way. You don’t always manage to do that openly, face-to-face, you have to find a kind of a secret way.
More here.
Tuesday Poem
One of the Citizens
What we have here is a mechanic who reads Nietzsche,
who talks of the English and the French Romantics
as he grinds the pistons; who takes apart the Christians
as he plunges the tarred sprockets and gummy bolts
into the mineral spirits that have numbed his fingers;
an existentialist who dropped out of school to enlist,
who lied and said he was eighteen, who gorged himself
all afternoon with cheese and bologna to make weight
and guarded a Korean hill before he roofed houses,
first in east Texas, then here in North Alabama. Now
his work is logic and the sure memory of disassembly.
As he dismembers the engine, he will point out damage
and use, the bent nuts, the worn shims of uneasy agreement.
He will show you the scar behind each ear where they
put in the plates. He will tap his head like a kettle
where the shrapnel hit, and now history leaks from him,
the slow guile of diplomacy and the gold war makes,
betrayal at Yalta and the barbed wall circling Berlin.
As he sharpens the blades, he will whisper of Ruby and Ray.
As he adjusts the carburetors, he will tell you
of finer carburetors, invented in Omaha, killed by Detroit,
of deals that fall like dice in the world's casinos,
and of the commission in New York that runs everything.
Despiser of miracles, of engineers, he is as drawn
by conspiracies as his wife by the gossip of princesses,
and he longs for the definitive payola of the ultimate fix.
He will not mention the fiddle, though he played it once
in a room where farmers spun and curses were flung,
or the shelter he gouged in the clay under the kitchen.
He is the one who married early, who marshaled a crew
of cranky half-criminal boys through the incompletions,
digging ditches, setting forms for culverts and spillways
for miles along the right-of-way of the interstate;
who moved from construction to Goodyear Rubber
when the roads were finished; who quit each job because
he could not bear the bosses after he had read Kafka;
who, in his mid-forties, gave up on Sarte and Camus
and set up shop in this Quonset hut behind the welder,
repairing what come to him, rebuilding the small engines
of lawnmowers and outboards. And what he likes best
is to break it all down, to spread it out around him
like a picnic, and to find not just what's wrong
but what's wrong and interesting— some absurd vanity,
or work, that is its own meaning— so when it's together
again and he's fired it with an easy pull of the cord,
he will almost hear himself speaking, as the steel
clicks in the single cylinder, in a language almost
like German, clean and merciless, beyond good and evil.
by Rodney Jones
from Transparent Gestures
Houghton Mifflin, 1989
.
Thinking Machines
Matthew Parkinson-Bennett at the Dublin Review of Books:
Yet the fixation on transcending traditional human existence may have something to do with a failure to appreciate the richness and fullness of lived life. Kurzweil endorses a pathetically inadequate idea of gathering data on a person’s life – photographs, biographical detail, social media activity – with which to reconstruct their personality after death. How hellish an experience would that be, condemned to live forever only what portion of life could be skimmed off the surface and recorded as data? As O’Connell puts it, “Kurzweil’s vision of the future might be an attractive one if you already accept the mechanistic view of the human being.” It’s a vision which leaves little room for the “rich inner life” prized by David Chalmers, originator of the hugely influential “hard problem of consciousness”. With the hard problem, Chalmers throws a spanner in the works of the “mechanistic view of the human being” by showing that it cannot account for the qualitative dimension of subjective conscious experience – “the quality of deep blue, the sensation of middle C”.
This deep incomprehension extends of course to the political. When asked whether there isn’t a risk that only the wealthiest will have access to the benefits of transhumanist technology, one prominent advocate replies: “Probably the most extreme form of inequality is between people who are alive and people who are dead.” Is that the sort of person who will control the hardware on which our minds are to live in eternal simulation?
more here.
Against Willpower
Carl Eric Fisher in Nautilus:
Ideas about willpower and self-control have deep roots in western culture, stretching back at least to early Christianity, when theologians like Augustine of Hippo used the idea of free will to explain how sin could be compatible with an omnipotent deity. Later, when philosophers turned their focus away from religion, Enlightenment-era thinkers, particularly David Hume, labored to reconcile free will with the ascendant idea of scientific determinism. The specific conception of “willpower,” however, didn’t emerge until the Victorian Era, as described by contemporary psychology researcher Roy Baumeister in his book Willpower: Rediscovering the Greatest Human Strength. During the 19th century, the continued waning of religion, huge population increases, and widespread poverty led to social anxieties about whether the growing underclass would uphold proper moral standards. Self-control became a Victorian obsession, promoted by publications like the immensely popular 1859 book Self-Help, which preached the values of “self-denial” and untiring perseverance. The Victorians took an idea directly from the Industrial Revolution and described willpower as a tangible force driving the engine of our self-control. The willpower-deficient were to be held in contempt. The earliest use of the word, in 1874 according to the Oxford English Dictionary, was in reference to moralistic worries about substance use: “The drunkard … whose will-power and whose moral force have been conquered by degraded appetite.”
In the early 20th century, when psychiatry was striving to establish itself as a legitimate, scientifically based field, Freud developed the idea of a “superego.” The superego is the closest psychoanalytic cousin to willpower, representing the critical and moralizing part of the mind internalized from parents and society. It has a role in basic self-control functions—it expends psychic energy to oppose the id—but it is also bound up in wider ethical and value-based judgments. Even though Freud is commonly credited with discarding Victorian mores, the superego represented a quasi-scientific continuation of the Victorian ideal. By mid-century, B.F. Skinner was proposing that there is no internally based freedom to control behavior. Academic psychology turned more toward behaviorism, and willpower was largely discarded by the profession.
That might have been it for willpower, were it not for an unexpected set of findings in recent decades which led to a resurgence of interest in the study of self-control. In the 1960s, American psychologist Walter Mischel set out to test the ways that children delayed gratification in the face of a tempting sweet with his now-famous “marshmallow experiment.” His young test subjects were asked to choose between one marshmallow now, or two later on. It wasn’t until many years later, after he heard anecdotes about how some of his former subjects were doing in school and in work, that he decided to track them down and collect broader measures of achievement. He found that the children who had been better able to resist temptation went on to achieve better grades and test scores.1 This finding set off a resurgence of scholarly interest in the idea of “self-control,” the usual term for willpower in psychological research.
These studies also set the stage for the modern definition of willpower, which is described in both the academic and popular press as the capacity for immediate self-control—the top-down squelching of momentary impulses and urges. Or, as the American Psychological Association defined it in a recent report, “the ability to resist short-term temptations in order to meet long-term goals.”
More here.
Joan Didion’s 1970s notes on a journey south
Sarah Nicole Prickett at Bookforum:
The West in South and West is an old destination for anyone who's read any Didion. She never wrote the piece on the South, or any piece on the South, yet though she did not write the piece she was assigned on the trial of Patty Hearst, she did eventually write an essay about her, collected in After Henry, and a somewhat personal history of California, calledWhere I Was From. We know she pays attention to snakes and likes gold silk organza. We have been told so often that she no longer has fixed ideas that it's anticlimactic to see how long-ago and odd these fixed ideas are, for instance an idea held by her middle school classmates: "We find Joan Didion as a White House resident / Now being the first woman president." Remembering only the "failures and slights and refusals"of her teenage years, she allows that, in fact, she was always an editor or a president, a member of all the right clubs, a recipient of more prizes and scholarships than her "generally undistinguished academic record" would seem to permit. (She adds proudly and a bit contradictorily, "merit scholarships only: I did not qualify for need.") She believed that she "would always go to teas," because she had not yet seen the termites in the teacups.
There are, as I learned at age twenty from Women in Love, "three cures for ennui: sleep, drink, and travel." By her own account and by the accounts of some who knew her in Los Angeles, Didion drank enough and took enough pills that I would believe she went to rehab when she said, in The White Album, that she had gone to a psych ward. (Dunne said the speed and the benzos as well as the barbiturates were prescribed for migraines, never mind the contraindication, and who knows, but I do think temporary insanity would have seemed less embarrassing to Didion, more appropriate to the period, than dipsomania.
more here.
A Life of Thomas De Quincey
Nicholas Spice at the London Review of Books:
De Quincey’s size mattered to him. He was uncommonly small. But he was also uncommonly clever, and his ambitions were large. As a young man, he idolised Wordsworth and Coleridge, and then sought them out and tried to make them his friends. For a while they all got on, but then increasingly they didn’t. Wordsworth was in the habit of condescending to De Quincey, but Wordsworth condescended to most people and anyway condescending to De Quincey was hard to resist: ‘He is a remarkable and very interesting young man,’ Dorothy Wordsworth wrote, ‘very diminutive in person, which, to strangers, makes him appear insignificant; and so modest, and so very shy.’ ‘Little Mr De Quincey is at Grasmere … I wish he were not so little, and I wish he wouldn’t leave his greatcoat always behind him on the road. But he is a very able man, with a head brimful of information,’ Southey wrote. As relations soured, the belittlements grew sardonic: for Wordsworth, De Quincey was ‘a little friend of ours’; for Lamb, ‘the animalcule’; Dorothy and Mary Wordsworth took to calling him Peter Quince. Even his friends tended to diminish him: ‘Poor little fellow!’ Carlyle exclaimed to his wife, Jane, who mused: ‘What would one give to have him in a box, and take him out to talk.’
Scarcely surprising, then, that De Quincey was touchy, quick to detect a snub and fiercely proud. He claimed he first took opium as a palliative for toothache. But it isn’t hard to imagine that he used it to muffle his social discomfort, coming to depend on it as a way of sidestepping the world. By 1815, hunkered down among his books in Dove Cottage (known at the time as Town End, the lease of which De Quincey had taken over from the Wordsworths when they moved out), he was fully addicted. He was 29 and he was never to be free of the drug again.
more here.
Monday, May 15, 2017
perceptions
Henry Luther Threadgill. Pulitzer Prize winning composer of In For a Penny, In For a Pound, 2016.
Thanks to Nadia Sirota for featuring Mr. Threadgill on an episode of Meet The Composer, her fabulous, award winning podcast.
CATSPEAK
by Brooks Riley
Sunday, May 14, 2017
False Alarmism: Technological Disruption and the U.S. Labor Market, 1850–2015
Robert D. Atkinson and John Wu over at the Information Technology and Innovation Foundation:
It has recently become an article of faith that workers in advanced industrial nations are experiencing almost unprecedented levels of labor-market disruption and insecurity. From taxi drivers being displaced by Uber, to lawyers losing their jobs to artificial intelligence-enabled legal-document review, to robotic automation putting blue-collar manufacturing workers on unemployment, popular opinion is that technology is driving a relentless pace of Schumpeterian “creative destruction,” and we are consequently witnessing an unprecedented level of labor market “churn.” One Silicon Valley gadfly now even predicts that technology will eliminate 80 to 90 percent of U.S. jobs in the next 10 to 15 years.
As the Information Technology and Innovation Foundation has documented, such grim assessments are the products of faulty logic and erroneous empirical analysis, making them simply irrelevant to the current policy debate. (See: “Robots, Automation, and Jobs: A Primer for Policymakers.”) For example, pessimists often assume that robots can do most jobs, when in fact they can’t, or that once a job is lost there are no second-order job-creating effects from increased productivity and spending. But the pessimists’ grim assessments also suffer from being ahistorical. When we actually examine the last 165 years of American history, statistics show that the U.S. labor market is not experiencing particularly high levels of job churn (defined as new occupations being created while older occupations are destroyed). In fact, it’s the exact opposite: Levels of occupational churn in the United States—defined as the rates at which some occupations expand while others contract—are now at historic lows. The levels of churn in the last 20 years—a period of the dot-com crash, the financial crisis of 2007 to 2008, the subsequent Great Recession, and the emergence of new technologies that are purported to be more powerfully disruptive than anything in the past—have been just 38 percent of the levels from 1950 to 2000, and 42 percent of the levels from 1850 to 2000.
Other than being of historical interest, why does this matter? Because if opinion leaders continue to argue that we are in unchartered economic territory and warn that just about anyone’s occupation can be thrown on the scrap heap of history, then the public is likely to sour on technological progress, and society will become overly risk averse, seeking tranquility over churn, the status quo over further innovation.
Tides and Prejudice: Racial Attitudes During Downturns in the United States 1979-2014
Robert Johnson and Arjun Jayadev over at INET Economics:
What happens to racial prejudice during economic downturns? This paper analyzes white attitudes towards African Americans in the United States at different points in a business cycle from 1979- 2014. Using a number of indicators of hostility towards African Americans available from the General Social Survey we develop an indicator of racial prejudice. We combine this with data on unemployment from the Current Population Survey and find robust evidence that racial hostility as measured by our indicator of prejudice is counter cyclical and rises during periods of higher unemployment for whites. Specifically a one standard deviation in the unemployment rate being experienced by whites is associated with a .03 to.05 standard deviation increase in the discrimination index. This is of a magnitude comparable with one year less of education. We undertake a quantile regression to show that this effect is widespread across the distribution of prejudice and that apart from those with initially low levels of prejudice, increasing own group unemployment results in statistically significant increases of similar magnitude in prejudice across that distribution. Finally, we show that discrimination is robustly positive correlated with measures of life dissatisfaction, further underscoring the significance of periods of distress in generating racial hostility.
More here.
Has art ended again?
Owen Hulatt in Aeon:
Hegel was, in many ways, the father of what we now call the history of art. He gave one of the earliest and most ambitious accounts of art’s development, and its importance in shaping and reflecting our common culture. He traced its beginnings in the ‘symbolic art’ of early cultures and their religious art, admired the clarity and unity of the ‘classical’ art of Greece, and followed its development through to modern, ‘romantic’ art, best typified, he claimed, in poetry. Art had not just gone through a series of random changes: in his view, it had developed. Art was one of the many ways in which humanity was improving its understanding of its own freedom, and improving its understanding of its relationship to the world. But this was not all good news. Art had gone as far as it could go and stalled; it could, according to Hegel’s Lectures on Aesthetics (1835), progress no further:
the conditions of our present time are not favourable to art […] art, considered in its highest vocation, is and remains for us a thing of the past.
In 1835, Hegel claimed that art had ended. Almost exactly 10 years later, the German composer Richard Wagner premiered Tannhäuser in Dresden, the first of his great operas; the beginning in earnest of a career that would change musical composition forever. Less than a century after Hegel’s claim, the visual arts saw the onset of Impressionism, Cubism, Surrealism and Fauvism, among other movements, and literature, poetry and architecture were deeply changed by Modernism.
In 1964, Danto attended an exhibition at the Stable Gallery in New York. He came across Andy Warhol’s artwork Brillo Boxes (1964) – a visually unassuming, highly realistic collection of plywood replicas of the cardboard boxes in which Brillo cleaning products were shipped. Danto left the exhibition dumbstruck. Art, he was convinced, had ended. One could not tell the artworks apart from the real shipping containers they were aping. One required something else, something outside the artwork itself, to explain why Warhol’s Brillo Boxes were art, and Brillo boxes in the dry goods store were not. Art’s progress was over, Danto felt; and the reign of art theory had begun.
More here.
Radicalism Begins in the Body
Junot Diaz and Samuel Delany in Boston Review:
JD: People have called you a sex radical. What do you suppose they mean? What does it mean to you? Does it come with any political commitments?
SD: Intellectual radicals, rather than actual radicals, are people who say things where they are not usually said. And, yes, all true radicalism has to begin in the body—so being a sex radical means you have to be ready to act radically and be willing to speak about it in places you ordinarily wouldn’t—such as an interview about an activity you might otherwise confine to a journal. That’s how I started—and the world got started around me, as it were, when my mother found my secret writings, took them to my therapist, and they ended up in an article: Kenneth Clarke, who was the head of the Northside Center where I was going for child therapy, quoted them in an article in Harper's and again in his book, Prejudice and Your Child (1955), and I found myself published because of it. My first professional sale, as it were. I got a lot of attention for it, too. It is the source of most of my “radicalism.”
JD: You once said that “there were far more opportunities for sex among men before Stonewall than since.” Let’s expand that a little to the larger question of what generational differences among gay men strike you as most significant?
SD: You have to remember there’s always what’s said and then there’s what happens. And there’s always a discrepancy between them. Human beings are definitely tribal, as much as wolves and apes are. And the fact that only one sex carries the young to term immediately starts the separation into cultures. Do you want it in public, in private, or in a special space that’s socially marked out? Do you want pictures or reproductions (and if so, what sort) of those public or private or socially marked out space? That’s finally what my book Times Square Red, Times Square Blue (1999) was about.
More here.
