an interview with artist Sheila Hicks

Hicks-web4Danielle Mysliwiec interveiws Shiela Hicks for The Brooklyn Rail:

Rail: The woven works and paintings in your M.F.A. thesis exhibition lean heavily toward abstraction. When you began at Yale, Clement Greenberg gave the Ryerson lecture at Yale entitled “Abstract and Representational,” attempting to make a case for abstraction. Given the climate, do you remember your first explorations in abstraction? Did you experience a debate in your own practice?

Hicks: The first year was a compulsory course conducted by Sewel Sillman. We made watercolors of melons or onions. We did figure drawing, exercises in perception of all kinds, and I took Albers’s course on color, Interaction of Color.Anyone who’s ever taken Interaction of Color, or taught it, which I taught to young architects when I had my Fulbright in Chile, inevitably thinks in terms of color as an exercise. Color is an emotion, it’s an idea, but it’s also a visual exercise. What happens if a color like this slice of lemon is next to this hot chocolate and then moves next to bougainvillea? Consider what kind of emotional response it evokes. When I exhibited both my paintings and weavings for my M.F.A.evaluation, there were definitely landscape references from Chile. Chilean landscape is overwhelmingly beautiful. I traveled with the photographer Sergio Larrain all the way down into the Beagle Canal and Strait of Magellan where there are immense manganese blue glaciers. I saw spectacular landscapes and seascapes. Inevitably I think that migrated into my work—not seeking to represent it, not seeking to portray it, but to emulate the sense, the feeling one has in perceiving that aura.

more here.

what does “atheism” really mean?

Atheists_t-shirtMike Dobbins at Killing the Buddha:

But there is another definition of atheism available to us: “a: a disbelief in the existence of a deity. B: the doctrine that there is no deity.” (Merriam-Webster), or even Urban Dictionary’s second definition, “A person who believes no god or gods exist.” This is a meaningful definition of atheism one can sink their teeth into. This accurately informs me and the world what atheists actually do believe about God. Most important for the atheist, it is in line with reality. Atheists do have beliefs or disbeliefs regarding God, just as they have beliefs and disbeliefs regarding heaven, the soul, and the afterlife.

When approached with such celestial concepts, an atheist does not try to conceal their actual beliefs by saying they have a ‘lack of belief in’ a soul. They properly state either a positive belief that there is no soul, a negative belief that they don’t believe in a soul, or on rare occasion, a belief in a soul. The atheist is perfectly willing, and able, to state their beliefs regarding this and other supernatural propositions. Should God be an exception? Of course he shouldn’t be.

Whether it be heaven, a soul, God, or a favorite atheist God, The Flying Spaghetti Monster, one takes a belief or disbelief on the concept. The introduction of the idea forces the conscious and intelligent human brain to automatically deliberate the proposition, especially ones of such magnitude.

more here.

defending John Updike

UpdikeRobert Wilson at The American Scholar:

Begley records the harsh things writers like James Wood and David Foster Wallace said about Updike late in his career—the former writing, “Updike is not, I think, a great writer” and the latter accusing Updike in 1997 of being, along with Roth and Norman Mailer, in his “senescence.” As for himself, Begley says, “Predicting his eventual place in the pantheon of American literature is an amusing pastime, but no more useful than playing pin-the-tail with the genius label.” Still, I wish he had said more about the influence of Updike’s own criticism when he was not writing about novelists he saw as potential rivals. Updike reintroduced an American audience to the 20th-century British novelist Henry Green (Party Going; Loving) and wrote thoughtful and generous reviews of other novelists from safely distant shores, ranging from John McGahern and William Trevor in Ireland to Christina Stead in Australia, to Wole Soyinka and a raft of other African writers. Any book by Vladimir Nabokov, whom Updike admired and at times emulated, was sure to get his notice. No other American writer of Updike’s stature contributed so much to the literary culture of our time.

Even if fixing Updike’s place in the firmament is only an amusing and useless pastime, it is hard to resist. I suspect that readers down the years will return to Updike as we do to Balzac, not for the single masterpiece, perhaps, but for the cumulative power of his close attention to his world (“he was enthralled by the detail of his own experience,” as Begley gracefully puts it).

more here.

Private parts: writers and the battle for our inner lives

Josh Cohen in New Statesman:

WriteAt the end of last year, an international group of writers drafted a petition decrying the escalation of state surveillance and calling for a digital bill of rights to protect the privacy of all global citizens. Scattered among the numerous expressions of solidarity in the online comment boxes were a good few barbs aimed at the presumption of a self-appointed elite of “arch-pseuds” that their writerly status conferred on them some special authority to speak on this question. Why, after all, should a petition of writers carry any more weight in the debate on privacy than one of welders or florists? A few weeks later, I had an encounter with an author that brought to life the specific and urgent link between literature and privacy. The writer was Otto Dov Kulka, the winner of this year’s Jewish Quarterly-Wingate Prize, for which I was on the judging panel. A distinguished Czech-born Israeli historian of the Nazi genocide, Kulka received the prize for Landscapes of the Metropolis of Death (2013), a personal and philosophical meditation on his experience as a child survivor of Auschwitz.

Receiving the prize, Kulka spoke of having dedicated his life after Auschwitz to documenting Nazism’s crimes in the rigorously disinterested language of the historian. But alongside this contribution to the public record, he had silently amassed a personal archive of memories, dreams and images of his time as an inmate of the so-called family camp at Auschwitz, which he called his “private mythology”. No one could fail to be moved by the undisguised delight and incredulity with which this slight yet robust old man received the award. The source of that incredulity was not false modesty but the genuine conviction he’d had when writing the book that the experience to which he was giving voice was too private to be shared – echoing the terrible recurring dream related by Primo Levi, of telling his experiences of Auschwitz to a group of oblivious listeners. Landscapes is the fruit not of any long-held literary ambition on Kulka’s part but of his search for a language that would do justice to the terrible singularity of his story. The form of the book wasn’t so much chosen as imposed on him by the privacy of the experience he sought to convey.

More here.

Stress alters children’s genomes

Jyoti Madhusoodanan in Nature:

ChromoGrowing up in a stressful social environment leaves lasting marks on young chromosomes, a study of African American boys has revealed. Telomeres, repetitive DNA sequences that protect the ends of chromosomes from fraying over time, are shorter in children from poor and unstable homes than in children from more nurturing families. When researchers examined the DNA of 40 boys from major US cities at age 9, they found that the telomeres of children from harsh home environments were 19% shorter than those of children from advantaged backgrounds. The length of telomeres is often considered to be a biomarker of chronic stress. The study, published today in the Proceedings of the National Academy of Sciences1, brings researchers closer to understanding how social conditions in childhood can influence long-term health, says Elissa Epel, a health psychologist at the University of California, San Francisco, who was not involved in the research.

Participants’ DNA samples and socio-economic data were collected as part of the Fragile Families and Child Wellbeing Study, an effort funded by the US National Institutes of Health to track nearly 5,000 children, the majority of whom were born to unmarried parents in large US cities in 1998–2000. Children's environments were rated on the basis of their mother's level of education; the ratio of a family’s income to needs; harsh parenting; and whether family structure was stable, says lead author Daniel Notterman, a molecular biologist at Pennsylvania State University in Hershey. The telomeres of boys whose mothers had a high-school diploma were 32% longer compared with those of boys whose mothers had not finished high school. Children who came from stable families had telomeres that were 40% longer than those of children who had experienced many changes in family structure, such as a parent with multiple partners.

More here.

Tuesday, April 8, 2014

The Eye of the Mind

Walcott-243x366

J. Mae Barizo on The Poetry of Derek Walcott, 1948-2013, in the LA Review of Books:

The collection, published by FSG and edited by Glyn Maxwell, is not the first selected by Walcott, but it is the most comprehensive. It includes seldom-seen poems written by the teenage Walcott, and provides a sweeping yet thorough examination of the octogenarian’s work. Walcott is usually referred to as a Caribbean poet (he was born in St. Lucia, educated in Jamaica), but that classification alone diminishes the breadth and significance of his oeuvre. Walcott embraces the formal English tradition to elucidate his Caribbean experience. The uniqueness of his voice stems from its hybrid of formal extravagance and graceful simplicity. This is apparent even in his 25 Poems, published when he was 18:

Where you rot under the strict gray industry
Of cities of fog and winter fevers, I
Send this to remind you of personal islands
For which Gauguins sicken, and to explain
How I have grown to know your passionate
Talent and this wild love of landscape.

(from “Letter to a Painter in England,” 25 Poems)

Walcott absorbs the world as a painter. He has always excelled with his lush collection of visual details (“flare of the ibis, rare vermilion”; “darkening talons of the tide;” “roads as small and casual as twine”), but his poetry is not simply a meditation on art and nature. His work devotes itself not to interpretations, but to intimacies. That is, he uses nature to explore his poetic experience. Walcott noted in a 1986 Paris Review interview that “the body feels it is melting into what it has seen,” and “if one thinks a poem is coming on […] you do make a retreat, a withdrawal into some kind of silence that cuts out everything around you.” So, while Walcott’s work is sometimes rooted in heady descriptions, his poems indulge both in transitory moments and the quiet after something has been seen, in the wake of astonishment.

More here.

Kripke’s Unfinished Business

643

Richard Marshall interviews Scott Soames in 3:AM Magazine:

3:AM: You argue that a theory of meaning needs propositions but also needs to account for the cognitive stance of a person towards a proposition and also how propositions manage to represent the world and have truth conditions. Is that right? So firstly, can you explain why you find Russell’s attempts to account for them not right?

SS: Yes, a theory of meaning for a language L needs propositions that represent the world and so have truth conditions. Yes, it also needs an account of cognitive stances agents take to propositions – if L has sentences – e.g. attitude ascriptions – that predicate properties of propositions. My answer to your question about Russell is explained in chapter 9 of my newThe Analytic Tradition in Philosophy, Vol. 1. Here is the gist of it. Between 1900 and 1910 Russell believed in propositions constituted by objects and properties, but he couldn’t explain their “unity”. Just as sentences aren’t collections of unrelated expressions, but have a structural unity that distinguishes them from mere lists and allows us to use them to represent the world truly or falsely, so propositions aren’t collections of unrelated meanings of the words used to express them, but have a unity that endows them with truth conditions that mere aggregations of their parts don’t have. Russell struggled unsuccessfully to explain this unity until he rejected propositions in 1910 in favor of his multiple relation theory of judgment. Although that theory was disastrous, the insight behind it was brilliantly correct. The intentionality of agents can’t be derived from the supposed sui generis intentionality of propositions to which agents bear attitudes. Instead, the unity that brings together Desdemona and being unfaithful in Othello’s belief that Desdemona was unfaithful is provided by the sui generis fact that the agent predicates being unfaithful of Desdemona. What Russell failed to see was how this insight can be used to reconstruct genuinely unified (i.e. representational) propositions by deriving the intentionality of propositions from the intentionality of agents who entertain them.

More here.

Drowning in Light

2740_96f2b50b5d3613adf9c27049b2a888c7

Dirk Hanson in Nautilus:

In 1996, Yale economist William D. Nordhaus calculated that the average citizen of Babylon would have had to work a total of 41 hours to buy enough lamp oil to equal a 75-watt light bulb burning for one hour. At the time of the American Revolution, a colonial would have been able to purchase the same amount of light, in the form of candles, for about five hour’s worth of work. And by 1992, the average American, using compact fluorescents, could earn the same amount of light in less than one second. That sounds like a great deal.

Except for one thing: We treat light like a drug whose price is spiraling toward zero. In the words of sleep expert Charles A. Czeisler of Harvard Medical School, “every time we turn on a light, we are inadvertently taking a drug that affects how we will sleep and how we will be awake the next day.”1 Our daily metabolic cycles are not precisely 24 hours long, and this turns out to be a crucial evolutionary glitch in the mammalian circadian system. Circadian rhythms must be reset daily to keep us in behavioral synch with the earth’s rotation, so we will sleep when it is dark and wake when it is light. This process is called entrainment, and it is achieved by means of light exposure. In the brain, a region of the hypothalamus called the suprachiasmatic nucleus receives input from the retina, causing specialized “24-hour” cells to oscillate in specific patterns. This affects how we eat, sleep, and work. And in most people, the circadian response is intensity-dependent, meaning the greater the light, the greater the effect on the human circadian system.

To complicate matters, our relationship with light is profoundly psychological as well. In “Psychological processes influencing lighting quality,” published in Leukos, the Journal of the Illuminating Engineering Society of North America in 2001, Jennifer A. Veitch analyzes the available scientific evidence concerning the manner in which lighting conditions affect mood and behavior in office settings. Veitch found that “preferences for illuminance levels are generally higher than the recommended levels.” Researchers in the Netherlands, Sweden, the United Kingdom, the United States, and Canada have all documented the same tendency to “overlight” things.2 Veitch also references studies showing that “people with seasonal affective disorder or the milder, subsyndromal, form of this mood disorder consistently preferred higher room illuminance levels than matched, normal controls.”

More here.

Growing up in Kundera’s Central Europe

Bousfield_468w

Jonathan Bousfield in Eurozine:

Thirty years ago, Czech novelist Milan Kundera dealt with cultural estrangement and its consequences in his celebrated essay “The Tragedy of Central Europe” (first published in the French journal Débats in November 1983, then in the New York Review of Books the following April), sparking off a long-running debate about the fate of European cultures caught on the “wrong side” of the Cold War divide.

Kundera's essay initially made for pessimistic reading. Not only did it argue that Central Europe constituted a “kidnapped West” abducted by an alien, Byzantine-Bolshevik civilisation, but it also claimed that the rest of the continent was in too deep a state of decadence to be fully aware of what it had lost. What initially looked like a requiem, however, soon gained an altogether more optimistic sheen. Mikhail Gorbachev came to power in the Kremlin, the Soviet Bloc showed signs of opening its windows and then the multi-ethnic, cosmopolitan Central Europe eulogised so evocatively by Kundera was quickly re-spun as a symbol of what Europe could be again, rather than what had forever been left behind.

Thirty years on, most of the countries in Kundera's Central Europe have been integrated into the European Union and NATO, and the very term “Central Europe” is no longer necessary, either as an anti-Soviet rallying cry or a badge of cultural belonging. However, the cultural concerns addressed by Kundera have not necessarily gone away simply because the context has changed. Europe is still sandwiched between two superpowers with differing worldviews, and small nations can still be the bearers of important truths.

More here.

climate change may not be the fault of rapacious humanity

Robert Colvile in The Telegraph:

Lovelock_2870245cAs the inventor of Gaia theory, James Lovelock is used to thinking big. Ever since he came up with the idea that the planet and its inhabitants form one vast, self-regulating system – initially scoffed at, but now taken seriously across a variety of disciplines – his focus has been wider than that of his more hidebound colleagues. In A Rough Ride to the Future, Lovelock outlines a new theory. He argues that since 1712, the year in which the Newcomen steam engine was created, we have moved into a new age, the Anthropocene, in which humanity’s ability to liberate energy and information from the Earth has rapidly outpaced both Darwinian evolution and the planet’s ability to cope.

What is refreshing about Lovelock’s approach to these issues is that it is blessedly free of dogma. He does not blame humanity for doing what comes naturally: exploiting the wonders available to it. And he is happy to outline the gaps in our understanding of climate science, not least the role of living beings in helping to regulate the system. This clarity extends to his conclusions. Ultimately, he suggests, climate change is down to ignorance, not negligence – but while we do not yet know its exact contours, the process is both extremely serious and probably unfixable. Unlike the situation with CFCs, or chlorofluorocarbons, a generation ago, there are too many actors – countries, companies and individual humans – that would need to be cudgelled into self-denial if the status quo were to be retained. Where he differs from the consensus, however, is in suggesting that this might not be such a bad thing. What we are seeing around us, Lovelock argues, may be the large-scale destruction of the planet’s ecosystem by rapacious humanity. But it may also be “no more than the constructive chaos that always attends the installation of a new infrastructure”. Humanity is already concentrating itself in bigger and bigger cities, so rather than trying to “save the Earth”, or restore some artificial version of a normal climate, why not live comfortable lives in clustered, air-conditioned mega-cities? This serves ants and termites perfectly well, he argues – as well as the inhabitants of Singapore.

More here.

Pieing for fun and profit

Pietoface-1024x578Rex Weiner at The Paris Review:

Popular belief has it that the pie-in-the-face gag (a word derived from the Norse gagg, meaning “yelp”) originated in the silent-movie era. Performed by the slapstick director Mack Sennett’s Keystone Kops, Fatty Arbuckle, Laurel and Hardy, the Three Stooges, and various imitators then and since, the stunt seems uniquely American. What better enforcer of the democratic dogma than a tossed pie? A gooey face is an instant social equalizer.

Of course, the self-important have always been targets for a takedown; even if pieing is a predominantly American phenomenon, the puncturing of pomposity is universal. I’m reminded of certain graffiti left behind in Egyptian tombs by workers who remarked on the pharaoh’s resemblance to the buttocks of an ox.

The French have a word for it, of course: lèse-majesté, “injured sovereignty,” an ancient crime from the Roman era, still on the books in such countries as Turkey and Thailand. Even in the Netherlands, a man was jailed in 2012 for calling Queen Beatrix a “con woman” and a “sinner,” and demanding abolishment of the monarchy. Royalty has always deserved, short of the guillotine, the pie.

more here.

WHEN YEARS ARE CELEBS

Simon Reid-Henry in More Intelligent Life:

InIn 1970 the great novelist Aleksandr Solzhenitsyn was putting the finishing touches to what he called “the chief artistic design of my life”. Its title, “August 1914”, was intended to convey to readers everything they needed to know about the content, even if they had never heard of the Battle of Tannenberg, the actual focus of the narrative, or read “The Guns of August”, the Pulitzer prize-winning account of the start of the first world war. This was the book with which Solzhenitsyn hoped finally to outdo his literary nemesis, Tolstoy, by blending history and fiction in a manner so “urgent…so hectic and choppy,” wrote his translator, Michael Glenny, that, “at times it almost leaves you breathless”. Alas, breathlessness can be tiresome over 6,000 pages, and “August 1914” never captured the public imagination. Yet Solzhenitsyn, the great inventor, was on to something. Today the market is in full bloom for what the writer Henry Grabar, tongue firmly in cheek, calls “annohistory”. The display tables are groaning with copies of Max Hastings’ “Catastrophe: Europe Goes to War 1914”, Allan Mallinson’s “1914: Fight the Good Fight”, Christopher Clark’s “The Sleepwalkers: How Europe Went to War in 1914” and Mark Bostridge’s “The Fateful Year: England 1914”. Any day now, something similar will happen with 1989.

What is it about some years that they come to hold such a prominent place in our culture, decades after their passing? Are some years really that much more important than others? Was what happened in them so dramatic that the dates themselves have become a rift that lifts up out of the earth, leaving all historical activity merely sloping away—“a drama never surpassed,” as Churchill once put it? What matters in history used to be a matter for the intellectual class to decide. For better and worse, those days are gone. Edinburgh University’s Tom Devine clearly thought he was criticising Britain’s education secretary, Michael Gove, when he said during a recent spat over the teaching of school history: “you cannot [just] pick out aspects of the past that may be pleasing to people”. But that is precisely what many history books do. And it is what readers do every time they walk into the history section of a bookshop.

More here.

Can a Single Book Sum Up a Nation?

Rb_dirda_opt_85Michael Dirda at VQR:

Initially, the nineteenth-​century passion fordefining and creating “national” novels arose from what Buell dubs “cultural legitimation anxiety.” Early attempts at the GAN aimed to portray the distinctive character of the young United States, often celebrating Yankee virtues, such as drive and know-​how, but sometimes revealing how far the country had fallen away from the foundational ideals of liberty and equality. These days, of course, we have grown leery of “exceptionalist self-​imaginings” or grand unitary visions of our ethnically and culturally diverse society. According to critic Mark McGurl, the contemporary American writer often “ ‘disaffiliates from the empirical nation … in order to affiliate with a utopian sub-​nation, whether that be African-​ or Asian-​ or Mexican-​’ or Native.” Buell argues back, however, that American life has always been characterized by “the tension between synthesis and particularism.” Even the lack of glue, “the perceived (non)relation between fractious parts has itself been one of the drivers of GAN thinking from the start.”

Buell, as these quotations should make clear, has thought hard and carefully about his subject. He is, after all, a distinguished (if now emeritus) professor of American literature at Harvard, admired by some of the shrewdest scholars of our literature, including Robert D. Richardson (biographer of Henry David Thoreau, Ralph Waldo Emerson, and William James) and Philip F. Gura, who dedicated his last book,Truth’s Ragged Edge: The Rise of the American Novel, to Buell.

more here.

SAUL LEITER IN BLACK AND WHITE

Leiter_01Genevieve Fussell at The New Yorker:

Saul Leiter lived in an apartment on a quiet street in New York’s East Village, a neighborhood that evolved, during the six decades he lived there, nearly as much as Leiter himself. An undervalued photographer for most of his life, Leiter quietly amassed a body of work that has only recently begun receiving the credit it deserves. Since his death, last fall,the apartment has become Leiter’s de facto archive; Margit Erb, his gallery representative, and Anders Goldfarb, his long-time assistant, have spent months organizing the boxes of prints, negatives, portfolios, and books that he left haphazardly piled throughout the space.

The apartment today is far more organized than it was when Leiter died, but evidence of his life is everywhere. A high-backed wooden chair, where he painted and drank coffee, sits in a corner of a large room lit by a wall of windows. Old saucers that he used as palettes are stacked on the window ledge above a quiet courtyard. Figurative paintings by Soames Bantry, Leiter’s partner, hang alongside his own abstract watercolors. Primitive trinkets and vintage toys, including a Mickey Mouse doll, sit on the mantle; canvases of folk and Japanese art lean against the walls.

more here.

The 16th Century’s Line of Fire: ‘Infinitesimal,’ a Look at a 16th-Century Math Battle

John Allen Paulos in the New York Times:

08SCIB-master180Bertrand Russell once wrote that mathematics had a “beauty cold and austere.” In this new book, the historian Amir Alexander shows that mathematics can also become entangled in ugliness hot and messy.

The time was the late 16th and 17th centuries, and the mathematics in question was the proper understanding of continua — straight lines, plane figures, solids. Is a line segment, for example, composed of an infinite number of indivisible points?

If so, and if these infinitesimals have zero width, how does the line segment come to have a positive length? And if they have nonzero widths, why isn’t the sum of their widths infinite?

For reasons like these, Aristotle had argued that continua could not consist of indivisibles. But new developments were demonstrating that thinking of them this way yielded insights not easily obtained from traditional Euclidean geometry.

Huh? It’s natural to wonder how such a seemingly arcane issue could possibly arouse much passion. But this fascinating narrative by Dr. Alexander (an occasional book reviewer for Science Times) vivifies the era and the fault lines that the mathematical dispute revealed.

Let’s begin with the math. The mathematicians, Cavalieri, Torricelli, Galileoand others, were at the forefront of the new geometric approaches involving infinitesimals. If classical Euclidean geometry is conceived as a top-down approach with all theorems following by pure logic from a few self-evident axioms, the new approaches can be thought of as bottom-up, inspired by experience. For example, just as a piece of cloth can be considered a collection of parallel threads and a book a collection of pages, so too might a plane be considered an infinite number of parallel lines and a solid an infinite number of parallel planes.

But why was this so controversial?

More here.

Monday, April 7, 2014

Sunday, April 6, 2014

Buying the Future

Konczalb

Mike Konczal in The New Inquiry:

Finance can go beyond bringing the future of raw commodities into cash value in the present. A wave of mathematical modeling and computer simulations allows investors to predict the likely value of everything from apartment rentals to sovereign crises to human beings, and a elaborate contractual infrastructure lets them lock down their bets. These fruits of financial engineering will increasingly play a role in our economic lives.

But are these fruits poison? Where will this sort of predictive financial engineering lead, and can anything be done to alter the path? This system of buying and selling the future requires a level of control over far beyond the normal standardization and commodification that comes with capitalist societies. To specify the future in the ways that futures contracts demand means locking down its forms in advance, with an abstract conception suitable to financial exchange positing what will become lived reality. Knowledge of the future breaks down, while financial markets overwhelms areas of everyday life once fully separate from what has been traditionally seen as finance. The consequences of this domination by finance have already begun to unfold and may only intensify as finance’s realms expand.

It might be useful to start with an old philosophical debate: about the relationship between universals and specific objects. As Marco d’Eramo notes in his book The Pig and the Skyscraper: Chicago: A History of Our Future, that back during the medieval period, philosophers fought over whether the names of things resulted from social conventions and the everyday reality of their existence, or whether the names of things existed in a reality independent of the actual objects that represent them in everyday lives. Are there only particular, individual, material things out there, with generic names arising only from social conventions? Or are there ideal Platonic universal entities, which exist separately from individual iterations of them?

More here.