The New Dark Ages, Part II: Materialism

by Akim Reinhardt

FlatIn part I of this essay, I offered a broad re-definition of the term “Dark Ages,” using it to describe any historical period when dogma becomes ascendant and flattens people's perceptions of humanity's very real complexities. From there, I discussed how the conventional Dark Ages, marked by religious dogma's domination medieval Europe, were supplanted by a subsequent Dark Age; during the 19th and 20th centuries, racism and ethnocentrism complemented the rise of ethnic national states, to cast a pall on much of the Western world.

If part I of this essay sought to expand Dark Age perils beyond the threat of religious totalitarianism, then part II of this essay will seek to drag it out of the past and into the present. To identify modern forms of dogma that threaten to flatten our understanding of life's complexities.

In particular, I will focus on various forms of materialism as among the most potent dogmas that have created Dark Ages during the 20th century, and which continue to threaten the West here in the 21st century.

I began part I of this essay by begging forgiveness from European historians for recycling and attempting to redefine the term “Dark Age,” which most of them have long since discarded. I should probably begin part II of this essay then by requesting patience from philosophers. For I am not using the term “materialism” in the philosophic sense.

Rather, I am using “materialism” to identify dogmatic interpretations of the human condition that are based on economics. That of course is closer to the term “historical materialism,” which refers to Marxist interpretations of the past. And while I will discuss Marxism and the past, I will also be talking about free market interpretations and the present, so the strict Marxist phrase “materiaism” simply will not do. Therefor, I am claiming the word “materialism” in this essay to mean various economic interpretations, from both the Left and the Right, which make grand claims of not just of the economy, but also of broader social, political, and cultural realms.

Originally emanating out of Europe, I define materialism as dogma that views economics as an all-encompassing filter for explaining the human condition. Such dogma has since subdivided into numerous factions, each with millions of followers. And while various doctrines are in stiff competition with each other, all of dogmatic forms of materialism place economics front and center in an effort to explain and interpret the human condition, erroneously downplaying various cultural and social elements.

Marxism is hardly the oldest economic philosophy to be widely accepted in Europe, but it was the first to become a truly dominant dogma that has initiated Dark Ages in various parts of the world.

Read more »



Falling in love with a beautiful bronze

by Leanne Ogasawara

0708 bronze (16)Not unlike the stories surrounding my favorite Carpaccio painting, my beloved bronze is surrounded in mystery and romance.

Utterly compelling, whenever I used to come home to LA, one of the first things I would long to do was to pay him a visit. He is so breath-takingly handsome –that truly a lifetime of visits to see him would never be enough. Physically perfect and with the most exquisite patina, it is that hand, pointing toward his victory wreath that always gets me.

(sigh~~~~).

Created between 300–100 BCE, the Getty Bronze is a victory statue celebrating a youth's win in one of the Greek Olympic Games. Perhaps he was the son of a wealthy family who wanted to commemorate their golden boy's athletic achievements. Such a beautiful start. But then several hundred years after his creation, the Romans literally ripped his feet off when dragging him back to Rome as booty (maybe for melting down since so little care was taken to pry him off his pedestal?)

That was not the end of his bad luck either, for he would then to sink to the bottom of the cold waters of the Adriatic when the ship carrying him went down–presumably in a storm.

Read more »

Is There Such A Thing As A Sane Republican? No.

by Evert Cilliers aka Adam Ash

Goposaur_upsidedownYou can't understand what a Republican is about until you zone in on his core belief:

“I don't want the government to take my money and give it to poor people. Especially poor black people.”

Republicans are children who never learned to share. Selfish. What's mine is mine.

Children is the right word, because Republicans get childishly bratty and emotional about their beliefs (consider the recent Republican government shutdown temper tantrum, for example). It's a visceral thing for them. They feel.

What do they feel the most? Threatened. They feel threatened by the Other, the Different, the New. They're paranoid. They see so many threats: poor people, blacks, Mexicans, gays, even women. Modernity itself gives them the jitters. They want to move backwards, to some white Christian paradise of the South, when men were men, and women and blacks were slaves.

They want the world to be like them. Reality scares them. They suffer from arrested development. In fact, Republicans are not fully developed human beings.

Their appeal is to childish emotion, not to adult reason. That's why they find it so difficult to compromise.

Democrats are very different. They're about doing the sensible, practical thing. They don't have an ideology, like Republicans do. They don't try to bend the world to some fundamentalist worldview. They try to fix things, not shape the world to some pre-ordained edenic vision.

So, today's question: can we rely on the Republicans to keep screwing up to the point that they lose the House in 2014?

Yes.

Read more »

Monday, October 14, 2013

Why Study Logic?

by Scott F. Aikin and Robert B. Talisse

Aristotle_Altemps_Inv8575Logic, as a field of study, is primarily focused on arguments. Logicians ask questions like: What counts as an argument? What counts as a good argument? How does argument go wrong? The overriding objective is to articulate the ways in which good reasoning differs from bad reasoning, and to employ those explanations in extending our capacity to reason.

For the most part, we argue and reason when thinking about things – tigers, taxis, and ties. Logic is an investigation into how we think about these things. So, as we argue in a language, we do logic in a language about that language; logic is a meta-language. Now, we have many other meta-langauges. There is the meta-language of grammar that captures our rules for well-formed sentences. There is the meta-langauge of artistic criticism that articulates rules or norms of the use of language for beauty. And so we may speak of crooks and hooks in our first order language, but it is the meta-languages that permit us to speak of nouns and rhymes. Logic, as a meta-language, then takes what comes natural to us – reasoning and argument – and provides a vocabulary with which we may talk about that reasoning, and hence scrutinize it. But in what way is it useful to have such a meta-language?

Consider the usefulness of the meta-language of grammar. With some basic grammatical concepts, we can identify the infelicity of the sentence My tie are blue or the ambiguity of I met a smart logician's husband. Without grammar, we may correct the first sentence with My tie is blue, or we may clarify the second with a well-placed question: Who was the smart one- the logician or the husband? But the explanation of what had gone wrong is inaccessible in the absence of a vocabulary designed to talk about the language. When developing the skill of making this ascent from first-order talk to the meta-language, we come to possess our thoughts and statements in a more complete fashion. We don't just know how to use the language, we also know why.

Similarly, logic supplies the tools with which to explain why (and not just see that) the reasoning in the following inference is good:

If Penelope is a cat, Penelope is a mammal

Penelope is a cat. So, Penelope is a mammal

Moreover, with logic, we can explain what goes wrong with fallacious reasoning, too:

If Violet is a cat, Violet is a mammal.

Violet is a mammal. So, Violet is a cat.

Logic provides names for good forms of reasoning (the first example above is an instantiation of what's called modus ponens) and we have names for bad forms, too (the second example is an instantiation of the fallacy of asserting the consequent). This attention to the forms of reasoning allows us to distinguish two reasons we may give for taking issue with an argument.

Read more »

The Muzak of Jhumpa Lahiri

by Ahsan Akbar

ScreenHunter_355 Oct. 14 09.11Summer bids farewell. It is the perfect time to long for a dip into warmth of homeland nostalgia aka “immigrant fiction”, though the term is not favoured by Jhumpa Lahiri, whose new book pique my interest. The Lowland (Bloomsbury 2013) is her second novel and fourth work of fiction. Immediately after the announcement of the 2013 Man Booker shortlist, its sales became astronomical. And Lahiri, no stranger to prizes and shortlists, reaffirms her place in the pantheon with yet another bestseller.

London maybe Lahiri's place of birth, but she grew up in the East Coast of America – Rhode Island, finishing college with multiple degrees from Boston. She cannot read Bengali, but she can speak the language and she certainly takes an interest in her roots: Calcutta. Her debut collection, Interpreter of Maladies, a slim collection of short stories, won the Pulitzer Prize in 2000. That surprised many literary establishments, perhaps shook some pillars too. As a fellow Bengali, daft as it may sound, I could not but rejoice in her achievement. Comprised of nine stories, the bestseller offered refreshing insight into the lives of Indians and Indian Americans without pulling punches. Personally, I enjoyed how Lahiri had common components in all the stories, which gave an overarching feel to the collection. Despite a lot that was both admirable and enjoyable about the book, I was also baffled by the fact that she would name a Bengali character 'Pirzada' in her '71 story (When Mr. Pirzada Came to Dine). This is especially distressing since she holds a PhD and is presumably skilled in research. Critics in the West, who choose to downplay such mistakes in works about cultures they don't know should just ask themselves this: Could an otherwise perfectly good story about the Civil Rights movement get placed anywhere if the black central character were called, say, Aaron Steinmetz?

In any case, Lahiri got the upper hand of the cultural politics of America-endorsed ethnic fiction: many of the stories from Interpreter of Maladies were about exotic places but written in the context of a safe American suburb, a soft focus also adapted by the Sri Lankan-born Canadian novelist Michael Ondaatje in his latest work of fiction, The Cat's Table.

Read more »

Monday Poem

Getting to Know You
.
I’m getting to know you who came
with the first Archaeon’s spark

Everything was new then, even you, you
parenthetical tail of vital events, you
old telegraphic protoplasmic stop, you
callous caboose bringing up the rear of trains
of eloquent clauses, fertile words,
grunts and final remains, you
small but lethal punctualtional dot

You came on the scene with the first cellknots waiting
You stood in the dark as first hearts began beating
In celebrations of birth you took orchestra seating
At wakes you confirmed your ruthless deleting

Never kind to lovers you roamed the earth like a shade
after light —being its nether side
what it made you unmade

Here it comes! the word went out
when your coughing heralds came through
making it clear you’d arrive to nullify anything new

Alone in your shadow lovers wept
embracing only the smoke they had kept
of the flame you snuffed before you had left
.

by Jim Culleny
10/7/13

“Saying” the Ghazal: Duende and Performing the Courtly Art of the Ghazal

by Shadab Zeest Hashmi

ScreenHunter_357 Oct. 14 09.24

Mughal miniature showing a poetry reading, c. 1640-50

The ghazal entered my consciousness first as music (on Radio Pakistan or my parents’ LPs), accessible only through melody, beat, rhyme, refrain; the poem’s literary heft, of course, utterly lost on me. The ghazal was really a visceral stimulus in my pre-language existence and as such revealed itself as sad, cold, dim, energetic, red, blue or sweet depending on what emotion its sonic synthesis suggested. Later, when I studied the form in school, I was filled with the sense of awe that surrounds the Urdu ghazal in Pakistan.

The ghazal is distinguished as the most elevated of poetic forms, and considered to be the litmus test of a true poet. I learned about the Urdu ghazal’s formal constraints, and how, in the hands of the masters the form has been known to embody in the elegant brevity of a couplet, a vast range of subjects with depth and precision. All this talk was useful in understanding the craft and reach of the ghazal but it created a chasm of sorts and cut me off from my earliest response to the ghazal—hearing in the ghazal a color or temperature of emotion, and falling under its spell. This loss of connection with the spirit of the form became apparent to me after writing and teaching the ghazal in English and reading the Spanish poet Lorca’s lectures on the Duende.

Before I discuss the ghazal and the duende, here is a brief history of the ghazal and how we have come to know and utilize it in English: The ghazal form originated in pre-Islamic, pre-literate Arabia, spreading across parts of Asia, Africa, and Europe, soon after the Muslim conquests of these regions. The Persians cultivated and refined this form to the extent that it became a defining feature of Persian poetics and was further transmitted to many other literary traditions, including that of Urdu. The Urdu ghazal took root in the court of the Sultanate of Dehli in the thirteenth century. The foremost ghazal poet Amir Khusrau was a famed scholar, Sufi mystic and musician, and was a poet in the court through the rule of seven emperors of Muslim India. His Persian and Hindavi (early dialect of Urdu) ghazals would later have a significant influence on the Urdu ghazal.

Read more »

Should Doctors ‘Google’ Their Patients?

by Jalees Rehman

Beware of what you share. Employers now routinely utilize internet search engines or social network searches to obtain information about job applicants. A survey of 2,184 hiring managers and human resource professionals conducted by the online employment website CareerBuilder.com revealed that 39% use social networking sites to research job candidates. Of the group who used social networks to evaluate job applicants, 43% found content on a social networking site that caused them to not hire a candidate, whereas only 19% found information that that has caused them to hire a candidate. The top reasons for rejecting a candidate based on information gleaned from social networking sites were provocative or inappropriate photos/information, including information about the job applicants' history of substance abuse. This should not come as a surprise to job applicants in the US. After all, it is not uncommon for employers to invade the privacy of job applicants by conducting extensive background searches, ranging from the applicant's employment history and credit rating to checking up on any history of lawsuits or run-ins with law enforcement agencies. Some employers also require drug testing of job applicants. The internet and social networking websites merely offer employers an additional array of tools to scrutinize their applicants. But how do we feel about digital sleuthing when it comes to relationship that is very different than the employer-applicant relationship – one which is characterized by profound trust, intimacy and respect, such as the relationship between healthcare providers and their patients?

Internet 1

The Hastings Center Report is a peer-reviewed academic bioethics journal which discusses the ethics of “Googling a Patient” in its most recent issue. It first describes a specific case of a twenty-six year old patient who sees a surgeon and requests a prophylactic mastectomy of both breasts. She says that she does not have breast cancer yet, but that her family is at very high risk for cancer. Her mother, sister, aunts, and a cousin have all had breast cancer; a teenage cousin had ovarian cancer at the age of nineteen; and that her brother was treated for esophageal cancer at the age of fifteen. She also says that she herself has suffered from a form of skin cancer (melanoma) at the age of twenty-five and that she wants to undergo the removal of her breasts without further workup because she wants to avoid developing breast cancer. She says that her prior mammogram had already shown abnormalities and she had been told by another surgeon that she needed the mastectomy.

Such prophylactic mastectomies, i.e. removal of both breasts, are indeed performed if young women are considered to be at very high risk for breast cancer based on their genetic profile and family history. The patient's family history – her mother, sister and aunts being diagnosed with breast cancer – are indicative of a very high risk, but other aspects of the history such as her brother developing esophageal cancer at the age of fifteen are rather unusual. The surgeon confers with the patient's primary care physician prior to performing the mastectomy and is puzzled by the fact that the primary care physician cannot confirm many of the claims made by the patient regarding her prior medical history or her family history. The physicians find no evidence of the patient ever having been diagnosed with a melanoma and they also cannot find documentation of the prior workup. The surgeon then asks a genetic counselor to meet with the patient and help resolve the discrepancies. During the evaluation process, the genetic counselor decides to ‘google' the patient.

Read more »

Duct Tape, Plywood and Philosophy

by Misha Lepetic

When all is finished, the people say, “We did it ourselves.”
~ Tao Te Ching, Verse 17

Gramsci-WomanWhat does philosophy in action look like? Casual thoughts about the discipline may be united by the cliché of the philosopher as a loner. From Archimedes berating a Roman soldier to not “disturb my circles” (which subsequently cost him his head), to Kant's famous provinciality, to Wittgenstein's plunging into the Norwegian winter to work on the Logik, the term “armchair philosopher” might seem to be a tautology. But philosophy – or at least the parts that occupy the intersection of the interesting and the accessible – still concerns itself with the world at large, and our place in it.

New Yorkers got to see a particularly odd example of philosophy in action over the summer when artist Thomas Hirschhorn installed his Gramsci Monument in the central courtyard of a Bronx public housing complex known as Forest Houses. I won't dwell much on Antonio Gramsci himself (see here for a start), but suffice to say he was a man of the people, who died in prison after founding the Italian Communist Party. What is more interesting is how Hirschhorn used Gramsci as a jumping-off point, and where he chose to do it. Completed in 1956, Forest Houses is part and parcel of what anyone would recognize as “the projects” – a scattering of 15 buildings in a towers-in-the-park configuration, populated by nearly 3400 residents, most of whom are minorities and low-income. However, Hirschhorn didn't so much choose the site as it chose him – after visiting 47 public housing projects in the city, Forest Houses was the only one that expressed any interest in his proposal.

The arrival of Hirschhorn and his motley architectural assemblage, which seemed to be made mostly of plywood and duct tape, was met with perplexity by both residents and art critics. As far as the critics go – and hey, someone's got to play the straw man to kick things off, right? – at least one was mightily displeased. Writing in the New York Times, Ken Johnson pooh-poohed Hirschhorn as a “canny conceptualist operator” and opined that the installation would ultimately “be preserved in memory mainly by the high-end art world as just a work by Mr. Hirschhorn, another monument to his monumental ego.”

It's difficult for me to comprehend that Johnson and I visited the same place. The first thing to note is the inappropriateness of the term “installation.” The Gramsci Monument is much more of an intervention. Of course, architects and urbanists are not immune to the charms of this term, either – any bland pop-up café seems to constitute an “intervention” of the street, the urban fabric or what have you, with “dramatic” being the accompanying adjective of choice. But what made Hirschhorn's work really an intervention was its sheer physicality, its uncompromising presence in the courtyard. The towers-in-the-park paradigm, one of the baleful legacies of modernism, was introduced to the US in large measure by Le Corbusier, whose reputation is currently the subject of a risible attempt at rehabilitation by MoMA. The result is an environment of hard vertical and horizontal masonry lines, scrawny trees and threadbare lawns. As a pedestrian, you walk among 12-story brick sentinels, and the absence of any place that can provide a moment of semi-privacy, one of the key signifiers of successful public space, is palpable. The point – which was much in keeping with Le Corbusier's design ideals – was to get you to where you were going, and as efficiently as possible. “No Loitering,” as the signs say.

Read more »

The Uses and Disadvantages of History for Ecological Restoration

by Liam Heneghan

Nietzsche-munch

Click diagram to enlarge

Context: One of the newer biological conservation strategies, ecological restoration, attempts to reverse the degradation of lands set aside for conservation purposes by reinstating, as closely as possible, the species and environmental conditions that existed before recent and large scale disturbances by human activities. A newly emerging framework within restoration ecology – the novel ecosystem paradigm – points out that with global change we are moving into an era for which there is no historical analogue. As a consequence land must be managed without excessive regard for the past which can no longer serve as our guide. This has generated a lot of controversy within the field. I was asked by Irish journalist Paddy Woodworth to speak on a panel on “The historical reference system: critical appraisal of a cornerstone concept in restoration ecology” at a conference of the Society for Ecological Restoration held in Madison Oct 6 -11th 2013. In recent articles and in his new book “Our Once and Future Planet: Restoring the World in the Climate Change Century” Woodworth had been critical of the novel ecosystem paradigm wondering if it does not undermine the case for restoration. I had not realized how controversial the topic had become. Tensions at the conference were running high, and the room in which this panel convened was over capacity with dozens turned away. What follows is the outline of my remark at this session.

On first glance the work of Friedrich Nietzsche (1844–1900), the German philosopher, might not seem especially helpful for restoration ecologists or indeed for anyone contemplating our relationship with the natural world. After all, his work supposedly challenges the foundations of Christianity and traditional morality. Nietzsche’s famous locutions concerning the “death of God” and his extensive discussions of nihilism should, however, be seen as his diagnosis rather than his cure. For Nietzsche our real cultural task is to overcome the annihilation of traditional morality, replacing it with something more life-affirming. The failure of our traditional precepts of value stem from the fact these express what Nietzsche calls the ascetic ideal. This ideal measures the appropriateness of human actions against edicts coming from beyond our natural and earth-bound life. The highest human values, as we traditionally assess them, came from a denial of our natural selves. Nature, in turn, is regarded as having no intrinsic value.

Thus Nietzsche even when he wrote in areas seemingly distant from traditional environmental concerns has useful things to say to us environmentalists. At times, in fact, his aphorisms are those of a poetic naturalist. In The Wanderer and His Shadow (1880, collected in Human, All too Human) he wrote “One has still to be as close to flowers, the grass and the butterflies as is a child, who is not so very much bigger that they are. We adults, on the other hand, have grown up high above them and have to condescend to them; I believe the grass hates us when we confess our love for it.” This is not, of course, to claim that Nietzsche is a traditional naturalist. His concerns are primarily about the thriving of human life, though in this he seems less like a traditional wilderness defender and closer to a contemporary sustainability advocate who seeks to locate a promising future for humans while simultaneously solving environmental problems.

Read more »

Bound for the Future

by Quinn O'Neill

Stanford_Torus_interiorA warm wind blew over the African grassland and stirred the leaves of empty trees. A long time ago, the faint sounds of a nearby tourist lodge were carried on the breeze and the trees cradled sleeping baboons. These baboons were special. Studied by neurobiologist Robert Sapolsky in the 80s, they would sleep in the trees to gain easy access to garbage from the tourists – half eaten hamburgers and leftover drumsticks.1

The practice proved lethal for the baboons, who met their demise when a TB outbreak contaminated the food. Those who frequented the garbage site happened to be the most aggressive and least socially affiliated in their troop. The more socially-oriented and peaceful members were spared, and a curious change occured in the dynamics of the troop, with the remaining baboons subsequently enjoying a persistent peaceful culture with relatively little agression and more grooming.

But that was a long time ago and much had changed. Faced with climate change, human pollution, and habitat loss, the baboon's numbers had dwindled dramatically and few places remained on the planet that could attract human tourists with their wild and natural beauty. Our own “alphas”, for too many decades, had put their own immediate interests above everything else, including collective human well being, animal welfare, and environmental sustainability. Our tar ponds and nuclear waste sites were now too widespread to be hidden from our view and the few remaining old trees, still beautiful were too wise to be enjoyed. “There used to be more of us,” you could almost hear them saying. “We were surrounded by life once.”

Over the years, lots of people had objected to what was taking place even as they watched it unfold. Credible authorities like James Hansen, head of the NASA Goddard institute for Space Studies, and Canadian geneticist and journalist David Suzuki warned of the need to radically reduce CO2 emissions and make environmental issues a priority. Had they been part of the tiny percentage that wielded awesomely concentrated wealth and power, perhaps they could have done something to stop it sooner.

Read more »

Monday, October 7, 2013

Monday Poem

Leaving

gust
air once here goes
to fill a vacuum there

dusk
the sun no more,
is behind mañana’s door

I can’t recall my last glimpse of you
you went
imagination is your wake

and here comes Go again blazing her trail of tears
while Gone is close behind
sweeping footprints with a green pine bough
from Going’s dust as I pine now
.

by Jim Culleny
9/12/13

The Terrain of Indignities

by Namit Arora

A review of Unclaimed Terrain, a book of short stories translated from Hindi, and a conversation with its author, Ajay Navaria.

UnclaimedTerrain“Indian writing” is often equated in the West with its small subset: the work of a tiny class of Indians that thinks and writes in English. Salman Rushdie fueled this folly in his introduction to Mirrorwork: 50 Years of Indian Writing 1947-97, declaring the work of such Indians a ‘more important body of work than most of what has been produced in the “16 official languages” of India’. He co-edited this anthology and of the 32 works of fiction and non-fiction that appear in it, 31 were written in English and one in Urdu, i.e., only one translation made the cut. Some of this lopsidedness can be explained by the paucity of translations into English, but is Rushdie’s judgment defensible in a country where, even today, less than one percent of Indians consider English their first language, less than ten percent their second, and 80 percent of all books are put out by hundreds of vernacular language publishers, including from authors with far greater Indian readership than most who write in English? Rushdie doesn’t even speak most of these languages. Isn’t his claim, then, an instance of linguistic prejudice? Aren’t the dynamics of class in India, and the power of English language publishing in the West, speaking through him?

Ajay Navaria’s Unclaimed Terrain—a collection of seven short stories translated from Hindi to English by Laura Brueck—shows from its first page how different its world is from those imagined by the Indians in Rushdie’s anthology. Navaria, a faculty member in the Hindi department in Jamia Millia Islamia, Delhi, may well be the first Dalit to teach Hindu religious scriptures at a major university. He is also the author of a novel and two books of short stories. In Unclaimed Terrain the protagonists of most stories are Dalit men who have clawed their way into the urban middle-class through their wits and education, sometimes with the help of reservations. Many harbor episodic memories of social life in ancestral villages, memories in which bigotry and abuse overwhelm kindness and beauty. They love the anonymity of the big city, even as they live in fear of being “found out” and reminded—in the artful ways of the metropolis—of their “proper place”.

In the story Subcontinent, for instance, the protagonist, as a boy, has seen village men abuse and assault his groveling father and grandma—returning after a stint in the city—for breaking caste taboos. As a boy, he has seen a Dalit wedding party attacked by thugs because the groom has dared to ride a horse in the village, and later that day, a woman of the party being raped: ‘I saw, beneath the white dhoti-clad bottom of a pale pandit-god, the darkened soles of someone’s feet flailing and kicking’. Rather than file a complaint, the village policeman mocks them, ‘They say she was really tasty. Lucky bitch, now she’s become pure!’ In his middle-age, the protagonist, Siddhartha Nirmal, Marketing Manager in a government enterprise in the big city, exults at the distance he has traveled in the world: 3BR flat; car; eating out at Pizza Hut and Haldiram’s, where the counter-boys call him Sir. He can hire the services of a Brahmin doctor, keep a Garhwali Brahmin driver who bows at him, and employ a Bengali music teacher he found on the Internet for his daughter, who goes to an expensive convent school. But such welcome anonymity that the city affords him disappears in familiar spaces, such as his office, which has ‘the same snakes. The same whispers, the same poison-laden smiles. Our “quota is fixed”. I got promoted only because of the quota … that’s it. Otherwise … otherwise, maybe I’m still dirty. Still lowborn. Like Kishan, the office janitor. Like Kardam, the clerk. Because I am their caste.’

Read more »

Poem

VERACITY

Inspired by Rumi

“May I borrow your donkey?”
A neighbor asked Kavanagh

Who said, “I'm very sorry,
I loaned out my donkey yesterday.”

At that moment, the donkey brayed
In the barn. The neighbor, believing

The donkey made Kavanagh a liar,
Asked “Then what is that I hear?”

Kavanagh replied, “Friend, are you going
To believe me or a donkey?”

By Rafiq Kathwari, winner of the Patrick Kavangah 2013 Poetry Award.

Darwin, God, Alvin Plantinga, and Evolution (Part II)

by Paul Braterman

597px-AlvinPlantinga

Professor Plantinga lecturing, 2009

Prof Alvin Plantinga, of Notre Dame University, is perhaps the most distinguished critic of current views on evolution. He claims that if our conceptual apparatus is simply the product of naturalistic Darwinian evolution, it will generally give rise to unreliable results. From this premise he argues that it is unreasonable to accept naturalistic evolution, since[1] if naturalistic evolution were true, our reasons for accepting it would be unreliable. There is nothing wrong with his logic, but his premise rests on a basic misunderstanding of how evolution works.

Disclosure: around the time of the Kitzmiller-Dover trial, Prof Plantinga and I had a long e-mail correspondence, now unfortunately lost during a University mail system upgrade. I remember, however, the final exchange. He said that Behe, Dembski, and Thaxton, advocates of three different versions of Intelligent Design, had produced arguments that required an answer. In reply, I said that I totally agreed; the answer was, in each case, that they were wrong. Prof Plantinga did not reply.

Disclaimer: I have no credentials when it comes to philosophy. But let me plead in mitigation that Prof Plantinga has no credentials when it comes to evolutionary biology.

According to Plantinga, a belief is warranted when it is produced by cognitive functions working properly, according to a design plan aimed at producing true beliefs. The design plan could be produced by an agent (God, or a super-scientist), or by evolution. This convoluted definition is necessary to bypass cases that puzzle philosophers, such as, is a belief warranted when it happens to be true but we hold it for bad reasons.[2] Plantinga's position is encapsulated in the title of his 1993 book, Warrant and Proper Function.

I have three problems here. One is the choice of words; design plan and aim have connotations of foresightful agency, and it would be better to use neutral terms such as adaptationand tendency to produce. The second is circularity; how do we define proper function, if not in terms of giving warrant to beliefs when appropriate? The third, which is really a consequence of the second, is that it is useless in real disagreements, because it begs the question. For example, Prof Plantinga tells us that he possesses a sense of the divine, which he regards as a warrant. But how can he know that this is a warrant, unless he already knows that this sense is leading him towards the truth? And what, then, of Darwin's objection (Part I); how can we have confidence in the beliefs that this sense induces, when those who claim to possess it differ so forcibly among themselves?

Read more »

Monday, September 30, 2013

Respect for truth in science and the humanities

by Dave Maier

As you all know, not that long ago Steven Pinker wrote a piece defending “scientism” as a general approach to intellectual matters, including those usually thought to be beyond the scope of science (e.g. the humanities). Leon Wieseltier responded, restating the standard view that the humanities are indeed beyond the scope of science, except around the edges, so to speak, and reaffirming the common use of “scientism” as a term of abuse, referring for example to a tendency to regard the method of the natural sciences as setting the standard for human inquiry generally, a view Wieseltier considers arrogant.

I'm going to try not to get into the back and forth of this today, as my interest is in Daniel Dennett's brief, testy defense of Pinker against Wieseltier in HuffPo a few weeks ago (and besides, I'm still reading the same book of Dennett's I wrote about last time). Dennett is usually secure enough in his views to avoid the scorched-earth rhetoric of (for example) the other “new atheists”, but Wieseltier seems to have gotten his goat this time. I myself didn't see that much wrong with Wieseltier's essay. Most of the sentences were true, for example, but on the other hand a bunch of true sentences need not a good essay make. After all, there were a lot of true sentences in Pinker's essay too.

Dennett takes offense at what he sees as Wieseltier's blunt, ignorant rejection of Pinker's sincere offer of a friendly hand across the disciplinary divide. Thus he tells us that “Name-calling and sarcasm are typically the last refuge of somebody who can't think of anything else to say to fend off a challenge he doesn't understand and can't abide.” Indeed, in Intuition Pumps Dennett lists and endorses (psychologist Anatol) Rapoport's Rules of successful criticism. I really like Dennett's version:

1) You should attempt to re-express your target's position so clearly, vividly, and fairly that your target says “Thanks, I wish I'd thought of putting it that way”.

2) You should list any points of agreement (especially if they are not matters of general or widespread agreement).

3) You should mention anything you learned from your target.

4) Only then are you permitted to say so much as a word of rebuttal or criticism.

Dennett admits to some difficulty in following these rules himself, even as in his piece he scolds Wieseltier for his lamentable lack of charity. Nor will I follow them here, as I wouldn't be able to finish (2) or (3), let alone (1) in this short space. But I certainly was struck by Dennett's claim, a mere seven sentences after the bit about “name-calling,” that

Postmodernism, the school of “thought” that proclaimed “There are no truths, only interpretations” has largely played itself out in absurdity, but it has left behind a generation of academics in the humanities disabled by their distrust of the very idea of truth and their disrespect for evidence, settling for “conversations” in which nobody is wrong and nothing can be confirmed, only asserted with whatever style you can muster.

Yikes! So much for charity!
Read more »

Food and Power: An Interview with Rachel Laudan

9780520266452

by Elatia Harris

All photos courtesy of Rachel Laudan

Rachel Laudan is the prize-winning author of The Food of Paradise: Exploring Hawaii’s Culinary Heritage, and a co-editor of the Oxford Companion to the History of Modern Science. In this interview, Rachel and I talk about her new book, Cuisine and Empire: Cooking in World History, and her transition from historian and philosopher of science to historian of food.

Elatia Harris: I can remember when there was no such academic discipline as food history, Rachel. What was involved in getting there from being a historian of science and technology?

Rachel Laudan: I can remember when there was no such discipline as history of science! In fact, moving to history of food was a breeze. After all, the making of food from plant and animal raw materials is one of our oldest technologies, quite likely the oldest, and it continues to be one of the most important. The astonishing transformations that occur when, for example, a grain becomes bread or beer, or (later) perishable sugar cane juice becomes seemingly-eternal sugar have always intrigued thinkers from the earliest philosophers to the alchemists to modern chemists. And the making of cuisines is shaped by philosophical ideas about the state, about virtue, and about growth, life, and death.

A lot of food writing is about how we feel about food, particularly about the good feelings that food induces. I'm more interested in how we think about food. In fact, I put culinary philosophy at the center of my book. Our culinary philosophy is the bridge between food and culture, between what we eat and how we relate to the natural world, including our bodies, to the social world, and to the gods, or to morality.

Read more »