Separate Americas: The Enduring Racial Divide

by Kathleen Goodwin

Eric-Garner-memorial-BKI struggle to organize my thoughts when it comes to the discourse on race in the United States as catalytic events are playing out in real time each day. I am also struggling to keep up with the tidal wave of articles, tweets, statuses, and photos that my social networks are posting about Michael Brown, Eric Garner and other victims of a deeply unequal America. And like every white person, I feel the need to qualify my analysis of these events by positing that as a white person I cannot possibly fully understand the experiences of people of color in this country. As a consequence of recent events, I have been reflecting on the real and imagined boundaries that separate Americans. Nowhere has this been more apparent for me than in my life in New York City where South Asian men drive the cabs I take, Hispanic women answer the phone in my doctor's office, East Asian women paint my nails, and black men guard the doors at my office. Yet, the people I work for and with are overwhelmingly white and usually male. New York is a diverse city, to be sure, but it seems that interaction between different races and ethnic groups is at most transactional and brief. It appears that most of us are still working for white men directly or indirectly and those who control our government continue to be predominantly white. Only fifteen black executives have ever been CEO of a Fortune 500 company and Obama was the fifth black senator in U.S. history before being elected the first black president.

Linda Chavers, a black woman who teaches at Phillips Exeter Academy, an elite boarding school in New Hampshire, authored a piece on damemagazine.com with a few lines that encapsulated this problem. Chavers attended one of Missouri Governor Jay Nixon's press conferences in August in the aftermath of Michael Brown's killing in Ferguson. She writes that she realized while listening to him speak:

“This man has never dealt with a Black person in his life. I'm sure he's existed among Black people: The people who clicked his ticket on the train, put his items into the grocery bag, panhandlers on the street as he as his driver waited for the light to change. I remember thinking, He has never had anyone like me in his life in a position of authority, in a position higher than his.”

Chavers' realization targets the crux of many of the issues with diversity and white privilege in the U.S.— the lives of black Americans and other minorities, are parallel but rarely intertwined on a meaningful level with lives of white Americans. And different races may exist simultaneously with the diminishing white majority, but white people still hold most positions of power and control most decision making.

Read more »



Counting Desserts

IMG_1128

by Shadab Zeest Hashmi

As I put my wildly colicky baby to bed, I would unclench his tiny fists, and hold each finger, one by one, listing names of desserts in Urdu: gulaab jamun, halva, russ malai, jalebi, burfi. Adam was taught names in a garden; I taught my son names that likely came from the Mughal royal kitchens; names of syrupy, milky, cardamom-scented delicacies which suggested an ecstatic mix of cultures (not unlike Urdu itself which I like to think of as a sweet and sometimes sharp concoction of separate sensibilities); for example, “Laddu” has something of the Indic, “Halva” Arabic, “Gulaab Jamun,” Persian, “Zardah,” Turkish; each dessert distinct not only in appearance and taste but the type of occasion it is associated with, and most importantly, in its verbal flavor. Barely audible over my bawling newborn, I gave myself up to the slow, sustained incantation of the dessert menu.

Postnatal sleep-deprivation is a godforsaken place but the fogginess it causes can also bring clarity; the sound of dessert names became a bridge for me to cross over to my own childhood in order to find something to comfort my child. Words offered themselves as the cradle we both needed. As I rocked him and chanted, I conjured every sensory detail I wanted to pass on, each scent and shape. I pictured the delights— rectangular pieces of silvery burfi, halva garnished with blanched almonds, laddu with roasted melon seeds, orange spirals of jalebi.

Read more »

Monday, December 1, 2014

Is neuroscience really ruining the humanities?

by Yohan J. John

BrainWorshipers“Neuroscience is ruining the humanities”. This was the provocative title of a recent article by Arthur Krystal in The Chronicle of Higher Education. To me the question was pure clickbait [1], since I am both a neuroscientist and an avid spectator of the drama and intrigue on the other side of the Great Academic Divide [2]. Given the sensational nature of many of the claims made on behalf of the cognitive and neural sciences, I am inclined to assure people in the humanities that they have little to fear. On close inspection, the bold pronouncements of fields like neuro-psychology, neuro-economics and neuro-aesthetics — the sorts of statements that mutate into TED talks and pop science books — often turn out to be wild extrapolations from a limited (and internally inconsistent) data set.

Unlike many of my fellow scientists, I have occasionally grappled with the weighty ideas that emanate from the humanities, even coming to appreciate elements of postmodern thinking. (Postmodern — aporic? — jargon is of course a different matter entirely.) I think the tapestry that is human culture is enriched by the thoughts that emerge from humanities departments, and so I hope the people in these departments can exercise some constructive skepticism when confronted with the latest trendy factoid from neuroscience or evolutionary psychology. Some of my neuroscience-related essays here at 3QD were written with this express purpose [3, 4].

The Chronicle article begins with a 1942 quote from New York intellectual Lionel Trilling: “What gods were to the ancients at war, ideas are to us”. This sets the tone for the mythic narrative that lurks beneath much of the essay, a narrative that can be crudely caricatured as follows. Once upon a time the University was a paradise of creative ferment. Ideas were warring gods, and the sparks that flew off their clashing swords kept the flames of wisdom and liberty alight. The faithful who erected intellectual temples to bear witness to these clashes were granted the boon of enlightened insight. But faith in the great ideas gradually faded, and so the golden age came to an end. The temple-complex of ideas began to decay from within, corroded by doubt. New prophets arose, who claimed that ideas were mere idols to be smashed, and that the temples were metanarrative prisons from which to escape. In this weak and bewildered state, the intellectual paradise was invaded. The worshipers were herded into a shining new temple built from the rubble of the old ones. And into this temple the invaders' idols were installed: the many-armed goddess of instrumental rationality, the one-eyed god of essentialism, the cold metallic god of materialism…

The over-the-top quality of my little academia myth might give the impression that I think it is a tissue of lies. But perhaps more nuance is called for. As with all myths, I think there are elements of truth in this narrative. To separate truth from poetic license, four questions need to be asked:

  • Was there ever an intellectual golden age?
  • Is there really a crisis in the humanities?
  • Why should we care about the humanities?
  • Who are the invading forces?

I suspect that addressing the first question will require a master's thesis worth of research, so for now I'll accept that there really was a golden age, at least for argument's sake. The second question is also a matter of debate, but there is some interesting data suggesting that the crisis in humanities may have more to do with perceived quality than with quantity [5, 6]. For this essay I will restrict my attention to the third and fourth questions.

Read more »

Time turned to Stone; Part 2: The Giants’ Causeway, time as process

by Paul Braterman

My previous post here described Siccar Point, where an 80,000,000 year time gap is present between near-vertical tilted strata, and their roughly horizontal overlay. This gap corresponds to the formation and subsequent erosion of fold mountains thrown up when Iapetus, precursor to the modern North Atlantic, closed. Today's post is (mainly) about the Giants' Causeway, part of the enormous lava field first produced when the modern North Atlantic began to open, and still growing at the Mid-Atlantic Ridge and, most spectacularly, in Iceland. Fragments of the initial outpouring were separated as the Eurasian and North American plates moved away from each other, and now can be found as far apart as Greenland and Denmark.

File:British Tertiary Volcanic Province.png

The Antrim Lava field shown within the British Tertialry Volcanic Province, itself part of the North Atlantic Lava Field. By Hazel Muzzy (Own work), via Wikimedia Commons.

The Antrim Lava Field, of which the Causeway is part, was formed in three separate phases each consisting of many individual episodes. The most spectacular feature of the Causeway is provided by the P1000059second of these. Here, the lava cooled slowly, to generate a solid layer which stressed as it cooled, finally fracturing to give a complex array of columns, up to 10 metres high, and showing in places an almost regular hexagonal pattern. The lava of this second episode shows subtle chemical differences from the first, evidence of changes in the hot lava plume feeding the outflow. But what most excited me at the site was the existence of a band around 5 metres thick, between these columns and the lava beneath them. This layer is not a sediment, but a palaeosol, an ancient soil formed by in situ weathering of the top of the lavas deposited in the first episode. Its nature is confirmed by the presence of occasional unweathered lumps, and there are occasional round scars (“Giants' eyes”) in the exposed surface where these lumps have come away. Humid conditions are confirmed by the presence of valleys carved by streams, and filled in by the later lava flows. The chemical composition is like that of tropical
soils, which have undergone extensive prolonged leeching under warm and wet conditions, with the
most insoluble materials, iron and aluminium oxides, predominating towards the top, and there are traces of charred plant roots in the topmost layer. So here we have direct evidence of an extended interval, variously estimated at between 100,000 years and 3 million years, between the first and second phase of eruptions. After my visit, I discovered that this interbasaltic layer is found across the whole area of the Antrim Lava Field, and that there is another such layer between the middle and upper lavas. There are also extensive dikes, penetrating all the lower levels, caused by the eruption of the lava layers above. The entire coastline has been extensively reshaped and eroded over the intervening millions of years, and most dramatically during the Ice Ages, and subsequent exposure to the storms of the Atlantic. For more extensive descriptions, see here, p. 30, or here, and references therein.

Read more »

Poem

GARDEN IS AIR

after Iqbal’s Shabnum aur Sitaray (Dew and Stars)

A star said to a dewdrop: “Tell
the story of a garden
far from the heavens
to which the moon sings.”

The dewdrop replied: “O star
not a garden, but a world of sighs;
the breeze visits only to return;
the rose, garden’s flourish, blooms

to wither, bears the pain,
is silent as a nightingale laments,
can’t gather pearls even from its
own hem. The humming bird is

imprisoned, thorns concealed
by roses, eye of the ailing iris forever
moist, the box tree scorched,
free only in name: An outrage!

The moon revolves to cure her heart.
Stars are sparks of man’s burning.
I am the sky’s teardrop. The garden is
air, a sad image on the horizon’s canvas.”

by Rafiq Kathwari, whose first book of poems is forthcoming in April 2015 from Doire Press, Ireland. More work here.

Stranger Anger

by Debra Morris

25-ferguson-protests-newyork-006.w529.h352.2xFor some time I've wanted to write about anger in politics, more specifically the conditions under which it is necessary. Necessary in the sense that there is no better response—one more appropriate, say, or more effective. As far as possible, I want to consider anger's necessity apart from the question whether it is justified. The latter question, though a difficult one, is in one sense more easily parsed: since we justify anger by giving reasons for it—it is a response to a palpable injustice, for instance—it is possible, at least in principle, to carry on an earnest discussion or reasoned deliberation about it, since any of us may call upon these reasons (as opposed to a minority of us who, through dint of superior resources, or effective power, or a monopoly on force, may compel a certain response from others, whether or not we ever bother to supply good reasons for our actions). I don't want to be merely philosophical about this; indeed, I'd like to move the conversation away from philosophy as far as possible. Still, even in a cautiously philosophical consideration of the “necessary and sufficient conditions for” something like anger, I would want to focus on the first term, which (I suspect) tends to get folded into the latter. Are there times when the necessity for anger can be established independently of whether there is sufficient (meaning, usually, a “reasoned”) justification for it? Times when anger is felt, or shown, for a different kind of objective or end, one that is only partly described—and rather inadequately, at that—in terms of reasons? I think it bears asking: If it is ever possible to say, of anger, that it is “fully justified,” then is it really anger—”anger” as opposed to something that can and maybe should be expressed in more political terms, e.g., righteous indignation in the face of injustice, defense against harm or suffering, a self-respecting virtue (or felicitous middle way) in Aristotle's sense?

It may be helpful to say what prompted my interest in the legitimate place of anger in politics, given that it wasn't this week's events in Ferguson—though the latter, at the same time they've convinced me that there is a vital and unavoidable issue here, have also made clear how limited are our terms for thinking and talking about anger. As I was reading Matt Taibbi's recent The Divide: American Injustice in the Age of the Wage Gap, an examination of the ways in which the non-prosecution of the gross malfeasance and irresponsibility underlying recent financial crises is inextricably tied to the hyper-policing and punishing of the politically marginal, I found myself thinking, repeatedly, “Why aren't people pissed about this? Why aren't we in the streets now, preparing to take our government back?” That I could confess to the same exasperation after reading pretty much anything by Thomas Franks will only invite derision, I'm afraid, especially in light of Ferguson: oh, so this is what a privileged, Nation-reading white girl gets worked up about. (That and, of course, the fact that I am not currently in the streets.)

Read more »

Do I Look Fat in These Genes?

by Carol A. Westbrook

Are you pleasantly plump? Rubinesque? Chubby? Weight-challenged? Or, to state it bluntly, just plain fat? Have you spent a lifetime being nagged to stop eating, start exercising and lose some weight? Have you been accused of lack of willpower, laziness, watching too much TV, overeating and compulsive behavior? If you are among the 55% of Americans who are overweight, take heart. You now have an excuse: blame it on your genes.

FatkidsIt seems obvious that obesity runs in families; fat people have fat children, who produce fat grandchildren. Scientific studies as early as the 1980's suggested that there was more to it than merely being overfed by fat, over-eating parents; the work suggested that fat families may be that way because they have genes in common. Dr. Albert J Stunkard, a pioneering researcher at the University of Pennsylvania who died this year, did much of this early work. Stunkard showed that the weight of adopted children was closer to that of their biologic parents than of their adoptive parents. Another of his studies investigated twins, and found that identical twins–those that had the same genes–had very similar levels of obesity, whereas the similarity between non-identical twins was no greater than that between their non-twin siblings. It was pretty clear to scientists by this time that there was likely to be one or more genes that determined your level of obesity.

In spite of the compelling evidence, it has been difficult to identify the actual genes that cause us to be overweight. This is due partly to the fact that lifestyle and environment are such strong influences on our weight that they can obscure the genetic effects, making it difficult to dissociate genetic from environmental effects. But the main reason it has been difficult to find the fat gene is because there is probably not just one gene for obesity, as is the case for other diseases such as ALS (Lou Gehrig's disease). There seem to be many forms of obesity, determined by an as yet unknown number of genes, so finding an individual gene is like looking for a needle in a haystack.

Read more »

Who are the terrorists?

by Ahmed Humayun

Egypt_MuslimBrotherhood_banner_Torn_downTwo weeks ago the United Arab Emirates (UAE) put dozens of Muslim groups around the world on its terrorism list. While the list includes organizations such as Al Qaeda, al-Nusra front, and the Islamic State of Iraq and Levant (ISIL), which is uncontroversial, it also includes many other cultural and civic organizations in the United States and Europe, such as the Council of American Islamic Relations (CAIR) and the Muslim American Society (MAS).

The inclusion of this second group of organizations has perplexed Western governments, who have asked the UAE for an explanation. There is no real mystery here, however. The UAE alleges that these groups are linked to the Muslim Brotherhood, which tops the UAE list, and is the dominant Islamist organization in the Middle East. In outlawing the Muslim Brotherhood and organizations that are alleged to be connected to it, the UAE is following in the footsteps of Saudi Arabia, which declared the Muslim Brotherhood a terrorist organization in March.

The antipathy of the dynastic Arab rulers to Islamists is well established. Islamists like the Muslim Brotherhood call for political reform but Arab tyrannies see this as the end of their stranglehold on power. Hence, the opposition of the Gulf states to the Muslim Brotherhood after its success in the 2011-2012 elections in Egypt, and their subsequent support for the overthrow of Mohammad Morsi, the elected President. Of course, Islamists are far from Jeffersonian Democrats and they are illiberal on many issues, but they represent an ideological alternative to the status quo that has local appeal, a terrifying prospect for the current crop of Arab rulers.

This terrorism designation, then, is a signal to Muslim groups worldwide that they should align with the Arab status quo or else expect to be stigmatized, even when they are an innocous organization like CAIR, which has actively worked to counter terrorism in partnership with American law enforcement.

So much for the politics of this ‘terrorism' list. That the rulers of countries like Saudi Arabia and the UAE think they can define terrorism and be taken seriously is extraordinary in itself. It is well and good that they condemn ISIL today but they provided the key financial support that fueled ISIL's conquest of Sunni provinces in northern Iraq as part of their war against Shiite influence in the region.

Read more »

Monday, November 24, 2014

Monday Poem

Elemental

Earth

Today I troll for a poem of humus
dark and rich as the French Roast
which always starts my day
and always is a gift

In this four billion year terrapoem
fungi, woodlouse and eelworms
spend millennia decomposing
in concert with nematodes
actinomycetes and protozoa
doling water and, with bacteria,
fix nitrogen in a scheme
age old and symbiotic,
while on it men
women and other animals
troll and plow,
think and sweat
—animals who draw their own life from it,
who build their lives upon it,
from which come their bones
and to which their bones
and breath go (come and go)
in intervals of comets


Air

in this rambling walkabout
with friends who’ve shed
conceits together, dropping them
as one sloughs old clothes:
into the low pressure system of our lungs
comes new atmosphere, November cool
and out
and in again
and out
in a rhythm old but not antique
for which we thank our
lobe-finned fish progenitors
who learned to suck sweet gas to reap its oxygen
and in return (until we’re absolutely through)
we essentially reply with gusts of CO2


Fire

Heraclitus said that all is flux
or, I’d say, fire

never still
the more we yearn
the more things move
they hotter burn
.

Water

rivers fall by rules of space
obliged by banks that hem obedient livers in,
pulled, it seems, by tugging mass we acquiesce,
are dragged to bottom
— inclined to give, to toss
to push to swell and plunge
(by some dark scripts)
from Paradise to Sodom

.

by Jim Culleny
11/20/14

Intersections in Middle America

by Mara Naselli

_MG_2768 - Version 2When my children entered the gallery at the Grand Rapids Art Museum that contained Anila Quayyum Agha's installation work, Intersections, they took off at a run. The sound of their little feet filled the space. I felt that cinch of parental panic and scanned the room for what they might inadvertently destroy. The room was empty. Empty in the sense that it contained no objects, save the large wood cube illuminated by single light bulb hanging from the ceiling. The gallery, about thirty-feet square, was transformed into something larger by the tapestry of shadow projected onto the walls. I hesitate to use the word sacred, but it was impossible not to feel a certain vastness. The contrast of light and dark created an immersive architecture. “You should have seen it when they were installing it,” said the security guard. “The whole room spun.”

~

Every September since 2009, Grand Rapids, Michigan, has hosted an open art contest called ArtPrize. Anyone can enter. Anyone can judge. Anyone can win. ArtPrize winners are elected by popular vote. The rules have been adjusted each year, but the basic idea has remained intact: bring art to the public, let the public judge art.

Grand Rapids is a small, quiet city. But when ArtPrize opens, art is everywhere: parking lots, rooftops, bars, bridges, abandoned buildings, churches, even the river. This reserved city transforms into a minimetropolis of raucous, unedited expression.

The cultural context of ArtPrize—that is, the culture of Grand Rapids, Michigan—bears mentioning. When ArtPrize began, I had just moved here from Chicago, and so I watched with some interest at what looked like a large-scale democratic experiment. Some called it a rich kid's art party (the founder, Rick DeVos, is the grandson of the co-founder of Amway). But I thought of it as an experiment in civic discourse, where good art and bad art would duke it out through the intelligent discernment of public opinion. In many ways, the location of ArtPrize made perfect sense. The city has a venerable history in furniture making and design. There's a vibrant arts community here, a grassroots artists' collective, a sculpture garden, a symphony, a ballet, an opera, and a fine art museum—all this in a town of fewer than 200,000. The community in many ways is steeped in arts funded by local philanthropic families such as the DeVoses. Grand Rapids is also conservative and Christian. The fact that ArtPrize was in a very red region of a blue state made the democratic aspect of the contest all the more interesting to me. Taste, culture, and politics would converge as the public would play patron.

Read more »

An anthropologist among sartorialists

by Mathangi Krishnamurthy

ScreenHunter_880 Nov. 24 11.09Scott Schuman is in India. On the 6th of November, he announced that he would be posting to his immensely popular fashion blog “The Sartorialist” from the cities of Mumbai, New Delhi, and Varanasi. I must confess that for a few minutes, I cursed my luck at being in the deep South and not on the fashionable streets of Mumbai and Delhi, where Mr.Schuman would most likely be lurking, camera in hand. Surely being spotted by Mr.Schuman would be the rightful validation of my many years of changing clothes four times in a row in order to get to the library? After all, academics, especially those in the fields of cultural studies and contemporary socio-cultural anthropology need necessarily to be fashionable I had often argued to myself. (Let the convenience of this categorization be conveniently ignored for now.)

My sartorialism is highly suspect in any case; dressing up for particular environments has always been a particularly harrowing task. Fashion never came easy. My sensibilities were shaped, first and foremost by a socialist secular republic of few choices and many cut corners. During childhood, a popular sitcom lampooned the state of the market thus; the titular character of Wagle Ki Duniya (Wagle's World), Mr.Wagle marks a festive occasion by procuring large quantities of the same bolt of cloth out of which emerge clothes for himself, his wife, his children, and the drawing room curtains. Despite this situation, I did marvel at the effortless beauty of my parents' and their friends' wardrobes, chiffon and polyester saris and factory uniforms. I, however, thanks to particularly unfashionable school uniforms and awkward teenage years, had no possibility of displaying either ingenuity or taste.

Graduate life in America brought forth another set of quandaries. While well schooled by now and comfortable in “Western” clothing, I longed uncharacteristically for loose cottons and salwar kameezes and allowed myself in the Texas heat to switch back and forth, even as I kept away from events conducted by the Indian Cultural Association. Neither their Diwalis nor their Holis held any attraction to my thoroughly disdainful anomic self. But those few kurtas declared my allegiance to some culturally specific India, and brought me attention nevertheless. Many years later, I was told that people had seen me as performing ethnicity for their benefit.

Read more »

Music to watch girls go by

by Sarah Firisen

Catcall“The boys watch the girls while the girls watch the boys who watch the girls go by…” so sang Andy Williams in 1967.

Boys looking at girls, and then reacting with admiration; what could be more natural? In movie after movie a barrage of wolf whistles following Sofia Loren and Marilyn Monroe as they sashay down the street are meant as innocent signs of appreciation. To today’s men, who often react with a distinct lack of sympathy to modern women’s complaints about callcalling and its associated behavior, what we’re complaining about is no different, at least in intent, to behavior that 40 years ago was seen as a badge of honor for attractive women. I have no idea whether it was really felt and taken that way by women in past generations, perhaps that is nothing more than a romantic rose colored view of what was clearly felt as harassment even then. I truly have no idea. What I do know is how the modern forms of this behavior looks and feel to me and any woman I’ve ever asked about it.

If you’re on some form of social media these days, it’s almost impossible that you haven’t witnessed some of the latest volleys in the catcall wars. There has been a steady stream of women trying to fight back in one way or another: one of my personal favorites, a young lady who handed out cards to the men harassing her on the street trying to educate them, almost always without success, about how it felt to be the recipient of that attention. Another one I really like is http://stoptellingwomentosmile.com/; because what could really be more innocuous than telling a woman how pretty she would be if only she would smile, or telling her “hey, smile, it could be worse”? Except, how the hell do you know it could be worse? Perhaps someone just died. Perhaps I just got a cancer diagnosis. Or maybe I have really bad cramps. Or maybe, it’s none of your business whether I smile or not.

And of course, one of the more infamous recent examples, where a woman (an actress hired for the video) walks around New York for 10 hours with a cameraman surreptitiously recording what this woman has to put up with. She doesn’t say anything to these men, she doesn’t look at them, she’s not dressed in provocative clothing and she engages with them in no way, and yet she is harassed over 100 times. As female Facebook friend after friend reposted this video, there was a very predictable explosion of comment threads and some got pretty nasty. One friend posted an interesting observation, which I’ve now tried to verify for myself: she says that when she’s all dressed up (as she often is quite beautifully), she actually attracts less attention from men. But it’s when she’s in her sweats, no makeup, hair unwashed that she can’t seem to shake the catcalls, the men following her, harassing her.

Read more »

The aural time traveler

by Charlie Huenemann

GrafonolaSome years back my musicologist friend introduced me to the charming world of gramophones. (A brief history may be in order: before there were iPods and YouTube, there were CDs; before that, there were vinyl records, still very much in vogue among hipsters today; and before that – from roughly 1895 to 1950 – there were thick and heavy shellac records that were to be played at 78 revolutions per minute. That's what I'm talking about. Wikipedia, of course offers a much longer history.) I became an enthusiast on the spot, and we formed the Logan Gramophone Society, which meets on secret dates set to the lunar calendar, and involves scones, tea, and fezzes. Our university's music department has a veritable treasure trove of old records which supplies us with an inexhaustible supply of the quirky, the charming, and the incredible.

The earliest recordings were made before there were any amplifiers, let alone mixers or equalizers. Recording artists played into a horn, and some mechanism translated their sound waves into a wavy line scratched into wax. (We should all devote a moment to marveling at the fact that the sound of a singer accompanied by strings and tuba can all get squashed into a single wavy line.) That wavy line was then wrapped into a spiral and stamped upon many shellac disks, which were sold through record stores. Consumers would then buy the discs, take them home, place them upon their turntables, and place a needle at one end of the spiral, and send the disc into motion. Then the whole process would reverse itself: the wavy lines would vibrate the needle, and those vibrations would be sent out the horn for all to enjoy.

The point is that the sound travels from producer to consumer without ever disappearing into some electronic circuit to be changed or shaped. Jascha Heifetz plays his violin into a horn, those vibrations become scratches, those scratches become vibrations, and I hear Heifetz play. Everything is on the surface; nothing ever goes into a black box. I am one step away from direct, physical connection to Heifetz, as I would be if I handled his bow or tried on his hat. It is a form of aural time travel.

Read more »

The continuing relevance of Immanuel Kant

by Emrys Westacott

Images-2

Immanuel Kant (1724-1804) is widely touted as one of the greatest thinkers in the history of Western civilization. Yet few people other than academic philosophers read his works, and I imagine that only a minority of them have read in its entirety the Critique of Pure Reason, generally considered his magnum opus. Kantian scholarship flourishes, with specialized journals and Kant societies in several countries, but it is largely written by and for specialists interested in exploring subtleties and complexities in Kant's texts, unnoticed influences on his thought, and so on. Some of Kant's writing is notoriously difficult to penetrate, which is why we need scholars to interpret his texts for us, and also why, in two hundred years, he has never made it onto the New York Times best seller list. And some of the ideas that he considered central to his metaphysics–for instance, his views about space, time, substance, and causality–are widely held to have been superseded by modern physics.

So what is so great about Kant? How is his philosophy still relevant today? What makes his texts worth studying and his ideas worth pondering? These are questions that could occasion a big book. What follows is my brief two penn'th on Kant's contribution to modern ways of thinking. I am not suggesting that Kant was the first or the only thinker to put forward the ideas mentioned here, or that they exhaust what is valuable in his philosophy. My purpose is just to identify some of the central strains in his thought that remain remarkably pertinent to contemporary debates.

1. Kant recognized that in the wake of the scientific revolution, what we call “knowledge” needed to be reconceived. He held that we should restrict the concept of knowledge to scientific knowledge–that is, to claims that are, or could be, justified by scientific means.

2. He identified the hallmark of scientific knowledge as what can be verified by empirical observation (plus some philosophical claims about the framework within which such observations occur). Where this isn't possible, we don't have knowledge; we have, instead, either pseudo-science (e.g. astrology), or unrestrained speculation (e.g. religion).

3. He understood that both everyday life and scientific knowledge rests on, and is made orderly, by some very basic assumptions that aren't self-evident but can't be entirely justified by empirical observations. For instance, we assume that the physical world will conform to mathematical principles. Kant argues in the Critique of Pure Reason that our belief that every event has a cause is such an assumption; perhaps, also, our belief that effects follow necessarily from their causes; but many today reject his classification of such claims as “synthetic a priori.” Regardless of whether one agrees with Kant's account of what these assumptions are, his justification of them is thoroughly modern since it is essentially pragmatic. They make science possible. More generally, they make the world knowable. Kant in fact argues that in their absence our experience from one moment to the next would not be the coherent and intelligible stream that it is.

Read more »

Monday, November 17, 2014

More Is Different

by Tasneem Zehra Husain
Emergence is a word deep enough to lose oneself in. It alludes to realities appearing, not suddenly or out of nothing, but slowly dissolving in to our consciousness – like a fuzzy picture, coming into focus. It refers to a gradual process, one that is smooth – not jerky – and yet results in an outcome that could not have been predicted, given the origin.
Examples of such behavior abound in the natural world. In stark contrast to human mobs, there are groups that exhibit increasing coherence and/or intelligence as they grow (In fact, intelligence too, is said by some to be an emergent phenomenon.) When birds or fish amass in large numbers, they move in ordered flocks, exhibiting a degree of synchronization and structure that is lacking in smaller groups. The organization of ice crystals is not hinted at in the molecules of water, any more than the structure of hurricanes is stamped onto individual air molecules, or instructions for avalanches are coded into grains of sand. So long as objects are studied in isolation, they display no hint of what becomes possible in groups that exceed a certain threshold. Emergent behavior is a property not of individuals, but the collective; it arises naturally, out of multitudes – a perfect illustration of a whole being greater than the sum of its parts.
In science, a phenomenon is deemed emergent if it cannot be attributed to the properties of the constituents of a system, but instead arises from the connections between them. It is an ability that resides not in the nodes themselves, but in the network they create. Think of that childhood game of join the dots. With each new dot that is added, the possibilities are multiplied manifold; a new dot can potentially connect to every single dot that already exists, forming bonds that both strengthen, and transform, the system. A network grows exponentially faster than the number of its nodes.
Nodes
It so happens that in physics, there are many cases where macroscopic and microscopic behavior are best described in different vocabularies. One oft-quoted example is that of classical mechanics ‘emerging' from quantum mechanics. At the turn of the last century, the reluctant revolutionary Max Planck was forced to declare a resolution to a set of problems that had plagued physicists for years. All these contradictions would disappear, he grudgingly said, if one assumed that energy could only be radiated and emitted in discrete blocks – he called these quanta. Barely was the quantum unleashed that it spread like a forest fire throughout physics. Suddenly, it became apparent that many quantities we had considered infinitely divisible, existed instead in multiples of a smallest basic unit. Zeno's paradox finally had a solution – you could simply not keep covering “half the remaining distance” between yourself and something else, because beyond a certain point, even space can no longer be subdivided. The quantization penetrated down to the very structure of the atom, which seemed to allow only certain well-defined orbits in which the electrons could revolve around the nucleus.

Read more »

Monday Poem

“The attitude of man is twofold in accordance with the two
basic words he can speak.”
—Martin Buber, I and Thou

.
Conjugation

In a diner my elbows rest upon Formica. I hold a book.
Curlicues of vapor rise above the coffee you’ve just poured.
I lure Thou with my take on Buber
hoping to shift the poles of my twofold attitude
from I-I to a here beyond that incarceration
when Thou and I might disappear in conjugation
.

Jim Culleny
9/18/13

7500 Miles, Part III: Ain’t No One Gonna Turn Me Around

by Akim Reinhardt

The ThinkerI've made some deep runs in my time.

I once drove non-stop from central Wyoming to eastern Iowa before passing out at a highway rest stop for a couple of hours, waking up with a scrambled brain, driving the short distance to Illinois, then staring with confusion and regret at the chili cheese omelette I'd ordered at a pre-cell truck stop where drivers sat with piles of quarters in front of them at booths hard wired to pay phones.

Another time I went from the Nevada-Utah line to eastern Nebraska, staving off sleep during the last several hours by frequently leaning my head out the window at 80 miles per hour, the wind and rain whipping me in the face beneath the dark night sky.

My most recent super haul was from Windsor, Arizona to northeastern Kansas, where I'd finally pulled over to sleep in a rural parking lot. But that was fifteen years ago. I was in my early thirties back then.

In the months leading up to the trip I've chronicled here, I had wondered: What do I still have left in me? What would the road be like for me in my late forties?

I had no illusions. I knew I wouldn't be busting tail nonstop for 1,200 miles. Even in my prime that was at my outer limits. It was unthinkable now.

But beyond the issue of endurance, I was more intrigued, and even fretful, about how I would take to the road.

What would it be like to long haul now compared to back then? What would my state of mind be after 600 miles? Seven hundred? Eight hundred, if that was even feasible. Would I still find driving alone for vast stretches to be meditative? Would I still marvel at the expanse of this continent? Or would I simply be middle aged and grumpy? Would I be helpless to enjoy a solo, long distance drive as I once had? Would I just be petty and impatient to reach my destination?

Even since before I first left Maryland back in late August, I knew this would be the jaunt. From Pine Ridge Reservation in South Dakota to Reno, Nevada. No other stretch of the trip is much more than 500 miles. This one's over 1,200.

Going in, I knew that South Dakota to the Nevada-California border in late September would sort it all out.

Read more »

American Craziness: Where it Came from and Why It Won’t Work Anymore

by Bill Benzon

During the course of my adult life I have witnessed the collapse of the political culture of my nation, the United States of America. To be sure, there have been some good things – the Civil Rights movement, for example – but the framework that served from the nation’s founding through the end of World War II no longer functions well.

Over the last three or four decades the prison population has increased enormously, as has economic inequality, and during this century we’ve become mired down in an enormously destructive, expensive and militarily ineffective series of wars in Iraq and Afghanistan. As far as I can see there is no near-term prospect of ending either the internal problems or the hopeless and ill-founded war on terrorism.

How did this happen?

Cultural Psychodynamics

The problem, I believe, is rooted in the cultural psychodynamics of the nation-state. The sociologist Talcott Parsons diagnosed it in his classic 1947 article, “Certain Primary Sources and Patterns of Aggression in the Social Structure of the Western World” (full text online HERE). At some length and with great sophistication Parsons argued that citizens of Western nations project many of their aggressive impulses onto other peoples so that, in attempting to dominate those peoples, they are, in a psychological sense, attempting to attain mastery over themselves. I fear this problem is not only a Western one, but that’s a side issue in this context. It’s not merely that I’m writing about America, but that America remains the most powerful nation in the world, with by far the largest military establishment. Through that establishment America has tethered the rest of the world to its internal psychodynamics.

That’s crazy.

If by chance Parsons’ argument strikes you as improbable, well, I urge you to read his essay in full. Pending that, I offer as a bit of supporting evidence an extraordinary statement made by Mario Cuomo, ex-governor of New York, in interview published in The New York Times Magazine on March 19, 1995:

The Second World War as the last time that this country believed in anything profoundly, any great single cause. What was it? They were evil; we were good. That was Tojo, that was that S.O.B. Hitler, that was Mussolini, that bum. They struck at us in the middle of the night, those sneaks. We are good, they are bad. Let’s all get together, we said, and we creamed them. We started from way behind. We found strength in this common commitment, this commonality, community, family, the idea of coming together was best served in my lifetime in the Second World War.

That’s what Parsons was talking about.

I have no idea whether or not Cuomo is familiar with Parsons but, while he is certainly an intelligent and sophisticated man, he is not an academic. When he spoke those words he was speaking as a practical politician skilled at the complex and messy business of governance. The socio-cultural milieu that Parsons analyzed is the arena in which Cuomo lived his professional life. Judging by his political success, he had a good intuitive grasp of those dynamics.

Read more »