It’s Time to Change the World, Again

by Bill Benzon, appendix by Charlie Keil

IMGP2888

Writer’s Cypher, UMMI Living Village Community Garden, Jersey City, NJ

Adolescents seem gifted in the belief that, if only the adults would get out of the way and grow up, the world would be a much better place. In that sense I am still, at 66 going on 67 (Pearl Harbor Day of this year) an adolescent. I still believe that the world needs changing, though it’s been decades since I naively thought that letters to The New York Times were a reasonable means to that end. And I still believe that it’s the adults that need changing.

But I must also move forward in the realization that I too am an adult, and have been so for some time now.

What to do?

IMGP4383

I painted this when I was nine or ten.

I was ten years old when the Russians put Sputnik into the heavens. I still remember the October evening when my father took me outside and pointed to a moving light in the night sky. “That’s it.”

That’s when my personal history joined world history. That’s the first event that was both personally meaningful to me–I’d been drawing sketches of spaceships for years and had even done a painting or two–and was also of world historical importance. By the time I was old enough to be an astronaut, however, I’d changed.

I’d gone to college, marched against the Vietnam War, done my conscientious objector’s alternative service in the Chaplain’s Office at Johns Hopkins, and lost all interest in becoming an astronaut. Inner space had become my territory.

I got my PhD, then a job at the Rensselaer Polytechnic Institute, was astounded when Reagan was elected and re-elected–that hadn’t been in the plan, no it hadn’t. And I was really surprised when the Soviet Union collapsed. After all, I’d grown up during the height of the Cold War, read articles about back-yard bomb shelters, and had even picked out the spot in our back yard where a shelter should go. I figured that, whatever else happened in the world, that I’d go to my grave in the shadow of the Cold War.

Read more »



Making sense of suicide (and Matt Walsh’s nonsense)

by Grace Boey

Suicidal-Fingers-depression-23530251-481-720Imagine this: someone secretly laces your coffee with meth, every morning, for 28 mornings. Over the first week, you become increasingly hyperactive, and start to bubble with confidence and energy. You feel great, but by day 7, your behaviour starts to get erratic, and you’re irritated with everyone else who can’t keep up. By day 21, you’re having flashes of paranoia, and freak out from time to time because your mind keeps racing, and you’re convinced everyone’s watching you move too fast.

By day 28, you haven’t slept for a week. You feel invincible, so much so that you decide to take all the drugs you’ve got to see if it will kill you. Because that which doesn’t kill you makes you stronger, right? And if it does kill you, you’ll die feeling amazing… and dying would be such an incredible thing to do. In fact, this had damn well better be fatal. Thanks to the meth and sleep deprivation, you are so confused, irrational and psychotic, that this babbling seems entirely sensible.

Was the suicide attempt ‘your own decision’, in any meaningful sense? Of course it wasn’t. It certainly wasn’t my decision, when those very events happened to me a couple of years ago. The only difference? No one had secretly laced my coffee with drugs (though they might as well have). The terrifying effects were a product of my very first full-blown bipolar manic episode. Thankfully, I survived—although the doctors who treated me assured me I could just as easily not have. I had no clue what was happening at the time; my mania had swept me away, before I even realized anything was amiss.

Despite all this, people like Christian blogger Matt Walsh would say I had committed a “terrible, monstrous atrocity” that was entirely my decision. On August 12, one day after Robin Williams appeared to have killed himself as a result of depression, Walsh published an article with the headline “Robin Williams didn’t die from a disease, he died from his choice.” In it, he claimed that “suicide does not claim anyone against their will”. Depression—and by extension of Walsh’s arguments, all mental illness—is not responsible for suicide: you are. When a huge backlash ensued, he stuck to his guns and wrote a detailed response to his critics.

When I first came across the headline of Walsh's original post, I took a deep breath, read the article, took another deep breath… and read it again. My conclusion at the end of this exercise was the exactly same as my initial response: what a load of exploitative, uninformed rubbish. Walsh's statements reflect deep misconceptions about mental illness, competent decision-making and ‘free will’, which (unfortunately) hinge on the supernatural metaphysics that accompanies Christianity. It angers me that someone like this should feel entitled to piss on the grave of Robin Williams with a headline like that. And personally, as someone who has attempted suicide under the grips of both mania and depression, I am insulted by Walsh's backward ideas.

Read more »

Monday, August 18, 2014

Arguing to Win

by Scott F. Aikin and Robert B. Talisse

MjAxMy1iMmRkOWFmOTdmNjRlZDFlIn the course of discussing the central themes of our recent book, Why We Argue (And How We Should), with audiences of various kinds, one kind of critical response has emerged as among the most popular. It deserves a more detailed reply than we are able to provide here; nonetheless, we want to sketch our response.

Why We Argue presents a conception of proper argumentation that emphasizes its essentially cooperative and dialectical dimension. Very roughly, our view runs as follows. The central aim of cognitive life is to believe what is true and reject what is false. We pursue this by appealing to our evidence and reasons when forming and maintaining our beliefs. Yet in pursuing this aim, we quickly encounter conflicting and inadequate evidence and reasons; furthermore, we discover that we each must rely upon other people as sources of evidence and reasons. Importantly, the others upon whom we must rely do not always speak univocally; they often provide conflicting reasons and evidence. Accordingly, in the pursuit of our central cognitive aim, we confront the inevitability of disagreement. Argumentation is the process by which we attempt to resolve disagreement rationally. Consequently, argumentation is inescapable for a rational creature like us; and the aspiration to argue properly is an indispensible corollary of our central cognitive aim.

The project of arguing well requires individuals to interact with each other in certain ways, and to avoid interacting in other ways. More specifically, in order to argue well, we must individually attempt to take the reasons, perspectives, arguments, criticisms, and objections of others seriously; we must see even those with whom we most vehemently disagree as fellow participants in the process of proper argumentation, and we must engage with them on those terms. This means, among other things, that when engaging in argument, one must seek to make the most of the reasons and considerations offered by one's opposition. Verbal tricks, insults, threats, and obfuscation are failures of argumentation, even when they prove effective at closing discussion or eliciting assent. A lot of Why We Argue (And How We Should) is devoted to cataloguing and dissecting common ways in which argumentation, especially political argumentation, fails.

So much for the nutshell version of our conception of argumentation. Let's turn now to the critical reaction it commonly invites. Critics say that our view is misguided because it cannot acknowledge the brute fact that most often we argue not to rationally resolve disagreement, but to end disagreement; and the favored way of ending disagreement is by winning an argument. Here a sports analogy is often introduced. Critics often claim that just as one plays baseball not (primarily) for the exercise, camaraderie, or the cooperative teamwork, but rather to win baseball games; so it is that when one argues, one argues to win.

Read more »

The Fields Medal

by Jonathan Kujawa

DownloadThe big news in math this week was the opening of the quadrennial International Congress of Mathematicians (ICM) in Seoul. A number of prestigious awards are given at the ICM. Most famously this includes the Fields medal and the Nevanlinna prize (aka the Fields medal for computer science). Up to four winners of the Fields medal are announced along with the winner of the Nevanlinna prize. All the winners must be no older than 40.

I had the pleasure to attend the 2006 ICM in Madrid. This is the ICM famous for Grigori Perelman refusing to accept the Fields medal for his work in finishing the proof of the Poincaré conjecture. Perelman (or at least the media version of him) comes across as the stereotypical eccentric mathematician uninterested in worldly things. Fortunately for the PR folks, this year's winners all appear to be the sort you'd enjoy having over for dinner and drinks.

This year the Fields medal went to Artur Avila, Manjul Bhargava, Martin Hairer, and Maryam Mirzakhani. The Nevanlinna prize went to Subhash Khot. An excellent profile of each of the winners, including very nicely done videos, can be found on the Quanta website. The profiles are a bit short on the actual math of the winners. If you'd like a more meaty discussion of their work, former Fields medalist Terry Tao wrote blog posts here and here giving a more technical overview. Even better, former Fields medalist Timothy Gowers is blogging from the ICM itself! He's giving summaries of the main talks as well as his more general impressions while at the event. I can also recommend that you check out the excellent overviews of some of the winners' work on John Baez's Google+ page.

Rather than talk about the details of the winners' work [1], I wanted to point out a meta-mathematical common feature of their research. This is the idea of studying a collection of objects as a whole, rather than one by one.

Read more »

Monday Poem

Tide

the way it
comes, goes,
surges, disappears,
a perfect metaphor
for shapes of time,
overused as moon
for that which vanishes
and reappears.

quiet now, the wedding past
too much so—
a house that buzzed
now hushed, silence loud
sharp, slimmer
than a midnight crescent

silence also
comes, goes
empties, spills, ebbs and fills,
evaporates and billows like a cloud
above a sugarbush still
boiling down sweet water
for its essence
.

by Jim Culleny
7/25/14

Socrates, evolution, and the word “theory”

by Paul Braterman

UWASocrates_gobeirne_croppedWhat's wrong with this argument? More than you think!

All men are mortal.
Socrates is a man.
Therefore Socrates is mortal.

It's perfectly valid, meaning that if the premises are true, the conclusion must also be true. Despite this, as Bertrand Russell explained very clearly many years ago,[1] the argument is almost totally worthless.

There is no real doubt that Socrates is mortal. Just look at the poor chap, clearly showing his 70 years. Bent, scarred from the Peloponnesian War, his brow furrowed by decades of unhappy marriage, and even more unhappy attempts[2] to persuade his fellow citizens that the best form of government is a fascist oligarchy. Besides, he is on trial for doubting the existence of the gods, and the news from the Agora is not good. Take my advice, and do not offer him life insurance.

Even if we didn't know about his circumstances, we would readily agree that he is mortal. We see decrepitude and death around us all the time, few people have been known to live beyond a hundred years, none beyond 150, and we have no reason to believe that Socrates is any different. In fact, from our experience, we are a lot more certain that Socrates is mortal than we are that all men are mortal. Ganymede, Elijah, and the Virgin Mary were all, according to various traditions, taken directly up into heaven without having to go through the tedious process of dying. However, no Zeus-worshipper or biblical literalist or devout Catholic would for such reasons doubt the mortality of Socrates. So the premise, that all men are mortal, is actually less certain than the conclusion, and if we seriously doubted Socrates's mortality, we would simply deny that premise. In other words, this classic example of deductive logic tells us nothing that we didn't already know.

Read more »

The Psychology of Procrastination: How We Create Categories of the Future

by Jalees Rehman

“Do not put your work off till tomorrow and the day after; for a sluggish worker does not fill his barn, nor one who puts off his work: industry makes work go well, but a man who puts off work is always at hand-grips with ruin.” Hesiod in “The Works and Days

Clock-92130_640

Paying bills, filling out forms, completing class assignments or submitting grant proposals – we all have the tendency to procrastinate. We may engage in trivial activities such as watching TV shows, playing video games or chatting for an hour and risk missing important deadlines by putting off tasks that are essential for our financial and professional security. Not all humans are equally prone to procrastination, and a recent study suggests that this may in part be due to the fact thatthe tendency to procrastinate has a genetic underpinning. Yet even an individual with a given genetic make-up can exhibit a significant variability in the extent of procrastination. A person may sometimes delay initiating and completing tasks, whereas at other times that same person will immediately tackle the same type of tasks even under the same constraints of time and resources.

A fully rational approach to task completion would involve creating a priority list of tasks based on a composite score of task importance and the remaining time until the deadline. The most important task with the most proximate deadline would have to be tackled first, and the lowest priority task with the furthest deadline last. This sounds great in theory, but it is quite difficult to implement. A substantial amount of research has been conducted to understand how our moods, distractability and impulsivity can undermine the best laid plans for timely task initiation and completion. The recent research article “The Categorization of Time and Its Impact on Task Initiation” by the researchers Yanping Tu (University of Chicago) and Dilip Soman (University of Toronto) investigates a rather different and novel angle in the psychology of procrastination: our perception of the future.

Read more »

Paul’s Boutique: An Appreciation

by Misha Lepetic

“If you explain to a musician he'll tell that
he knows it but he just can't do it”
~ Bob Marley

Pauls_bigIt's hard to imagine that the Beastie Boys released “Paul's Boutique” around this time, 25 years ago. Even more astonishing is the fact that I recently had two separate conversations with members of the so-called Millennial Generation, which resulted in the extraordinary discovery that neither person had even heard of “Paul's Boutique.” Now this may make me sound like an ornery codger complaining about how the young folk of today are illiterate because they have never heard of (insert name of your own pet artist). But taken together, these two events require me to submit a modest contribution to keeping the general awareness of “Paul's Boutique” alive and well.

What makes “Paul's Boutique” so extraordinary and enduring? The sophomoric effort by the brash NYC trio debuted in 1989, and was the much-anticipated follow-up to “License To Ill.” But instead of a new set of frat party anthems along the lines of “Fight For Your Right (To Party),” listeners were treated to a continuous magic carpet woven out of a kaleidoscope of samples. Romping over this dense, schizophrenic bricolage, MCA, Ad-Rock and Mike D traded lightning-quick call-and-response rhymes that embraced the usual MC braggadocio but at the same time drew on a vast range of sources and styles. The effect, to this day, is a delirious sort of aural whiplash.

No one is clear on how many songs were actually sampled, although the number is certainly well over a hundred. The exegesis of both samples and lyrical references is a time-honored tradition, too. Around 1995, one of the first sites that ever made me think the World Wide Web might be a good idea was (and continues to be) the Paul's Boutique Samples and References List. When studied, Torah-like, alongside the Beastie Boys Annotated Lyrics and the record itself, one begins to appreciate the catholic taste of both the rappers and their producers, the inimitable Dust Brothers, who would go on to provide much of the genius behind Beck's seminal “Odelay” album a few years later.

Read more »

In Bed

by Tamuira Reid

I don't like writing about depression. Because it's hard to get right in words. Because I sound like an asshole when I try. Because I am too close to it still. Because my memory of what happened feels faulty at best.

I remember light streaming through the blinds, big fat rays of sun, hitting me in the face. I remember a phone next to me, maybe in the palm of my hand, maybe wedged between the mattress and my thigh. Cold coffee on the nightstand. Cigarette ash on the sheets. I remember the sounds of kids playing on the street below, throwing rocks at a metal shop gate.

Friends told me to buck up. Pull it together. Muscle through. They said things like fake it till you make it and everything happens for a reason. They blamed it on global warming. Growing pains. Venus is in retrograde, after all.

They wanted me to will myself better and all I wanted was to write my will. I thought I was dying. I believed with every fiber left of my being that I was dying. Case closed. The party is over.

The more I needed people the more I retreated from them. How could I tell them that I couldn't feel my body? That it was completely disconnected from my mind? I was a person in parts, each part trying to function in its dysfunction. Pieces that no longer fit together in a way that made any sense.

My neighbor at the time, a well-meaning philosophy professor that only left his apartment long enough each day to teach and buy wine, told me that depression comes in waves. But that made it sound too beautiful. There was nothing good about the bad. He suffered from melancholy, a sort of condition that he became addicted to, enamored of. A powerful, deep sadness that became life-affirming for him. People broke his heart but in a pretty, poetic way. And this somehow gave him buoyancy in this world.

But my depression felt different.

Read more »

Do Androids Dream of Electric Tomatoes? Food and Nostalgia

by Dwight Furrow

Electric tomatoesThe world of food and wine thrives on a heavy dose of nostalgia. Culinarians (“foodies'” in the vernacular) chase down heritage tomatoes, ferment their own vinegar, and learn to butcher hogs in the name of “how things used to be” before the industrial food business created TV dinners and Twinkies. As we scour the Internet for authentic recipes, we imagine simpler times of family farms supporting family feasts consisting of real food, prepared in homey, immaculate kitchens with fruit pies on the windowsill, and the kids shelling beans at the table. Similarly, the wine industry continues to thrive on the romantic myth of the noble winemaker diligently tilling a small vineyard year after year to hand-produce glorious wines that taste of the local soil and climate.

Of course, in reality the winemaking of days past was not so romantic. Bad weather would have ruined some vintages and difficulties in controlling fermentation temperatures and unsanitary conditions in the winery rendered many wines barely drinkable. As to the way we ate in the not-to-distant past, for most people, food was scarce, expensive, of poor quality and often unsafe. Kitchens, if they existed, were poorly equipped and their operation depended on difficult, relentless work by women. Only the wealthy could eat in the manner approaching the quality of contemporary nostalgic yearnings, but that quality usually depended on the work of underpaid kitchen staff after slavery was abolished.

Nostalgia is a form of selective memory, history without the bad parts, enabling us to enjoy the past without guilt.

Does this dependency on myth render our contemporary fascination with the foods of the past a kind of kitsch—a sentimental, clichéd, easily marketed longing that offers “emotional gratification without intellectual effort” in Walter Benjamin's formulation, an aesthetic and moral failure? Worse, is this longing for the past a conservative resistance to the modern world. The word “nostalgia” has Greek roots—from nostos and algia meaning “longing to return home”. Are contemporary culinarians and wine enthusiasts longing for a return to the “good” old days?

Read more »

The Subtle Power of Financial Jargon

by Kathleen Goodwin

J-Jargon-cartoon2-300x278Few students of colonial history can deny the power of spoken and written language to subject and subdue a population. Zia Haider Rahman's “dazzling debut” of a novel, “In the Light of What We Know”, contrasts two South Asian characters who attended Oxford together as undergrads—a privileged Pakistani narrator who becomes an investment banker in London and his friend Zafar, an impoverished refugee of the 1971 Bangladeshi Liberation War turned Harvard educated international human rights lawyer. One of the central themes of the novel is the way language can exert power over individuals and groups, as well as entire nations. Spoken language is an obvious manifestation of the tension created by modern day neo-colonialism or the 1971 splintering of Bengali-speaking East Pakistan from Urdu-speaking West Pakistan. But Rahman also explores a parallelism in the way language—in the form of industry jargon, acronyms and other forms of coded phrasing— can create power structures with remarkable potency.

“In The Light of What We Know” skips around temporally but the narrative is centered around London in 2008, in the midst of the unfurling financial crisis. The nameless narrator is revealed to not only be a banker, but the head of the mortgage-backed securities unit of his (also unnamed) global investment bank and thus on the verge of losing his job as the public condemns him and his counterparts for the calamity that is taking place. Tellingly, Rahman's résumé includes a stint as a Wall Street investment banker prior to becoming a novelist. His purpose does not appear to be to crucify the financial sector, rather his novel explores the great irony of the financial crisis—the securities derived from residential mortgages, which when the American housing market collapsed became essentially worthless and set off the chain of events that have changed history forever, were vetted and encouraged by the entities that should have understood and prevented the systemic risk to global markets these securities posed. Rahman's narrator explains this as being a function of the incestuous and hierarchical relationship between the big banks and the ratings agencies and regulators. The narrator describes a financial club of sorts, headed by chummy Oxford classmates who maintain a revolving door hiring policy and most importantly—speak a financial language incomprehensible to the ignorant public. The critical point is the way these hidden power structures allowed the conditions preceding the financial crisis to occur. The ratings agencies and regulators were compliant with the investment banks while the rest of society was unaware of the huge gambles being sanctioned, which eventually proved to be detrimental to the stability of the global economy.

Read more »

Bouquet of Nerves

by Shadab Zeest Hashmi

NY Night Bridge smallStarry night, a large starry night with infinite trees, is the background of what seems to be an architectural form— a balcony, bridge, courtyard with pillars? In the foreground, a sphere with a curve draped over it like an arm. This drawing has the expansiveness that suggests eternity (or waiting for what seems like an eternity) and monumentality, as well as intimacy, a sense of security. A birthday present from my teenage son, this abstract drawing is titled “colic.” The architectural form is a crib, the starry sky is the sleepless, endless night of shared anxiety between a mother and her colicky newborn.

I am handed this drawing on my return from an evening in New York City, my eyes still filled with the lambent and the monumental, with sorrows hidden under careful inscriptions; riches, anxiety, loneliness, poverty, and also a plentitude of heart, a sharing of burdens. My son’s drawing belongs in the series of photographs I have just taken of the city— of monumentality and intimacy: endless tunnel ceilings, vertiginous buildings, old trees, sparkle, strangers caught sharing a laugh as they contend with waiting in queue together. Wear this city like a jewel if you will, or a sensible shoe— carry it like a bouquet of nerves, or an empty envelope— New York is resplendent and humble, so high and mighty it gives you the cold shoulder, so electric it sings you into rebirth.

“Colic” is about birth, and the anxiety and excitement of growth. When I read New York into this drawing, I see the loftiness of empire— starry and sorry— the darkness of hierarchies, the bond of empathy. I see the struggle for meeting the definition of nationhood, the founding fathers are in the high rises, in the homeless, in the cogs and wheels, in the sobs and hiccups of the centuries.

But it is my birthday today and this drawing jolts me into the realization that the night sky is still full of uncertainty, mystery and hope— colic is still a good metaphor for life, that I still long for my own mother’s protective arms, that nothing is sweeter than to be remembered as an extended arm by my son.

Monday, August 11, 2014

The Mind Matters

by Yohan J. John

What is the mind? And what is its relationship with the body? Philosophers, psychologists, cognitive scientists and neuroscientists have all attempted to bring their professional heft to bear on the “mind-body problem”, but consensus remains elusive. At best, mainstream academics and researchers share a metaphysical commitment: the belief that the seemingly immaterial mind emerges from ordinary matter, specifically the brain. This position — known as materialism or physicalism — has replaced mind-body dualism as the mainstream academic position on the mind-body problem. According to dualism, mind and matter are completely separate substances, and mind (or soul, or spirit) merely inhabits matter. Dualism is a problematic position because it doesn’t offer a clear explanation of how the immaterial mind can causally interact with the material body. How can the immaterial soul push the buttons in the body’s control room… if it doesn’t have hands?

Mind-developmentMaterialism avoids this issue by denying the existence of two separate substances — mind is matter too, and is therefore perfectly capable of influencing the body. But having made this claim, many materialists promptly forget about the influence of the mind on the body. There seems to be a temptation to skip the difficult step of linking complex mental phenomena with neural processes. Many people think this step is just a matter of working out the details, and they readily replace mental terms like ‘intention’, ‘attitude’, or ‘mood’, with terms that seem more solid, like ‘pleasure chemical’, ‘depression gene’, or ‘empathy neuron’. But these concepts have thus far proved woefully inadequate for constructing a mechanistic theory of how the mind works. Rather than explaining the mind, this kind of premature reductionism seems to explain the mind away. While we work out the details of how exactly the brain gives rise to intentions, attitudes, and moods, we should not lose sight of the fact that these kinds of mental phenomena have measurable influences on the body.

Recent studies linking epigenetics, neuroscience and medicine reveal that subjective experience can have a profound impact on our physical and mental well-being. Mounting evidence is telling us something that was often neglected in the incomplete transition from dualism to materialism — that the mind is a crucial material force that influences the body, and by extension, the world outside the body.

Read more »

How to say “No” to your doctor: improving your health by decreasing your health care

by Carol A. Westbrook

PillsHas your doctor ever said to you, “You have too many doctors and are taking too many pills. It's time to cut back on both”? No? Well I have. Maybe it's time you brought it up with your doctors, too.

Do you really need a dozen pills a day to keep you alive, feeling well, and happy? Can you even afford them? Is it possible that the combination of meds that you are taking is making you feel worse, not better? Are you using up all of your sick leave and vacation time to attend multiple doctors' visits? Are you paying way much out of pocket for office visits and pharmacy co-pays, in spite of the fact that you have very good insurance? If this applies to you, then read on.

I am not referring to those of you with serious or chronic medical conditions, such as cancer, diabetes, and heart disease, who really do need those life-saving medicines and frequent clinic visits. I am referring here to the average healthy adult, who has no major medical problems, yet is taking perhaps twice as many prescription drugs and seeing multiple doctors 3 – 4 times as often as he would have done ten or fifteen years ago. Is he any healthier for it?

There is no doubt that modern medical care has made a tremendous impact on keeping us healthy and alive. The average life expectancy has increased dramatically over the last half century, from about 67 years in 1950 to almost 78 years today, and those who live to age 65 can expect to have, on average, almost 18 additional years to live! Some of this is due to lifestyle changes but most of the gain is due to advances in medical care, especially in two areas: cardiac disease and infectious diseases, especially in the treatment of AIDS. Cancer survival is just starting to make an impact as well. But how much additional longevity can we expect to gain by piling even more medical care on healthy individuals?

Read more »

Poem from the Calamity Jane Chronicles

by Mara Jebsen

Good morning, Calamity Jane Imgres

–plenty disaster for breakfast again.

Staunching of blood. Stemming of tide. Not

enough Dutchboys with fingers in the dam. . .

Do Wake Up Calamity Jane. We are killing

even our favorites, again. Soldiers & small

flash-mouth boys. Pedestrian

death.—Plenty of martyr for our

coffers. or Coffins. Plenty…fodder for our

Martinis–I mean plenty…

Don't get muddled, Calamity Jane.

Terror's a river that rises again. Brace

for the hate that laps up the spine,

Stockpile dreams for a dryer time.

Poem

RANT FOR GAZA

Now is the summer of our discontent.
Banned bombs,
Made in America,
once again rain down mercilessly
on the world’s largest open-air prison.
Photos of Raggedy Ann dolls
pulled out from Gaza’s burning debris
makes steeliest men sob.

How do middle-class Muslim youth
from Seattle to Srinagar
manage, to the extent they do,
their blind rage,
their helplessness
at the most compelling moral issue
of our generation:
the organized ethnic cleansing
of Palestinians,
as well their land
by the Zionists,
aided by the most mighty democracy?

America,
you arm & enrich a colonial settler state,
inserted into Palestine
by Imperial design,
your cop on the beat in the Mid East
who assures the oil flows smoothly,
& despots keep the subjects quiet.

Read more »

Monday, August 4, 2014

Karl Marx’s Guiding Idea

by Emrys Westacott

Imgres

“Nothing human is alien to me.” This was Karl Marx's favourite maxim, taken from the Roman writer, Terrence. But I think that if Marx had lived a century later, he might have added as a second choice the famous phrase sung by Sportin' Life in Gershwin's Porgy and Bess: “It ain't necessarily so.” For together these two sayings capture a good deal of what I think of as Marx's Guiding Idea, the idea at the heart of his philosophy that remains as valuable and as relevant today as in his own time. Let me explain.

Human beings have been around for a few million years, and for most of that time most people's material and social circumstances have been quite stable. The experiences of one generation were pretty much the same as the experiences of their forbears. In this respect the lives of humans were like those of other animals. Unlike other animals, however, human beings reflect on their lives and circumstances; moreover they communicate these reflections to one another. The result is religion, mythology, philosophy, history, literature, and the performing arts (all of which can arise within a purely oral culture), and eventually the natural sciences, and social studies of various kinds, such as psychology, sociology, economics, and political theory.

These diverse forms of reflection on the human condition perform various functions. One function is to explain why things are the way they are. For instance, the bible explains why the Israelites lived in Israel (God made a promise to Abraham, and kept it, enabling Joshua's army to conquer the land); the theory of the four humours purported to explain personality differences between individuals. Another function is to justify a certain order of things. Thus, the doctrine of the divine right of kings sought to justify the institution of a powerful executive who stands above the law. The doctrine that individuals have a right to freedom of thought and expression is often cited to justify a policy of religious tolerance.

These two functions are sometimes hard to disentangle. For example, the alleged cultural inferiority of a people might be taken both to explain why they have been conquered and to justify that conquest as legitimate or even desirable. The “laws” of market competition provide an explanation of why some individuals and businesses do better than others, and these same laws are appealed to by those inclined to endorse the the outcome of the competition.

Read more »

On PBS Nature Documentaries, and My Life as a Turkey

by Hari Balasubramanian

One late summer afternoon two years ago, I saw a monarch butterfly casually fly across my office window in Amherst, Massachusetts. If I had not known what monarchs do, I would have only admired its beauty and then forgotten all about it. But since I'd seen a documentary on these butterflies the previous year – how this little creature, barely a few inches long and wide, makes a 2000-mile journey from Canada and US to certain forests in Mexico all by itself, traveling as much as 50 miles each day, navigating its way based on some unknown compass, and then returning back to its northern haunts in the space of multiple generations – since I knew these facts, that moment when I glimpsed the butterfly was suddenly full of wonder and meaning.

I mention this because many nature documentaries, or even short videos, have had a similar effect on me. I like the PBS Nature series the most (full videos available here). The species, habitats and themes vary –and not all the episodes are consistent – but there is always something unusual to learn and contemplate. Just a few random examples: how the male stork, after having made a long journey from Africa to a rooftop nest in a German village, reunites with its late-arriving partner (Earthflight); how a relatively small creature such as the honey badger could be so powerful, intelligent, and – this was the most striking for me – be gifted with a fearless attitude, so much that even lions know to stay away; or, how some astonishing friendships can be formed across species, as in the sanctuary where a goat, unfailingly and without any obvious benefit to itself, helps lead a blind, old horse on its daily graze every single day (Animal Odd Couples).

My Life as a Turkey

Today, though, I'll focus on a Nature episode that won the Emmy award for outstanding nature programming. First aired in November 2011, My Life as a Turkey (full video) skillfully recreates the year that that naturalist and wildlife artist Joe Hutto spent raising 16 wild turkey chicks all by himself, in a forest in Florida. (The qualifier “wild” distinguishes wild turkeys from their domesticated cousins that are consumed as food.)

Joe_turkey

Hutto isn't simply a passive observer. He takes on the role of an emotionally invested mother from the moment the turkeys are born until they are independent. As he writes in Illumination in the Flatwoods, the book on which the film is based: “Had I known what was in store—the difficult nature of the study and the time I was about to invest—I would have been hard pressed to justify such an intense involvement. But, fortunately, I naively allowed myself to blunder into a two-year commitment that was at once exhausting, often overwhelming, enlightening, and one of the most inspiring and satisfying experiences of my life.”

Read more »