Ai Weiwei and the fine art of the art installation

by Mathangi Krishnamurthy

As a rule, I am wary of art installations. I am never sure if the form they take bear any relation to the political content they claim to espouse. Also, as a rule, I visit modern art exhibitions for their verbosity. The words speak to me of artistic intent that always races ahead, far in excess of its signifying objects. The intent itself I find to be of such beauty, nudging me with its faint hints of revolution and radical joy. Of course, it does worry me that I have to read the labels of things before I can calculate the impact they will have on my fervor and/or joy.

However, on the lowest rung of my pleasure-affording hierarchy lie modern art installations. I remember once visiting the Museum of Modern Art in New York City and staring hard at a diagonal tube light mounted up on a wall. I also metaphorically bonked myself on the head for “Artist” not making the top three on the list of possibilities suitable to my eight year old self's artistic ability or lack thereof.

As I walked into Ai Weiwei's exhibition “Evidence” I thought to myself that I should maintain a healthy cynicism and a suitably controlled set of expectations about what a set of art installations ought to be able to evoke. In the late afternoon of a confusing Berlin summer, I got off the bus already flush with the pleasure of a scarily efficient public transport system, and walked down the lane to the spot on my Google map that said “Martin-Gropius-Bau”. The Bau is a startlingly beautiful building, all neo-Renaissance in its pastiche of dome, entryway columns, curlicued windows and shadowy moldings. Something already felt right. The sun shone bright and the clouds filtered out its strongest rays. I was suitably warm and the light was suitably right. Ai Weiwei in his entire grandfatherly wallpapered aura stared straight ahead and betrayed no amusement at my sudden and unexpected enthusiasm.

Atrium One

Ai Weiwei: Evidence, Stools, 2014, wooden stools; Martin-Gropius-Bau, Berlin

Across eighteen rooms of the Bau were spread all the works that were being curated under the title “Evidence”. Playing with the concept of both what “discovery” means to police and detective records, and the concept of empirical “evidence” as relating to crimes both contemporary and historical, the main items of this exhibit comprise found, made, and remade artifacts—touchy, feely, gritty physical objects. Most of them display familiar hints of the Ai Weiwei oeuvre. They offer confusing and paradoxical cues by playing with the material they are composed of, they are parts of a much larger story that they bear evidence to, and they are often directly related to aspects of the artist's life.

Read more »

A sin tax on junk entertainment

by Thomas Rodham Wells

Unnamed (1)Governments should tax the production and consumption of junk entertainment like Angry Birds and The Bachelor to correct the market failures that encourage their overconsumption. As with tobacco and alcohol, the point of such sin taxes is not to prevent people from consuming things that are bad for them if they really want to. They are not like bans. Rather, such taxes communicate to consumers the real but opaque long-term costs to themselves of consuming such products so that they can better manage their choices about how much of their lives to give up to them.

At the heart of this proposal is the fact that high art – i.e. real art – like Booker prize winning novels and Beethoven is objectively superior to junk entertainment like Candy Crush and most reality TV. (For now, let us abstract from ‘middle-brow' entertainment like our new Golden Age of TV.) Some egalitarians of taste dispute the existence of any objective distinction in quality between pushpin and Pushkin and argue that the value of anything is merely the subjective value people put on it. I will humour them. The case for the objective superiority of art can be made entirely within a narrowly utilitarian -‘economistic' – account of subjective value: in the long run consuming junk entertainment is less pleasurable than consuming art.

At best, junk entertainment passes the time and brings us closer to death in a relatively painless way. At worst, passing a lot of time in this way makes us stupid by atrophying our abilities to appreciate anything more difficult. Hence the pejorative term ‘junk', for there is a strong resemblance between this sort of mental activity and eating cheeseburgers: the more cheeseburgers we eat, the less we enjoy each new one, and the fatter and more unhealthy we become. In contrast, art has the capacity not only to fill up the limited time we have in our lives, but in the process also to educate us in the enjoyment of its intellectual depths so that it produces more delight in us the more of it we consume. In economics terminology, the consumption of junk entertainment exhibits diminishing marginal utility and reduces our human capital while the consumption of art exhibits the opposite. Art is special.

Read more »

Monday, August 25, 2014

Re-tweeting to the High Ground

by Gerald Dworkin

Many of the quotations that appear in my e-reader Philosophy: A Commonplace Book are one-liners:

There are many ways in which the thing I am trying in vain to say may be tried in vain to be said. —Beckett

Better latent than never. —Morgenbesser

Philosophy is to the real world as masturbation is to sex. —Marx

But often the difference between a one-liner and a many- is arbitrary. Wilde's “I do all the talking. It saves time and prevents arguments” could have a semi-colon instead of a period and be a one-liner. Thus Shaw: One sees things as they are and asks why; another dreams things that never were, and asks why not. Or, as Nabokov put it: The difference between a therapist and the rapist is a matter of distance.

UnnamedThis brings us to the concept of an aphorism. Most definitions, wisely, do not use a precise measure of length or depend grammatical structure. Here are some typical definitions.

1) a pithy observation
2) a terse saying
3) a short phrase
4) a brief statement
5) a concise statement
6) a laconic expression

It was Nietzsche's aim to “to say in ten sentences what everyone else says in a book—what everyone else does not say in a book.” But he was also brilliant at much shorter length. “All truth is simple…is that not doubly a lie?”

Obviously this pithetic character is at best a necessary condition but not sufficient. “Today is Monday” is pithy enough but lacks a certain je ne sais quoi. Again, definitions try to supply the missing ingredient in different ways– “embodying a general truth”, “makes a statement of wisdom” , “astute observation”. The first of these seems, to us now, too weak. “Objects fall when unsupported” is both pithy and a general truth. But Bacon titled one of his books on the nature of science Aphorisms Concerning the Interpretation of Nature.

For what it is worth, the word derives from aphorismos (greek) meaning definition. And indeed historically many aphorisms took the form of definitions. Ambrose Bierce's Devil's Dictionary being a prime example. ACADEMY. Originally a grove in which philosophers sought a meaning in nature; now a school in which naturals seek a meaning in philosophy. PHILOSOPHY. A route of many roads leading from nowhere to nothing.

All of this is by way of introducing the reader to a particularly clever and astute practitioner of that current system of aphoristic communication knows as the TWEET. Now those inclined to resist all things contemporary may object that any message that may be as large as 140 characters cannot be an aphorism. (Joke interruption: I was asked the other day to supply a password with at least eight characters. I decided upon Snow White and the Seven Dwarfs.) And those truly hostile may note that the definition of Tweet given first in all dictionaries is “A weak chirping sound, as of a young or small bird.” But the tweets I am bringing to your attention are both short and clever.

Read more »

Of Dark Matter, And Resonance Across Scales

by Tasneem Zehra Husain

ScreenHunter_769 Aug. 25 13.00I'm a total pushover when it comes to stories of connection. I am delighted by accounts of barriers breaking down and disparate people uniting in purpose, of ideas coalescing and theories fusing to reveal the common threads that underlie diversity. As I look back upon the history of physics, what reaches out and grabs me are the moments of unification when strands long thought separate are suddenly braided together in a whole that is stronger and more beautiful than the sum of its parts. Sometimes we uncover hidden affinities by exploring a motif repeated in apparently unrelated contexts; at other times, we are compelled by circumstance to form alliances with those we may have neglected, to put our heads together and come up with a solution acceptable to all. The conundrum of dark matter falls solidly in the latter category.

For several decades, cosmologists and astronomers had been growing progressively distant from their particle physics colleagues. As one group craned their necks further out into uncharted space, the other crawled deeper into the recesses of the atom. The disciplines began to seem as divergent as the scales upon which they operate, but there is a surprising resonance between the minute and the colossal. Even objects of cosmic proportions are built from subatomic particles. The discovery of dark matter was a reminder that no part of the universe can be completely understood by those who turn their backs on the rest.

Discussions of dark matter (and dark energy) are often front-ended by a startling admission of ignorance: the entire gamut of matter particles we conventionally study – quarks and leptons combined – forms less than 5% of the known universe. There is about five times as much dark matter out there, we are told, while the rest of the universe is made up of dark energy. But, since neither dark matter nor dark energy can be seen, how do scientists justify this shocking claim? An analogy might help. The mechanism of human vision is such that we see objects only when they reflect light. But if you find yourself in a pitch dark room, you don't immediately conclude that just because nothing is visible, the room must be empty. You simply realize that sight is no longer a reliable guide under these circumstances, and you must lean on sounds and smells, and touch (and taste?) to probe your surroundings. For lifeforms less dependent on vision, the darkness is multi-textured and alive with variety. Consider bats, for instance. Where we rely on light hitting objects and bouncing back, bats bank on sound. They emit high frequency calls, inaudible to human ears, and use the resulting echoes to construct a sonic map of their surroundings (the further an object is, the longer it takes for the echo to come back). The moral of the story is this: as long as there is a way for you to interact with an object, you can “sense” its presence.

Read more »

Leaving (and almost leaving)

by Rishidev Chaudhuri

I

It's impossible for me Pents10to leave a place well. I used to think that I was merely bad at logistics and planning (and I am), but I manage to conspire against myself with such sinister competence that this explanation no longer seems viable. As the time to leave approaches my consciousness starts to fragment, and I become exhausted and flee into sleep. I wait too long to do things, unable to act unless I have killed my inertia with drink or other confusion, or distracted myself sufficiently that anything I do is useless. I spend hours on minutiae, reorganizing my book collection and cataloguing my kitchen equipment; they're happy hours, once I forget why I'm doing it.

Perhaps it's that leaving is quite obviously a rehearsal for death, disrupting even the faint illusions of permanence that spatial and environmental contiguity offer us. So is everything, if we have learned to listen to the philosophers and to live well, but of course we have not learned to listen and who has the time to rehearse for death these days?

I have trouble even with leaving hotel rooms and getting off of airplanes. I'm haunted by the sense that I've left traces of my self behind. Maybe in the shape of things: do I have my keys? has my wallet finalized the escape it has been plotting all these years? Perhaps these things I've left are important and their absence will make the self who leaves unviable. Eventually I get frustrated and resentful of the unreasonable claims of that future self but by then it is too late: I am nearly that future self and the instincts of self-preservation take over.

II

With leaving comes the return of beginner's mind, that flush of seeing things fresh as you did when you first arrived, of being once again surprised at the particularity of things, troubled by their contingency and delighted by the odd way the fragments of a world fit together (Louis Macneice's delightful “drunkeness of things being various”). As everyone knows by now, the only time it is truly possible to appreciate anything is when you are faced with its transience and, by then, it is too late and the moments are inextricably entangled with the melancholy of their endings. Sometimes, though, the melancholy parts to reveal intimations of an exuberant noonday joy, as when the sun stands still and makes the world bright and shimmering for a few moments before it begins to fall towards the horizon.

Read more »

In the Shadow of Mo’Ne Davis

by Akim Reinhardt

Mo'ne DaivsThirteen year old Mo'ne Davis recently took America by storm when she pitched her south Philly baseball team deep into the Little League World Series, where clubs from around the world compete every August.

A beloved celebrity of the moment, her success brought to mind my own somewhat tortured little league experiences.

I. While not terribly big, my father was nevertheless a super-stud athlete at his highschool in Fresno, California during the mid-1950s. Captain of the football team (he played end on both sides of the ball), member of the track, field, diving, swimming, and basketball teams, he was popular enough to be voted president of the class of `56. And he was good enough, despite being only 145 pounds, to earn a football scholarship to Redding College in northern California, although he would soon lose it in a gambling scandal. True.

So you'd think I grew up in a household that paid attention to sports and that I learned it all from at my father's knee.

Quite to the contrary, not only didn't the old man watch sports, he didn't even understand the appeal. To him, sports were something to do, not something you watch other people do. I think he looked at it like drinking: he liked drinking, especially with others and alone if need be, but why on earth would he turn on the TV to watch someone else drink? Or drive across the city and pay for parking and admission to watch people drink. It didn't make any sense to him.

Fair enough, you say. But then he must've been a great coach when I was a kid, right? The kind of dad who could really teach the fundamentals and show you the tricks to getting ahead.

Again, not really.

Great players often make for lousy coaches. One common explanation is that their prodigious talent makes it more difficult for them to become good teachers, not easier. That the concept of pedagogy is foreign to them. That they are dumbfounded when mediocre players play, well, mediocre.

How could you not hit that ball or make that shot? That's easy, what's wrong with you? It was easy for them, of course. Not so much for the other 99% of humanity.

And that's kind of what it was like with my dad. As I became old enough to participate in organized sports on the rock and glass strewn fields of the Bronx, he was, more than anything, dumbfounded when it became obvious that I wasn't a great, natural athlete. He wondered about my eyesight (which was fine), and told me to concentrate more (which I did, sometimes). But generally, he was at a loss to explain it.

Read more »

On Alvin Kernan

by Eric Byrd

9780300123159There's a subgenre of military memoirs produced by elderly emeriti, the crew-cut close readers of postwar English departments, who in late career published personal recollections of they and the other terrified teenagers who mostly fought World War Two. Alvin Kernan (Shakespeare editor, torpedo bomber crewman) is like Paul Fussell (Johnsonian, infantry officer) and Samuel Hynes (Auden biographer, Marine aviator). Seventeen year-old Kernan joined the Navy before the war, to escape the bleakness of Depression Wyoming: Ma and Pa down on the ranch, hard winters and harder times. Kernan's mother had a representatively difficult life. She killed herself while he was at sea. Home on leave, he inspects her grave “already collapsing and pocked with gopher holes”:

The World War I generation to which she, born in 1900, belonged was the first to leave the land, and with a little education, she married a soldier, moved to town, went to Florida, lost the money from the sale of her father's farm in the land boom, had a child, divorced, and began wandering—Chicago, Memphis, a ranch in Wyoming. She remarried, became a Catholic, and put a determined face on it all, but she was part of the first generation of really rootless modern Americans, moving restlessly by car about the country, emancipated socially and intellectually to a modest degree, but lost, really, without the supporting ethos and family that had protected people in the years when the continent was being settled. Alienation was the familiar state of my generation of Depression and another world war, but the old people had few defenses against it when it appeared.

Hemingway, Fitzgerald, and Dos Passos are the favorite writers of young Seaman Kernan. He could be one of their characters. As with Hemingway's Nick Adams, death-shaded excursions in the American wilds precede and forebode initiation overseas. And Kernan must have recognized his family in Dos Passos' panorama of the wandering and the unmoored, the war-mobilized, the desperately migratory. The down but not out, bumming the freights, going to sea, following work; displaced but for all that able to dream of landing somewhere better:

Returning from out baseball game, we came alongside the ship and began to send sailors up the gangway. At that moment another landing craft came up carrying officers, including the executive officer of the Suwanee—a small, dark, mean man—who stood up in the bow, dead drunk, shouting in a loud voice to the officer-of-the-deck, “Get those fucking enlisted men out of there and get us aboard.” Protocol was that officers always take precedence in landing, and our boat shoved off immediately, circling while the officers staggered up the gangway after their afternoon drinking in the officers' club. The gap between enlisted men and officers in the American navy during WWII was medieval. Enlisted men accepted the division as a necessary part of military life, but it never occurred to us that it in any way diminished our status as freeborn citizens who, because of a run of bad luck and some unfortunate circumstances like the Depression, just happened to be down for a brief time. “When we get rich” were still words deep in everybody's psyche. But the exec's words, “those fucking enlisted men,” spoke of deep and permanent divisions. He obviously really disliked us, and his words made shockingly clear that he, and maybe the other officers he represented, had no sense that we had shared great danger and won great victories together.

Dos Passos' Three Soldiers, in a paragraph.

Beyond the charm of the Lost Generation atmosphere, the virtues of Crossing the Line are its swift pace and concision of evocation. No episode lasts longer than is necessary to make the essential impressions—usually Kernan's fear and awe (at times laced with boyish glee) before the military juggernauts whose savage collisions he is witnessing. Kernan did not set out to reconstruct the birth of his literary consciousness, or find the boy in the vitae. Quite the opposite. Seaman Kernan is a small animal in a world of threats. He thinks with his gut, senses through the soles of his feet.

Read more »

Suicide: An Act Of Supreme Bravery

by Evert Cilliers aka Adam Ash

WilliamsSuicide is not for cowards.

You’ve got to be mad brave to whack yourself. Yep, suicide takes a lot of balls. The most courage any human can ever muster. Suicides are the bravest people who ever lived, because they commit the greatest act possible — a deed against actual existence, against their very being. They say no to life itself, and then have the courage of that unbelievable conviction to end everything. Suck on the barrel of a gun or cast themselves down from a great height on to the indifference of solid ground.

And we often resent them for it. Because they say no to all of us, to all of us who persist in living. They place the idea of living in jeopardy. They undermine our pathetic belief in life. How could they? How dare they?

Why do they say no to life? Because for them, living is not worthy. Life is too crappy to merit a fart. Not up to scratch. They feel this way because they are depressed. So depressed, there is no more pleasure in being alive; only persistent, absolute pain. And no advice from the living can help.

I know about that.

I’ve been mortally depressed in my life, clinically depressed, and thought about committing suicide, but never got around to trying it. (I believe I saved myself from depression by exercise: as a runner all my life, I think I finally ran my topsy-turvy brain chemistry into balance: if more people exercised, we’d need fewer therapists.)

Read more »

Darwin in the Garden

by Josh Yarden

I posted a story last month about biblical metaphor, entitled “What Fruit Grows on the Tree of Knowledge?” The class discussion I related there continues with this question from a student:

Tree“When you said that the Tree of Knowledge is not an actual tree, didn't you just make up a story that's completely different from the one in the Book of Genesis?”

“Ok, let's get back into to the mythical garden.”

“So, you're saying it's all just a myth?”

“It's not just a myth. When I say a text is mythic, I don't mean that it is false. I mean that the power of the story is in the way it reflects experiences that happen over and over again in our lives. That's how people in different cultures over thousands of years can relate to these essentially human stories. We do know for a fact that the story has existed for millennia, and it has had a powerful and a memorable impact on our society. That makes it real, whether or not the events happened as described.

“Ok, but that doesn't explain whether or not the tree of knowledge of good and bad is an actual tree.”

“You can decide for yourself, but keep in mind that Torah does not claim to be ‘the truth, the whole truth and nothing but the truth.' There aren't enough details in these brief stories to suggest that any of them are full accounts of actual events, but they do contain enough symbolism to be read on three levels: the myth, the moral and the metaphor.

Read more »

Strained Analogies Between Recently Released Films and Current Events: The Guardians of the Galaxy are Taking Our Jobs

by Matt McKenna

GuardiansoftheGalaxyAt the end of Guardians of the Galaxy, there is much rejoicing by the citizens of the noble planet of Xander after their having been saved by the film's titular ragtag bunch of lovable anti-heroes. What is interesting to note, however, is how unconcerned the individuals on Xandar are by the troubling labor dynamics made apparent in their pyrrhic victory against the evil tyrant, Ronan the Accuser. Consider that a mere five “Guardians” (three humanoids, a tree, and a raccoon) were required to protect the Milky Way, a galaxy containing three hundred billion stars and, in the Marvel canon anyway, is so utterly teeming with bipedal life forms that one can't even land a spaceship on a random abandoned husk of a planet without running into at least one English speaking vigilante/mercenary/henchman who has dedicated her/his/its life to finding one lost relic or another. For goodness sake, just imagine the sheer number of plots against freedom-loving Xandarians that would arise in such a galaxy. And yet, Marvel's Milky Way apparently only requires a handful of part-time crime fighting goofballs to prevent evil from running roughshod over the forces of its PG-13-themed justice. Though it may sound as if I'm suggesting this implausibly small cosmic police force is a plot hole in Guardians of the Galaxy, it is precisely this miniscule ratio of guardians-to-villains that constitutes the film's most salient point about the real world: In our Milky Way as in Marvel's, the good jobs of the future will be dominated by a lionized elite few.

Read more »

Divine Intervention

Vlcsnap-2014-04-19-17h58m53s177 (1)

by Leanne Ogasawara

One of my favorite 3QD associates recently wrote a wonderful blog post, Old Man Bush: The Last Motherfucker. Reminiscing about the good ol' days, he asks the inevitable question, what happened to today's youth?

It's true, George HW Bush was old school. Despite being accepted at Yale, he postpones college to fight in the war, becoming a young aviator and then war hero… and not just that, says Akim, but the badass is still jumping out of helicopters at 90 years old today. Akim is impressed and wonders how it is that we all became so soft?

Honestly. How else do you explain seedless watermelons? Nope, we can’t be bothered to spit black watermelon seeds anymore, much less just eat the white ones. Cause we’re soft.

I mean, good luck finding regular grapefruit juice. No siree Bob, it’s gotta be ruby red on every grocery store shelf, cause the plain old yellow grapefruits are a little bitter. Can’t be expected to put up with that.

Or reading a map. Or cooking dinner from scratch. Or getting up to change the channel. Or waving a hand fan. Or walking anywhere. Nope. Middle class America is too soft for any of that. Just gimme a smart phone, a remote, some takeout, a shit ton of air conditioning, and a good parking spot.

You know things are bad when you start looking back at old HW's presidency with nostalgia, right? What is really scary, though, is I had just been thinking the exact same thing!

Read more »

It’s Time to Change the World, Again

by Bill Benzon, appendix by Charlie Keil

IMGP2888

Writer’s Cypher, UMMI Living Village Community Garden, Jersey City, NJ

Adolescents seem gifted in the belief that, if only the adults would get out of the way and grow up, the world would be a much better place. In that sense I am still, at 66 going on 67 (Pearl Harbor Day of this year) an adolescent. I still believe that the world needs changing, though it’s been decades since I naively thought that letters to The New York Times were a reasonable means to that end. And I still believe that it’s the adults that need changing.

But I must also move forward in the realization that I too am an adult, and have been so for some time now.

What to do?

IMGP4383

I painted this when I was nine or ten.

I was ten years old when the Russians put Sputnik into the heavens. I still remember the October evening when my father took me outside and pointed to a moving light in the night sky. “That’s it.”

That’s when my personal history joined world history. That’s the first event that was both personally meaningful to me–I’d been drawing sketches of spaceships for years and had even done a painting or two–and was also of world historical importance. By the time I was old enough to be an astronaut, however, I’d changed.

I’d gone to college, marched against the Vietnam War, done my conscientious objector’s alternative service in the Chaplain’s Office at Johns Hopkins, and lost all interest in becoming an astronaut. Inner space had become my territory.

I got my PhD, then a job at the Rensselaer Polytechnic Institute, was astounded when Reagan was elected and re-elected–that hadn’t been in the plan, no it hadn’t. And I was really surprised when the Soviet Union collapsed. After all, I’d grown up during the height of the Cold War, read articles about back-yard bomb shelters, and had even picked out the spot in our back yard where a shelter should go. I figured that, whatever else happened in the world, that I’d go to my grave in the shadow of the Cold War.

Read more »

Making sense of suicide (and Matt Walsh’s nonsense)

by Grace Boey

Suicidal-Fingers-depression-23530251-481-720Imagine this: someone secretly laces your coffee with meth, every morning, for 28 mornings. Over the first week, you become increasingly hyperactive, and start to bubble with confidence and energy. You feel great, but by day 7, your behaviour starts to get erratic, and you’re irritated with everyone else who can’t keep up. By day 21, you’re having flashes of paranoia, and freak out from time to time because your mind keeps racing, and you’re convinced everyone’s watching you move too fast.

By day 28, you haven’t slept for a week. You feel invincible, so much so that you decide to take all the drugs you’ve got to see if it will kill you. Because that which doesn’t kill you makes you stronger, right? And if it does kill you, you’ll die feeling amazing… and dying would be such an incredible thing to do. In fact, this had damn well better be fatal. Thanks to the meth and sleep deprivation, you are so confused, irrational and psychotic, that this babbling seems entirely sensible.

Was the suicide attempt ‘your own decision’, in any meaningful sense? Of course it wasn’t. It certainly wasn’t my decision, when those very events happened to me a couple of years ago. The only difference? No one had secretly laced my coffee with drugs (though they might as well have). The terrifying effects were a product of my very first full-blown bipolar manic episode. Thankfully, I survived—although the doctors who treated me assured me I could just as easily not have. I had no clue what was happening at the time; my mania had swept me away, before I even realized anything was amiss.

Despite all this, people like Christian blogger Matt Walsh would say I had committed a “terrible, monstrous atrocity” that was entirely my decision. On August 12, one day after Robin Williams appeared to have killed himself as a result of depression, Walsh published an article with the headline “Robin Williams didn’t die from a disease, he died from his choice.” In it, he claimed that “suicide does not claim anyone against their will”. Depression—and by extension of Walsh’s arguments, all mental illness—is not responsible for suicide: you are. When a huge backlash ensued, he stuck to his guns and wrote a detailed response to his critics.

When I first came across the headline of Walsh's original post, I took a deep breath, read the article, took another deep breath… and read it again. My conclusion at the end of this exercise was the exactly same as my initial response: what a load of exploitative, uninformed rubbish. Walsh's statements reflect deep misconceptions about mental illness, competent decision-making and ‘free will’, which (unfortunately) hinge on the supernatural metaphysics that accompanies Christianity. It angers me that someone like this should feel entitled to piss on the grave of Robin Williams with a headline like that. And personally, as someone who has attempted suicide under the grips of both mania and depression, I am insulted by Walsh's backward ideas.

Read more »

Monday, August 18, 2014

Arguing to Win

by Scott F. Aikin and Robert B. Talisse

MjAxMy1iMmRkOWFmOTdmNjRlZDFlIn the course of discussing the central themes of our recent book, Why We Argue (And How We Should), with audiences of various kinds, one kind of critical response has emerged as among the most popular. It deserves a more detailed reply than we are able to provide here; nonetheless, we want to sketch our response.

Why We Argue presents a conception of proper argumentation that emphasizes its essentially cooperative and dialectical dimension. Very roughly, our view runs as follows. The central aim of cognitive life is to believe what is true and reject what is false. We pursue this by appealing to our evidence and reasons when forming and maintaining our beliefs. Yet in pursuing this aim, we quickly encounter conflicting and inadequate evidence and reasons; furthermore, we discover that we each must rely upon other people as sources of evidence and reasons. Importantly, the others upon whom we must rely do not always speak univocally; they often provide conflicting reasons and evidence. Accordingly, in the pursuit of our central cognitive aim, we confront the inevitability of disagreement. Argumentation is the process by which we attempt to resolve disagreement rationally. Consequently, argumentation is inescapable for a rational creature like us; and the aspiration to argue properly is an indispensible corollary of our central cognitive aim.

The project of arguing well requires individuals to interact with each other in certain ways, and to avoid interacting in other ways. More specifically, in order to argue well, we must individually attempt to take the reasons, perspectives, arguments, criticisms, and objections of others seriously; we must see even those with whom we most vehemently disagree as fellow participants in the process of proper argumentation, and we must engage with them on those terms. This means, among other things, that when engaging in argument, one must seek to make the most of the reasons and considerations offered by one's opposition. Verbal tricks, insults, threats, and obfuscation are failures of argumentation, even when they prove effective at closing discussion or eliciting assent. A lot of Why We Argue (And How We Should) is devoted to cataloguing and dissecting common ways in which argumentation, especially political argumentation, fails.

So much for the nutshell version of our conception of argumentation. Let's turn now to the critical reaction it commonly invites. Critics say that our view is misguided because it cannot acknowledge the brute fact that most often we argue not to rationally resolve disagreement, but to end disagreement; and the favored way of ending disagreement is by winning an argument. Here a sports analogy is often introduced. Critics often claim that just as one plays baseball not (primarily) for the exercise, camaraderie, or the cooperative teamwork, but rather to win baseball games; so it is that when one argues, one argues to win.

Read more »

The Fields Medal

by Jonathan Kujawa

DownloadThe big news in math this week was the opening of the quadrennial International Congress of Mathematicians (ICM) in Seoul. A number of prestigious awards are given at the ICM. Most famously this includes the Fields medal and the Nevanlinna prize (aka the Fields medal for computer science). Up to four winners of the Fields medal are announced along with the winner of the Nevanlinna prize. All the winners must be no older than 40.

I had the pleasure to attend the 2006 ICM in Madrid. This is the ICM famous for Grigori Perelman refusing to accept the Fields medal for his work in finishing the proof of the Poincaré conjecture. Perelman (or at least the media version of him) comes across as the stereotypical eccentric mathematician uninterested in worldly things. Fortunately for the PR folks, this year's winners all appear to be the sort you'd enjoy having over for dinner and drinks.

This year the Fields medal went to Artur Avila, Manjul Bhargava, Martin Hairer, and Maryam Mirzakhani. The Nevanlinna prize went to Subhash Khot. An excellent profile of each of the winners, including very nicely done videos, can be found on the Quanta website. The profiles are a bit short on the actual math of the winners. If you'd like a more meaty discussion of their work, former Fields medalist Terry Tao wrote blog posts here and here giving a more technical overview. Even better, former Fields medalist Timothy Gowers is blogging from the ICM itself! He's giving summaries of the main talks as well as his more general impressions while at the event. I can also recommend that you check out the excellent overviews of some of the winners' work on John Baez's Google+ page.

Rather than talk about the details of the winners' work [1], I wanted to point out a meta-mathematical common feature of their research. This is the idea of studying a collection of objects as a whole, rather than one by one.

Read more »

Monday Poem

Tide

the way it
comes, goes,
surges, disappears,
a perfect metaphor
for shapes of time,
overused as moon
for that which vanishes
and reappears.

quiet now, the wedding past
too much so—
a house that buzzed
now hushed, silence loud
sharp, slimmer
than a midnight crescent

silence also
comes, goes
empties, spills, ebbs and fills,
evaporates and billows like a cloud
above a sugarbush still
boiling down sweet water
for its essence
.

by Jim Culleny
7/25/14

Socrates, evolution, and the word “theory”

by Paul Braterman

UWASocrates_gobeirne_croppedWhat's wrong with this argument? More than you think!

All men are mortal.
Socrates is a man.
Therefore Socrates is mortal.

It's perfectly valid, meaning that if the premises are true, the conclusion must also be true. Despite this, as Bertrand Russell explained very clearly many years ago,[1] the argument is almost totally worthless.

There is no real doubt that Socrates is mortal. Just look at the poor chap, clearly showing his 70 years. Bent, scarred from the Peloponnesian War, his brow furrowed by decades of unhappy marriage, and even more unhappy attempts[2] to persuade his fellow citizens that the best form of government is a fascist oligarchy. Besides, he is on trial for doubting the existence of the gods, and the news from the Agora is not good. Take my advice, and do not offer him life insurance.

Even if we didn't know about his circumstances, we would readily agree that he is mortal. We see decrepitude and death around us all the time, few people have been known to live beyond a hundred years, none beyond 150, and we have no reason to believe that Socrates is any different. In fact, from our experience, we are a lot more certain that Socrates is mortal than we are that all men are mortal. Ganymede, Elijah, and the Virgin Mary were all, according to various traditions, taken directly up into heaven without having to go through the tedious process of dying. However, no Zeus-worshipper or biblical literalist or devout Catholic would for such reasons doubt the mortality of Socrates. So the premise, that all men are mortal, is actually less certain than the conclusion, and if we seriously doubted Socrates's mortality, we would simply deny that premise. In other words, this classic example of deductive logic tells us nothing that we didn't already know.

Read more »

The Psychology of Procrastination: How We Create Categories of the Future

by Jalees Rehman

“Do not put your work off till tomorrow and the day after; for a sluggish worker does not fill his barn, nor one who puts off his work: industry makes work go well, but a man who puts off work is always at hand-grips with ruin.” Hesiod in “The Works and Days

Clock-92130_640

Paying bills, filling out forms, completing class assignments or submitting grant proposals – we all have the tendency to procrastinate. We may engage in trivial activities such as watching TV shows, playing video games or chatting for an hour and risk missing important deadlines by putting off tasks that are essential for our financial and professional security. Not all humans are equally prone to procrastination, and a recent study suggests that this may in part be due to the fact thatthe tendency to procrastinate has a genetic underpinning. Yet even an individual with a given genetic make-up can exhibit a significant variability in the extent of procrastination. A person may sometimes delay initiating and completing tasks, whereas at other times that same person will immediately tackle the same type of tasks even under the same constraints of time and resources.

A fully rational approach to task completion would involve creating a priority list of tasks based on a composite score of task importance and the remaining time until the deadline. The most important task with the most proximate deadline would have to be tackled first, and the lowest priority task with the furthest deadline last. This sounds great in theory, but it is quite difficult to implement. A substantial amount of research has been conducted to understand how our moods, distractability and impulsivity can undermine the best laid plans for timely task initiation and completion. The recent research article “The Categorization of Time and Its Impact on Task Initiation” by the researchers Yanping Tu (University of Chicago) and Dilip Soman (University of Toronto) investigates a rather different and novel angle in the psychology of procrastination: our perception of the future.

Read more »

Paul’s Boutique: An Appreciation

by Misha Lepetic

“If you explain to a musician he'll tell that
he knows it but he just can't do it”
~ Bob Marley

Pauls_bigIt's hard to imagine that the Beastie Boys released “Paul's Boutique” around this time, 25 years ago. Even more astonishing is the fact that I recently had two separate conversations with members of the so-called Millennial Generation, which resulted in the extraordinary discovery that neither person had even heard of “Paul's Boutique.” Now this may make me sound like an ornery codger complaining about how the young folk of today are illiterate because they have never heard of (insert name of your own pet artist). But taken together, these two events require me to submit a modest contribution to keeping the general awareness of “Paul's Boutique” alive and well.

What makes “Paul's Boutique” so extraordinary and enduring? The sophomoric effort by the brash NYC trio debuted in 1989, and was the much-anticipated follow-up to “License To Ill.” But instead of a new set of frat party anthems along the lines of “Fight For Your Right (To Party),” listeners were treated to a continuous magic carpet woven out of a kaleidoscope of samples. Romping over this dense, schizophrenic bricolage, MCA, Ad-Rock and Mike D traded lightning-quick call-and-response rhymes that embraced the usual MC braggadocio but at the same time drew on a vast range of sources and styles. The effect, to this day, is a delirious sort of aural whiplash.

No one is clear on how many songs were actually sampled, although the number is certainly well over a hundred. The exegesis of both samples and lyrical references is a time-honored tradition, too. Around 1995, one of the first sites that ever made me think the World Wide Web might be a good idea was (and continues to be) the Paul's Boutique Samples and References List. When studied, Torah-like, alongside the Beastie Boys Annotated Lyrics and the record itself, one begins to appreciate the catholic taste of both the rappers and their producers, the inimitable Dust Brothers, who would go on to provide much of the genius behind Beck's seminal “Odelay” album a few years later.

Read more »