What Art Can Do

by Christopher Horner

Grasmere (Photo by author)

Why do we value art? I am going to suggest that a large part of the answer is to do with its unique power to disclose and convey areas of our lives unavailable to us though other means. Art, on this account, is a kind of communication, and kind of act: something performative – a communication that makes something happen, in a way that eludes ordinary discourse. 

By ‘ordinary’ here I mean the kind of communication delivered by language when it is used to convey concepts: ranging from the most banal everyday speech to the most rarefied theory. Of course, ordinary speech acts themselves have a performative quality too – we don’t just communicate information through language, but make things happen, make requests, (‘shut the door’, ‘the meeting is over’ ‘help me!’). Moreover we use our bodies, tone of voice, emphasis and more: and the conceptual content may not even be what matters, especially when something isn’t banal, but matters greatly: moments of grief or joy, for instance. A gesture, a tear, or just silence may be more eloquent than words. It is this ‘beyond’ in our imperfect communications, that hint at what art can do. Art aspires to a more perfect communication: one that takes us beyond the confines of the lonely self.  Read more »



Looking for the Enlightened

by Marie Snyder

The season finale of The Last of Us sets up a great deontological v. teleological conundrum with the big question (tiny spoiler), which ends up being an episode-long trolley problem: Is it right to kill one person if doing so could save multitudes?

In a utilitarian view, of course we should sacrifice one person to potentially save all of humanity. It would be absurd not to see this and ensure the safety of all! But it doesn’t pass the categorical imperative sniff test. We can’t support intentional harm coming to people, any people, no matter how few, even if it will help many others. 

Kant’s famous rule: “I am never to act otherwise than so that I could also will that my maxim should become a universal law, so killing a person to help others is necessarily wrong. It’s not a numbers game since “the moral worth of an action does not lie in the effect expected.” And we have to treat each person as an end in themselves, never as a means to an end. 

Or, as the great Mitchell & Webb make clear, killing some to save others is just plain wrong “because it’s offensive and evil.” 

And people will do absolutely anything to save the children they love if they turn out to be the sacrificial lambs in question. 

Fortunately, the current issues facing us don’t force us to choose. We can help our children and the world at once, so that should be easy, right?!? Unfortunately, our drive to protect our kids is often short sighted. Life is not as cut and dried as a good zombie-like show. We often want our children to be happy today even if it means they end up suffering later. Read more »

Monday, May 1, 2017

The Pollinators of Technology

by Evan Edwards

DownloadOn the night of Monday, April 3rd, a man stood in the middle of the intersection at Franklin and Columbia in Chapel Hill, NC. Within minutes, thousands of people poured out of bars, houses, apartments, fraternity and sorority homes, and who knows where else, barrelling down the largest streets in the town to join him. There’s a video that shows it happening in high speed. The University had just won the NCAA men’s basketball tournament which (if you don’t know) is a very big deal.

I grew up in North Carolina, and as the week drew closer to the game, I watched so many people that I know from Middle and High school making their way back to the state, just to be there if/when they pulled it off. If they couldn’t make it, many documented their excitement wherever they were, on social media, and sent messages and memes to one another as the game loomed closer, just brimming with enthusiasm. Although I never really got into sports, it was a bit moving to watch people get so very joyous about something when nearly everything else in the news is tinged with a kind of abysmal horror.

If you watch the video I linked to above, you notice that the frame shakes as it pans from side to side. Because we’re used to it, we can read this erratic movement as the work of a smartphone camera because professional cameras and drones aren’t this sloppy, and no one uses handheld video-cameras any more. In the shot, too, you see the arm of the man in the intersection upstretched in the first few frames, the luminous glow of his iPhone at its apex, almost giving him the look of an angler fish wandering the deep, or a single firefly waiting in a meadow. As the crowd rolls in, you can’t always make out the screen glow, but it’s clear that almost everyone in the crowd is either raising their phone up to take a picture, to record video, to go live, or to snapchat.

When I was younger, my friends and I did something similar to this. We would call each other during concerts, to leave voicemails or let them listen for a while if a song that meant something to both of us was being played. For me, it was a special way of using technology to deepen a personal friendship. This was before I was on Facebook (you had to have a college e-mail address to get an account when I was in High School), Myspace was not used for sharing things like this, and so the concert voice mail was, in some way, the most cutting edge social medium we had. It was extraordinary to wake up to a voicemail like that from a friend. Absolutely moving.

Read more »

Monday, February 27, 2017

Reality Check: Wine, Subjectivism and the Fate of Civilization

by Dwight Furrow

Perfectly round circlesI must confess to having once been an olfactory oaf. In my early days as a wine lover, I would plunge my nose into a glass of Cabernet, sniffing about for a hint of cassis or eucalyptus only to discover a blast of alcohol thwarting my ascension to the aroma heaven promised in the tasting notes. A sense of missed opportunity was especially acute when the wine was described as "sexy, flamboyant, with a bounteous body." Disappointed but undaunted, I would hurry off to wine tastings hoping the reflected brilliance of a wine expert might inspire epithelial fitness. It was small comfort when the expert would try to soften my disappointment with the banality, "it's all subjective anyway." So one evening, while receiving instruction in the finer points of wine tasting from a charming but newly minted sommelier, I let frustration get the better of me and blurted "Well, if it's all subjective, what the hell are we doing here? Is it just your personal opinion that there is cassis in the cab or is it really there. We all have opinions. If you're an expert you should be giving us your knowledge, not your opinion!" Someone muttered something about "chill out" and it was quickly decided that my glass needed refilling. But the point stands. The idea of expertise involves the skillful apprehension of facts. If there is no fact about aromas of cassis in that cab there is no expertise at discerning it.

These conversations over a glass of wine are more pleasant (because of the wine) but structurally similar to the semester-long task of getting my college students to realize that moral beliefs are not arbitrary emendations of their lightly held personal attitudes but are rooted in our need to survive and flourish as social beings. Yet even after weeks of listening to me going on about the sources of value they still write term papers confidently asserting that with regard to "right" and "wrong", eh, who knows?

Subjectivism, the view that a belief is made true by my subjective attitude towards it, has long been the default belief of freshman students and arbiters of taste. Unfortunately this tendency to treat it as the wisdom of the ages has escaped the confines of the wine bar and classroom into the larger society. Buoyed by the cheers of multitudes, our fabulist-in-chief, routinely finds his "own facts" circulating in what seems to be an otherwise empty mind. Unfortunately, this is no longer mere fodder for a seminar debate.

Read more »

Monday, April 13, 2015

Do we really value thinking for oneself?

by Emrys Westacott

Why do we choose to do what we think is right even when it goes against our inclinations or interests? This is one of the oldest and toughest questions in moral psychology. Knowing the good clearly does not entail that we will do the good. So what carries us from the former to the latter? Imgres

One philosopher who wrestled with this question long and hard was Immanuel Kant (1724-1804). He considered it profoundly mysterious that we often choose to do overrid our interests or desires and do our duty purely because we consider ourselves dutybound. (Nietzsche expresses a similar sense of wonder when he asks, “How did nature manage to breed an animal with the right to make promises?”) Kant's explanation is that we are moved by what he calls moral feeling.[1] And he identifies two main kinds of moral feeling: respect for morality, and disgust for what is contrary to morality. Discussing these in his lectures on ethics, he says that you cannot make yourself or anyone else have these feelings. But you can inculcate them, or something that will serve the same purpose, in a child through proper training. The following passage is especially noteworthy:

We should instill an immediate abhorrence for an action from early youth onwards . . . we must represent an action, not as forbidden or harmful, but as inwardly abhorrent in itself. For example, a child who tells lies must not be punished, but shamed; we must cultivate an abhorrence, a contempt for this act, and by frequent repetition we can arouse in him such an abhorrence of the vice as becomes ahabitus with him.[2]

I imagine this bit of moral pedagogy will strike many readers as morally suspect. But why?

Read more »

Monday, December 8, 2014

Why Kant Was Wrong about Food

by Dwight Furrow

Atelier crenn

from the San Francisco restaurant Atelier Crenn

Among philosophers who think about art and aesthetics, the position of food and wine is tenuous at best. Food and wine receive little discussion compared to painting or music, and when they are discussed, most philosophers are skeptical that food and wine belong in the category of fine arts.

Food and wine have not always been marginalized in discussions of aesthetics. In the 18h Century, taste provided a model for how to understand aesthetic judgments in general—until Kant came along to break up the party. Kant argued that food and wine could not be genuine aesthetic objects and his considerable influence has carried the day and continues to influence philosophical writing on the arts.

What were these powerful arguments that succeeded in removing taste from the agenda of aesthetics? Kant thought that both “mouth taste” and genuine aesthetic appreciation are based on an individual’s subjective experience of pleasure. But with “mouth taste” there is no reflection involved and no imaginative involvement, just an immediate response. The pleasure comes first and then we judge based on the amount of pleasure experienced whether we find the flavors “agreeable” or “disagreeable”. Thus, our judgments about food and wine are based entirely on our subjective, idiosyncratic, sensuous preferences. By contrast, when we experience paintings or music aesthetically, contemplation ensues whereby our rational and imaginative capacities engage in “free play”. Our pleasure is not an immediate response to the object but comes after the contemplation and is thus based on it. We respond not only to whether the object is pleasing but to how the object engages our cognitive capacities of understanding and imagination. This yields a judgment that is not merely a subjective preference but involves a more universal form of appreciation.

Kant was wrong to argue that “mouth taste” does not provoke contemplation. Connoisseurs of wine, cheese, coffee, and beer, as well as the flavorists who analyze our food preferences for the food industry show that food and wine can be thoughtfully savored, and various components of the tasting experience can be analyzed. But that fact by itself doesn’t really refute Kant’s view. What mattered for Kant was not just the fact of contemplation, but rather how the contemplation unfolds and what its result is. So we have to look more closely at what Kant had in mind.

Read more »

Monday, November 24, 2014

The continuing relevance of Immanuel Kant

by Emrys Westacott

Images-2

Immanuel Kant (1724-1804) is widely touted as one of the greatest thinkers in the history of Western civilization. Yet few people other than academic philosophers read his works, and I imagine that only a minority of them have read in its entirety the Critique of Pure Reason, generally considered his magnum opus. Kantian scholarship flourishes, with specialized journals and Kant societies in several countries, but it is largely written by and for specialists interested in exploring subtleties and complexities in Kant's texts, unnoticed influences on his thought, and so on. Some of Kant's writing is notoriously difficult to penetrate, which is why we need scholars to interpret his texts for us, and also why, in two hundred years, he has never made it onto the New York Times best seller list. And some of the ideas that he considered central to his metaphysics–for instance, his views about space, time, substance, and causality–are widely held to have been superseded by modern physics.

So what is so great about Kant? How is his philosophy still relevant today? What makes his texts worth studying and his ideas worth pondering? These are questions that could occasion a big book. What follows is my brief two penn'th on Kant's contribution to modern ways of thinking. I am not suggesting that Kant was the first or the only thinker to put forward the ideas mentioned here, or that they exhaust what is valuable in his philosophy. My purpose is just to identify some of the central strains in his thought that remain remarkably pertinent to contemporary debates.

1. Kant recognized that in the wake of the scientific revolution, what we call “knowledge” needed to be reconceived. He held that we should restrict the concept of knowledge to scientific knowledge–that is, to claims that are, or could be, justified by scientific means.

2. He identified the hallmark of scientific knowledge as what can be verified by empirical observation (plus some philosophical claims about the framework within which such observations occur). Where this isn't possible, we don't have knowledge; we have, instead, either pseudo-science (e.g. astrology), or unrestrained speculation (e.g. religion).

3. He understood that both everyday life and scientific knowledge rests on, and is made orderly, by some very basic assumptions that aren't self-evident but can't be entirely justified by empirical observations. For instance, we assume that the physical world will conform to mathematical principles. Kant argues in the Critique of Pure Reason that our belief that every event has a cause is such an assumption; perhaps, also, our belief that effects follow necessarily from their causes; but many today reject his classification of such claims as “synthetic a priori.” Regardless of whether one agrees with Kant's account of what these assumptions are, his justification of them is thoroughly modern since it is essentially pragmatic. They make science possible. More generally, they make the world knowable. Kant in fact argues that in their absence our experience from one moment to the next would not be the coherent and intelligible stream that it is.

Read more »

Monday, August 4, 2014

Karl Marx’s Guiding Idea

by Emrys Westacott

Imgres

“Nothing human is alien to me.” This was Karl Marx's favourite maxim, taken from the Roman writer, Terrence. But I think that if Marx had lived a century later, he might have added as a second choice the famous phrase sung by Sportin' Life in Gershwin's Porgy and Bess: “It ain't necessarily so.” For together these two sayings capture a good deal of what I think of as Marx's Guiding Idea, the idea at the heart of his philosophy that remains as valuable and as relevant today as in his own time. Let me explain.

Human beings have been around for a few million years, and for most of that time most people's material and social circumstances have been quite stable. The experiences of one generation were pretty much the same as the experiences of their forbears. In this respect the lives of humans were like those of other animals. Unlike other animals, however, human beings reflect on their lives and circumstances; moreover they communicate these reflections to one another. The result is religion, mythology, philosophy, history, literature, and the performing arts (all of which can arise within a purely oral culture), and eventually the natural sciences, and social studies of various kinds, such as psychology, sociology, economics, and political theory.

These diverse forms of reflection on the human condition perform various functions. One function is to explain why things are the way they are. For instance, the bible explains why the Israelites lived in Israel (God made a promise to Abraham, and kept it, enabling Joshua's army to conquer the land); the theory of the four humours purported to explain personality differences between individuals. Another function is to justify a certain order of things. Thus, the doctrine of the divine right of kings sought to justify the institution of a powerful executive who stands above the law. The doctrine that individuals have a right to freedom of thought and expression is often cited to justify a policy of religious tolerance.

These two functions are sometimes hard to disentangle. For example, the alleged cultural inferiority of a people might be taken both to explain why they have been conquered and to justify that conquest as legitimate or even desirable. The “laws” of market competition provide an explanation of why some individuals and businesses do better than others, and these same laws are appealed to by those inclined to endorse the the outcome of the competition.

Read more »

Monday, December 20, 2010

Murders, Monsters and Mirrors: The Ethics of Killing and Cannibalism

‘Murder’ differs from ‘killing’ – and must differ for the words to have their moral impact – because killing is a neutral term. Surprising as it may seem, it is most helpful for discussions on killing if we recognise that the word itself is mostly and simply ‘the taking of organic life’. It is another matter whether it is all or certain forms of organic life we are concerned with.

‘Murder’ falls within the category of ‘killing’, in that the organism in question is killed but did not want to be killed. How we assess this is also another matter, but for humans we can infer in most instances whether or not someone willingly wants to die. If she does not wish to die, but still has her life taken away – violently or not is beside the point – then she was murdered.

Armin-meiwes I say this because I think we need clarity in the case of infamous German cannibal, Armin Meiwes. In March 2001, Meiwes killed and ate a willing, consenting man, Bernd Brandes. Meiwes had advertised on online chat-rooms, without euphemism or innuendo, his seeking a “young well-built man, who wanted to be eaten”. Brandes was a year older than his killer, but this didn’t seem to faze Meiwes who held auditions for the position. The other potential candidates thought that “being gobbled up” was a metaphor concerning sexual-actions. Four candidates travelled to Meiwes’ house, but eventually were told the seriousness of the description. Meiwes “let them” leave and was not impressed with another, who he found sexually unappealing.

After finally meeting Brandes, they started up the ritual that would lead to Brandes’ death and devouring. Brandes had drawn up a will and testament, where his money and estate would go to his live-in partner. Also, Meiwes video-taped both Brandes whilst alive and later, after his death. After all these final sentences of conscious human experience were given their appropriate full-stops and commas, Brandes ingested sleeping-tablets. Meiwes cut off Brandes’ penis, cooked it, and ate it with Brandes (eventually it was given to the dog apparently because of a poor recipe choice). Eventually, Meiwes killed (not “murdered”) Brandes, chopped him into pieces, and ate him over several days.

Read more »

Monday, August 9, 2010

‘The Thing Itself’ : A Sci-Fi Archaeology

by Daniel Rourke

Mid-way through H.G.Wells’ The Time Machine, the protagonist stumbles into a sprawling abandoned museum. Sweeping the dust off ancient relics he ponders his machine’s ability to hasten their decay. It is at this point that The Time Traveller has an astounding revelation. The museum is filled with artefacts not from his past, but from his own future: The Time Traveller is surrounded by relics whose potential to speak slipped away with the civilisation that created them.

Having bypassed the normal laws of causality The Time Traveller is doomed to inhabit strands of history plucked from time’s grander web. Unable to grasp a people’s history – the conditions that determine them – one will always misunderstand them.

Archaeology derives from the Greek word arche, which literally means the moment of arising. Aristotle foregrounded the meaning of arche as the element or principle of a Thing, which although indemonstrable and intangible in Itself, provides the conditions of the possibility of that Thing. In a sense, archaeology is as much about the present instant, as it is about the fragmentary past. We work on what remains through the artefacts that make it into our museums, our senses and even our language. But to re-energise those artefacts, to bring them back to life, the tools we have access to do much of the speaking.

The Things ThemselvesLike the unseen civilisations of H.G.Wells’ museum, these Things in Themselves lurk beyond the veil of our perceptions. It is the world in and of Itself; the Thing as it exists distinct from perceptions, from emotions, sensations, from all phenomenon, that sets the conditions of the world available to those senses. Perceiving the world, sweeping dust away from the objects around us, is a constant act of archaeology.

Kant called this veiled reality the noumenon, a label he interchanged with The-Thing-Itself (Ding an Sich). That which truly underlies what one may only infer through the senses. For Kant, and many philosophers that followed, The Thing Itself is impossible to grasp directly. The senses we use to search the world also wrap that world in a cloudy haze of perceptions, misconceptions and untrustworthy phenomena.

In another science fiction classic, Polish writer Stanislaw Lem considered the problem of The Thing Itself as one of communication. His Master’s Voice (HMV), written at the height of The Cold War, tells the story of a team of scientists and their attempts to decipher an ancient, alien message transmitted on the neutrino static streaming from a distant star. The protagonist of this tale, one Peter Hogarth, recounts the failed attempts at translation with a knowing, deeply considered cynicism. To Peter, and to Stanislaw Lem himself, true contact with an alien intelligence is an absolute impossibility:

“In the course of my work… I began to suspect that the ‘letter from the stars’ was, for us who attempted to decipher it, a kind of psychological association test, a particularly complex Rorschach test. For as a subject, believing he sees in the coloured blotches angels or birds of ill omen, in reality fills in the vagueness of the thing shown with what is ‘on his mind’, so did we attempt, behind the veil of incomprehensible signs, to discern the presence of what lay, first and foremost, within ourselves.”

Stanislaw Lem, His Master’s Voice

Read more »