I have been thinking a lot about change recently. 2020 seemed like a good year to do this, for several reasons. There was the political turmoil in the United States where I live. There was the global pandemic. There was the birth of our daughter. There were a few projects I worked on related to long term change on evolutionary timescales. All of these issues gave me the opportunity to think about change and some of the paradoxes associated with it. Everybody defines change in their own way, and some changes may be more important to some of us than to others, so how we react to, adapt to and enable change is ultimately very subjective. And yet we all have to deal with some very objective measures of change, at the very least those pertaining to life and death. So the paradox of change is that while it impacts us on a very subjective, personal level and each of us perceives it very differently, on another level it also unites us because of its universal aspects, aspects that can help us define our common humanity.
There was of course the pandemic that forced great changes. A way of life which we took for granted was suddenly and irrevocably changed. Careers and lives ended, we hunkered down in our homes, stopped traveling and started looking inward. For some of us who had been caught up in immediate matters of family, the pandemic even came as a welcome respite in which we got to spend more time with our significant others and children. We stepped back and reevaluated our life on the treadmill. For others, it posed a constant challenge to get work done, especially with kids whose schools were closed. For my wife and me, the pandemic was a chance to spend more time with our newborn daughter and avoid the stresses and boredom of the commute and stresses of physical meetings in the office. What can be unwelcome change for one can be unexpectedly welcome for another. In this particular case we were privileged, but the tables could well be turned. Read more »
In the media it is relatively easy to find examples of new technologies that are going “revolutionize” this or that industry. Self-driving cars will change the way we travel and mitigate climate change, genetic engineering will allow for designer babies and prevent disease, superintelligent AI will turn the earth into an intergalactic human zoo. As a reader, you might be forgiven for being in a constant state of bewilderment as to why we do not currently live in a communist utopia (or why we are not already in cages). We are incessantly badgered with lists of innovative technologies that are going to uproot the way we live, and the narrative behind these innovations is overwhelmingly positive (call this a “pro-innovation bias”). What is often missing in such “debates”, however, is a critical voice. There is a sense in which we treat “innovation” as a good in itself, but it is important that we innovate responsibly. Or so I will argue. Read more »
Philosophy has been an ongoing enterprise for at least 2500 years in what we now call the West and has even more ancient roots in Asia. But until the mid-2000’s you would never have encountered something called “the philosophy of wine.” Over the past 15 years there have been several monographs and a few anthologies devoted to the topic, although it is hardly a central topic in philosophy. About such a discourse, one might legitimately ask why philosophers should be discussing wine at all, and why anyone interested in wine should pay heed to what philosophers have to say.
This philosophical discourse about wine did not emerge in a vacuum. Prior to the mid-20th century, one would never have encountered “philosophy of economics,” “philosophy of law,” “philosophy of science,” “philosophy of social science,” or the “philosophy of art” either, each of which has become a standard part of the philosophical canon. Philosophers have always had much to say about these practices but not as organized into discrete sub-disciplines with their own subject matters.
The assumption behind the emergence of these sub-disciplines is that the study of philosophy brings something to them—particular skills or insights—that immersion in the disciplines themselves would struggle to employ. Thus, in trying get clear on what the philosophy of wine can contribute to the community of wine lovers, we quickly run up against the question of what distinctive skills or insights characterize philosophy. Read more »
If one enters the name “Ellen Page” into the search box at en.wikipedia.org, it redirects to an entry entitled “Elliot Page”(and informs you that it has done this). This is because on December 1, 2020, as the entry itself tells us in the section marked “Personal life,” that person, an accomplished and popular actor nominated for the 2007 Best Actress Oscar for their performance in the film Juno, “came out as transgender on his social media accounts, revealed his pronouns as he/him and they/them, and revealed his new name, Elliot Page.” Naturally this led to a flurry of activity on the interwebs, much of it, not surprisingly, about gender politics. This post, however, will be not be about that, except incidentally; instead, it concerns the much sexier topic of the semantics and metaphysics of naming, and will most likely (you have been warned) finish up with lengthy citations from the relevant sections of Philosophical Investigations.
My immediate reaction, that is, when I heard this, was to wonder whether and in what sense “Ellen Page” is still a referring expression, and who gets to decide this, and on what grounds. Naturally Elliot himself has a unique and in some ways authoritative perspective on this, but a) he’s only one of an entire community of English speakers; and b) if he wants to give us a theory of the reference of proper names he’s entirely welcome to do so, but in that context his own perspective, as in (a), is, I think, less authoritative than on the rather narrower question of what he should now be called, which I grant is up to him.
So I’m thinking about sentences like
1) The star of Juno is Ellen Page. 2) Ellen Page is the star of Juno. 3) The first-billed actor in the credits of season 1 of The Umbrella Academy is Ellen Page.
Are these true? False? Nonsensical? Rude to Elliot but otherwise okay? What do they mean? Did they change their meanings on December 1, 2020? What else might we say about them? Read more »
In my first column for 3 Quarks Daily I wrote that we are still fighting both the Civil War and WWII. As Henry Louis Gates Jr. puts it: “two hideous demons slumber under the floorboards of Western culture: anti-Semitism and anti-Black racism.” We have learned that any steps forward will be met with enormous resistance and backwards pressure. Gates quotes Ernst Cassirer: “every developmental step [of modern societies] can be reversed.” We saw this clearly post-Reconstruction, when everything possible was done to limit the lives of Black[i] people, and again following the two terms of the first Black president, when Americans chose an openly racist birther backed by the Ku Klux Klan and Neo-Nazis. A leap backwards was true as well for European Jews, who before the Second World War believed they had successfully assimilated into secular society.
Any reader of history cannot escape the echoes, back and forth, of racism, white nationalism, German and American ideas of purity. For example, jazz was reviled by the Nazis, and listening to it was a crime. Americans loved jazz, but as late as the 1950s Lena Horne couldn’t go into the dining room in the Sahara Hotel in Las Vegas. Black musicians had to reach and leave the stage through a separate enclosed corridor. Artie Shaw, who by all accounts was not at all racist and performed early on with Billie Holiday when most musical groups were segregated, at the same time hid the fact that he was Jewish. Ava Gardner reports that he sat silent at a table of bigwigs making antisemitic[ii] comments and even joined in rather than speak up and give himself away[iii].
Pressure Point, a film made in 1962 and directed by Hubert Cornfield, is a mostly unknown but quite brilliant dissection of both race and Nazism in America. Sidney Poitier portrays a psychiatrist who (in a flashback) has been given a job in 1942 in a federal penitentiary. He has purposely been assigned a patient who is openly a white supremacist, played vividly by Bobby Darin. (Neither character is named in the film so I will refer to the roles by the names of the actors.) Read more »
“How do you get a philosophy major away from your front door? You pay them for the pizza.”
As a doctoral candidate in philosophy people often ask me what I am going to “do” with my degree. That is, how will I get a job and be a good, productive little bourgeoisie worker. How will I contribute to society, and how will my degree (which of course was spent thinking about the meaning of “meaning”, whether reality is real, and how rigid designation works) benefit anybody. I have heard many variations on the theme of the apparent uselessness of philosophy. Now, I think philosophy has a great many uses, both in itself and pragmatically. Of concern here, however, is whether not just philosophy, but education in general might be (mostly) useless.
If you are like me, then you think education matters. Education is important, should be funded and encouraged, and it generally improves the well-being of individuals, communities, and countries. It is with this preconception that I went head-first into Bryan Caplan’s well written (and often wonderfully irreverent) The Case Against Education, where he argues that we waste trillions in taxpayer revenue when we throw it at our mostly inefficient education system. Caplan does not take issue with education as such, but rather the very specific form that education has taken in the 21st century. Who hasn’t sat bored in a class and wondered whether circle geometry would have any bearing on one’s employability?
As the title suggests, this is not a book that is kind in its assessment of the current state of education. While standard theory in labour economics argues that education has large positive effects on human capital, Caplan claims that its effect is meagre. In contrast to “human capital purists”, Caplan argues that the function of education is to signal three things: intelligence, conscientiousness, and conformity. Education does not develop students’ skills to a great degree, but rather seeks to magnify their ability to signal the aforementioned traits to potential employers effectively. Read more »
Before the COVID pandemic, travel to academic conferences and colloquia was a large part of the job of being a professor at a research-focused university. The last few months have given us the opportunity to reflect on the hurly burly of academic travel. We’ve keenly missed many things about those in-person events. Yet there were things we don’t miss very much at all. While academic conferences are still paused, we wanted to make a note about what’s worth our time and not, and then make some resolutions about what we can do better.
The bloom of online conferences since last Spring provides a key point of comparison. The online conference has many of the same problems that beset the in-person conference: the schedules are overfull with interesting papers at conflicting times, presenters go over their allotted times and thereby leave no time for discussion, and the Q&A sessions tend to go off the rails with people asking questions that have more to do with their own views than with the presentation. But we were still pleased that the move online allowed younger scholars the opportunity to shine and get uptake with their work. And we were able still to hear a few presentations that provided some real insight. In these respects, online conferences are much like their in-person counterparts.
But there are differences. A unique feature of in-person conferences lies in the unplanned sociality that they make possible. The in-person setting allows for the possibility of passing some luminary in the hall between sessions, or meeting someone whose work you just read. In fact, it’s a piece of unacknowledged common wisdom that the true value of in-person conferences lies in unstructured time when one is not attending sessions. Read more »
COVID-19 has forced populations into lockdown, seen the restriction of rights, and caused widespread economic, social, and psychological harm. With only 11 countries having no confirmed cases of COVID-19 (as of this writing), we are globally beyond strategies that aim solely at containment. Most resources are now being directed at mitigation strategies. That is, strategies that aim to curtail how quickly the virus spreads. These strategies (such as physical and social distancing, increased hand-washing, mask-wearing, and proper respiratory etiquette) have been effective in delaying infection rates, and therefore reducing strain on healthcare workers and facilities. There has also been a wave of techno-solutionism (not unusual in times of crisis), which often comes with the unjustified belief that technological solutions provide the best (and sometimes only) ways to deal with the crisis in question.
Such perspectives, in the words of Michael Klenk, ask “what technology”, instead of asking “why technology”, and therefore run the risk of creating more problems than they solve. Klenk argues that such a focus is too narrow: it starts with the presumption that there should be technological solutions to our problems, and then stamps some ethics on afterwards to try and constrain problematic developments that may occur with the technology. This gets things exactly backwards. What we should instead be doing is asking whether we need a given technology, and then proceed from there. It is with this critical perspective in mind that I will investigate a new technological kid on the block: digital contact tracing. Basically, its implementation involves installing a smartphone app that, via Bluetooth, registers and stores the individual’s contacts. Should a user become infected, they can update their app with this information, which will then automatically ping all of their registered contacts. While much attention has been focused on privacy and trust concerns, this will not be my focus (see here for a good example of an analysis that looks specifically at these factors, drawn up by the team at Ethical Intelligence). I will instead focus on the question of whether digital contact tracing is fair. Read more »
Last time, in part 1, I distinguished two strategies for combating philosophical modernism of a certain dated kind: a pluralistic post-empiricism (the exact nature of which I left open for now), and a more narrowly focused post-phenomenological approach which regards the former (and/or its main components) as merely another form of the supposedly mutually rejected picture. In sections I and II, I discussed Charles Taylor’s and Hubert Dreyfus’s phenomenological criticisms of Richard Rorty and John McDowell; today I continue with a look at Taylor’s analogous criticism of Donald Davidson. As before, the point is not to reject phenomenological approaches, but instead merely to understand why Davidson looks to Taylor even less like an anti-Cartesian ally than do Rorty and McDowell, and thus why Taylor will not be impressed by a pragmatist strategy of multiple philosophical tools in which Davidsonian semantics plays a major role. Let me also say that in reading a lot of Taylor’s work recently, I was quite impressed with the scope and rigor of his overall project, and I think that what I present as his drastic misreading of Davidson’s philosophy may most likely be detached and discarded without threatening that project. Or so it seems to me at present. Read more »
A life in which the pleasures of food and drink are not important is missing a crucial dimension of a good life. Food and drink are a constant presence in our lives. They can be a constant source of pleasure if we nurture our connection to them and don’t take them for granted.
Because food and drink are an easily accessible source of pleasure, barring poverty or disease, to care little for them is a moral failure with consequences not only for the self but for others around us. However, to nurture that connection to everyday pleasure requires thought and restraint. Pleasure can be dangerous when pursued without reason and self-control. Addictive pleasures damage us and everyone around us. Addicts, in fact, cannot feel pleasure as readily as the non-addicted and require increasing levels of stimulation to find satisfaction. Addictions and compulsions are pathological and are no model for the genuine pursuit of pleasure. Thus, we need to make a distinction between pleasure that we get from thoughtless, compulsive consumption, and pleasure that is freely chosen. Pleasure freely chosen is actually a good guide to what is good for us and what should matter to us.
This emphasis on freely chosen pleasure is important not only for keeping us healthy but because certain kinds of pleasures are deeply connected to our sense of control and independence. Some of the pleasures in life come from the satisfaction of needs. When we are cold, warm air feels good. When we are hungry even very ordinary food will taste good. But such enjoyment tends to be unfocused and passive. We don’t have to bring our attention or knowledge to the table to enjoy experiences that satisfy basic needs. We are hard-wired to care about them and our response is compelled.
However, many pleasures are not a response to need or deprivation. We have to eat several times a day, but we don’t have to eat well several times a day. Pleasure freely chosen is essential to a good life because it expresses our independence from need. Read more »
By the beginning of the 20th century, it had become clear to an influential minority of philosophers that something was badly amiss with modern philosophy. (There had been gripes of innumerable sorts since the beginning of modernity in the 17th century; but our subject today is the present.) “Modern” here means something like “Lockean and/or Cartesian,” where this means … well, it’s not immediately clear what exactly this means, nor what exactly is wrong with it, and therein lies the tale of a good deal of 20th-century philosophy. As with every broken thing, we have two choices: fix it, or throw it out and get a new one; and many philosophers have advertised their projects as doing one or the other. However, as we might expect, unclarity about the old results in corresponding unclarity about the supposedly better new. What’s the actual difference, philosophically speaking, between rehabilitation and replacement?
Let’s start with what two important groups of contemporary anti-modern philosophers (again, let’s leave pre-moderns out of it for today) say about what they’re doing. We can all agree that (in Wittgenstein’s words, but quoted by all and sundry) “a picture held us captive,” and even, in his continuation, that the way it did this was that “it lay in our language and language seemed to repeat it to us endlessly.” That is, it’s not simply a philosophical theory, the conclusion of an argument we have come to regard as unsound. Even in such relatively straightforward cases, of course, there may be plenty of disagreement about how to continue; but here part of our task is not simply to outline a better view, but also to diagnose and escape this characteristic feature of the old one. Such a treatment would explain how such captivity was possible, and how our very language could turn against us, as well as (naturally) what to do about it. Read more »
Human beings are agents. I take it that this claim is uncontroversial. Agents are that class of entities capable of performing actions. A rock is not an agent, a dog might be. We are agents in the sense that we can perform actions, not out of necessity, but for reasons. These actions are to be distinguished from mere doings: animals, or perhaps even plants, may behave in this or that way by doing things, but strictly speaking, we do not say that they act.
It is often argued that action should be cashed out in intentional terms. Our beliefs, what we desire, and our ability to reason about these are all seemingly essential properties that we might cite when attempting to figure out what makes our kind of agency (and the actions that follow from it) distinct from the rest of the natural world. For a state to be intentional in this sense it should be about or directed towards something other than itself. For an agent to be a moral agent it must be able to do wrong, and perhaps be morally responsible for its actions (I will not elaborate on the exact relationship between being a moral agent and moral responsibility, but there is considerable nuance in how exactly these concepts relate to each other).
In the debate surrounding the potential of Artificial Moral Agency (AMA) this “Standard View” presented above is often a point of contention. The ubiquity of artificial systems in our lives can often lead to us believing that these systems are merely passive instruments. However, this is not always necessarily the case. It is becoming increasingly clear that intuitively “passive” systems, such as recommender algorithms (or even email filter bots), are very receptive to inputs (often by design). Specifically, such systems respond to certain inputs (user search history, etc.) in order to produce an output (a recommendation, etc.). The question that emerges is whether such kinds of “outputs” might be conceived of as “actions”. Moreover, what if such outputs have moral consequences? Might these artificial systems be considered moral agents? This is not to necessarily claim that recommender systems such as YouTube’s are in fact (moral) agents, but rather to think through whether this might be possible (now or in the future). Read more »
I often hear it said that, despite all the stories about family and cultural traditions, winemaking ideologies, and paeans to terroir, what matters is what’s in the glass. If a wine has flavor it’s good. Nothing else matters. And, of course, the whole idea of wine scores reflects the idea that there is single scale of deliciousness that defines wine quality.
For many people who drink wine as a commodity beverage, I suppose the platitude “it’s only what’s in the glass matters” is true. But many of the people who talk this way are wine lovers and connoisseurs. For many of them, there is something self-deceptive about this full focus on what is in the glass. Although flavor surely matters, it is not all that matters, and these stories, traditions, and ideologies are central to genuine wine appreciation.
Burnham and Skilleås, in their book The Aesthetics of Wine, engage in a thought experiment that shows the questionable nature of “it’s only what’s in the glass that matters”. They ask us to imagine a scenario in 2030 in which wine science has advanced to such a point that any wine can be thoroughly analyzed, not only into its constituent chemical components (which we can already do up to a point), but with regard to a wine’s full development as well.
In 2019 Buckey Wolf, a 26-year-old man from Seattle, stabbed his brother in the head with a four-foot long sword. He then called the police on himself, admitting his guilt. Another tragic case of mindless violence? Not quite, as there is far more going on in the case of Buckey Wolf: he committed murder because he believed his brother was turning into a lizard. Specifically, a kind of shape-shifting reptile that lives among us and controls world events. If this sounds fabricated, it’s unfortunately not. Over 12 million Americans believe (“know”) that such lizard people exist, and that they are to be found at the highest levels of government, controlling the world economy for their own cold-blooded interests. This reptilian conspiracy theory was first made popular by well-known charlatan David Icke.
What emerged from further investigation into the Wolf murder case was an interesting trend in his YouTube “likes” over the years. Here it was noted that his interests shifted from music to martial arts, fitness, media criticism, firearms and other weapons, and video games. From here it seems Wolf was thrown into the world of alt-right political content.
In a recent paper Alfano et al. study whether YouTube’s recommender system may be responsible for such epistemically retrograde ideation. Perhaps the first case of murder by algorithm? Well, not quite.
In their paper, the authors aim to discern whether technological scaffolding was at least partially involved in Wolf’s atypical cognition. They make use of a theoretical framework known as technological seduction, whereby technological systems try to read user’s minds and predict what they want. In these scenarios, such as when Google uses predictive text, we as users are “seduced” into believing that Google knows our thoughts, especially when we end up following the recommendations of such systems. Read more »
Beauty has long been associated with moments in life that cannot easily be spoken of—what is often called “the ineffable”. When astonished or transfixed by nature, a work or art, or a bottle of wine, words even when finely voiced seem inadequate. Are words destined to fail? Can we not share anything of the experience of beauty? On the one hand, the experience of beauty is private; it is after all my experience not someone else’s. But, on the other hand, we seem to have a great need to share our experiences. Words fail but that doesn’t get us to shut up.
Perhaps communication about beauty is not hopeless; we do after all share some responses to beauty. Most everyone agrees the Mona Lisa is beautiful (if you can actually get close enough to enjoy the diminutive painting amidst the hordes at the Louvre). Most everyone agrees that Domaine de la Romanée-Conti makes lovely wine if you can afford a taste. Who would argue with the spectacular coastline view of Cinque Terre from Monterosso?
However, in matters of beauty, disagreements are just as common. As Alexander Nehamas argues, beauty forms communities of like-minded lovers who share an affection for certain works of art and who do find it possible to communicate their obsession. Something escapes the dark tunnels of subjectivity to survive in a clearing where others mingle. But this process excludes people who don’t get it. We are often bored to tears by something that fascinates others. Across that barrier of incomprehension words may well fail. Beauty forms communities of rivals as the scandal surrounding the first performance of Stravinsky’s Rite of Spring exemplifies. The contretemps between conventional and natural wine is the latest to divide the wine world. May it not be the last because these conflicts matter and are a symptom of the fundamentally normative response which beauty demands of us. Read more »
In discourse about wine, we do not have a term that both denotes the highest quality level and indicates what that quality is that such wines possess. We often call wines “great”. But “great” refers to impact, not to the intrinsic qualities of the wine. Great wines are great because they are prestigious or highly successful—Screaming Eagle, Sassicaia, Chateau Margaux, Penfolds Grange, etc. They are made great by their celebrity, but the term doesn’t tell us what quality or qualities the wine exhibits in virtue of which they deserve their greatness. Sometimes the word “great” is just one among many generic terms—delicious, extraordinary, gorgeous, superb—we use to designate a wine that is really, really good. But these are vacuous, interchangeable and largely uninformative.
It’s a peculiarity of the wine community that when designating the highest quality, we sometimes refer to a score assigned by a critic. But that tells us how much that critic liked the wine in comparison to similar wines. It doesn’t tell us why it deserves such a rating. We have criteria to judge wine quality such as complexity, intensity, balance, and focus. But these refer to various dimensions of a wine, not an overall judgement of quality.
Although most wines provide pleasure, some wines are not merely pleasurable. They stand out from the ordinary and have a special claim on our attention. We need a way of describing the depth and meaning of that experience. In the history of aesthetics “beauty” has filled this role as an indicator of remarkable aesthetic quality. It is less frequently used today than in centuries past since many works of modern or contemporary art do not aim at aesthetic pleasure. After the disruptions of 20th Century art, it seems most people in the art world are disillusioned by beauty as if it were a fusty old term genuflecting toward conventions left behind, something false or inflated that reflective people no longer believe in. Read more »
The word “interpretivism” suggests to most people a particularly crazy sort of postmodern relativism cum skepticism. If our relations to reality are merely interpretive and perspectival (I will use these terms interchangeably as needed, the idea being that eachinterpreter has her own distinct perspective on a world not reducible to any single view), our very access to objective facts seems threatened. Nietzsche, for example, famously says that “there are no facts, only interpretations” (a careless misreading, but let’s not get into it here). Fast-forward to Jacques Derrida and the whole lit-crit crew, who claim that everything is a text; and with the triumphantly dismissive reference to that notorious postmodern imp, the game is over. Interpretation is for sissies; let’s get back to doing hard-nosed empirical science (or objective metaphysics).
On this account, the opposite of “interpretive” is something like “representational”: our successful beliefs simply get the world right, with no (subjective, open-ended, wishy-washy) interpretation required. This makes sense up to a point. Our beliefs portray the world as being a certain way, not as (primarily) meaningful or enlightening or useful, or whatever is characteristic of our favored interpretations. On the other hand, to distinguish belief from meaning in this way makes it seem as if interpretation does not concern itself with belief or inquiry at all. Yet even if interpretation is not the same as inquiry, or meaning the same as belief, they are – or so we post-Davidsonian pragmatists claim – more closely intertwined than this dichotomous account would indicate.
One way to sort this out is to jump right into it with a close analysis of the notions of meaning and belief in the manner of the later Davidson and Richard Rorty’s frustratingly dodgy use of same. We’ll do more of that later on (he warned); but today I wanted to try another tack. It is generally accepted that history in particular is an interpretive discipline (a “humanity,” not a “science”), yet it is commonly accepted as well that historians deal in facts. If we can see how this conceptual accommodation works in the narrower context, we may be able to transpose it, or something like it, into our larger one. In this post I will set the problem up, leaving you in suspense until next time when I reveal a possible solution. Read more »
Democracy is a precious social good. Not only is it necessary for legitimate government, in its absence other crucial social goods – liberty, autonomy, individuality, community, and the like – tend to spoil. It is often inferred from this that a perfectly realized democracy would be utopia, a fully just society of self-governing equals working together for their common good. The flip side of this idea is familiar: the political flaws of a society are ultimately due to its falling short of democracy. The thought runs that as democracy is necessary for securing the other most important social goods, any shortfall in the latter must be due to a deviation from the former. This is what led two of the most influential theorists of democracy of the past century, Jane Addams and John Dewey, to hold that the cure for democracy’s ill is always more and better democracy.
The Addams/Dewey view is committed to the further claim that democracy is an ideal that can be approximated, but never achieved. This addition reminds us that the utopia of a fully realized democracy is forever beyond our reach, an ongoing project of striving to more perfectly democratize our individual and collective lives.
This view is certainly attractive. Trouble lies, however, in making the democratic ideal concrete enough to serve as a guide to real-world politics without thereby deflating it of its ennobling character. Typically, as the ideal is made more explicit, one finds that it presumes capacities that go far beyond the capabilities of ordinary citizens. It turns out that democracy isn’t only out of our reach, it’s also not for us. Read more »
Research by linguists into wine metaphors have identified several source domains that help wine writers describe the faint and ephemeral features of poetry in a glass. “Wine is a building”, “wine is piece of cloth”, and especially “wine is a person” are a few of the rich diversity of potential likenesses that might uncover facets of a wine. There are after all many ways of being a body or a person with new variants continuously on offer. But how do writers identify, within these source domains, which likenesses will be compelling and how do readers come to understand what a metaphor means? Identifying source domains for wine metaphors must be supplemented by an account of how interpretation works.
Given the importance of variation and distinctiveness in wine appreciation and the need for linguistic innovation to capture these dimensions, theories of metaphor that explicitly link metaphor to the exercise of imagination will be most useful. The use of metaphor in wine language looks backward to conventional, entrenched descriptions while looking forward in order to capture the emergence of innovative taste profiles that require linguistic imagination.
To add more complexity to the mix, the use of metaphor in wine language serves two broad purposes that are sometimes opposed. On the one hand writers use metaphor to communicate an accurate description of the wine they’re tasting, especially by conveying the holistic properties such as elegance, intensity, or balance. On the other hand, metaphor expresses the remarkable experiences of a wine that wine importer Terry Theise calls “sublime”. “Some wines” he writes, “…are so haunting and stirring that they bypass our entire analytical faculty and fill us with image and feeling”. Read more »