by Mike O’Brien
I’m disappointed in my columns so far. Not to say that they’ve been completely without value; I’ve managed to turn out some decent pieces and some kind readers have taken the time to tell me as much. But, taken together, my output exhibits a fault that I had sought to avoid from the outset. It has been far too occasional, too reactive, too of the moment in an extended, torturous moment of which I very much do not wish to be. There are exculpatory circumstances, of course. These are difficult times, and we are weak and vulnerable beings, and I am nothing if not an entrenched doom-scroller with a tendency for global anxiety. This is not an apology, because a core tenet of my approach to writing is a complete disregard for my audience. You’re all lovely people, I’m sure, but in order to write I have to provisionally discount your existence. This is, rather, a confession to the only person whose opinion matters to me as a writer. And that would be myself.
I’m qualified to critique my own work because I know what it is supposed to be, and I know when it falls short because I can’t be arsed to invest more effort into it. I have the great misfortune of being able to skate by on style. I know some people enjoy fluffy exercises in style; I rather enjoy such exploits myself, and my favourite writers are all great stylists. But I had set out to do more with these columns, and so far that goal is largely unmet. I can blame the general state of emergency that has engulfed this year, for robbing me of focus and constancy, for frustrating my earnest intentions to rise above immediacy and reaction. But I’ve lived with myself long enough to know that I lacked focus and constancy long before Covid showed up, and if this was a “normal” (i.e. less obviously and acutely disastrous) year, I’d have to come up with some other excuse for the same failures.
I don’t think of myself as a writer because I don’t feel any internal drive to write. Some people have a burning need to express their inner experience. Some burst out in song, or scribble urgent thoughts on whatever scraps are close at hand. Some suffer years of toil and sacrifice in service to a creative practice that is as vital to them as breathing. Such a motivated existence is rather alien to me. I’ve spent much of the last decade surrounded by artists and creators, admiring their efforts and enjoying their works, but I still don’t feel like I’m one of them. This despite having written and produced two plays of my own in the last three years. Why did I do that, if I’m not a writer?
Because I had to. Not that I was ordered by an art court to write a script, or threatened with art jail for failing to appear at curtain time. I promised a friend, after much cajoling, to cast my name into a lottery for an arts festival. I figured I had a 50% chance of getting picked. I can live with a 50/50 split. When my name was drawn, I was locked in. This being a Fringe festival, I could have discharged my responsibilities by reciting the alphabet on stage for 30 minutes, if I wished. No-refund policies are artistically liberating in that way. But I ended up doing a rather ambitious production (for someone with no theatre experience), and in a twist of cosmic comedy, I won a guaranteed spot in the next year’s festival. For those of you tracking the math, that’s a commitment to create two plays, from a probability-adjusted initial commitment of 0.5 plays. This is why I don’t gamble.
That little story illustrates the general pattern of how I get things done. I don’t, absent some external compulsion. Of course, I had to elect to be bound by that compulsion at some earlier date, usually in an uncharacteristic fit of aspiration or boredom. That’s how I got through university, despite being adrift and miserable for most of it; I simply didn’t afford myself the option of quitting. (Except for an aborted BA in economics at McGill, because neo-classical economics is a stinking, useless corpse and, besides, what does anyone need two BAs for?). All my significant accomplishments are the result of being lashed to a mast by some past self. Recognizing this fact, I keep an eye out for useful masts.
This recurring column was to be such a thing. My application was submitted on a whim, emerging from a happy confluence of circumstances. I was on the cusp of turning forty, in the early days of the first Covid lockdown, starting to think that I needed some kind of productive activity to guard my faculties from atrophy. It was clear by then that life in the creative milieu where I had been working was suspended for the foreseeable future, and opportunities for spontaneous creative collaboration would not be forthcoming. It is rare for my world to be so empty that I am moved to fill it with my own work. But I suppose this has been a rare year (I should hope).
This loss of casual contact with a creative community was paralleled in another area of my life, the bustling academic scene in Montreal. Since finishing my MA in 2010, I had been participating in conferences and seminars across town, enjoying an environment of scholarship without bearing any of the burdens of academia. I’d occasionally wonder if I was a free-rider in this, though such doubts were assuaged by the fact that I actually read the material and seemed to provide useful feedback (and, when it was unavoidable, paid the dear sums required for attendees who are neither enrolled nor destitute). Some activities have been moved online, but much of what I enjoyed has disappeared. This has prompted a reckoning, as I have to ask myself what exactly I was getting from these events, and what elements are most important to find in a substitute.
Certainly, part of the the appeal lay in discussing philosophy with professional philosophers, which does not often happen outside of academe. The conventions and mood of such interactions are familiar and engaging, even when the content is not my particular cup of tea. But I don’t want to simply be a gadfly. I feel some faint motivation to do some philosophical work myself, and being around the doing of philosophy gave the feeling that some kind of philosophical accomplishment was imminent. “This is the place”, said my gut, “things are gonna happen.” But they didn’t happen because I didn’t make them happen, despite being in “the place”.
I also had some misgivings about “the place”, and its suitability to the production of the kind of philosophy that I care about. I’m wary of sniping from the sidelines, criticizing professionals who have to deal with far more burdens and responsibilities than I can imagine. But I’ve read enough accounts from talented, industrious and disenchanted academics to believe that my pessimism was somewhat accurate. I never seriously considered an academic career, or rather, a life of trying to hustle my way into a candidacy for an academic career. The job market was grim when I finished my grad studies, and now, ten years later and on the precipice of the worst depression since the 1930s, I lack the unconditional optimism required to see a path forward. Also, I’ve spent too much time dithering and I’m too damn old. Which leaves me facing a certain self-imposed ultimatum: either I find a way to do the kind of work I want to accomplish with my own resources, of my own accord, or I attribute such aspirations up to vanity and accept that I’m just a slacker with too many degrees.
I have mostly focused my attention on matters about which I can do almost nothing; global capitalism, ecological collapse, other countries’ elections. There is a safety in that, as there is no attainable goal that I can fail to reach. The real risk (to my pride, anyway) lies in focusing on matters about which I can do almost anything, like art and philosophy. What follows here is an attempt to outline the field in which I’ll risk failure. I’ve got a pretty good idea of the initial layout, because I’ve been loitering at the entrance to these projects for quite some time.
My main philosophical preoccupation is animal ethics, and in particular the ethical relations between humans and other animals. I’ve read a lot of other people’s work in this area, but not much of it has appealed to me. First, a lot of it draws heavily on concepts of rights and personhood. I don’t believe in animal rights, for the most part. In that sense I am a species egalitarian, because I don’t really believe in human rights, either.
See, you’re making that face that people make when I tell them that. Come back, I can explain.
I think rights are a useful fiction that can be instrumentalised to protect people (and any other thing) from abuse. It works for people (sort of, some of the time) because we can tell a plausible story about how people are persons, and how persons can engage in a reason-giving discourse that binds actions. I don’t think we can tell a similar plausible story about animals. Rights and persons are pieces in a language game that only humans can play. Human persons can refer to animals when they play this game among themselves, and bind their actions with regards to animals as a result, but this is not a “recognition” of rights. It is a stipulation of status motivated by a desire to provide the protections afforded by that status.
So long as rights discourse is the most efficacious way to frame moral debates in politics, I am all for pretending that animals have rights (humans, too). I’ll even go along with calling rivers and mountains persons, if that is the language required to grant them actionable protections here and now. But the thing that makes right discourse so useful in law also makes it a mess to work with in ethics. By moving from an “ought” (we should treat these entities in this way) to an “is” (these entities have these properties), rights discourse can make a normative discussion appear as a descriptive one. Which is fine, if everyone involved accepts the ascription of properties as true, and can move on to discussing the implications of these shared “facts”.
Humanity’s treatment of animals, both directly in agriculture and indirectly through habitat impacts, is monstrously abusive. It is therefore easy for well-meaning people to broadly agree on morally required changes to behaviour, despite disagreement on the facts of animal psychology and moral ontology. This shared normative motivation is corrosive to philosophy, because you can make all manner of mistakes in your case for why, and still get assent for your resulting what. Logically, the statement IF (x) THEN -(A & -A) is true whatever you may substitute for x. In this case, we can substitute “we do not have license to destroy any and all life on earth” for -(A & -A).
(Yes, I know that the structure does not track).
I have some other ideas about this, but I’ll save them for a more fully developed treatment in an upcoming column. Hobbes may make a guest appearance.
The second major issue I wish to develop is more meta-ethical, rather than nuts-and-bolts ethical. In addition to rights talk, I’ve heard a lot of justice talk. After declaring myself anti-rights, am I now declaring myself anti-justice? Kind of. Justice is fine; a just world would be much better than the one we have. But it is both excessively and insufficiently demanding as a sole moral standard. Excessively, because doing just the right amount of good may require us to make calculations for which the data may be lacking. Insufficiently, because there may be compelling reasons to do more good than our duty (as determined by available information) requires, all things considered. There may be wisdom in adopting a standard of grace and generosity, rather than justice, due to our limited ability to gather and parse relevant information. Given a history of underestimating the psychological lives of animals, erring on the side of over-estimating our moral duties to them may be in order.
I suspect that the importance placed on accurately calculating the demands of justice may reveal an assumption about moral success. That is, getting moral demands “just right” allows for an efficient allocation of resources whereby those demands can met. But that’s a rather curious situation. Why would our moral demands be equal to the resources available to meet them? That sounds an awful lot like the Just World Hypothesis, with some assembly required. My own hunch is that we are in the unlucky position of being able to recognize exponentially more demands than we can satisfy, and that morality is largely a matter of failing gracefully. Maybe that’s just a Catholic thing.
From a more bookishly philosophical perspective, justice talk shares with rights talk the fault of sliding from “ought” to “is”, and much justice talk is methodological busy-work following a perfunctory stipulation of empirical and normative assumptions. There is a deeper problem, though. I have often read “would x be justified?”, or “is x a sufficient justification for such-and-such”. And, because I am a difficult person, my first thought is “justified to whom?”. If “justified” is merely a long way of writing “just”, then such questions amount to “what is The Good?” and have no place in a journal article, unless the author presumes to have answered philosophy’s oldest question. But if “justified” means “satisfying some shared moral standard”, I have to ask “to what standard are you referring?”. Because, you see, these discussions are held in secular settings that do not avail themselves of scriptures or other declarations of substantive moral truths. Some authors are ethical naturalists, i.e. they believe that moral facts are a subset of natural facts, and they can be discovered empirically. I can’t fathom how that would work, and every attempt I’ve seen at explaining it involves smuggling some normative concepts in via half-baked evolutionary theory. Others may believe that morality is derivative of rationality, or is some intuitable set of independently true facts. In the era of empirical cognitive science, these positions often collapse into ethical naturalism with more steps.
The answer to this normative bootstrapping problem is, I think, Nietzsche. Many serious, analytic ethical thinkers will scoff at this. Yes, he was a bombastic ironist fond of hyperbole and self-contradiction. Yes, he said some very nasty things about, well, everyone (though women, Germans and Englishmen got the worst of it). But his breakthrough in moral thinking stands fast. God, as a source of moral facts, is indeed dead in secular thinking, and science is not replacing Him in that role. Western philosophy spent generations trying to reverse-engineer some kind of rational genealogy for the moral principles inherited from Christianity, but the jig is up. To ask “what is justified?” in humanity’s unopposed dominion on earth is to ask “what are we going to do, and what will that mean?”. Holding out for some external compulsion, be it nature, God, aliens or some logical wormhole between “ought” and “is”, to force our moral belief is just forestalling the inevitable. Moral assent, following old Fred, is a creative act, not a disinterested recognition of fact. We created atom bombs, artificial intelligence (but not really, yes, I know…) and gene editing. We shouldn’t shrink at the audacity of creating moral facts. It’s what we’ve been doing all along. We just lacked the maturity, as a cultural species, to take full responsibility for it.
I hope that these threads will be developed into something satisfyingly substantial in my future pieces. Satisfying to me, of course, and only insofar as I am an arbiter of what form is proper to these ideas. I don’t write for myself. I don’t write for you. Most of the time, I don’t write at all. But when I do, I hope to do justice to the lofty aspirations I had when I tied myself to this mast. No more following the coast.