by Scott F. Aikin and Robert B. Talisse
Before the COVID pandemic, travel to academic conferences and colloquia was a large part of the job of being a professor at a research-focused university. The last few months have given us the opportunity to reflect on the hurly burly of academic travel. We’ve keenly missed many things about those in-person events. Yet there were things we don’t miss very much at all. While academic conferences are still paused, we wanted to make a note about what’s worth our time and not, and then make some resolutions about what we can do better.
The bloom of online conferences since last Spring provides a key point of comparison. The online conference has many of the same problems that beset the in-person conference: the schedules are overfull with interesting papers at conflicting times, presenters go over their allotted times and thereby leave no time for discussion, and the Q&A sessions tend to go off the rails with people asking questions that have more to do with their own views than with the presentation. But we were still pleased that the move online allowed younger scholars the opportunity to shine and get uptake with their work. And we were able still to hear a few presentations that provided some real insight. In these respects, online conferences are much like their in-person counterparts.
But there are differences. A unique feature of in-person conferences lies in the unplanned sociality that they make possible. The in-person setting allows for the possibility of passing some luminary in the hall between sessions, or meeting someone whose work you just read. In fact, it’s a piece of unacknowledged common wisdom that the true value of in-person conferences lies in unstructured time when one is not attending sessions. Read more »
by Ali Minai
Given where we find ourselves in this late November of 2020, it is hard to think of a book more relevant or timely than The Hype Machine by Sinan Aral. The author is the David Austin Professor of Management and Professor of Information Technology and Marketing at the Massachusetts Institute of Technology. As one of the world’s foremost experts on social media and its effects, Prof. Aral is the perfect person to look at how this phenomenon has changed the world and the human experience. This is what he sets out to do in his new book, The Hype Machine, published under the Currency Imprint of Random House this September, and with considerable success.
The book provides an excellent overview of where things stand with social media, its promise and its peril. For anyone looking for a single, accessibly non-technical source of information and insight on these important issues, this book is essential reading. The book is very well-organized, and the logical flow – both across and within chapters – is remarkably smooth. Overall, the book is an easy read that informs and educates the reader without getting mired in technical jargon – no mean feat for a book about a technical field that is rife with jargon. And, while a large proportion of the book simply communicates information on where things stand and how different social media platforms are shaping the lives of their users, Prof. Aral does not shy away from building a useful abstract framework in which to place all this, and to address the complex issues raised as a result. Read more »
Dylan Kwait. Surfers by Plum Island, October 2020.
With permission … thanks Dylan!
by Claire Chambers
This has been a terrible year with almost no redeeming features. I have written elsewhere about Covid-19’s hardships, both personal and political. Today, with a vaccine on the horizon and Trump defeated, I want to find some other crumbs of comfort that, as with Hansel and Gretel, might take us somewhere.
I am privileged to live with my husband Rob and two sons in our own house with a garden, and to hold down a lecturing job which is pretty secure. Rob is even more financially stable in his career, working as he does in a growth industry this pestilent year: he’s a family doctor.
But of course being a GP has brought its own share of unexpected stresses recently. In March when the UK government spectacularly failed to get PPE for medical professionals and disaster capitalists were circling, Rob bought some pyjamas from Marks and Spencer for turning into improvised surgical scrubs. He even fashioned his own visor using a piece of sponge, some plexiglass, and a coat hanger. Although he never had to use it, I won’t forget that time of helpless, abandoned terror.
Nigerian-American author Chimamanda Ngozi Adichie posted a microblog to Instagram early into this crisis about her fears around the pandemic. When I read these terse words, I nodded vigorously:
My husband is a doctor and each morning when he leaves for work, I worry. My throat itches and I worry. On Facetime I watch my elderly parents. I admonish them gently: Don’t let people come to the house. Don’t read the rubbish news on WhatsApp.
Out of the two vulnerable parents Adichie was worrying about in this essay from April, it is poignant to realize that her father was to die just two months later. Read more »
by Chris Horner
Not long ago there was an article circulating on Facebook about ‘Hating the English’, originally published in a large circulation newspaper. The Irish author says something to the effect that once she thought it was just a few bad ones etc., but now she hates the lot of them. It’s been stimulated, I think, by the repulsive English nationalism that has been raising its head since Brexit, plus the usual ignorance about Ireland, Irish history and Irish interests on the part of your typical ‘Brit’. It’s not a very good piece of writing, and it has a rather slight idea in it. I’d ignore it but for the ‘likes’ and positive comments it’s received, particularly from ‘leftists’. It’s an example of what we could call ‘bloc thinking’ – the emotionally satisfying but futile consignment of entire masses of people into categories of nice and nasty.
It has a number of obvious problems. It is deeply unwise to brand entire national groups good or bad, to declare love or hate for whole ethnic or national communities. Too many English people have branded the Irish in just that way throughout their shared and troubled history; just repeating it the other way is hardly progress. This kind of thing is the habit of the worst kinds of right wing chauvinists, and we should steer well clear of it. We get the same kind of thing about, for instance, from ‘anti-imperialists’ despising the ‘Americans’ (meaning usually: ’citizens of the USA’). This is particularly obtuse when it comes from people who have never visited the USA and don’t know anyone who lives there. Just think: 328 million people, rich and poor, white, black or brown, anglo and latino, from coast to coast. All dismissed, because policies emanating from ‘America’s’ ruling 1%. It is true that many – not all by any means – US citizens will have supported those policies, but that ought to be the beginning of a problem to think about, not the invitation to simple minded moralising. Fatuous generalisations are so obviously foolish that it might not detain us long, if it were not for the tendency of this kind of approach to encompass whole swathes of people, demographics and even generations as Good or Bad. So we get Greedy ‘boomers’ versus ‘millennials’, or whatever crass label is currently in use. And so on. Read more »
by Thomas Larson
According to Donald Trump, in a statement made to MSNBC’s “Morning Joe,” April 11, 2011, about the fake “birther controversy” of President Barack Obama—the opening salvo in Trump’s campaign of political disinformation—Obama’s “grandmother in Kenya said, ‘Oh, no, he was born in Kenya and I was there and I witnessed the birth.’ She’s on tape,” Trump went on. “I think that tape’s going to be produced fairly soon. Somebody is coming out with a book in two weeks, it will be very interesting.”
And, according to Vox News, President Trump, two weeks after losing the 2020 November 3rd election, tweeted, “I won the election!” He had warned many times prior to the vote that the only way he would lose the election would be if it was rigged, and the only way he would win was if the election was fair, a remarkably trenchant conjuration of the Three Witches’ spell on Macbeth, “Fair is foul, and foul is fair.”
And, according to Chanel Dion of One America News and Trump legal team lawyer, Sydney Powell, software engineers in Michigan and Georgia (and in parts of 26 other states) contracted with Dominion Voting Systems, which has financial ties to Nancy Pelosi, Dianne Feinstein, George Soros, the Clinton Foundation, and the seven-years-dead Hugo Chávez of Venezuela, to make ballot-counting machines switch votes from Republican to Democrat presidential candidates or to leave out a prescribed number of votes for President Trump in Joe Biden’s favor. Read more »
by Adele A Wilby
Recent protests in the US by Trump supporters since the election of Joe Biden, highlight just how political ideologies have the potential to tear seemingly ‘stable’ societies apart. A political divide however cannot always be seen as a clear-cut contradiction between the right and the left, as, for example, the way Trump supporters might assert; Biden, and Democrats more broadly, could hardly be seen to represent the left. Likewise, the right has it shades of commitment to conservatism. However, Trump’s 70 million supporters represent a congealing of far-right politics in America identifiable by the policies articulated by Trump that they endorse: anti- immigration, racism, a resurgent nationalism. While there is little doubt that such policies have been magical music to the ears of many right wingers, for others Trump and the Republican Party do not go far enough, and it is these extreme right-wing groups that are the subject of Talia Lavin’s book Culture Warlords: My Journey into the Dark Web of White Supremacists.
Lavin opens her book with an explicit acknowledgement of her politics. She admits that she is ‘to the left of Medicare for All’. Thus, there is no pretence of ‘objectivity’ in the subject of her research and analysis: the book is an account of her one year internet engagement with a ‘sliver of a movement’ of the right-wing spectrum. Her research though adds up to a shocking yet thought provoking first-hand account of the thinking that underpins the deep hate and the almost godlike worship of violence that white supremacists profess. Read more »
A lamp lighting a storefront in Vahrn, South Tyrol, in November of 2020. And sky.
by David Kordahl
When I was seventeen years old, I took my first college science course, a summer class in astronomy for non-majors. The professor narrated his wild claims in an amused deadpan, calmly showing us how to reconstruct the life cycle of stars, and how to estimate the age of the universe. This course was at the University of Iowa, and I imagine that the professor was accustomed to intermittent resistance from students like me, whose rural, religious upbringing led them—led me—to challenge his claims. Yet I often found myself at a loss. The professor used a soft sell, and his claims seemed somewhere beyond the realm of mere politics or belief. Sure, I could spot a few gaps in his vision (he batted away my psychoanalytic interpretation of the Big Bang by saying he had never heard of Freud), but I envied him. I wished that my own positions were so easy to defend.
Seventeen years later, I’m now in the professor’s position, defending physics to doubting undergraduates. My views now are mostly easy to defend. Yet as someone who never progressed much beyond an undergrad knowledge of astronomy (I took one graduate cosmology course and left it at that), I’ll admit that some of the things that bothered me back then still bother me today. Some of the grandest claims in physics are based almost solely on astrophysical evidence, including the assertion that our standard physics accounts for less than 5% of what’s really there, with the other 95+% of mass-energy in the woolly categories of “dark energy” and “dark matter.” Such views are mainstream enough now to invite few scientific naysayers. Read more »
by Tim Sommers
Cosmology is a young science. Maybe the youngest. Some people say it started in the 1920’s when these little glowing clouds visible at certain points in the sky were found, by better and better telescopes, to be composed of billions and billions of stars, just like our own galaxy – the Milky Way – and it was then discovered that no matter what direction you looked they were all rushing away from us. More than one cosmologist has wondered if these galaxies know something that you and I don’t.
On the other hand, maybe, cosmology is the oldest science – if looking up at the stars counts. Anyway, well before the discovery of other galaxies we knew that the universe was very old and very large, but, boy, we had no idea.
Other people say cosmology began when Einstein’s General Relativity gave us some of the math we needed to talk about the universe as a whole for the first time and began to raise questions about its shape.
For me, I say, cosmology didn’t really start until the 1960s when satellites designed to detect tiny amounts of microwave radiation mapped what we now call the “cosmic microwave background radiation”. Here’s the thing that we had missed for so long, the thing that is so hard to believe that we all take for granted now – hardly worth talking about. If the galaxies are all very far apart and getting further apart all the time, doesn’t it follow that there was a time when they were all very close together? Doesn’t it follow, in fact, as most likely, that there was a starting point? Isn’t this, for the first time in the history of humanity, empirical evidence that the universe had a beginning? And if there was such a beginning would it make a sound even if no one was around to hear it? The answer is, yes, yes, and it would make a sound, it does. If you have an old radio or tv, detune it or stop between stations and hear the static, the fuzz. That’s it. That’s the sound of the cosmic microwave background radiation. It’s been there since the beginning of time.
But if the universe has a beginning, will it have an end? And how will it end? Read more »
Marc Caplan in the Los Angeles Review of Books:
ONE HUNDRED YEARS AGO, when most American Jews were immigrants from Eastern Europe, nearly every Jew in the United States spoke Yiddish, but no one gave it any respect. Today, by contrast, everyone is full of affection for Yiddish, even though almost no one speaks it. Though one hears from every synagogue pulpit and reads in most university Jewish Studies mission statements that Hebrew is the eternal and unifying language of the Jewish experience, Yiddish maintains an emotional claim on the descendants of Eastern European Jews, as well as leaving an indelible imprint on the popular culture created by, for, and among these immigrants and their offspring. Is this valorization of Yiddish commensurate with knowledge and appreciation of — or respect for — the language and the culture it created beyond the lexicon of sentimental melodies, off-color jokes, and redefined adjectives? One could gesture to the 2020 Seth Rogen film An American Pickle without having to answer the question further. Emotional relationships can often lead in nonrational directions, seldom directed by facts.
Toni Morrison has cautioned all Americans that no haunting can ever be entirely benign. And to the extent that Yiddish has changed American culture — as Ilan Stavans and Josh Lambert assert in the title of their readable and teachable new anthology — it is as a haunting, a ghostly reminder of deceased ancestors, defunct aspirations, and lost causes.
Frédérique de Vignemont in Aeon:
Heini Hediger, a noted 20th-century Swiss biologist and zoo director, knew that animals ran away when they felt unsafe. But when he set about designing and building zoos himself, he realised he needed a more precise understanding of how animals behaved when put in proximity to one another. Hediger decided to investigate the flight response systematically, something that no one had done before.
Hediger found that the space around an animal could be partitioned into zones, nested within one another, and measurable down to a matter of centimetres. The outermost circle is what’s known as flight distance: if a lion is far enough away, a zebra will continue to graze warily, but any closer than that, the zebra will try to escape. Closer still is the defence distance: pass that line and the zebra attacks rather than fleeing. Finally, there’s the critical distance: if the predator is too close, there’s nothing to do but freeze, play dead and hope for the best. While different species of wild animals have different limits, Hediger discovered that they’re remarkably consistent within a species. He also offered a new definition of a tame animal, as one that no longer treats humans as a significant threat, and so reduces its flight distance for humans to zero. In other words, a tame animal was one to which you could get close enough to touch.
Like all animals, humans also protect themselves from potential threats by keeping them at a distance. Those of us beginning to see friends again after months of pandemic-induced social distancing can feel this at a visceral level, as we balance the desire for contact against a sense of risk. Once we evaluate something as a potential threat – even if that assessment is informed by public policy or expert prescription – there’s a powerful urge to maintain a buffer of space.
Annie Zaidi in Scroll.in:
The beating heart of literature is writers’ engagement with sadness and the conflicts of their time. Many of these conflicts are centred on wealth and access to natural resources: land, water, mineral, forest, stone, sand, clean air. Big money, often with the aid of big media, attempts to shape public opinion about who controls the world, who deserves what, how resources ought to be shared. In a similar vein, traditional hegemonies in India – patriarchy and the caste system – try to control the stories we tell about each other.
Why, then, do some of these powerful groups enable spaces where they can be challenged? Why do they invest in literature or theatre or film festivals where non-hegemonic views are invited? Writers are easy to take down, to put away or, at the very least, politely ignore. Why are we invited, given a platform and asked to comment on contentious issues?
I have struggled with this question for a few years now: what do the wealthy hope to gain?
Sara Harrison in Undark:
THERE’S A WEALTH OF information floating in the air, though we rarely take the time to notice. Olfaction, or the ability to smell, may be the least appreciated of the five senses. A 2011 poll by the marketing firm McCann Worldgroup, for instance, found that 53 percent of young people would prefer to give up their sense of smell than to give up their use of technology.
But that was before the Covid-19 pandemic suddenly made us acutely aware of the dangers in the air around us: the droplets expelled from unmasked mouths and noses, the potentially infectious soup of molecules in unventilated, indoor spaces. And before anosmia, or the loss of smell, emerged as one of the most common Covid-19 symptoms. So perhaps it’s time we pay closer attention to what else is in the air.
As Harold McGee shows in “Nose Dive: A Field Guide to the World’s Smells,” olfaction is a fascinating landscape that adds much to our sensory experience of the world, if only we would breathe a bit deeper. He devotes some 600 pages to the vast and exciting “osmocosm,” his term for the odors that swirl around us every day, even if we don’t notice them.
Rasha Aridi in Smithsonian:
The Covid-19 pandemic has made the world feel lonelier than ever as people have been shut away in their homes, aching to gather with their loved ones again. This instinct to evade loneliness is deeply engrained in our brains, and a new study published in the journal Nature Neuroscience suggests that our longing for social interaction elicits a similar neurological response to a hungry person craving food, reports Ali Pattillo for Inverse. Livia Tomova, a cognitive neuroscientist at the Massachusetts Institute of Technology, and her collaborators conducted a study in which they had a test group of 40 people fast for ten hours. At the end of the day, the hungry subjects were shown images of pizza and chocolate cake while receiving a brain scan, reports Bethany Brookshire for Science News.
In a second round of experimentation, the subjects were barred from social interaction—no in person or virtual human contact—for ten hours. Afterward, they were shown images of people gathering and playing sports as the team scanned their brains. The scans revealed that the same part of their brains perked up in response to both food and social gatherings, reports Science News.
…”[This study] provides empirical support for the idea that loneliness acts as a signal—just like hunger—that signals to an individual that something is lacking and that it needs to take action to repair that,” Tomova tells Inverse.
At the still point of the turning world. Neither flesh nor fleshless;
Neither from nor towards; at the still point, there the dance is,
But neither arrest nor movement. And do not call it fixity,
Where past and future are gathered. Neither movement from nor towards,
Neither ascent nor decline. Except for the point, the still point,
There would be no dance, and there is only the dance.
I can only say, there we have been: but I cannot say where.
And I cannot say, how long, for that is to place it in time.
from Four Quartets: Burnt Norton
Painting by John Civitello
Jennifer Senior in The New York Times:
More than 40 years ago, three psychologists published a study with the eccentric, mildly seductive title, “Lottery Winners and Accident Victims: Is Happiness Relative?” Even if you don’t think you know what it says, there’s a decent chance you do. It has seeped into TED talks, life-hack segments on morning shows, even the occasional whiff of movie dialogue. The paper is the peanut butter and jelly sandwich of happiness studies, a staple in any curriculum that looks at the psychology of human flourishing.
The study is straightforward. As the title suggests, the authors surveyed lottery winners and accident victims, plus a control group, hoping to compare their levels of happiness. But what the authors found violated common intuition. The victims, while less happy than the controls, still rated themselves above average in happiness, even though their accidents had recently rendered them all either paraplegic or quadriplegic. And the lottery winners were no happier than the controls, at least in any statistically meaningful sense. If anything, the warp and weft of their everyday lives was a little more threadbare. Talking to friends, hearing jokes, having breakfast — all of these simple pleasures now left them less satisfied than before.
There were flaws in the study — its design, alas, was as crude as an ax — but you can see why it became famous. It had an irresistible takeaway: Money! It doesn’t buy you happiness! Perhaps even more fundamentally, it had a sexy, almost absurd, premise. What kind of mind would think to pair lottery winners and accident victims in a research paper? Who in academic psychology had such a cockeyed imagination? It was social science by way of Samuel Beckett.