The Social Aspects of Illness

by Joan Harvey

Like most people of a certain age, at any one time I have the unfortunate experience of knowing several people, some close, some not, who have cancer. It has become standard for the friend or spouse of the ill person to join one of the many message boards devoted to the subject and post updates to keep their friends and relatives informed. Others use Facebook to share information. Currently there are three people whose lives I follow, mostly from a distance, all with serious forms of cancer, one newly diagnosed but metastasized, two others who have been fighting for months and months.

I began to think more about the social aspects of illness when today’s usual protocol was not followed. It’s a truism to say we often notice things more in their absence, and it was the lack of shared information when someone I cared for was dying that made me aware how accustomed we are to being included in the progress (or lack thereof) of the cancer patient and how we come to depend on receiving a steady stream of updates on the ill. When this doesn’t happen and we aren’t notified of all the steps toward healing or death along the way, we feel cut off, and this can create a sense of deprivation. Read more »



World Enough and Time

by Samia Altaf

“Let’s go look at the flowers outside,” I say, as I sense Dad sinking into the recesses of his fading memory. I wheel him out. Look at the petunias. What a riot—purple, pink, white, that ordinary pedestrian flower in such abundant glory. I hold a bunch to his nose and he takes a deep breath. “Wow,” he says, and opens his eyes. The misty and faraway look hits me hard. It is like looking inside a bombed-out building that has few windows left intact and very little light. But he tries, never having been one to give up; he blinks shortsightedly at the greenery, the flowers, and the blue sky, and shakes his head at the wonder of it all—and of him being there in the middle of it. He struggles to say something but gives up halfway—words too have faded. “Wow,” he says again.

Dad has steadily and imperceptibly lost all memory to Alzheimer’s. Memories of his children, friends, family, his wife now gone. Only shards of crystallized knowledge—encoded so deeply—remain beyond the cognitive deterioration, He recognizes the simple beauty of flowers, they are still real. We look at the yellow rose bush, the petunias and the marigolds, this lovely spring evening. His face lights up. “Wow,” he says, over and over. And then: “I have these in my house” and “when will I go there?” He asks, a complete sentence, and looks anxiously at me, his brow scrunched with a look fit to break your heart. That he remembers—his house and the feeling of wanting to go there. One he had to leave when he and mom got to be too sick to live on their own. Do memories plague his ears, like flies?

“Do memories plague their ears like flies?

They shake their heads. Dusk brims the shadows.

Summer by summer all stole away,

The starting gates, the crowds and cries.”

(Philip Larkin)

Read more »

The Enlightened Luddite

“Luddite” is a word that is thrown around a lot these days. It signifies someone who is opposed to technological progress, or who is at least not climbing on board the technological bandwagon. 21st century luddites tend to eschew social media, prefer presentations without PowerPoint, still write cheques, and may even, in extreme cases, get by without a cell phone. When used in the first person, “luddite” is often a badge of honour. “I’m a bit of a luddite,” usually means “I see through and am unimpressed by the false promise of constant technological novelty.” Used in the third person, though, it typically suggests criticism. “So-and-so’s a bit of a luddite,” is likely to imply that So-and-so finds the latest technology confusing and has failed to keep up with it, probably due to intellectual limitations.

The original Luddites were English textile workers in the Midlands and the North who organized themselves in secret and destroyed machines in the newly established factories and mills. They were called “Luddites” after their fictitious leader, “General Ludd.” Peak Luddism occurred around 1812-1813, although there were incidents of machine breaking before and after then. The most notable of these were the Swing riots that swept across parts of Southern England in 1830 when impoverished agricultural labourers, sometimes carrying out threats issued by the fictitious Captain Swing, smashed threshing machines, burned barns, and demanded higher wages.

A common view today of these machine-breakers in the early stages of the industrial revolution is that they represent a clearly futile attempt to block technological innovation. One can sympathize with their plight, of course. Weavers who were proud of skills they had spent years mastering could now be replaced by relatively unskilled workers operating the new machines. Agricultural labourers who counted on threshing with hand flails for several months of employment following the harvest now faced winters without work or wages. But the times they were a-changing. Quite simply, the new methods of production could produce much more in less time and at lower cost. Trying to halt progress of this sort is as pointless as ordering back the tide. Resistance is futile.

But this view of the people who smashed stocking looms and threshing machines is oversimplified and somewhat unjust. It also carries some questionable ideological freight. Let me explain. Read more »

Sparrows in Winter

by Liam Heneghan

The traffic had been slow all day but by four pm, it was reduced to a trickle. Those cars that passed him on the street did so in two and threes as if they were sticking together for safety like lumbering animals caught out in a storm. It was, in fact, a very harsh winter day. The afternoon temperatures dipped well below zero: one of the coldest days ever recorded in Chicago. The only sounds now were from an occasional plane passing overhead, and from distant cackling from those venturesome neighbors who had left snug homes to experience the cold. He could hear the sound of his feet crunching through the snow.

The husband and wife had left the house to drop off a bag of groceries to a nearby homeless shelter. Peanut butter, cans of peaches, mandarin oranges in a light syrup, crackers, nuts, and so on: things that people accumulate in abundance for emergencies that rarely materialize. He stepped out of the car to drop off the goods, and walked down the short flight of stairs into a reception area in the basement of the church. The volunteer welcomed him with gusto; she had said “Welcome”, rather grandly, and looked at him expectantly. She seemed confused when he extended his shopping bag. “Donations,” he said. “Ah…!” she replied as she realized that he would not be staying. Another volunteer elaborately produced a receipt and he was required to jot down the list of items he was leaving behind. He found that he could not remember all the items, and made a few of them up.

As he left the shelter, he suppressed a smile and once back in the car he and his wife laughed at the confusion as she drove them away. It seemed like the sort of thing that always happened to him. The sort of thing that would also happen to his father, he reflected. After a few minutes, they fell silent, both ruminating, no doubt, on those now suffering without shelter. The husband abruptly declared he wanted to walk home and got out of the car. “I am looking more and more like my father,” thought the husband. The wife drove on. Read more »

God wanted me to write this column

by Joseph Shieber

One of the biggest early 20th century philosophical challenges to the belief in God stemmed from the doctrine of verificationism.

Roughly, according to verificationism, a claim has meaning just in case it is possible to verify the claim — either through empirical evidence or logical proof.

Now here’s the problem for a claim that expresses belief in God. There is no empirical evidence that can prove the existence of God, nor is it possible, purely through logic alone, to prove God’s existence. So, according to verificationism, a claim like “God exists” is quite literally meaningless. It’s just nonsense. Though grammatical, it’s no more meaningful than “Colorless green ideas sleep furiously”.

What made me think of verificationism and theism recently is the fact that, in the last couple of weeks  (and, indeed, years), a number of religious people have claimed that God had a particular interest in the outcome of the 2016 United States Presidential election.

Now it seems to me that we can show that such claims are pretty obviously nonsensical. And it seems to me that we can do this without accepting anything as strong as the verificationist thesis. In fact, we can do this even according to the standards of a committed religious believer. Read more »

The Borrowed and the New: American Wine and French Tradition

by Dwight Furrow

The wine world is an interesting amalgam of stability and variation. As I noted last month, agency in the wine community is dispersed with many independent actors having some influence on wine quality. This dispersed community is held together by conventions and traditions that foster the reproduction of wine styles and maintain quality standards. Most major wine styles are embedded in traditions that go back hundreds of years and are still vibrant today. Although the genetic instability of grapes and their sensitivity to minor changes in weather, soil and topography are agents of change, most of these changes are minor variations within a context of stability. We create new varietals, discover new wine regions, and develop new technologies and methods but these produce minor deviations from a core concept that sometimes seems immune to radical change. There are, after all, only so many ways to ferment grape juice. Red and white still wine, sparkling wine, and fortified wine have been around for centuries and are still the main wine styles on offer. Every wine we drink is a modification of those major themes.

Nevertheless, sometimes wine styles change, often massively. In a community so bound by tradition how does that change take place? One example of a massive change in taste took place in the U.S. in the decades following WWII, where in the course of about twenty years American wine consumers changed their preference from sweet wine to dry. How did such a revolution in taste occur in such a relatively short period of time? Read more »

Sunday, February 10, 2019

Reading in the Age of Constant Distraction

Mairead Small Staid in The Paris Review:

“I read books to read myself,” Sven Birkerts wrote in The Gutenberg Elegies: The Fate of Reading in an Electronic Age. Birkerts’s book, which turns twenty-five this year, is composed of fifteen essays on reading, the self, the convergence of the two, and the ways both are threatened by the encroachment of modern technology. As the culture around him underwent the sea change of the internet’s arrival, Birkerts feared that qualities long safeguarded and elevated by print were in danger of erosion: among them privacy, the valuation of individual consciousness, and an awareness of history—not merely the facts of it, but a sense of its continuity, of our place among the centuries and cosmos. “Literature holds meaning not as a content that can be abstracted and summarized, but as experience,” he wrote. “It is a participatory arena. Through the process of reading we slip out of our customary time orientation, marked by distractedness and surficiality, into the realm of duration.”

Writing in 1994, Birkerts worried that distractedness and surficiality would win out. The “duration state” we enter through a turned page would be lost in a world of increasing speed and relentless connectivity, and with it our ability to make meaning out of narratives, both fictional and lived. The diminishment of literature—of sustained reading, of writing as the product of a single focused mind—would diminish the self in turn, rendering us less and less able to grasp both the breadth of our world and the depth of our own consciousness. For Birkerts, as for many a reader, the thought of such a loss devastates.

More here.

The Special Sleep That Kicks In During a Sickness

Ed Yong in The Atlantic:

In 1909, the Japanese scientist Kuniomi Ishimori collected spinal fluid from sleep-deprived dogs and injected it into active, rested pooches. Within hours, the latter fell into a deep sleep. By coincidence, a pair of French researchers did the same experiments a few years later and got the same results. These studies, and others like them, suggested that the blood of sleepy animals contains some kind of soporific secret sauce of chemicals. Ishimori called these “hypogenic substances.” Others labeled them “somnogens.”

The sources of these sleep-inducing chemicals have proved surprisingly elusive, and scientists have found only a few that fit the bill. Now Hirofumi Toda from the University of Pennsylvania has discovered another—a gene called nemuri that triggers sleep, at least in fruit flies. Unexpectedly, it alsobecomes active during infections and acts to kill incoming microbes. It seems to be part of a self-regulating system, analogous enough to an internal thermostat that we might call it a sleep-o-stat. It can send animals to sleep when they most need shut-eye, whether because they’re sick or because they just haven’t slept enough.

This sleep-o-stat works separately from the daily body clocks that make us feel more tired at night.

More here.

‘There Is No Such Thing as Emancipatory Technology’: Marxist Scholar David Harvey

Jipson John and Jitheesh P.M. in The Wire:

British scholar David Harvey is one of the most renowned Marxist scholars in the world today. His course on Karl Marx’s Capital is highly popular and has even been turned into a series on YouTube. Harvey is known for his support of student activism, community and labour movements.

In an interview with The Wire, he talks about the problems arising out of the neo-liberal project, the resulting surge of populist politics and right-wing movements. He also talks about the relevance of Marx’s critique of capitalism in the present context and the threat to labour from automation.

Could you trace the origin of neo-liberalism? What were the structural reasons for its emergence?

The idealist interpretation of liberalism rests on a utopian vision of a world of individual freedom and liberty for all guaranteed by an economy based on private property rights, self-regulating free markets and free trade, designed to foster technological progress and rising labour productivity to satisfy the wants and needs of all.

More here.

Meet the Guardian of Grammar Who Wants to Help You Be a Better Writer

Sarah Lyall in the New York Times:

With his finely tuned editing ear, Benjamin Dreyer often encounters things so personally horrifying that they register as a kind of torture, the way you might feel if you were an epicure and saw someone standing over the sink, slurping mayonnaise directly from the jar.

There is “manoeuvre,” the British spelling of “maneuver,” for example, whose unpleasant extraneous vowels evoke the sound of “a cat coughing up a hairball,” Dreyer says. There is “reside,” with its unnecessary stuffiness. (“You mean ‘live’?”) There is the use of quotation marks after the term “so-called,” as in “the so-called ‘expert,’” which just looks stupid.

And there are the words Dreyer currently dislikes most, even more than he dislikes “munch” and “nosh” and other distasteful eating-adjacent terms. Sitting recently in his book-crammed office at Penguin Random House, where he is vice president, executive managing editor and copy chief for Random House — a division within the larger company — Dreyer scribbled “smelly” and “stinky” on a card and slid it speedily across the desk, as if the card itself was emitting a foul stench. “I can’t say them out loud,” he said.

More here.

“A new Bretton Woods” and the problem of “economic order” — also a reply to Adler and Varoufakis

Adam Tooze at his own website:

Davos 2019 was a downbeat affair. That at least is how regulars described it To a newcomer it did not seem that way. Indeed, that gloomy assessment would seem to call for a reality check. Most thoughtful people are a lot gloomier than the folks you bump into at Davos. The 2019 meeting may not have had heroes to celebrate. The luster may have come off crypto. But if Davos lacked pizzazz, neither was it overshadowed by the sense of fundamental crisis that should surely color any realistic assessment of the current world situation.

It may sound paradoxical, but is it not the case that our sense of what is realistic is as much defined by mood (Stimmung) as it is by specific empirical facts?

In this sense the mood of Davos is profoundly unrealistic. What sets the tone is a Vegas-style festival of corporate PR and self-presentation, adorned with a barrage of exhortatory slogans worthy of a Mao-era pep rally.

More here.

Is Anti-Intellectualism Ever Good for Democracy?

Adam Waters and EJ Dionne, Jr. in Dissent:

Donald Trump campaigned for the presidency and continues to govern as a man who is anti-intellectual, as well as anti-fact and anti-truth. “The experts are terrible,” Trump said while discussing foreign policy during the 2016 campaign. “Look at the mess we’re in with all these experts that we have.” But Trump belongs to a long U.S. tradition of skepticism about the role and motivations of intellectuals in political life. And his particularly toxic version of this tradition raises provocative and difficult questions: Are there occasions when anti-intellectualism is defensible or justified? Should we always dismiss charges that intellectuals are out of touch or too protective of established ways of thinking? In 1963 the historian Richard Hofstadter published Anti-Intellectualism in American Life, in which he traced a recurring mode of thought prevalent, as he saw it, in U.S. religion, business, education, and politics. “There has always been in our national experience a type of mind which elevates hatred to a kind of creed,” he wrote. “[F]or this mind, group hatreds take a place in politics similar to the class struggle in some other modern societies.” On the list of widely hated groups were Masons, abolitionists, Catholics, Mormons, Jews, black Americans, immigrants, international bankers—and intellectuals.

Hofstadter’s skepticism of mass opinion—on both the left and the right—came through quite clearly. “[T]he heartland of America,” he wrote, “filled with people who are often fundamentalist in religion, nativist in prejudice, isolationist in foreign policy, and conservative in economics, has constantly rumbled with an underground revolt against all these tormenting manifestations of our modern predicament.” It is not an accident that these words sound familiar in the Trump era. A liberalism that viewed the heartland with skepticism was bound to encourage the heartland to return the favor.

More here.

African America’s First Protest Meeting

William L. Katz in BlackPast.Org:

In January 1817 nearly 3,000 African American men met at the Bethel A.M.E. Church (popularly known as Mother Bethel AME) in Philadelphia and denounced the American Colonization Society’s plan to resettle free blacks in West Africa.  This gathering was the first black mass protest meeting in the United States. The black leaders who summoned the men to the church endorsed the ACS scheme and fully expected the black men who gathered there to follow their leadership.  Instead they rejected the scheme and forced the black leaders to embrace their position.

The genesis of this remarkable meeting can found in the early efforts of Captain Paul Cuffee, a wealthy black New Bedford ship owner.  After an 1811 visit to the British colony of Sierra Leone which had been established to receive free blacks and later recaptives — blacks freed by the British Navy from slaving vessels taking them from Africa to the New World — Cuffee envisioned a similar colony for African Americans somewhere in West Africa. To promote his vision, Cuffee, at great financial loss, brought 38 black volunteer settlers from the United States to Sierra Leone in 1815.

When he returned to the United States with news of the resettlement he persuaded many of the most influential black leaders of the era to support him in transporting larger numbers of African Americans to a yet to be determined colony on the West African coast.  Rev. Peter Williams of New York, Rev. Daniel Coker in Baltimore, and James Forten, Bishop Richard Allen, and Rev. Absalom Jones all of Philadelphia, who were the most important African American leaders in the United States at that time, enthusiastically endorsed the plan.

Given the dire situation faced by free blacks in the North during that era, their endorsement was understandable. The vote was denied to all but a handful of black men.  No black women could vote at the time.  Black workers were limited in the occupations they could pursue and the wages they could receive.  In Philadelphia, New York, Boston, and other Northern cities they were subject to attack by white mobs and to the ever present threat of slavecatchers who kidnapped fugitive slaves and on occasion those who had never been enslaved. Thus to many leaders in the “free” North colonization seemed an attractive option.

More here. (Note: Throughout February, we will publish at least one post dedicated to Black History Month)

Saturday, February 9, 2019

A special class of vivid, textural words defy linguistic theory: could ‘ideophones’ unlock the secrets of humans’ first utterances?

David Robson in Aeon:

If you don’t speak Japanese but would like, momentarily, to feel like a linguistic genius, take a look at the following words. Try to guess their meaning from the two available options:

1. nurunuru (a) dry or (b) slimy?
2. pikapika (a) bright or (b) dark?
3. wakuwaku (a) excited or (b) bored?
4. iraira (a) happy or (b) angry?
5. guzuguzu (a) moving quickly or (b) moving slowly?
6. kurukuru (a) spinning around or (b) moving up and down?
7. kosokoso (a) walking quietly or (b) walking loudly?
8. gochagocha (a) tidy or (b) messy?
9. garagara (a) crowded or (b) empty?
10. tsurutsuru (a) smooth or (b) rough?

The answers are: 1(b); 2(a); 3(a); 4(b); 5(b); 6(a); 7(a); 8(b); 9(b) 10(a).

If you think this exercise is futile, you’re in tune with traditional linguistic thinking. One of the founding axioms of linguistic theory, articulated by Ferdinand de Saussure in the early 19th century, is that any particular linguistic sign – a sound, a mark on the page, a gesture – is arbitrary, and dictated solely by social convention. Save those rare exceptions such as onomatopoeias, where a word mimics a noise – eg, ‘cuckoo’, ‘achoo’ or ‘cock-a-doodle-doo’ – there should be no inherent link between the way a word sounds and the concept it represents; unless we have been socialised to think so, nurunuru shouldn’t feel more ‘slimy’ any more than it feels ‘dry’.

Yet many world languages contain a separate set of words that defies this principle. Known as ideophones, they are considered to be especially vivid and evocative of sensual experiences.

More here.

The 500-Year-Long Science Experiment

Sarah Zhang in The Atlantic:

In the year 2514, some future scientist will arrive at the University of Edinburgh (assuming the university still exists), open a wooden box (assuming the box has not been lost), and break apart a set of glass vials in order to grow the 500-year-old dried bacteria inside. This all assumes the entire experiment has not been forgotten, the instructions have not been garbled, and science—or some version of it—still exists in 2514.

By then, the scientists who dreamed up this 500-year experiment—Charles Cockell at the University of Edinburgh and his German and U.S. collaborators—will be long dead. They’ll never know the answers to the questions that intrigued them back in 2014, about the longevity of bacteria. Cockell had once forgotten  about a dried petri dish of Chroococcidiopsis for 10 years, only to find the cells were still viableScientists have revived bacteria from 118-year-old cans of meat and, more controversially, from amber and salt crystals millions of years old.

All this suggests, according to Ralf Möller, a microbiologist at the German Aerospace Center and collaborator on the experiment, that “life on our planet is not limited by human standards.” Understanding what that means requires work that goes well beyond the human life span.

More here.