Our Obligations to the Other Animals

Thomas Nagel at the NYRB:

Christine Korsgaard is a distinguished philosopher who has taught at Harvard for most of her career. Though not known to the general public, she is eminent within the field for her penetrating and analytically dense writings on ethical theory and her critical interpretations of the works of Immanuel Kant. Now, for the first time, she has written a book about a question that anyone can understand. Fellow Creatures: Our Obligations to the Other Animals is a blend of moral passion and rigorous theoretical argument. Though it is often difficult—not because of any lack of clarity in the writing but because of the intrinsic complexity of the issues—this book provides the opportunity for a wider audience to see how philosophical reflection can enrich the response to a problem that everyone should be concerned about.

Since the publication of Peter Singer’s Animal Liberation in 1975, there has been a notable increase in vegetarianism or veganism as a personal choice by individuals, and in the protection of animals from cruel treatment in factory farms and scientific research, both through law and through public pressure on businesses and institutions. Yet most people are not vegetarians: approximately 9.5 billion animals die annually in food production in the United States, and the carnivores who think about it tend to console themselves with the belief that the cruelties of factory farming are being ameliorated, and that if this is done, there is nothing wrong with killing animals painlessly for food.

more here.



Cloudy With a Chance of War

David Berreby in Nautilus:

“Prof” was the English physicist and mathematician Lewis Fry Richardson, for whom doing science came as naturally as breathing. “It was just the way he looked at the world,” recalls his great-nephew, Lord Julian Hunt. “He was always questioning. Everything was an experiment.” Even at the age of 4, recounts his biographer Oliver Ashford in Prophet or Professor? Life and Work of Lewis Fry Richardson, the young Lewis had been prone to empiricism: Told that putting money in the bank would “make it grow,” he’d buried some coins in a bank of dirt. (Results: Negative.) In 1912, the now-grown Richardson had reacted to news of the Titanic’s sinking by setting out in a rowboat with a horn and an umbrella to test how ships might use directed blasts of noise to detect icebergs in fog. (Onlookers might have shaken their heads, but Richardson later won a patent for the fruit of that day’s work.) Nothing—not fellow scientists’ incomprehension, the distractions of teaching, or even an artillery bombardment—could dissuade him when, as he once put it, “a beautiful theory held me in its thrall.”

…Richardson’s finite-difference work had been too novel and unfamiliar to win him a research post at a major university. But in 1913, it helped get him a plum job: directing a research laboratory for Britain’s Meteorological Office, which hoped Richardson would bring both rigorous thinking and practical lab skills to the search for accurate weather forecasts. Here, with a good salary, a house to himself, and a lab far from any distractions, he would have ample time for research. The following year, however, the Great War arrived. At age 32, with his important research ongoing, Richardson could have kept to his agreeable job. Yet even as his principles would not permit him to serve in the military, he still felt he should take part in the war. “In August 1914,” he later wrote, “I was torn between an intense curiosity to see war at close quarters, an intense objection to killing people, both mixed with ideas of public duty, and doubt as to whether I could endure danger.’’ Rebuffed when he requested a leave of absence to serve in the ambulance corps, in 1916 he simply quit. A few weeks later, he and his slide rule, notes, and instruments were at the front.

And so for the next few years Richardson’s theories of war and weather advanced in and around the combat zone. Over six weeks in 1916, with a bale of hay for his desk, Richardson patiently solved equation after equation for hundreds of variables. His aim was to demonstrate his method of “weather prediction by numerical processes” by creating a real forecast.

More here.

How Don McCullin captured history in the making

Samantha Weinberg in More Intelligent Life:

The horrors of the battlefield are never far away in Tate Britain’s retrospective of Don McCullin’s work: the dead Khmer Rouge soldiers in a crater in Cambodia, Congolese soldiers tormenting freedom fighters in Stanleyville, young Christians on a bombed-out Beirut street, posing like a boy band over the body of a dead Palestinian girl. But McCullin has said again and again that he doesn’t like to be called a war photographer; preferring, simply, “photographer”. He is as interested in the people fighting wars as the people caught in their rip tide. “Starving Twenty Four Year Old Mother With Child” taken in Biafra in 1968, shows a woman, so gaunt she appears elderly, trying to feed her baby, who is sucking on empty, wrinkled breasts. Another picture, taken in a psychiatric hospital in Beirut in 1982, shows a child curled up on a mattress, flies settled on his body. He is tied to the metal bedstead with string, to stop him wandering off amid the broken glass. There is no need to see or hear the bombs to understand their effect on the helpless, and the desperation of those who care for them.

I’ve known Don McCullin for many years. He’s soft-spoken, occasionally gruff, but funny, too. He was born into poverty in north London, 83 years ago. His first published photograph, “The Guvnors in their Sunday Suits” (1958), shows some young men he’d been at school with standing in a bombed-out building. When the men were caught up in a fight, during which a policeman was stabbed to death, McCullin sensed an opportunity to sell the photograph to the press. The Observer newspaper bought it and a few years later, after seeing the pictures he had taken of a freshly divided Berlin, would offer him a job. It was clear that he had a special eye – and more than that, an empathy that travelled down the lens to his subjects, and was reflected back to his audience.

Over the decades he has wandered the world, from one atrocity to the next, documenting humanity and inhumanity. In between he has turned his lens on Britain: on the poverty of Bradford and London’s East End; the humour of the country at play; the naked beauty of the landscape around his home in Somerset.

More here.

Thursday, March 7, 2019

Seven moral rules found all around the world

From Science Daily:

Anthropologists at the University of Oxford have discovered what they believe to be seven universal moral rules.

The rules: help your family, help your group, return favors, be brave, defer to superiors, divide resources fairly, and respect others’ property, were found in a survey of 60 cultures from all around the world.

Previous studies have looked at some of these rules in some places — but none has looked at all of them in a large representative sample of societies. The present study, published in volume 60, no. 1 issue of Current Anthropology, by Oliver Scott Curry, Daniel Austin Mullins, and Harvey Whitehouse, is the largest and most comprehensive cross-cultural survey of morals ever conducted.

More here.

Merchants of Hype

Sabine Hossenfelder in Back Reaction:

Once upon a time, the task of scientists was to understand nature. “Merchants of Light,” Francis Bacon called them. They were a community of knowledge-seekers who subjected hypotheses to experimental test, using what we now simply call “the scientific method.” Understanding nature, so the idea, would both satisfy human curiosity and better our lives.

Today, the task of scientists is no longer to understand nature. Instead, their task is to uphold an illusion of progress by wrapping incremental advances in false promise. Merchants they still are, all right. But now their job is not to bring enlightenment; it is to bring excitement.

Nowhere is this more obvious than with big science initiatives. Quantum computing, personalized medicine, artificial intelligence, simulated brains, mega-scale particle colliders, and everything nano and neuro: While all those fields have a hard scientific core that justifies some investment, the big bulk is empty headlines. Most of the money goes into producing papers whose only purpose is to create an appearance of relevance.

More here.

Modern Monetary Theory Isn’t Helping

Doug Henwood in Jacobin:

Now that policies made famous by Bernie Sanders, like Medicare for All and free college, and newer ones like the Green New Deal, are infiltrating the political mainstream, advocates are always faced with the question: “how would you pay for them?” Although there are good answers to “this question” that could even be shrunk down to a TV-friendly length and vocabulary, they’re not always forthcoming. Even self-described socialists seem to have a hard time saying the word “taxes.” How lovely would it be if you could just dismiss the question as an irrelevant distraction?

Conveniently, there’s an economic doctrine that allows you to do just that: Modern Monetary Theory (MMT). Newly elected Rep. Alexandria Ocasio-Cortez is at least MMT-curious, and it’s all over Marxist reading groups and Democratic Socialists of America chapters. It’s even seeping into the business press — Bloomberg’s Joe Weisenthal is friendly to the doctrine. James Wilson of the New York Times tweeted recently, “The speed with which young activists on both left and right are migrating toward MMT is going to have a profound effect on US politics in the 2020s and 2030s.”

More here.  [Thanks to Tony Cobitz.]

This Photo of a 7-Year-Old Girl Transformed the Abolition Movement

Maurice Berger in the New York Times:

The daguerreotype shows a 7-year old girl. Her face is pale, her expression somber. Her elegant plaid dress, trimmed in lace, and the notebook on the cloth-covered table behind her, suggest that she comes from a prosperous family.

Though modest, the photograph taken in Boston in 1855, is actually historic. It shows not a white child but a black girl — Mary Mildred Williams — who was born into slavery. It was an image so compelling to white Americans at the time that it helped transform the abolition movement. Housed in relative obscurity at the Massachusetts Historical Society, the daguerreotype was recently rediscovered by the photographer and scholar Jessie Morgan-Owens while researching her dissertation.

More here.

How Cruelty Made Us Human

Barbara J. King at the TLS:

What was the driving force that made us human, akin to but separate from other apes and our evolutionary cousins such as the Neanderthals? In The Goodness Paradox, the anthropologist Richard Wrangham approvingly quotes Frederick the Great in pointing to “the wild beast” within each man: our nature, he argues, is rooted in an animal violence that morphed over time to become uniquely human. When male human ancestors began to plot together to execute aggressive men in their communities, indeed to carry out such killings through what Wrangham calls “coalitionary proactive aggression”, they were launched towards full humanity.

Proactive aggression is premeditated, a feature that sets it apart from reactive aggression, which is impulsive, a response to some immediate threat. Hot emotion drives reactive aggression: someone insults you and you respond with a swing at their jaw.

more here.

When Mario Vargas Llosa Punched Gabriel García Márquez

Silvana Paternostro at The Paris Review:

It was about eleven or twelve in the morning and I was in my house in Colonia Nápoles, where I had an office, a big house with an editorial office in one part, and in the other part I lived with my girlfriend and my two children. There’s a knock at the door and it’s Gabo and Mercedes. I was very happy and very surprised to see him. Gabo was already a friend of mine, but there are hierarchies in friendships. It was a friendship of guarded proportions. I was a newspaper photographer and he was what he is. Back then I didn’t presume to call him Gabo. Calling him Gabito was for me like calling Cervantes “Miguelito.” For me, he’s Gabriel García Márquez. They came for the photographs. He told me, “I want you to take some pictures of my black eye.” They came to my house because they trust me.

more here.

Is Anti-Intellectualism Ever Good for Democracy?

Waters and Dionne Jr in Dissent:

Donald Trump campaigned for the presidency and continues to govern as a man who is anti-intellectual, as well as anti-fact and anti-truth. “The experts are terrible,” Trump said while discussing foreign policy during the 2016 campaign. “Look at the mess we’re in with all these experts that we have.” But Trump belongs to a long U.S. tradition of skepticism about the role and motivations of intellectuals in political life. And his particularly toxic version of this tradition raises provocative and difficult questions: Are there occasions when anti-intellectualism is defensible or justified? Should we always dismiss charges that intellectuals are out of touch or too protective of established ways of thinking?

In 1963 the historian Richard Hofstadter published Anti-Intellectualism in American Life, in which he traced a recurring mode of thought prevalent, as he saw it, in U.S. religion, business, education, and politics. “There has always been in our national experience a type of mind which elevates hatred to a kind of creed,” he wrote. “[F]or this mind, group hatreds take a place in politics similar to the class struggle in some other modern societies.” On the list of widely hated groups were Masons, abolitionists, Catholics, Mormons, Jews, black Americans, immigrants, international bankers—and intellectuals. Hofstadter’s skepticism of mass opinion—on both the left and the right—came through quite clearly. “[T]he heartland of America,” he wrote, “filled with people who are often fundamentalist in religion, nativist in prejudice, isolationist in foreign policy, and conservative in economics, has constantly rumbled with an underground revolt against all these tormenting manifestations of our modern predicament.” It is not an accident that these words sound familiar in the Trump era. A liberalism that viewed the heartland with skepticism was bound to encourage the heartland to return the favor.

More here.

Thursday Poem

A Glittering

One mourner says if I can just get through this year as if salvation comes in January.

Slow dance of suicides into the earth:

I see no proof there is anything else. I keep my obituary current, but believe that good times are right around the corner

Una grande scultura posse rotolare giù per una collina senza rompersi, Michelangelo is believed to have said (though he never did): To determine the essential parts of a sculpture, roll it down a hill. The inessential parts will break off.

That hill, graveyard of the inessential, is discovered by the hopeless and mistaken for the world just before they mistake themselves for David’s white arms.

They are wrong. But to assume oneself essential is also wrong: a conundrum.

To be neither essential nor inessential—not to exist except as the object of someone’s belief, like those good times lying right around the corner—is the only possibility.

Nothing, nobody matters.

And yet the world is full of love . . .

by Arah Manguso
From Blackbird Magazine, Spring 2005

The Black Death may have transformed medieval societies in sub-Saharan Africa

Lizzie Wade in Science:

In the 14th century, the Black Death swept across Europe, Asia, and North Africa, killing up to 50% of the population in some cities. But archaeologists and historians have assumed that the plague bacterium Yersinia pestis, carried by fleas infesting rodents, didn’t make it across the Sahara Desert. Medieval sub-Saharan Africa’s few written records make no mention of plague, and the region lacks mass graves resembling the “plague pits” of Europe. Nor did European explorers of the 15th and 16th centuries record any sign of the disease, even though outbreaks continued to beset Europe. Now, some researchers point to new evidence from archaeology, history, and genetics to argue that the Black Death likely did sow devastation in medieval sub-Saharan Africa. “It’s entirely possible that [plague] would have headed south,” says Anne Stone, an anthropological geneticist who studies ancient pathogens at Arizona State University (ASU) in Tempe. If proved, the presence of plague would put renewed attention on the medieval trade routes that linked sub-Saharan Africa to other continents. But Stone and others caution that the evidence so far is circumstantial; researchers need ancient DNA from Africa to clinch their case. The new finds, to be presented this week at a conference at the University of Paris, may spur more scientists to search for it.

Plague is endemic in parts of Africa now; most historians have assumed it arrived in the 19th century from India or China. But Gérard Chouin, an archaeologist and historian at the College of William & Mary in Williamsburg, Virginia, and a team leader in the French National Research Agency’s GLOBAFRICA research program, first started to wonder whether plague had a longer history in sub-Saharan Africa while excavating the site of Akrokrowa in Ghana. Founded around 700 C.E., Akrokrowa was a farming community surrounded by an elliptical ditch and high earthen banks, one of dozens of similar “earthwork” settlements in southern Ghana at the time. But sometime in the late 1300s, Akrokrowa and all the other earthwork settlements were abandoned. “There was a deep, structural change in settlement patterns,” Chouin says, just as the Black Death ravaged Eurasia and North Africa. With GLOBAFRICA funding, he has since documented a similar 14th century abandonment of Ife, Nigeria, the homeland of the Yoruba people, although that site was later reoccupied.

More here.

Wednesday, March 6, 2019

Dictators Kill Poets: On Federico García Lorca’s Last Days

Aaron Shulman in Literary Hub:

Before dawn on August 17th, 1936, a man dressed in white pajamas and a blazer stepped out of a car onto the dirt road connecting the towns of Víznar and Alfacar in the foothills outside Granada, Spain. He had thick, arching eyebrows, a widow’s peak sharpened by a tar-black receding hairline, and a slight gut that looked good on his 38-year-old frame.

It was a moonless night and he wasn’t alone under the dark tent of the Andalusian sky. He was escorted by five soldiers, along with three other prisoners: two anarchist bullfighters and a white-haired schoolteacher with a wooden leg. The headlights from the two cars that had delivered them here illuminated the group as they made their way over an embankment onto a nearby field dotted with olive trees. The soldiers carried Astra 900 semiautomatic pistols and German Mauser rifles. By now the four captives knew that they were going to die. The man in the pajamas was the poet Federico García Lorca.

More here.

Robert Berwick & Noam Chomsky: The Siege of Paris

Robert Berwick & Noam Chomsky in The Inference Review:

In 1866, the Linguistic Society of Paris issued a stern injunction: “The Society does not accept any communication concerning either the origin of the language or the creation of a universal language.”1 On peut facilement imaginer pourquoi. The late eighteenth and early nineteenth centuries, as Giorgio Graffi observed, marked the blossoming of modern comparative linguistics.2 William Jones, a British judge in India, and Jacob Grimm, the author of a collection of morbid German fairy tales, were among the pioneering linguists studying Indo-European languages. They aimed collectively to discover historical connections among languages and to reconstruct their origins in an Indo-European Ursprache. But their work focused on the external features of individual languages, rather than on the origin of language as a cognitive faculty; and it was conducted, as Sylvain Auroux has emphasized, against a backdrop of evolutionary and phylogenetic thought.3 Linguists told themselves many stories about the evolution of language, and so did evolutionary biologists; but stories, as Richard Lewontin rightly notes, are not hypotheses, a term that should be “reserved for assertions that can be tested.”4

The human language faculty is a species-specific property, with no known group differences and little variation. There are no significant analogues or homologues to the human language faculty in other species.

More here.

Ilhan Omar Has a Less Bigoted Position on Israel Than Almost All of Her Colleagues

Eric Levitz in New York Magazine:

It should be “okay” for Americans who want their country to have a close alliance with a foreign power to form political organizations that advance their views. The problem with AIPAC is not that it pushes American lawmakers to show deference to the interests of another country. The problem is that it pushes them to show deference to a country that practices de facto apartheid rule in much of the territory it controls. If there were a lobby pushing Congress to put the humanitarian needs of Bangladesh over the immediate economic interests of Americans — by imposing a steep carbon tax and drastically increasing foreign aid to that low-lying nation — would the left decry the idea that such lobbying was “okay?” Of course not. Because progressives aren’t hypernationalists. And I don’t think Omar is either. So she shouldn’t frame her opposition to the Israel lobby in nationalist terms. The problem isn’t Congress’s “allegiance to a foreign country,” but its complicity in Jewish supremacy in the West Bank, an inhuman blockade in Gaza, and discrimination against Arab-Israelis in Israel proper.

More here.

Islam and its Past: Jahiliyya, Late Antiquity, and the Qur’an

Aziz Al-Azmeh at The Marginalia Review of Books:

Current scholarship on Paleo-Islam was strongly marked by the publication in 1977 of Hagarism: The Making of the Islamic World by Crone and Michael Cook. This domain of history had then for decades attracted little consolidated academic energy, its themes generally considered plain and uncomplicated, corresponding to a grand narrative pervading classical Arabic sources. Hagarism proposed that Arabic historical sources should be disregarded, that other sources, including an Armenia chronicle, should be preferred, and that at its inception Islam was really a Jewish sectarian movement. The book’s cognitive harvest was scant, but its warning against uncritical reliance on classical Arabic sources, and its narrative revisionism, were carried forth by a sprightliness altogether uncommon in Islamic studies. Hagarism’s unfledged source-critical skepticism, and consequences drawn from it, came together to define the major commonplaces of what rapidly came to be regarded as the mainstream studies of Paleo-Islam since. A default setting of hyper-scepticism congealed rapidly into an academic orthodoxy that came to project an air of assurance, self-evidence and effortless repeatability. This setting was reinforced by relative institutional isolation of the field from the broader reaches of the historical sciences, by inbred, tribal habits of reading, and dedication to in-house issues and concerns.

more here.

Newton the Alchemist

Dmitri Levitin at Literary Review:

The discovery of Newton’s alchemical manuscripts – containing no fewer than a million words, some of the pages mutilated by the acids used during his quest for the philosopher’s stone – led to a flurry of scholarly activity. This culminated in the 1980s in the work of Richard Westfall, still Newton’s greatest biographer, and Betty Jo Teeter Dobbs. In a spectacular rejection of Butterfield’s dismissiveness, they argued that alchemy underpinned Newton’s whole world-view. Newton’s belief in transmutation, Dobbs claimed, was akin to a religious quest, with the ‘philosophic mercury’, believed to be ableto break down metals into their constituent parts, acting as a spirit mediating between the physical and divine realms. Westfall suggested that it was assumptions born in Newton’s alchemical researches about invisible forces acting at a distance that allowed him to develop his greatest theory: that of universal gravitation, which he announced to the world in his Principia of 1687.

more here.

Joan Miró’s Modernism for Everybody

Peter Schjeldahl at The New Yorker:

“Painting,” painted by Joan Miró in 1933, in Barcelona, is a composition of black, red, and white blobby shapes and linear glyphs on a ground of bleeding and blending greens and browns. It hangs in “Joan Miró: Birth of the World,” an enchanting show at the Museum of Modern Art that draws on the museum’s immense holdings of Miró’s work, along with a few loans. “Painting” is a bit sombre, for him, but it has the ineffably friendly air of nearly all his art: adventurous but easy-looking, an eager gift to vision and imagination. It invokes a word inevitably applied to Miró: “poetic,” redolent of the magic, residual in us, of childhood rhymes, with or without figurative elements. Never unsettling in the ways of, say, Matisse or, for heaven’s sake, Picasso, Miró is a modernist for everybody. (He died in 1983, at the age of ninety.) This has given him a peculiar trajectory in the modern-art canon: he was considered majestic at points in the past, in ways that feel somewhat flimsy now. Looking at “Painting” helps me think about the art world’s shifting estimation of the “international Catalan,” as Miró termed himself. It stirs a personal memory.

more here.