Save the NEA: One Poet’s Story of How the Arts Build Community

Patricia Traxler in Agni:

TraxlerJust to give some idea of what killing the NEA will (or more aptly, will not) accomplish, the $146 million budget of the National Endowment for the Arts represents just 0.012% (about one one-hundredth of one percent) of our federal discretionary spending. According to 2012 NEA figures, the annual budget for the arts per capita (in dollars) in Germany was $19.81; in England, $13.54; in Australia, $8.16; in Canada, $5.19, and in the United States just $0.47. Yes, 47 cents annually per capita. For all the arts combined. And the new POTUS feels that’s too much.

It would be impossible to enumerate all the programs that will likely die when the NEA and the NEH are killed, and the many people these cuts will deprive of things like public television programming and National Public Radio; school enrichment programs in the arts; and community programs to encourage music, dance, theater, visual art and literary art, literacy, and the pleasure of reading.

More here.

Music as medicine: how songs could soon replace painkillers and help you sleep better

From Wired:

MiniMarkoIn September 2013, Marko Ahtisaari resigned from his position as the head of product design at Nokia. The Finnish company had just been acquired by Microsoft and Ahtisaari, the son of a former president of Finland, decided it was time to look for his next startup. He joined the MIT Media Lab shortly after, where he was introduced by Joi Ito, the Lab’s director, to Ketki Karanam, a biologist who was studying how music affects the brain. Ahtisaari was naturally interested: he grew up playing the violin and later studied music composition at Columbia University. “I used to be part of the New York scene,” Ahtisaari says. “I left to do product design and to be an entrepreneur. For 15 years I didn’t play much. I have friends who are now playing with Tom Yorke and the Red Hot Chili Peppers.”

Karanam showed Ahtisaari that there was an increasing body of evidence based on imaging studies that showed what happens to the brain when exposed to music. “It fires very broadly,” Ahtisaari. “It’s not just the auditory cortex. What happens is essentially similar to when we take psycho-stimulants. In other words, when we take drugs.”

To Ahtisaari, this indicated that music could, at least in principle, complement or even replace the effects that pharmaceuticals had on our neurology. For instance, there were studies that showed that patients with Parkinson’s disease improved their gait when listening to a song with the right beat pattern.

More here. And see more here.

how to write ‘romantic’ biographies

9780307379689-fullsize-cmykMichael Dirda at The Washington Post:

Holmes’s “This Long Pursuit” is itself a complement to two earlier volumes: “Footsteps: Adventures of a Romantic Biographer” (1985) and “Sidetracks: Explorations of a Romantic Biographer” (2000). All three are, essentially, collections of essays, talks, reminiscences and reviews held together by their author’s description of himself as a “romantic biographer.” That phrase carries multiple meanings: While Holmes’s field is, roughly, England in the age of Coleridge, he sometimes writes about romantic figures of other nations and periods (poet Gérard de Nerval, novelist Robert Louis Stevenson) and he himself clearly possesses an adventurous, romantic spirit.

more here.

capitalism and christianity

148398038029SwansonOsteenopeningGlendaleAZ1_3_14666Barrett Swanson at Dissent:

Though few contemporary Christians would likely admit it, many of the American colonies were built upon the idea of redistribution. Those dour Puritans who first populated the territories of New England were not lured by the promise of windfall profits. Nor had they endured months of seasickness and disease for the chance to start a small business. Instead, they were hopeless utopians, runaway apostates of the established church who yearned to embrace a higher manner of being, one founded upon a system of communitarian ethics. John Winthrop, the Puritan governor of the Massachusetts Bay Colony, sketched the tenets of this new society in a sermon called “A Model of Christian Charity,” which he delivered in 1630 while on board a British ship headed across the Atlantic. A gusty ode to American exceptionalism, the homily christened the new continent “The City Upon a Hill,” a metaphor that Ronald Reagan would make a watchword for Republicans some three-hundred-and-fifty years later. But in Winthrop’s eyes what gave the New World its luster were the egalitarian principles of the Protestant gospel, central among them the commitment to redistributing wealth on the basis of individual need. “We must be willing,” Winthrop said, “to abridge ourselves of our superfluities for the sakes of others’ necessities . . . we must bear one another’s burdens.”

It is stupefying to consider how, over the course of four centuries, American Christianity would forsake these humble sentiments for the telegenic hucksterism of preachers like Joel Osteen. This Pentecostal quack with a garish smile doesn’t tout the spiritual benefits of communal interdependence. Nor does he acknowledge the ethical requirements of the Christian social contract. Instead, like so many stewards of the “prosperity gospel,” Osteen thinks individual wealth is a hallmark of Christian virtue and urges his followers to reach inside themselves to unlock their hidden potential.

more here.

orpheus in bulgaria

Orfei1Allegra Hyde at Threepenny Review:

My husband and I have lived in Bulgaria for six months, lived in this country often confused for other places. “You’ll have to brush up on your French,” said a friend before I left the U.S., believing me bound for Algeria. “Enjoy the northern lights,” said another. Bulgaria is one of the forgotten nations once tucked behind the Iron Curtain, its cities now stocked with crumbling Soviet tenements and silent factories and stray dogs too hungry to bark. In the winter, in Haskovo —the city where I teach English to three hundred hardened teenagers—the air thickens to a gray haze as residents burn brush and scraps of trash to heat their homes. The smoke makes me cough, makes my eyes sting, makes my thoughts turn dark.

Today, though, we have left Haskovo. We have left winter as well. The first spring blossoms are starting to show, forsythia yellowing the countryside. As the road to the Devil’s Throat continues its manic winding route through the Rhodopes, we pass the occasional village of squat red-roofed dwellings, laundry lines strung with colorful underwear like prayer flags. Chickens bustle after bugs. Kids kick soccer balls on smears of new grass.

“21 km,” says a sign.

Even in the presence of spring, I feel nervous. I can’t help imagining the ways we might die on this mountain road, squeezed between cliffs and a squalling river. It’s a bad habit of mine: envisioning worst-case scenarios.

more here.

A.I. VERSUS M.D.

Siddhartha Mukherjee in The New Yorker:

Ct-siddhartha-mukherjee-2-02-jpg-20160602Explanations run shallow and deep. You have a red blister on your finger because you touched a hot iron; you have a red blister on your finger because the burn excited an inflammatory cascade of prostaglandins and cytokines, in a regulated process that we still understand only imperfectly. Knowing why—asking why—is our conduit to every kind of explanation, and explanation, increasingly, is what powers medical advances. Hinton spoke about baseball players and physicists. Diagnosticians, artificial or human, would be the baseball players—proficient but opaque. Medical researchers would be the physicists, as removed from the clinical field as theorists are from the baseball field, but with a desire to know “why.” It’s a convenient division of responsibilities—yet might it represent a loss? “A deep-learning system doesn’t have any explanatory power,” as Hinton put it flatly. A black box cannot investigate cause. Indeed, he said, “the more powerful the deep-learning system becomes, the more opaque it can become. As more features are extracted, the diagnosis becomes increasingly accurate. Why these features were extracted out of millions of other features, however, remains an unanswerable question.” The algorithm can solve a case. It cannot build a case.

Yet in my own field, oncology, I couldn’t help noticing how often advances were made by skilled practitioners who were also curious and penetrating researchers. Indeed, for the past few decades, ambitious doctors have strived to be at once baseball players and physicists: they’ve tried to use diagnostic acumen to understand the pathophysiology of disease. Why does an asymmetrical border of a skin lesion predict a melanoma? Why do some melanomas regress spontaneously, and why do patches of white skin appear in some of these cases? As it happens, this observation, made by diagnosticians in the clinic, was eventually linked to the creation of some of the most potent immunological medicines used clinically today. (The whitening skin, it turned out, was the result of an immune reaction that was also turning against the melanoma.) The chain of discovery can begin in the clinic. If more and more clinical practice were relegated to increasingly opaque learning machines, if the daily, spontaneous intimacy between implicit and explicit forms of knowledge—knowing how, knowing that, knowing why—began to fade, is it possible that we’d get better at doing what we do but less able to reconceive what we ought to be doing, to think outside the algorithmic black box?

More here.

Remember why we work on cancer

Levi Garraway in Nature:

Comment1aI first realized I'd been bitten by the science bug in the summer of 1987. I was walking home from the laboratory, mulling over an organic chemistry reaction that I had been attempting — and mostly failing — to execute. Suddenly, a notion coalesced in my 19-year-old brain: all human biology and disease must ultimately come down to reactions that either proceed properly or go awry. As I savoured the evening breeze, I knew that I wanted to dedicate my career to understanding these mechanisms and thereby to hasten new treatments. Nearly every scientist remembers moments like these. I am saddened, therefore, by the cynical view that has become increasingly common in both academia and industry: that much biomedical science, even — or perhaps especially — that which appears in 'high-profile' journals, is bogus. I am one of many scientists who have seen their past research subjected to unexpected scrutiny as a result. An attempt to replicate work from my team was among the first described by the Reproducibility Project: Cancer Biology, an initiative that independently repeated experiments from high-impact papers. In this case, as an editorial that surveyed the first replications explained, differences between how control cells behaved in the two sets of experiments made comparisons uninformative1. The replicators' carefully conducted experiment showed just how tough it can be to reproduce result.

…We scientists search tenaciously for information about how nature works through reason and experimentation. Who can deny the magnitude of knowledge we have gleaned, its acceleration over time, and its expanding positive impact on society? Of course, some data and models are fragile, and our understanding remains punctuated by false premises. Holding fast to the three Rs ensures that the path — although tortuous and treacherous at times — remains well lit.

More here.

Monday, March 27, 2017

Sunday, March 26, 2017

Islamic Foreshadowing of Evolution

I will summarise the key elements of the modern science of evolution, and the reasons why the evidence in its favour is generally regarded among scientists as conclusive, before turning to my main theme, which is the extent to which Muslim scholars anticipated key aspects of the modern theory.

Paul Braterman at Muslim Heritage:

ScreenHunter_2648 Mar. 27 09.57We know what comes next, and they don’t. Bear that in mind whenever you see scholars commenting on the significance, in the context of today’s science, of thinkers who died centuries ago. To do them justice, we need to see the world through their eyes, not ours. But we too are people of our own time, and if we are looking for the origins of the concepts that concern us today, we would do well to start off by clarifying those concepts.

And so, in this article, I will summarise the key elements of the modern science of evolution, and the reasons why the evidence in its favour is generally regarded among scientists as conclusive, before turning to my main theme, which is the extent to which Muslim scholars anticipated key aspects of the modern theory. But remember that the aim is to understand their thinking in the context of their own time, rather than in the light of later knowledge.

I conclude that they made important contributions, and that one scholar (al Jahiz) even made the crucial step of realising that one species can evolve into another, and that what are now distinct species share a common ancestor. However, this is still a long way from recognising that such change is universal, or that even highly dissimilar species share a common ancestor, or that these facts are significant.

More here.

Sexual selection, in humans and in lizards

Ambika Kamath writes:

ScreenHunter_2647 Mar. 27 09.50Over the last few months, there’s been a slow-boiling battle underway between Holly Dunsworth and Jerry Coyne about the evolution of sexual dimorphism in humans, surrounding the question of why male and female humans, on average, differ in size. The battlefield ranged from blogposts to twitter to magazine articles. In a nutshell, Coyne argued that “sexual dimorphism for body size (difference between men and women) in humans is most likely explained by sexual selection” because “males compete for females, and greater size and strength give males an advantage.” His whole argument was motivated by this notion that certain Leftists ignore facts about the biology of sex differences because of their ideological fears, and are therefore being unscientific.

Dunsworth’s response to Coyne’s position was that “it’s not that Jerry Coyne’s facts aren’t necessarily facts, or whatever. It’s that this point of view is too simple and is obviously biased toward some stories, ignoring others. And this particular one he shares…has been the same old story for a long long time.” Dunsworth went on to propose, seemingly off the cuff, alternative hypotheses for sexual dimorphism in body size in humans that were focussed not on men but on women, as examples of the kind of hypothesis that is relatively rarely considered or tested in this field.

Though on the surface this battle may seem to be about specific biological facts (Coyne certainly tries to win by treating it that way), in reality this disagreement is, as Dunsworth argues, about the process by which hypotheses are tested and about how knowledge comes into existence. About which hypotheses are considered for testing in the first place.

More here.

Long-Sought Research Deregulation Is Upon Us

Richard A. Shweder and Richard E. Nisbett in the Chronicle of Higher Education:

ScreenHunter_2646 Mar. 27 09.42It has been a 40-year labor: Regulatory systems are not easy to undo. Nevertheless, in January the federal government opened the door for universities to deregulate vast portions of research in the social sciences, law, and the humanities. This long-sought and welcome reform of the regulations requiring administrative oversight of federally funded human-subject research on college campuses limits the scope of institutional review board, or IRB, management by exempting low-risk research with human subjects from the board’s review.

The new regulations state: "We acknowledge that guidance may be useful for interpreting some of the terms in this exemption, and that some cases will be debatable. However, we also believe that a substantial number of research activities will plainly fit this exemption, and should be allowed to proceed without IRB review."

The exempted research activities include surveys, interviews, and other forms of free communication between researchers and human adults, aptitude testing, the observation and recording of verbal and nonverbal behavior in schools and public places (for example, courtrooms), benign behavioral interventions (including ordinary psychology experiments), secondary-data analysis, and other low-risk projects and research procedures.

More here.

ARE LIBERALS ON THE WRONG SIDE OF HISTORY?

Adam Gopnik in The New Yorker:

ScreenHunter_2645 Mar. 27 09.28Of all the prejudices of pundits, presentism is the strongest. It is the assumption that what is happening now is going to keep on happening, without anything happening to stop it. If the West has broken down the Berlin Wall and McDonald’s opens in St. Petersburg, then history is over and Thomas Friedman is content. If, by a margin so small that in a voice vote you would have no idea who won, Brexit happens; or if, by a trick of an antique electoral system designed to give country people more power than city people, a Donald Trump is elected, then pluralist constitutional democracy is finished. The liberal millennium was upon us as the year 2000 dawned; fifteen years later, the autocratic apocalypse is at hand. Thomas Friedman is concerned.

You would think that people who think for a living would pause and reflect that whatever is happening usually does stop happening, and something else happens in its place; a baby who is crying now will stop crying sooner or later. Exhaustion, or a change of mood, or a passing sound, or a bright light, something, always happens next. But for the parents the wait can feel the same as forever, and for many pundits, too, now is the only time worth knowing, for now is when the baby is crying and now is when they’re selling your books.

And so the death-of-liberalism tomes and eulogies are having their day, with the publishers who bet on apocalypse rubbing their hands with pleasure and the ones who gambled on more of the same weeping like, well, babies. Pankaj Mishra, in “Age of Anger” (Farrar, Straus & Giroux), focusses on the failures of what is sometimes called “neoliberalism”—i.e., free-market fundamentalism—and, more broadly, on the failure of liberal élites around the world to address the perpetual problem of identity, the truth that men and women want to be members of a clan or country with values and continuities that stretch beyond merely material opportunity. Joel Mokyr’s “A Culture of Growth” (Princeton) is an attempt to answer the big question: Why did science and technology (and, with them, colonial power) spread west to east in the modern age, instead of another way around? His book, though drier than the more passionate polemics, nimbly suggests that the postmodern present is powered by the same engines as the early-modern past. In “Homo Deus” (HarperCollins), Yuval Noah Harari offers an elegy for the end of the liberal millennium, which he sees as giving way to post-humanism: the coming of artificial intelligence that may leave us contented and helpless, like the Eloi in H. G. Wells’s “Time Machine.” Certainly, the anti-liberals, or, in Harari’s case, post-humanists, have much the better of the rhetorical energy and polemical brio. They slash and score. The case against the anti-liberals can be put only slowly and with empirical caution. The tortoise is not merely a slow runner but an ugly one. Still, he did win the race.

More here.

Doing the write thing: Angie Thomas

Afua Hirsch in The Guardian:

UntitledIf a spaceship landed in northern Texas and beamed every adolescent within a 50-mile radius into its desolate interior, the scene would look a lot like what now lies in front of me. It’s difficult to believe there are any teenagers in north Texas not currently forming orderly queues at the Las Colinas conference centre – a formidably angular set of slabs in the Texan wasteland. Yet among the lines of young readers at the North Texas Teen Book Festival, their arms cradling impractical numbers of books, and the row of authors signing on an industrial scale, one woman stands out. Angie Thomas, one of the youngest writers in the place, is one black face in a sea of white. She’s upbeat, her hair tied with a perky bow, and when a fan says she looks “so pretty” in a top that combines a hood with sheer lace panels, she laughs and says “thank you” in a Mississippi accent whose vowels are so many notes, it’s a beguiling song. She fingers the garment. “My friend called it Thug Life with a feminine twist.” However you interpret that description, it will mean something different after reading Thomas’s book, the recently released The Hate U Give. She’s a 29-year-old woman from Jackson who has written a novel that is a strident and utterly compelling march into the most sensitive and contentious subjects in America today: race, privilege and the killings of unarmed black people at the hands of the police. And she has done so for the young adult fiction scene – the popular “YA” genre still best known for Harry Potter and the Twilight trilogy. Among these overwhelmingly white adolescents in suburban Texas, the book has completely sold out and will, a few days later, debut at number one on the New York Times bestseller list. It’s a publishing miracle.

The Hate U Give tells the story of Starr, a 16-year-old black girl who lives in inner-city America in a neighbourhood that is poor and black, but goes to school in a suburb that is affluent and white. At home, Starr’s loving and protective parents usher their children into a room they call the “den” not just to watch basketball games, but to shield them from the machine gun fire that frequently erupts on the street outside. One night Starr and her childhood friend Khalil are driving home from a party when they are pulled over by police. Khalil, who is unarmed, is made to get out of the vehicle, and an officer – who later claims he mistook the boy’s hairbrush for a gun –shoots and kills him, traumatising Starr.

More here.

The Wisdom of the Aging Brain

Anil Ananthaswamy in Nautilus:

Aging_8bf0f5b933338cb9b11ff56496e6f56fAt the 2010 Cannes Film Festival premiere of You Will Meet A Tall Dark Stranger, director Woody Allen was asked about aging. He replied with his characteristic, straight-faced pessimism. “I find it a lousy deal. There is no advantage in getting older. I’m 74 now. You don’t get smarter, you don’t get wiser … Your back hurts more, you get more indigestion … It’s a bad business, getting old. I’d advise you not to do it if you can avoid it.” Creaking bones and bad digestion notwithstanding, is that really the only face of aging? Turns out, it’s not. At least for the fortunate few, old age may not be Woody Allenesque; instead old age is when they become compassionate and wise. Yes, wise.

While aging diminishes activity in certain brain regions, there’s tantalizing evidence this may be compensated by changes in brain regions associated with supportive and social behavior. This shift in brain activity may foster wisdom in some people, a way of being that moves one away from self-centeredness toward emotional equanimity and wider social consciousness. We may even be able to work toward wisdom in old age. For millennia, philosophers and theologians have been preoccupied with the notion of wisdom (the Greek word philosophia means “love of wisdom”). Centuries before the Greeks got into the act, the religious traditions of India and China, such as Hinduism, Buddhism, and Daoism, were thinking about wisdom, emphasizing the regulation of emotion—or emotional balance—as key to it. Aristotle delineated wisdom into two types. One was the general, god-like, all-knowing wisdom, and the second (more pertinent to us mere mortals) was something called phronesis, or practical wisdom, which is the ability to be discerning about one’s actions, knowing when and why to act in a pragmatic manner. Ideally, such wise actions—whether involving emotion regulation or reasoning—would balance self-interest against the interests of others and those of society in general.

More here.

Sunday Poem

In Broken Images

He is quick, thinking in clear images;
I am slow, thinking in broken images.

He becomes dull, trusting to his clear images;
I become sharp, mistrusting my broken images,

Trusting his images, he assumes their relevance;
Mistrusting my images, I question their relevance.

Assuming their relevance, he assumes the fact,
Questioning their relevance, I question the fact.

When the fact fails him, he questions his senses;
When the fact fails me, I approve my senses.

He continues quick and dull in his clear images;
I continue slow and sharp in my broken images.

He in a new confusion of his understanding;
I in a new understanding of my confusion.
.

by Robert Graves
from To Read a Poem
Edited by Donald Hall
Harcourt Brace, 1992