The climate of history: Four theses

Cco8zmlW4AEtCUj

Dipesh Chakrabarty in Eurozine:

The current planetary crisis of climate change or global warming elicits a variety of responses in individuals, groups, and governments, ranging from denial, disconnect, and indifference to a spirit of engagement and activism of varying kinds and degrees. These responses saturate our sense of the now. Alan Weisman's best-selling book The World without Us suggests a thought experiment as a way of experiencing our present: “Suppose that the worst has happened. Human extinction is a fait accompli. […] Picture a world from which we all suddenly vanished. […] Might we have left some faint, enduring mark on the universe? […] Is it possible that, instead of heaving a huge biological sigh of relief, the world without us would miss us?” I am drawn to Weisman's experiment as it tellingly demonstrates how the current crisis can precipitate a sense of the present that disconnects the future from the past by putting such a future beyond the grasp of historical sensibility. The discipline of history exists on the assumption that our past, present, and future are connected by a certain continuity of human experience. We normally envisage the future with the help of the same faculty that allows us to picture the past. Weisman's thought experiment illustrates the historicist paradox that inhabits contemporary moods of anxiety and concern about the finitude of humanity. To go along with Weisman's experiment, we have to insert ourselves into a future “without us” in order to be able to visualize it. Thus, our usual historical practices for visualizing times, past and future, times inaccessible to us personally – the exercise of historical understanding – are thrown into a deep contradiction and confusion. Weisman's experiment indicates how such confusion follows from our contemporary sense of the present insofar as that present gives rise to concerns about our future. Our historical sense of the present, in Weisman's version, has thus become deeply destructive of our general sense of history.

I will return to Weisman's experiment in the last part of this essay. There is much in the debate on climate change that should be of interest to those involved in contemporary discussions about history. For as the idea gains ground that the grave environmental risks of global warming have to do with excessive accumulation in the atmosphere of greenhouse gases produced mainly through the burning of fossil fuel and the industrialized use of animal stock by human beings, certain scientific propositions have come into circulation in the public domain that have profound, even transformative, implications for how we think about human history or about what the historian C. A. Bayly recently called “the birth of the modern world”.

More here.And a response by Timothy J. LeCain.

How I Acted Like A Pundit And Screwed Up On Donald Trump

Apologia_16x9

Nate Silver over at FiveThirtyEight:

Trump is one of the most astonishing stories in American political history. If you really expected the Republican front-runner to be bragging about the size of his anatomy in a debate, or to be spending his first week as the presumptive nominee feuding with the Republican speaker of the House and embroiled in a controversy over a tweet about a taco salad, then more power to you. Since relatively few people predicted Trump’s rise, however, I want to think through his nomination while trying to avoid the seduction of hindsight bias. What should we have known about Trump and when should we have known it?

It’s tempting to make a defense along the following lines:

Almost nobody expected Trump’s nomination, and there were good reasons to think it was unlikely. Sometimes unlikely events occur, but data journalists shouldn’t be blamed every time an upset happens, particularly if they have a track record of getting most things right and doing a good job of quantifying uncertainty.

We could emphasize that track record; the methods of data journalism have been highly successful at forecasting elections. That includes quite a bit of success this year. The FiveThirtyEight “polls-only” model has correctly predicted the winner in 52 of 57 (91 percent) primaries and caucuses so far in 2016, and our related “polls-plus” model has gone 51-for-57 (89 percent). Furthermore, the forecasts have been well-calibrated, meaning that upsets have occurred about as often as they’re supposed to but not more often.

But I don’t think this defense is complete — at least if we’re talking about FiveThirtyEight’s Trump forecasts. We didn’t just get unlucky: We made a big mistake, along with a couple of marginal ones.

The big mistake is a curious one for a website that focuses on statistics. Unlike virtually every other forecast we publish at FiveThirtyEight — including the primary and caucus projections I just mentioned — our early estimates of Trump’s chances weren’t based on a statistical model. Instead, they were what we sometimes called ”subjective odds” — which is to say, educated guesses. In other words, we were basically acting like pundits, but attaching numbers to our estimates. And we succumbed to some of the same biases that pundits often suffer, such as not changing our minds quickly enough in the face of new evidence. Without a model as a fortification, we found ourselves rambling around the countryside like all the other pundit-barbarians, randomly setting fire to things.

More here.

The empty brain

Header_ESSAY-GS3522985

Robert Epstein in Aeon:

No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.

To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.

A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to changerapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.

But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

More here.

Is Life Worth Living?

Cfef3278-7bd2-4851-bd70-1463eaf8809bCatherine Hollis at Public Books:

Any number of recent memoirs—most, but not all, by women—face down the question James posed in his essay “Is Life Worth Living?” Should we go on living, and if so, what will our lives look like? If terrible things have happened to us, is healing possible? Each of the four writers under discussion here—Jessa Crispin, Jacqueline Rose, Rachel Moran, and Sandra Cisneros—confronts these questions to varying degrees. Constructing a life, like constructing a memoir or a biography, involves asking how to live. Any reader confined to her bed by depression may find whole communities of fellow travelers in the ranks of memoirs published during the so-called memoir boom.1 Best-selling examples of this trend include Elizabeth Gilbert’s Eat, Pray, Love (2006), Jeannette Walls’s Glass Castle (2006), and Cheryl Strayed’s Wild (2012). Although Walls and Strayed face serious issues including poverty, homelessness, and addiction, Gilbert’s own memoir has been taken to exemplify a genre of narrative we might call “first-world white girl problems.” It’s hard to feel sorry for someone who treats her depression with trips to Italy, India, and Bali, even if we empathize with her grief at ending a marriage.

Other notable, if less famous, entrants in the “Live Through This” genre include three memoirs about surviving difficult parents. Nick Flynn’s Another Bullshit Night in Suck City (2004) is probably the best known, and documents with lyrical honesty his relationship with his homeless father. Domenica Ruta’s With or Without You (2013) is a stunning portrait of a mother-daughter relationship shaped by addiction and violence in an Italian working-class neighborhood near Boston. And Ariel Gore (publisher of Hip Mama) writes tenderly about caregiving and emotional boundaries in The End of Eve (2014), when she finds herself tending to an insane and impossible mother dying from cancer.

more here.

How Photomontage Ended the Lebanese Civil War

Salti-web1-LARGERasha Salti at The Brooklyn Rail:

Officially, the Lebanese Civil War ended when the Taif Agreement was ratified (November 1989), the Lebanese parliament voted to adopt an Amnesty Law (March 1991), and militias were dissolved (May 1991). The Amnesty Law pardoned all political crimes committed prior to its enactment. I, for myself, cannot recall a specific moment or event when the war ended. The transition from the state of war to a state of non-war is blurred in my memory. It was gradual, and even though the country felt relatively safer, tension pervaded the atmosphere and the looming threat of the conflict reigniting remained. The actors of the Civil War—those still alive, as well as the ghosts—transformed from warlords to political leaders, populating the legislative and executive bodies of governance. It was as if we were all handed a new, additional script, and we, mutatis mutandis, all slipped into new roles. The militias became political parties with an overt agenda to strictly serve their sectarian constituencies, even as the militants upheld the claim that they were constitutive protagonists of a republic. And we, perpetrators and victims, everyday folks, claimed to be citizens of this republic. The longest serial ever performed: for the past twenty-five years, we have been voting for these very same leaders to govern our destinies—to ensure our safety, well-being, and prosperity. It has not been smooth sailing; the “old script” surges every once in a while, in relative degrees of intensity. It was and remains necessary to believe that the war is over, and that we all wanted it to end. In the span of twenty-five years, the theatricality deployed to that end has been superlative and tireless, with public showcases of redemption, songs, music videos, and speeches to cheer our resolve to end the war.

more here.

Is this the end of sex?

41zzO16vpZL._SX327_BO1,204,203,200_Philip Ball at The New Statesman:

Is it time to give up sex? Oh, it has plenty to recommend it; but as a way of making babies it leaves an awful lot to chance. I mean, you might have some pretty good genes, but – let’s face it – some of them aren’t so great. Male pattern baldness, phenylketonuria, enhanced risk of breast cancer: I’m not sure you really want those genetic conditions passed on in the haphazard shuffling of chromosomes after sperm meets egg.

It is already possible to avoid more than 250 grave genetic conditions by genetic screening of few-days-old embryos during in vitro fertilisation (IVF), so that embryos free from the genetic mutation responsible can be identified for implantation. But that usually works solely for diseases stemming from a single gene – of which there are many, though most are rare. The procedure is called pre-implantation genetic diagnosis (PGD), and it is generally used only by couples at risk of passing on a particularly nasty genetic disease. Otherwise, why go to all that discomfort, and possibly that expense, when the old-fashioned way of making babies is so simple and (on the whole) fun?

In The End of Sex, Henry Greely, a law professor and bioethicist at Stanford University, argues that this will change. Thanks to advances in reproductive and genetic technologies, he predicts that PGD will become the standard method of conception in a matter of several decades. (Recreational sex might nonetheless persist.)

more here.

The spectrum of sex development: Eric Vilain and the intersex controversy

Sara Reardon in Nature:

Sex1As a medical student in Paris in the 1980s, Eric Vilain found himself pondering the differences between men and women. What causes them to develop differently, and what happens when the process goes awry? At the time, he was encountering babies that defied simple classification as a boy or girl. Born with disorders of sex development (DSDs), many had intermediate genitalia — an overlarge clitoris, an undersized penis or features of both sexes. Then, as now, the usual practice was to operate. And the decision of whether a child would be left with male or female genitalia was often made not on scientific evidence, says Vilain, but on practicality: an oft-repeated, if insensitive, line has it that “it's easier to dig a hole than build a pole”. Vilain found the approach disturbing. “I was fascinated and shocked by how the medical team was making decisions.”

Vilain has spent the better part of his career studying the ambiguities of sex. Now a paediatrician and geneticist at the University of California, Los Angeles (UCLA), he is one of the world's foremost experts on the genetic determinants of DSDs. He has worked closely with intersex advocacy groups that campaign for recognition and better medical treatment — a movement that has recently gained momentum. And in 2011, he established a major longitudinal study to track the psychological and medical well-being of hundreds of children with DSDs. Vilain says that he doesn't seek out controversy, but his research seems to attract it. His studies on the genetics of sexual orientation — an area that few others will touch — have attracted criticism from scientists, gay-rights activists and conservative groups alike. He is also a medical adviser for the International Olympic Committee, which about five years ago set controversial rules by which intersex individuals are allowed to compete in women's categories.

More here.

Wednesday Poem

Strange Fruit

Where the plows can’t reach
snow crusts brick tenements in
a black-and-white photograph.
Outside the apartments
streetlamps glow like twin moons,
as if belonging to another solar system,
one where Billie Holiday didn’t die.
Still, the thin blade of her voice
keeps slicing, fragile and honeyed,
transporting me to a closet-sized
chamber redolent with beeswax,
illuminated by a single bare bulb
swinging from its cord.

Rebecca Hart Olander
originally published in Brilliant Corners
Find more about: Rebecca Hart Olander

Tuesday, May 17, 2016

Designing Time: The Idea of Plot in the Lyric Essay

Tyler Mills in Agni:

ScreenHunter_1944 May. 17 19.53What is “plot” in a lyric essay? As I worked on “Home” (AGNI issue 83), I kept thinking about this question. Why? My process involves piecework. I handwrote scenes in a notebook, typed them up, and moved them around.

A half hour here. An hour there. Forty-five minutes in the dark early light of October.

Primarily a poet, I’ve always been tentative about plot. But I’ve always kept a notebook. Words, phrases, scraps of description—these are the things that the plot of the lyric essay must transform. In “On Keeping a Notebook,” Joan Didion writes,

“our notebooks give us away, for however dutifully we record what we see around us, the common denominator of all we see is always, transparently, shamelessly, the implacable ‘I.’… we are talking about something private, about bits of the mind’s string too short to use, an indiscriminate and erratic assemblage with meaning only for its maker.”

The lyric essay must transform our “erratic assemblage,” moving them into meaning like the night sky that turns toward morning. The constellations change positions, and we pick out their patterns from the chaos of darkness. The crisis that spins everything toward the main thing is realization. Realization is what the mind does with these observations. Realization is what the mind does with the world. Realization is the heart of the lyric essay—what makes it move, what makes all of its light-riddled parts hold together.

More here.

This Man Memorized a 60,000-Word Poem Using Deep Encoding

Lois Parshley in Nautilus:

ScreenHunter_1943 May. 17 19.42Of man’s first disobedience, and the fruit of that forbidden tree,” John Basinger said aloud to himself, as he walked on a treadmill. “Of man’s first disobedience…” In 1992, at the age of 58, Basinger decided to memorize Paradise Lost, John Milton’s epic poem, as a form of mental activity while he was working out at the gym. An actor, he’d memorized shorter poems before, and he wanted to see how much of the epic he could remember. “As I finished each book,” he wrote, “I began to perform it and keep it alive in repertory while committing the next to memory.”

The twelve books of Paradise Lost contain over 60,000 words; it took Basinger about 3,000 hours to learn them by rote. He did so by reciting the piece, line-by-line out loud, for about an hour a day for nine years. When he memorized all 12 books, in 2001, Basinger performed the masterpiece in a live recital that lasted three days. Since then, he’s performed smaller sections for various audiences, eventually attracting the attention of John Seamon, a psychologist at Wesleyan University, in Connecticut. In 2008, “He recited for an hour in the Wesleyan library,” says Seamon. “He’d given out copies of Milton’s book so we could follow along. At the end of the talk I introduced myself and said ‘I’d love to study your memory.’” Basinger agreed, and so Seamon devised a test.

Then 74, Basinger came into the lab to perform a series of cued recall tests. Scientists read two successive lines from each of the poem’s 12 books and then asked Basinger to recall the next 10 lines. The results, published in Memory in 2010, were surprising: Despite the amount of elapsed time since his memorization process, Basinger’s recall was, overall, word-perfect 88 percent of the time. When he was prompted with lines that opened one of the 12 books, his accuracy increased to 98 percent.

More here.

The grotesque criminalization of poverty in America

Ryan Cooper in The Week:

If you are arrested for a serious crime, you're supposed to be taken to jail and booked. Then there's some sort of hearing, and if the judge doesn't think you will skip town or commit more crimes, you are either released on your own recognizance, or you post bail, and you are free until a pre-trial hearing. After that, you either go to trial, or plead guilty and accept punishment.

But for a great many people, this is not how it works. As a new report from the Prison Policy Initiative demonstrates, over one-third of people who go through the booking process end up staying in jail simply because they can't raise enough cash to post bail. For millions of Americans in 2016, poverty is effectively a crime.

This flowchart lays out the basic reality for people who get booked. A very small minority (4 percent) are denied bail, while about a quarter are released without bail. Thirty-eight percent manage to make bail, while 34 percent can't scrape together the cash:

Arrest_pretrialdetention

People who can't make bail (let's call them “bailed-in”) make little money, with a median pre-jail income of $15,109 — less than half the median income for the general population.

More here.

a personal history of L.A. Punk by John Doe with Tom DeSavia

Cover00Camden Joy at Bookforum:

In Los Angeles in the middle of the 1970s several hundred diverse misfits came together and began to collaborate. Some were high school glam-rock enthusiasts, like Belinda Carlisle, Jane Wiedlin, or the boys who became Pat Smear and Darby Crash. Others were older, having traveled farther. From Baltimore came John Doe, from Florida came Exene Cervenka; in California they met and fell in love. Together, and against the world, these few hundred sparked an experiment called LA punk rock—an impulse, some might say, a happening, an underground movement, a rebellion, a cultural revolution. Mention of it now usually stirs memories of mohican haircuts and hardcore music, stage-diving and slam-dancing, but those didn’t come until later. There was an initial punk endeavor in the city that was far different. The charismatic Tomata du Plenty at the front of The Screamers. The wonderfully harebrained choreography of Devo, newly arrived from Ohio. Photographers, cartoonists, poets, painters, and performance artists participated fully, supporting and contributing to a movement that was all about risk, immediacy, rule-breaking, and anti-materialism. Despite how that sounds, the scene was a welcoming one, more Brando and Bettie Page than what was going on in New York and London at the time. This is the moment with which John Doe’s new book Under the Big Black Sun concerns itself, shining a light on a legendary but largely unexamined corner of the West Coast counterculture.

This LA moment ran from 1976 through 1981, and Doe, a founder of the band X, saw much of it firsthand. Under the Big Black Sun—which Doe wrote with Tom DeSavia and includes contributions by a number of others musicians—gathers together a few of the musical and critical celebrities, allotting them each a chapter or, in the case of John Doe, several chapters. Here nostalgic fans of LA punk will learn amazing things: how The Go-Gos and The Germs grew out of the same rehearsal space, how the stories of Charles Bukowski inspired not only the lyrics but the lifestyle of X (the cigarettes, tattoos, and booze), how friends became bandmates, parties went on for weeks, everyone was high and no one had any money, and some people died, and some became famous, how the scene was pansexual, gay-friendly.

more here.

michael fried on clement greenberg

Clement_greenbergMichael Fried at nonsite:

But, again, my aim in these remarks is not to critique Greenberg’s ideas. Instead I want to seize upon the thought of density or intensity or weight of intuited decision and to associate that thought with a body of work to which, on theoretical grounds, it might seem to have nothing in common—the photographic oeuvre of Robert Adams. Very briefly: Adams was born in New Jersey in 1937; his family subsequently moved to Madison, Wis­consin and a few years later to the suburbs of Denver. Adams got his B.A. from the University of Redlands in California, and went on to do a Ph.D. at the University of Southern California. In 1962 he began teaching English at Colorado College but around that time became interested in taking and making photographs; by 1967 he was doing so seri­­ously, and in 1970 he stopped teaching in order to photo­graph full time. An important photobook, The New West: Landscapes along the Colorado Front Range, appeared in 1974 and a year later his work was shown in the impor­tant exhibition (in retrospect a mile­stone in American photographic history), New Topographics: Photographs of a Man-Altered Landscape (1975). Since that time superb photobooks have appeared with some regularity (Denver: A Photographic Survey of the Metropolitan Area [1977]; Los Angeles Spring [1986]; What We Bought: The New World, Scenes from the Denver Metropolitan Area, 1970-74 [1995 and 2009]; and Turning Back: A Photographic Journal of Re-exploration [2005] among them), and of course for a long time now Adams has been widely recognized as one of the most distinguished photographers at work anywhere. My personal familiarity with his art is quite recent, dating as it does from the major retro­spective exhibition, a selection of nearly 300 works, organized by Joshua Chuang for the Yale University Art Gallery, which opened in Vancouver in the fall of 2010 and over the next few years traveled to a number of venues in this country and Europe.8 (I saw it in New Haven in the fall of 2012 after having caught it some months before at LACMA. Let me also say that I had the privilege of going through the exhibition at LACMA with Jim Welling and at Yale with Josh Chuang; I’m grateful to them both for count­less insights.) Simply put, I was swept away by what I saw. Naturally I had admired individual photographs and even small shows of Adams’s work in the past. But Josh Chuang’s exhibition established Adams’s sta­ture as a major artist beyond the possibility of dispute, by virtue both of the taste, intel­ligence, and amplitude of the selection and, in both museums but espe­cially in New Haven, the effectiveness of the installation.

more here.

Survivor guilt in the Anthropocene

1280px-Lonesome_George_-Pinta_giant_tortoise_-Santa_CruzJennifer Jacquet at Lapham's Quarterly:

The current array of species disappearances is comparable in rate and size to the five other mass extinctions in earth’s 4.5-billion-year history. But only since the second half of the twentieth century—with the creation of international scientific bodies, and databases that tally likely extinct species (to date, nine pages of very small font)—have we come to understand the magnitude. This havoc we have wreaked on earth’s biological system feels fundamentally different than that which we have wreaked on its physical system. We feel bad for warming glaciers and making the oceans more acidic, but we feel particularly bad about annihilating wild animals that managed to struggle for their survival alongside us year after year. They struggled against all odds but one.

Dealing with the disaster we have created means finding a way to reckon with our guilt for causing it. “Why stick around to see the last beautiful wild places getting ruined, and to hate my own species, and to feel that I, too, in my small way, was one of the guilty ruiners?” asked Jonathan Franzen in 2006. “The guilt of knowing what human beings have done” is how conservation biologist George Schaller described the feeling he gets when he looks at the Serengeti. In 2008 Schaller made one of the most definitive statements of Anthropocene-inspired self-reproach. “Obviously,” he said, “humans are evolution’s greatest mistake.” And in 2015 Pope Francis joined the chorus of mourners. “Because of us,” he wrote in his encyclicalLaudato Si’, “thousands of species will no longer give glory to God by their very existence, nor convey their message to us. We have no such right.”

In 1961 psychoanalyst William Niederland coined the term survivor syndrome after conducting a study of those who survived Nazi concentration camps as well as survivors of natural disasters and car accidents. Niederland noted that among their symptoms were chronic depression and anxiety. Many camp survivors whom the SS had “selected” to live found it difficult to relate to ordinary people and have ordinary feelings. Sigmund Freud , page 44] had intimated the idea in an 1896 letter in which he discussed his father’s death, describing a “tendency toward self-reproach which death invariably leaves.”

more here.

The myth of human nature

Tim Lewens in New Humanist:

Nature-cover-cutout-copy“What,” asked the distinguished evolutionist Michael Ghiselin in 1997, “does evolution teach us about human nature?” The answer he gave will surprise those who suppose that the evolutionary sciences describe the deepest and most ubiquitous aspects of our psychological makeup. Ghiselin informed his readers that evolution “teaches us that human nature is a superstition.” Why would anyone say such a thing? Doesn’t talk about human nature amount to talk about the ways we are all the same? What could be objectionable about that? We can begin to understand the problems if we look back 180 years. On 2 October 1836, HMS Beagle landed at Falmouth. She had finally returned to England, after a five-year circumnavigation of the globe. One of the Beagle’s passengers was a 27-year-old Charles Darwin. After disembarking he first went to stay at his father’s house in Shrewsbury, but by March of 1837 he had moved to London. It was here that Darwin began to speculate in a series of notebooks on a wide range of topics in natural history and beyond. He formulated his “transmutationist” view of how species had come into existence, he pointed to intense struggle as the primary agent of change in the natural world, and he reflected openly on the impact this image of life’s history might have for human psychology, morality and aesthetic sensibilities. Many of these notebook jottings were transformed, in 1842, into a short “sketch” of Darwin’s theory. By 1844 that short sketch had expanded into a 230-page statement of the evolutionary view. But it was not until 1859 – 15 years later – that the Origin of Species was published. What had Darwin been doing in the meantime?

The answer is that he spent the eight years between 1846 and 1854 working on a gigantic study of barnacles. This period – sometimes referred to as a “delay”, as though Darwin was ready to publish the Origin in the mid-1840s, but somehow lost his nerve – was a puzzle to historians for some time. But it now seems clear how Darwin used his barnacle work as a detailed empirical testing ground for many of his earlier theoretical speculations. One of the most important lessons Darwin took from his meticulous study of barnacle anatomy concerned the ubiquity of variation: “Not only does every external character vary greatly in most of the species,” he wrote, “but the internal parts very often vary to a surprising degree.” He went so far as to assert that it is “hopeless” to find any part or organ “absolutely invariable in form or structure”. Variability in all parts of all species is a primary fact of nature, says Darwin, and this ubiquitous variation is the fuel that powers natural selection. It is the conviction, inherited from Darwin, that species vary in all respects at any moment in time, and that natural selection causes those species to change in profound ways over time, that has made the likes of Ghiselin so sceptical of the thought that species have “natures”.

Evolutionists are not, however, united in their rejection of “human nature”. The eminent evolutionary psychologists Leda Cosmides and John Tooby announced back in 1990 their intention to defend “the concept of a universal human nature”, and Stephen Pinker’s 2002 book The Blank Slate: The Modern Denial of Human Nature implies through its title that the deniers of human nature are misguided.

More here.

An old idea revived: Starve Cancer to Death

Sam Apple in The New York Times:

WarburgThe story of modern cancer research begins, somewhat improbably, with the sea urchin. In the first decade of the 20th century, the German biologist Theodor Boveri discovered that if he fertilized sea-urchin eggs with two sperm rather than one, some of the cells would end up with the wrong number of chromosomes and fail to develop properly. It was the era before modern genetics, but Boveri was aware that cancer cells, like the deformed sea urchin cells, had abnormal chromosomes; whatever caused cancer, he surmised, had something to do with chromosomes. Today Boveri is celebrated for discovering the origins of cancer, but another German scientist, Otto Warburg, was studying sea-urchin eggs around the same time as Boveri. His research, too, was hailed as a major breakthrough in our understanding of cancer. But in the following decades, Warburg’s discovery would largely disappear from the cancer narrative, his contributions considered so negligible that they were left out of textbooks altogether. Unlike Boveri, Warburg wasn’t interested in the chromosomes of sea-urchin eggs. Rather, Warburg was focused on energy, specifically on how the eggs fueled their growth. By the time Warburg turned his attention from sea-urchin cells to the cells of a rat tumor, in 1923, he knew that sea-urchin eggs increased their oxygen consumption significantly as they grew, so he expected to see a similar need for extra oxygen in the rat tumor. Instead, the cancer cells fueled their growth by swallowing up enormous amounts of glucose (blood sugar) and breaking it down without oxygen. The result made no sense. Oxygen-fueled reactions are a much more efficient way of turning food into energy, and there was plenty of oxygen available for the cancer cells to use. But when Warburg tested additional tumors, including ones from humans, he saw the same effect every time. The cancer cells were ravenous for glucose.

Warburg’s discovery, later named the Warburg effect, is estimated to occur in up to 80 percent of cancers. It is so fundamental to most cancers that a positron emission tomography (PET) scan, which has emerged as an important tool in the staging and diagnosis of cancer, works simply by revealing the places in the body where cells are consuming extra glucose. In many cases, the more glucose a tumor consumes, the worse a patient’s prognosis.

More here.

Tuesday Poem

The Last Time

The last time we had dinner together in a restaurant
with white tablecloths, he leaned forward

and took my two hands in his hands and said,
I’m going to die soon. I want you to know that.

And I said, I think I do know.
And he said, What surprises me is that you don’t.

And I said, I do. And he said, What?
And I said, Know that you’re going to die.

And he said, No, I mean know that you are.

by Marie Howe
from What Living We Do
W.W. Norton, 1998

Monday, May 16, 2016

Perceptions: Art in Nature

Acorn woodpecker tree
Acorn Woodpecker. Granary Tree.

Acorn woodpeckers drill into trees not in order to find acorns, but in order to make holes in which they can store acorns for later use, especially during the winter.

As the acorn dries out, it decreases in size, and the woodpecker moves it to a smaller hole. The birds spend an awful lot of time tending to their granaries in this way, transferring acorns from hole to hole as if engaged in some complicated game of solitaire.

Multiple acorn woodpeckers work together to maintain a single granary, which may be located in a man-made structure – a fence or a wooden building – as well as in a tree trunk. And whereas most woodpecker species are monogamous, acorn woodpeckers take a communal approach to family life. In the bird world, this is called cooperative breeding. Acorn woodpeckers live in groups of up to seven breeding males and three breeding females, plus as many as ten non-breeding helpers. Helpers are young birds who stick around to help their parents raise future broods; only about five per cent of bird species operate in this way.”

More here, here and here.