the ruins of palmyra

Palmyra-valley-of-tombsIngrid D. Rowland at the NY Review of Books:

In September 2015, the Getty Research Institute in Los Angeles acquired the first photographs ever taken of Palmyra, the great trading oasis in the heart of the Syrian desert. Louis Vignes was a young lieutenant in the French navy whose interest in photography earned him a place on a scientific expedition to the Dead Sea region in 1863. The twenty-nine photographs he made of Palmyra during his visit in 1864 (including two panoramic shots) were finally printed in Paris by the pioneering photographer Charles Nègre (who had taught Vignes) between 1865 and 1867.

With a history that extends back nearly four thousand years, Palmyra has risen and fallen many times. Its original name was Tadmor, which probably meant “palm tree,” an indication of the site’s renowned fertility. Both the Bible and local legend credit the city’s foundation to King Solomon in the tenth century BCE, but in fact it is already mentioned in Mesopotamian texts a millennium earlier. A spring and a wadi, or dry river bed, provided the settlement with water, making it a welcome stop for travelers and traders on the road between Central Asia and the Mediterranean Sea.

The population, almost from the outset, was a mixture of Semitic peoples from surrounding areas: Amorites, Aramaeans, Arabs, and Jews, who developed a distinctive Palmyrene language and a distinctive script expressive of their distinctive cosmopolitan culture, which drew from Persia, Greece, and Rome as well as local tradition.

more here.

How cats conquered the world (and a few Viking ships)

Ewen Callaway in Nature:

CatResearchers know little about cat domestication, and there is active debate over whether the house cat (Felis silvestris) is truly a domestic animal — that is, its behaviour and anatomy are clearly distinct from those of wild relatives. “We don’t know the history of ancient cats. We do not know their origin, we don't know how their dispersal occurred,” says Eva-Maria Geigl, an evolutionary geneticist at the Institut Jacques Monod in Paris. She presented the study at the 7th International Symposium on Biomolecular Archaeology in Oxford, UK, along with colleagues Claudio Ottoni and Thierry Grange. A 9,500-year-old human burial from Cyprus also contained the remains of a cat1. This suggests that the affiliation between people and felines dates at least as far back as the dawn of agriculture, which occurred in the nearby Fertile Crescent beginning around 12,000 years ago. Ancient Egyptians may have tamed wild cats some 6,000 years ago2, and under later Egyptian dynasties, cats were mummified by the million. One of the few previous studies3 of ancient-cat genetics involved mitochondrial DNA (which, contrary to most nuclear DNA, is inherited through the maternal line only) for just three mummified Egyptian cats.

…Cat populations seem to have grown in two waves, the authors found. Middle Eastern wild cats with a particular mitochondrial lineage expanded with early farming communities to the eastern Mediterranean. Geigl suggests that grain stockpiles associated with these early farming communities attracted rodents, which in turn drew wild cats. After seeing the benefit of having cats around, humans might have begun to tame these cats. Thousands of years later, cats descended from those in Egypt spread rapidly around Eurasia and Africa.

More here.

Tuesday Poem

Settling for the Night

It’s a custom with my youngest
to sprinkle “sleeping dust”
over his eyes
before closing them,
combing the sleep down through his hair
and tenderly over his forehead

good night, Dad,
good night…

I listen to our children breathing the night,
their tiny heads under the covers
gone somewhere where we cannot follow
but at least they will return;
the stars of some eternal night
speckle their hair
and their faces are like clocks
in the bedroom twilight.

Their morning is afternoon to us;
their afternoon will see us settled for the night;
some quiet Sunday perhaps
the sun through the blinds will raise
its black ladder on my bedroom wall
and the child fists
will have become adult hands
that will sprinkle the sleeping dust
over my closed eyes
before combing it down
through my peppered grey hair…

Good night, Dad,
good night…
.

by Ifor ap Glyn
from Cerrdi Map yr Underground
publisher: Gwasg Carreg Gwalch, Llanrwst, 2001
.

Monday, September 19, 2016

Sunday, September 18, 2016

When Analogies Fail

Alexander Stern in the Chronicle of Higher Education:

Photo_78335_landscape_650x433An analogy is, according to Webster’s, “a comparison of two things based on their being alike in some way.” The definition seems to capture exactly what Simmons, a sports commentator, and Dowd, a New York Times columnist, are doing in the sentences above: comparing two things and explaining how they’re alike. Being a dictionary, however, Webster’s has little to say about why we use analogies, where they come from, or what role they really play in human life.

Analogies need not, of course, all have the same aim. They’re used in different contexts to varying effect. Still, it is evident that we use analogies for mainly rhetorical reasons: to shed light, to explain, to reveal a new aspect of something, to draw out an unseen affinity, to drive home a point. As Wittgenstein wrote, “A good simile refreshes the mind.”

This Simmons’s and Dowd’s analogies demonstrably fail to do. Our understanding of Trump is unlikely to benefit from an attentive viewing of Species. The careers of the basketball player Robert Horry and the actor Philip Baker Hall, admirable though they may be, leave Australia similarly unilluminated. This kind of analogy — which often consists of an ostensibly funny pop-culture reference or of objects between which certain equivalences can be drawn (x is the y of z’s) — has become increasingly common.

You also find it in academic writing. For example, from the journal Cultural Critique: “Attempting to define multiculturalism is like trying to pick up a jellyfish — you can do it, but the translucent, free-floating entity turns almost instantly into an unwieldy blob of amorphous abstraction.” The analogy aims not to enlighten, but to enliven, adorn, divert.

Of course there’s nothing wrong with this, as far as it goes, but its increasing prominence reflects more general changes in the way we relate to the world around us.

More here.

Every cognitive bias exists for a reason—primarily to save our brains time or energy

Buster Benson in Quartz:

ScreenHunter_2227 Sep. 18 20.14I’ve spent many years referencing Wikipedia’s list of cognitive biaseswhenever I have a hunch that a certain type of thinking is an official bias but I can’t recall the name or details. But despite trying to absorb the information of this page many times over the years, very little of it seems to stick.

I decided to try to more deeply absorb and understand this list by coming up with a simpler, clearer organizing structure. If you look at these biases according to the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.

Four problems that biases help us address: Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later.

Problem 1: Too much information

There is just too much information in the world; we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.

More here.

ON THE CELIBATE LOVE AFFAIR OF NORA EPHRON AND MIKE NICHOLS

Richard Cohen in Literary Hub:

ScreenHunter_2226 Sep. 18 20.04“Marlon Brando’s gay, everybody knows that.”

Nora said that one night in my house in Washington. I can’t remember how Brando’s name came up, but there it was, this startling (at the time) piece of information, so inside, so unknown to the general public, who considered him—fools that they were—a womanizer of great repute. I can remember exactly where I was at the time. In the living room. Standing in front of the sofa and to the right. The remark hit with the force of a dumdum bullet. Marlon Brando’s gay? Who knew?

Everybody, it turned out. Everybody knew. And whether they did or they didn’t, whether it was true or not, was totally beside the point. When Nora said one of these things—and she said them quite often—she did not do so with any sort of tentativeness, with hesitation, with the suggestion that this might be the rawest gossip and possibly wrong, but with a firmness and robust confidence that transformed the gossamer of hearsay into something chiseled into the frieze of a Greek temple. It was beyond dispute. Behold what she knew and behold what you didn’t. You knew some things. She knew everything.

More here.

Don’t believe the rumours: Universal Grammar is alive and well

Dan Milway writes:

Main-qimg-0f0e650bd9f16c171f4d31768717f0be-cAs I write this I am sitting in the Linguistics Department lounge at the University of Toronto. Grad students and Post-doctoral researchers are working, chatting, making coffee. Faculty members pop in every now and then, taking breaks from their work.

It’s a vibrant department, full of researchers with varied skills and interests. There are those who just got back into town from their summer fieldwork, excited to dig into the new language data from indigenous Canadian, Amazonian, or Australian languages. There are those struggling to find a way to explain the behaviour of some set of prefixes or verbs in Turkish, or Spanish, or Niuean. There are those designing and running experiments to test what young children know about the languages they are acquiring. There are those sifting through massive databases of speech from rural farmers, or lyrics of local hip-hop artists, or the emails of Enron employees, to hopefully gain some understanding of how English varies and changes. And there are those who spend their time thinking about our theories of language and how they might be integrated with theories of Psychology, Neurology, and Biology.What unites these disparate research agendas is that they are all grounded in the hypothesis, generally attributed to Noam Chomsky, that the human mind contains innate structures, a Universal Grammar, that allows us to acquire, comprehend, and use language.

According to a recent article in Scientific American, however, the community I just described doesn’t exist, and maybe couldn’t possibly exist in linguistics today, because the kind of work that I just described has long since shown the Universal Grammar hypothesis (UG) to be flat-out wrong. But such a community does exist, and not just here at UofT, or in Chomsky’s own department at MIT, but in Berkeley and Manhattan, in Newfoundland and Vancouver, in Norway and Tokyo. Communities that collectively groan whenever someone sounds the death knell of the UG hypothesis or the enterprise of Generative Linguistics it spawned. We groan, not because we’ve been exposed for the frauds or fools that these pieces say we are, but because we are always misrepresented in them. Sometimes the misrepresentation is laughable, but more often it’s damn frustrating.

More here.

Once we blamed Yoko Ono. Now we blame refugees

Shazia Mirza in New Statesman:

Shazia-mirza_2016tour_image-print_copyHate and lies are all the rage. Everyone’s at it. “Obama is the founder of ISIS!” and everyone believes the Trump. “We’re at breaking point” – look, here’s a poster of lots of brown men that look just like your dad, cousin and brother. If you vote out, all these scavengers that come over here for the good life, they’ll be gone in the morning.Hate is fashionable. It’s flourishing in the comments section of the Daily Mail, Facebook, and this morning I saw some photographs on Twitter of Madonna’s cellulite with the comment: “I thought grandma had died of AIDS.”When people are unhappy, discontent and disillusioned with their own life, they want someone to blame. Once we blamed Yoko Ono. Now we blame refugees. They caused Brexit; they are destroying the NHS, housing, transport, and education. They can’t seem to do any good or make any valuable contribution.

Well, some people might be surprised to hear that Syrian refugees are not coming to Britain for the food, weather and £65.45 a week, which couldn’t even get you a night out at the cinema. They’re not coming because they want to find Harry Potter, drink tea and watch drunk people rolling down every high street looking for their teeth on a Friday night. No. These people want to live. When I was a teacher, I taught in some very challenging schools where a lot of the children had difficult and unstable parents. But no matter how awful their parents, the child always wanted to remain with them. They would rather be with their own volatile parent than with a kind, caring stranger. Refugees would love to remain in their homeland. The place they know and where their family life has been. But they are forced to leave out of desperation.

More here.

What It Feels Like to Die

Jennie Dear in The Atlantic:

Lead_960For many dying people, “the brain does the same thing that the body does in that it starts to sacrifice areas which are less critical to survival,” says David Hovda, director of the UCLA Brain Injury Research Center. He compares the breakdown to what happens in aging: People tend to lose their abilities for complex or executive planning, learning motor skills—and, in what turns out to be a very important function, inhibition. “As the brain begins to change and start to die, different parts become excited, and one of the parts that becomes excited is the visual system,” Hovda explains. “And so that’s where people begin to see light.”

Recent research points to evidence that the sharpening of the senses some people report also seems to match what we know about the brain’s response to dying. Jimo Borjigin, a neuroscientist at the University of Michigan, first became intrigued by this subject when she noticed something strange in the brains of animals in another experiment: Just before the animals died, neurochemicals in the brain suddenly surged. While scientists had known that brain neurons continued to fire after a person died, this was different. The neurons were secreting new chemicals, and in large amounts. “A lot of cardiac-arrest survivors describe that during their unconscious period, they have this amazing experience in their brain,” she says. “They see lights and then they describe the experience as ‘realer than real.’” She realized the sudden release of neurochemicals might help to explain this feeling.Borjigin and her research team tried an experiment. They anesthetized eight rats, and then stopped their hearts. “Suddenly, all the different regions of the brain became synchronized,” she says. The rats’ brains showed higher power in different frequency waves, and also what is known as coherence—the electrical activity from different parts of the brain working together.

More here.

Sunday Poem

Q

Somewhere (thank you, father) over the hills,
through some trap-door in my mind, despite my having
no call to speak it, and hearing of it so long ago,
I know the Urdu ishq is love.
And further, how it’s the highest (a divine fervour,
a bolt cued from the round heavens – almost angelic)
among a whole host of forms, or feathers, of love
like that myth of subtle Inuit measures of snow

and now I’ve utterly gone and put my foot in it
and other shoppers are turning round, as we inch
up to the queue’s end, still far from those tills,
and she’s prodding me to explain my short-falling
answer – giving the nod, when she asked me If . . . and Whether . . .
– she swears that at the end of my assent she heard me whisper
ish

by Zaffar Kunial
from The Poetry Review,
Autumn 2014

.

Saturday, September 17, 2016

The true story of a scientist who discovered the equation for altruism—and gave himself away

Michael Regnier in Quartz:

ScreenHunter_2225 Sep. 17 16.54Laura met George in the pages of Reader’s Digest. In just a couple of column inches, she read an abridged version of his biography and was instantly intrigued. In the 1960s, apparently, egotistical scientist George Price discovered an equation that explained the evolution of altruism, then overnight turned into an extreme altruist, giving away everything up to and including his life.

A theatre director, Laura Farnworth recognized the dramatic potential of the story. It was a tragedy of Greek proportions—the revelation of his own equation forcing Price to look back on his selfish life and mend his ways, even though choosing to live selflessly would lead inexorably to his death. But as she delved into his life and science over the next five years, Farnworth discovered a lot more than a simple morality tale.

Born in New York in 1922, George Price realized pretty early on that he was destined for greatness. In a class full of smart kids he was one of the smartest, especially with numbers. He was in the chess club, obviously, and his mathematical brain was naturally drawn to science. Determining that there was no rational argument for God’s existence, he became a militant atheist, too.

His PhD came from the University of Chicago for work he did on the Manhattan Project—having graduated in chemistry, he’d been recruited to find better ways to detect traces of toxic uranium in people’s bodies.

More here.

Pardon Edward Snowden

Ken Roth and Salil Shetty in the New York Times:

ScreenHunter_2224 Sep. 17 16.49Edward J. Snowden, the American who has probably left the biggest mark on public policy debates during the Obama years, is today an outlaw. Mr. Snowden, a former National Security Agency contractor who disclosed to journalists secret documents detailing the United States’ mass surveillance programs, faces potential espionage charges, even though the president hasacknowledged the important public debate his revelations provoked.

Mr. Snowden’s whistle-blowing prompted reactions across the government. Courts found the government wrong to use Section 215 of the Patriot Act to justify mass phone data collection. Congress replaced that law with the USA Freedom Act, improving transparency about government surveillance and limiting government power to collect certain records. The president appointed an independent review board, which produced important reform recommendations.

That’s just in the American government. Newspapers that published Mr. Snowden’s revelations won the Pulitzer Prize. The United Nations issued resolutions on protecting digital privacy and created a mandate to promote the right to privacy. Many technology companies, facing outrage at their apparent complicity in mass surveillance, began providing end-to-end encryption by default. Three years on, the news media still refer to Mr. Snowden and his revelations every day. His actions have brought about a dramatic increase in our awareness of the risks to our privacy in the digital age — and to the many rights that depend on privacy.

Yet President Obama and the candidates to succeed him have emphasized not Mr. Snowden’s public service but the importance of prosecuting him.

More here.

Vladimir Nabokov interviewed by Herbert Gold in 1967

Herbert Gold in the Paris Review:

INTERVIEWER: What is most characteristic of poshlust in contemporary writing? Are there temptations for you in the sin of poshlust? Have you ever fallen?

ScreenHunter_2223 Sep. 17 16.42NABOKOV: “Poshlust,” or in a better transliteration poshlost, has many nuances, and evidently I have not described them clearly enough in my little book on Gogol, if you think one can ask anybody if he is tempted by poshlost. Corny trash, vulgar clichés, Philistinism in all its phases, imitations of imitations, bogus profundities, crude, moronic, and dishonest pseudo-literature—these are obvious examples. Now, if we want to pin down poshlost in contemporary writing, we must look for it in Freudian symbolism, moth-eaten mythologies, social comment, humanistic messages, political allegories, overconcern with class or race, and the journalistic generalities we all know. Poshlost speaks in such concepts as “America is no better than Russia” or “We all share in Germany’s guilt.” The flowers of poshlost bloom in such phrases and terms as “the moment of truth,” “charisma,” “existential” (used seriously), “dialogue” (as applied to political talks between nations), and “vocabulary” (as applied to a dauber). Listing in one breath Auschwitz, Hiroshima, and Vietnam is seditious poshlost. Belonging to a very select club (which sports one Jewish name—that of the treasurer) is genteel poshlost. Hack reviews are frequently poshlost, but it also lurks in certain highbrow essays. Poshlost calls Mr. Blank a great poet and Mr. Bluff a great novelist. One of poshlost’s favorite breeding places has always been the Art Exhibition; there it is produced by so-called sculptors working with the tools of wreckers, building crankshaft cretins of stainless steel, Zen stereos, polystyrene stinkbirds, objects trouvés in latrines, cannonballs, canned balls. There we admire the gabinetti wall patterns of so-called abstract artists, Freudian surrealism, roric smudges, and Rorschach blots—all of it as corny in its own right as the academic “September Morns” and “Florentine Flowergirls” of half a century ago. The list is long, and, of course, everybody has his bête noire, his black pet, in the series. Mine is that airline ad: the snack served by an obsequious wench to a young couple—she eyeing ecstatically the cucumber canapé, he admiring wistfully the hostess. And, of course,Death in Venice. You see the range.

More here.

‘Looking for “The Stranger,”’ the Making of an Existential Masterpiece

16book-1-master180John Williams at the New York Times:

In the closing days of World War II, the American publisher Alfred A. Knopf was pursuing English-language rights to Albert Camus’s novel “The Plague,” with its powerful and clear allegorical view of Nazism. With hesitation, he also acquired Camus’s first novel, “The Stranger,” which one reader at the company described as “pleasant, unexciting reading” that seemed “neither very important nor very memorable.”

The novel went on to become, by consensus, one of the most important and memorable books of the 20th century. Alice Kaplan, in the prologue to “Looking for ‘The Stranger,’” her new history of Camus’s profoundly influential debut, writes that critics have seen the novel variously as “a colonial allegory, an existential prayer book, an indictment of conventional morality, a study in alienation, or ‘a Hemingway rewrite of Kafka.’” This “critical commotion,” in Ms. Kaplan’s phrasing, “is one mark of a masterpiece.”

Ms. Kaplan sets out to tell “the story of exactly how Camus created this singular book.” It’s a story that unfolded against one of the most dramatic backdrops in history.

more here.