perceptions

He_will_not_divide_us_111

Shia LaBeouf. He Will Not Divide Us. January 2017.

“The actor turned performance artist's latest collaboration with fellow performance artists Nastja Säde Rönkkö and Luke Turner invites the public to say the words “HE WILL NOT DIVIDE US” as many times as they like and for as long as they like into a camera mounted on a wall outside the Museum of the Moving Image in New York.”

More here.

Sunday, February 12, 2017

Diet culture is just another way of dealing with the fear of death

Michelle Allison in The Atlantic:

Lead_960 (1)Knowing a thing means you don’t need to believe in it. Whatever can be known, or proven by logic or evidence, doesn’t need to be taken on faith. Certain details of nutrition and the physiology of eating are known and knowable: the fact that humans require certain nutrients; the fact that our bodies convert food into energy and then into new flesh (and back to energy again when needed). But there are bigger questions that don’t have definitive answers, like what is the best diet for all people? For me?

Nutrition is a young science that lies at the intersection of several complex disciplines—chemistry, biochemistry, physiology, microbiology, psychology—and though we are far from having figured it all out, we still have to eat to survive. When there are no guarantees or easy answers, every act of eating is something like a leap of faith.

Eating is the first magic ritual, an act that transmits life energy from one object to another, according to cultural anthropologist Ernest Becker in his posthumously published book Escape From Evil. All animals must feed on other life to sustain themselves, whether in the form of breastmilk, plants, or the corpses of other animals. The act of incorporation, of taking a once-living thing into your own body, is necessary for all animals’ existence. It is also disturbing and unsavory to think about, since it draws a direct connection between eating and death.

More here.

The Indus Jigsaw: Can It Be Pieced Together, At Least Partially?

Vikram Zutshi in Swarajya:

ScreenHunter_2586 Feb. 12 17.47Asko Parpola is a Finnish Indologist and Sindhologist, current professor emeritus of Indology and South Asian studies at the University of Helsinki, Finland. Generally recognised as the world's expert on the Indus script, he has been studying this undeciphered writing for over 40 years at the University of Helsinki. He is co-editor of collections of all seals and inscriptions in India and Pakistan.

As professor of Indology he has led a Finnish team of experts through numerous approaches to the puzzle of one of the world's very earliest writing systems. A grand summary of Dr Parpola's work, Deciphering the Indus Script was published by Cambridge University Press in 1994. His next opus, The Roots of Hinduism: The Early Aryans and the Indus Civilization, was published by Oxford University Press in 2015.

Vikram Zutshi: Describe the premise of your book The Roots of Hinduism. What can a reader expect to take away from this work? How do you define Hinduism and where can its roots be found?

Dr Parpola: The earliest literature of South Asia, generally dated between about 1300 and 1000 BCE, consists of the hymns of the Rigveda. In these skilfully constructed poems, people calling themselves Arya praise their deities and pray for victory in battle, for sons, and for other good things. What is the prehistoric background of these Aryas? Their Sanskrit language belongs to the Indo-European language family mostly spoken outside India. This raises the question: where are the roots of the Vedic religion?

More here. [Thanks to Omar Ali.]

The Great mathematician Abraham A. Fraenkel remembers the challenges he and his Jewish colleagues faced under the slow rise of the Nazis

Abraham A. Fraenkel in Tablet:

ScreenHunter_2585 Feb. 12 17.34My report about this last phase of my life in Germany should not close without my describing some people who in every respect deserve to be highlighted. Those who first come to mind are eight scientists. Of course, I cannot and do not wish to offer biographies or acknowledgments of their scientific accomplishments that can be easily found elsewhere. Instead, I will mention primarily those aspects that were significant for my own development. Of these eight men, there are four mathematicians: Hilbert, Brouwer, Landau, and von Neumann; two physicists: Einstein and Niels Bohr; and two Protestant theologians and philosophers: Rudolf Otto and Heinrich Scholz.

In his time David Hilbert (1862–1943) was the most significant mathematician in the world. For a long time, he shared this honor with Henri Poincaré, who died in 1912. In contrast to most of his colleagues, Hilbert’s discoveries in successive periods encompassed the broadest range of pure mathematics. He hardly dealt with applied mathematics, except for one not very successful period devoted to physics. He was born in Königsberg and never relinquished his East Prussian accent. The number of true anecdotes about him is legion, as he was, without doubt, a highly original character. He became a professor in Göttingen in 1895 and declined appointments to Leipzig, Berlin, Heidelberg, and in 1919 to Bern. He was correctly considered the scientific head of German mathematics, and was acknowledged throughout the world. Students flocked to him from all over Europe and the United States. At the second International Congress of Mathematicians in Paris in 1900 he gave a programmatic lecture on “Mathematical Problems.” The 23 key unsolved problems he enumerated largely determined the developments in mathematics in the subsequent decades. Most of these problems have since been solved, problem No. 1 by Paul J. Cohen in 1963.

More here.

How Americans Die May Depend On Where They Live

Anna Maria Barry-Jester in FiveThirtyEight:

Barry-jester-mortality-1-newMortality due to substance abuse has increased in Appalachia by more than 1,000 percent since 1980. Deaths from diabetes, blood and endocrine diseases also increased in most counties in the United States during that time.

That’s according to a new study, published Tuesday in the Journal of the American Medical Association, examining the mortality rates for 21 leading causes of death. The study also found that the death rate from cardiovascular disease, the leading cause of mortality in the U.S., is down in most parts of the country. And the research highlights numerous disparities between counties. For example, a newborn is nearly 10 times more likely to die from a neonatal disorder if she is born in Humphreys County, Mississippi, which has the highest neonatal mortality rate in the country, than if she is born in Marin County, a wealthy area north of San Francisco, which has the lowest rate.

The study also looked at how mortality from the 21 causes of death has changed over time, from 1980 through 2014. For example, neurological disorders such as Alzheimer’s were the third leading cause of death in 2014 and were prevalent across the country. But they have become more common in much of the South, while decreasing in the West.

More here.

Sunday Poem

The Idea of Egypt Begins to Emerge in 1955
AD After a Field Trip to the Mummy in
the NashvilleMuseum of Natural History

Unlike us they spoke with a foreign accent
Egyptian I guess and they didn’t live very long

or maybe they just kept shrinking like an old person
does on account of their being very extremely
wrinkled they took trinkets with them into their

burial chambers kind of like the stuff I would
pullout of my Mother’s top dresser drawer

that she always made me put back and couldn’t
for the life of her understand why I took it out
in the first place the mummy was kind of scary

the first time I saw it but by the second field trip
I was older and all and could look into the holes

where the ears eyes nose and mouth used to be
and count the threads on the dirty gray linen
that wound around the bones there was hardly

any other place to go on a field trip to except
FortNashboro which was a pretty poor excuse

for a fort which I guess is why the Cherokee
made the early Nashvillians mostly miserable
and the Upper Room which when you got there

all it was a big wooden carving of a famous
painting of the last supper which we had every
year at my house anyway except instead of Jesus
and the disciples we had my family and my sisters’
dumb boyfriends one girl fainted the first time

we saw the mummy and lots of girls screamed
kind of like they thought they were supposed to

even though it was a museum there’s a limit to
how long you can look at a mummy though and
after about two minutes we moved on down to

the glass cases that held real Indian arrowheads
that were probably just dug up out of the ground

but which I couldn’t help hoping had been ripped
from the still-beating hearts of the coon-skinned
soldiers at FortNashboro and plunked on Jesus’

seder plate what do you think y’all get back on the bus
now would have sounded like in hieroglyphic
.

by Arne Weingart
from ABZ Press
.

Why I Taught Myself to Procrastinate

Adam Grant in The New York Times:

GrantNORMALLY, I would have finished this column weeks ago. But I kept putting it off because my New Year’s resolution is to procrastinate more. I guess I owe you an explanation. Sooner or later. We think of procrastination as a curse. Over 80 percent of college students are plagued by procrastination, requiring epic all-nighters to finish papers and prepare for tests. Roughly 20 percent of adults report being chronic procrastinators. We can only guess how much higher the estimate would be if more of them got around to filling out the survey. But while procrastination is a vice for productivity, I’ve learned — against my natural inclinations — that it’s a virtue for creativity. For years, I believed that anything worth doing was worth doing early. In graduate school I submitted my dissertation two years in advance. In college, I wrote my papers weeks early and finished my thesis four months before the due date. My roommates joked that I had a productive form of obsessive-compulsive disorder. Psychologists have coined a term for my condition: pre-crastination.

…Steve Jobs procrastinated constantly, several of his collaborators have told me. Bill Clinton has been described as a “chronic procrastinator” who waits until the last minute to revise his speeches. Frank Lloyd Wright spent almost a year procrastinating on a commission, to the point that his patron drove out and insisted that he produce a drawing on the spot. It became Fallingwater, his masterpiece. Aaron Sorkin, the screenwriter behind “Steve Jobs” and “The West Wing,” is known to put off writing until the last minute. When Katie Couric asked him about it, he replied, “You call it procrastination, I call it thinking.”

So what if creativity happens not in spite of procrastination, but because of it?

More here.

Gloomed and Uglied Away: a letter Zora Neale Hurston sent to her editor

Dan Piepenbring in The Paris Review:

ZoranealeHave you ever been tied in close contact with a person who had a strong sense of inferiority? I have, and it is hell. They carry it like a raw sore on the end of the index finger. You go along thinking well of them and doing what you can to make them happy and suddenly you are brought up short with an accusation of looking down on them, taking them for a fool, etc., but they mean to let you know and so on and so forth. It colors everything. For example, I took this man that I cared for down to Carl Van Vechten’s one night so that he could meet some of my literary friends, since he had complained that I was always off with them, and ignoring him. I hoped to make him feel at home with the group and included so that he would go where I went. What happened? He sat off in a corner and gloomed and uglied away, and we were hardly out on the street before he was accusing me of having dragged him down there to show off what a big shot I was and how far I was above him. He had a good mind, many excellent qualities, and I am certain that he loved me. But his feeling of inferiority would crop up and hurt me at the most unexpected moments. Right in the middle of what I considered some sweet gesture on my part, I would get my spiritual pants kicked up around my neck like a horse-collar. I asked him to bring me all the clippings on TELL MY HORSE, and he brought several and literally flung them at me. “You had read them” he accused, “and knew that they were flattering. You just asked me to get them to see how great you were.” You know how many marriages in the literary and art world have broken up such rocks, to say nothing of other paths of life. A business man is out scuffling for dear life to get things for the woman he loves, and she is off pouting and accusing him of neglecting her. She feels that way because she does not feel herself able to keep up with the pace that he is setting, and just be confident that she is wanted no matter how far he goes. Millions of women do not want their husbands to succeed for fear of losing him. It is a very common ailment. That is why I decided to write about it.

More here. (Note: At least one post throughout February will be in honor of Black History Month)

Saturday, February 11, 2017

Book Review: Daniel Dennett rides again

Dan Jones in Nature:

542030a-i1In Joel and Ethan Coen's 2009 film A Serious Man, physics professor Larry Gopnik is in the middle of an existential crisis. In a dream, he gives a lecture on Heisenberg's uncertainty principle; Sy Ableman, the older man with whom Gopnik's wife is having an affair, stays on after the students disperse. In a condescending drawl, he addresses Gopnik and his equation-covered chalkboard: “I'll concede that it's subtle, clever — but at the end of the day, is it convincing?”

Philosopher and cognitive scientist Daniel Dennett has been hearing variants of this riposte for decades. If history is a guide, his latest book, From Bacteria to Bach and Back, will elicit similar responses. It is a supremely enjoyable, intoxicating work, tying together 50 years of thinking about where minds come from and how they work. Dennett's path from the origins of life to symphonies is long and winding, but you couldn't hope for a better guide. Walk with him and you'll learn a lot.

The book's backbone is Charles Darwin's theory of natural selection. That replaced the idea of top-down intelligent design with a mindless, mechanical, bottom-up process that guides organisms along evolutionary trajectories into ever more complex regions of design space. Dennett also draws heavily on the idea of 'competence without comprehension', best illustrated by mathematician Alan Turing's proof that a mechanical device could do anything computational. Natural selection has created, through genetic evolution, a world rich in competence without comprehension — the bacteria, trees and termites that make up so much of Earth's biomass.

Yet, as Dennett and others argue, genetic evolution is not enough to explain the skills, power and versatility of the human mind.

More here.

Centennial: Einstein’s Desperate Mistake, the Cosmological Constant

Vasudevan Mukunth in The Wire:

ScreenHunter_2584 Feb. 11 17.45On February 8, 1917, Einstein published a paper titled ‘Kosmologische Betrachtungen zur allgemeinen Relativitätstheorie‘ (‘Cosmological Considerations in the General Theory of Relativity’). In it, he described a number called the cosmological constant. The constant had a value such that, when used in his newly created equations describing the behaviour of the gravitational force, a non-changing universe was spit out – agreeing with knowledge at the time, as well as his belief, that the universe was static. Without the constant in the picture, on the other hand, Einstein’s general theory of relativity suggested that the gravitational pull of masses contained in the universe would pull all the matter together, keeping the universe dynamic.

It would be more than a decade before evidence would begin to emerge that the universe was expanding. And it would be scores of years before astronomers would find that the expansion was also accelerating. Then again, it would be many years before Einstein realised his actual mistake.

While he popularly considered his addition of the constant to be an affront to his own work, it may not have been as bad as he thought. The cosmological principle states, rather assumes, that the universe at the largest scales has the same properties everywhere. This is a spatial definition. An extension called the ‘perfect’ cosmological principle states that the universe at the largest scales has the same properties everywhere and at every time, implying that it has always remained the way it is today and it will continue to be this way forever. This is also called the steady-state theory, an alternative to the Big Bang theory that has been widely discredited – and which Einstein himself pursued for a while in 1931.

Anyway, in defence of Einstein, theoretical astrophysicist Peter Coles writes on his blog, “General relativity, when combined with the cosmological principle, but without the cosmological constant, requires the universe to be dynamical rather than static. If anything, therefore, you could argue that Einstein’s biggest blunder was to have failed to predict the expansion of the Universe!” Indeed, if Einstein had not decided to fudge his own monumental equations, he may have been onto something.

More here.

The Madness of King Donald

Andrew Sullivan in New York Magazine:

ScreenHunter_2583 Feb. 11 17.35I guess I should start by saying this is not a blog. Nor is it what one might call a column. It’s an experiment of sorts to see if there’s something in between those two. Most Fridays, from now on, I’ll be writing in this space about, among other things, the end of Western civilization, the collapse of the republic, and, yes, my beagles. If you’re a veteran reader of my former site, the Dish, you may find yourselves at times in an uncanny valley. So may I. The model I’m trying to follow is more like the British magazine tradition of a weekly diary — on the news, but a little distant from it, personal as well as political, conversational more than formal.

I want to start with Trump’s lies. It’s now a commonplace that Trump and his underlings tell whoppers. Fact-checkers have never had it so good. But all politicians lie. Bill Clinton could barely go a day without some shading or parsing of the truth. Richard Nixon was famously tricky. But all the traditional political fibbers nonetheless paid some deference to the truth — even as they were dodging it. They acknowledged a shared reality and bowed to it. They acknowledged the need for a common set of facts in order for a liberal democracy to function at all. Trump’s lies are different. They are direct refutations of reality — and their propagation and repetition is about enforcing his power rather than wriggling out of a political conundrum. They are attacks on the very possibility of a reasoned discourse, the kind of bald-faced lies that authoritarians issue as a way to test loyalty and force their subjects into submission. That first press conference when Sean Spicer was sent out to lie and fulminate to the press about the inauguration crowd reminded me of some Soviet apparatchik having his loyalty tested to see if he could repeat in public what he knew to be false. It was comical, but also faintly chilling.

More here.

‘COLONEL LÁGRIMAS’ BY CARLOS FONSECA

Colonel-lagrimasDiego Azurdia at The Quarterly Conversation:

Colonel revitalizes the notion of a literature that exists for and from the moment of writing, and it avoids the accompanying unchecked optimism in the possibility of transcendence by foregrounding failure. We are reminded of the pure effort, an idea coined by Ortega y Gasset in order to describe Don Quixote’s adventures and King Philp II’s construction of El Escorial. Both gargantuan in dimensions, much like the Colonel’s Vertigos of the Century, they exemplify those projects that foreground will over structure and design, so as to find their justification in effort itself. They are doomed to fail and inevitably result in a sheer state of melancholy, arguably Iberian and Latin American.

I guess it is unavoidable to end with a discussion of the tradition. Latin American writers have often dealt with challenges springing from the historical and problematic relation between cultural production and politics. The place of the intellectual in the continent has been a highly contended matter, and what we see in the Colonel is a kind of embodiment of the its different stages: from an “enlightened” academic in its universal (mathematical) labors, to the political subject attempting to participate in the movements of his time, to the forgotten hermit attempting to memorialize his own life. If anything, Colonel is a novel that attempts to work out the difficult question about the place of the contemporary intellectual.

more here.

why time flies

12rovelli-cover-master180Carlo Rovelli at The New York Times:

Alan Burdick’s “Why Time Flies” certainly does not answer our every question. And precisely for this reason it captures us. Because it opens up a well of fascinating queries and gives us a glimpse of what has become an ever more deepening mystery for humans: the nature of time.

Time may appear unproblematic at first. What is there to say about it? It flies, things happen in the fullness of it, clocks measure it, and we are well aware of its passage. This review shall take you perhaps three minutes to read. Nothing particularly curious about that. But the closer we look, the less clear our temporal sense becomes: First, our brain, body and cells all keep track of time in a variety of ways that are not all that well understood. Psychologists are puzzled by a wealth of experiments showing that we process time in more subtle and complex ways than we expected. Some neuroscientists interpret the brain as a “time machine,” whose core mechanism is to collect past memories in order to predict the future. Philosophers debate the very existence of time. And perhaps most disconcertingly of all, physics teaches us that physical time happens to be astonishingly different from how we intuit it: runs at different speeds, at different altitudes; is distorted by matter; is not organized in a straightforward past, present and future. Advanced tentative theories of the universe even discard temporality altogether from the basic ingredients of the world. From whatever side we address it, the nature of time is a source of perplexity and wonder.

more here.

‘Oliver Goldsmith in Grub Street’ by Norma Clarke

51D-L2Nk1cL._SX329_BO1,204,203,200_John Mullan at The Guardian:

Oliver Goldsmith has always been a puzzle. So he was to his contemporaries, many of whom found him, as the actor David Garrickput it, “a mixture so odd” of contradictory qualities. Was he brilliant or foolish? The painter Joshua Reynolds recalled that Goldsmith like to argue from “false authorities” and talk humorous nonsense. Listeners never knew when to take him seriously. He is a puzzle to literary history too: he dabbled in this genre and that, producing no coherent body of work, yet managed to write a handful of small masterpieces.

There is his brilliant comedy of social pretensions and mistaken identities She Stoops to Conquer, almost the only play of the 18th century apart from Sheridan’swork still to be staged and relished. There is his nostalgic, melancholy poem “The Deserted Village”, once a favourite of all poetry anthologists, its quotability adaptable to any political perspective. “Ill fares the land, to hastening ills a prey, / Where wealth accumulates and men decay”. Above all, perhaps, there is The Vicar of Wakefield, one of the most frequently reprinted novels in English. This hilarious pastiche of the Book of Job manages to seem both a deliciously innocent tale and a wicked mockery of sentimentality. In its naive, sententious, oddly endearing narrator, Dr Primrose, Goldsmith created one of the great unreliable narrators of British fiction.

The 100 best nonfiction books: No 53 – The Varieties of Religious Experience by William James (1902)

Robert McCrum in The Guardian:

JamesWilliam James, brother of the more famous Henry, was a classic American intellectual, a brilliant New Englander and renowned pragmatist – a celebrity in his time who coined the phrase “stream of consciousness”. He responded to the cultural and social ferment of the late 19th century with the Gifford lectures, given in Edinburgh during 1900-02. When he turned these talks into a book, James, a Harvard psychologist and the author of The Principles of Psychology, placed himself at the crossroads of psychology and religion to articulate an approach to religious experience that would help liberate the American mind at the beginning of the 20th century from its puritan restrictions by advancing a pluralistic view of belief inspired by American traditions of tolerance. Like his brother, he was obsessed by the problem of expressing individual consciousness through language; this is just one of the principal themes of The Varieties of Religious Experience.

…The idea that all citizens were equally and independently close to God sponsored among the James family the conviction that religious experience should not become confined within the narrow prison of a denomination. The same irreverence towards categories encouraged William James to adopt a high-low style that gives his writing a fresh and populist character that’s rather different from the mature style of his brother the novelist. William used his populism to suggest that any religious experience” was true if the consequences of holding it were pleasing to the individual concerned. This restatement of the American pursuit of happiness gave his audiences a new appreciation of human dignity grounded in everyday reality.

More here.