Seduction and Power: Cathedrals and Capitalism

Alexander Strecker in lensculture:


In his two-part series, “Seduction” and “Meeting”, Cyril Porchet examines the nature of spectacle, comparing Baroque churches to the modern-day corporate shareholder meeting. Porchet created “Seduction” by photographing from the choirs of Baroque churches across Europe. In person, these places feel enormous. And indeed, Porchet’s prints are giant, 1.2 m x 1.6 m, allowing the viewer to fall into the details almost as if they were present at the churches themselves. But Porchet’s works are not mere re-creations. His photographs transform these massive spaces into a series of flat, but still nearly infinite, displays of extravagance. Although these churches, as physical structures, took decades to build and remain with us today, they were also spectacles, designed to elicit a much more immediately felt sense of wonder and piety.

More here.

This “Death Watch” Allegedly Counts Down the Last Seconds of Your Life

Tuan C. Nguyen in Smithsonian:

Tikker-death-watch2There’s now a watch that reminds us of the one appointment that we won’t be able to cancel. It’s called the Tikker. And it counts down the minutes, and even seconds, we have before we will likely meet our demise. Currently being sold on the crowd-sourcing website Kickstarter, the concept for a so-called “death watch” isn’t as morbidly depressing as it may appear on the surface. In fact, the watch’s creator, Fredrik Colting, believes his invention does exactly the opposite by inspiring and motivating people to “live better.” For Colting, the cold finality of death had only fully set in when his grandfather passed away several years ago. Tikker was born out of his desire to figure out a way to use this acceptance to spur positive changes in one’s life. “It’s my belief that if we are aware of death, and our own expiration,” says Colting, “that we will have a greater appreciation for life.”

To arrive at an estimation of how much longer someone has to live, users fill out a questionnaire that’s designed to add or subtract years based on current age, exercise habits and other health related factors. That exact time can then be programmed into the watch, at which point the final countdown begins. However, the method in which Tikker calculates each person’s individualized expiration date is superficially scientific at best. Though the use of so-called longevity calculators have gained some credibility among researchers, some experts, such as actuary Steve Vernon of the Stanford Center on Longevity, have warned that people shouldn’t rely too much on these kind of approximations since there’s a “50 percent chance you’ll live beyond this estimate.” As an example of how inexact these kind of formulas are, Vernon tested popular online calculators from the Northwestern Mutual Life Insurance Company, and His results were 95, 101 and 95.6 years, respectively. In any case, it’s probably best not to view this generated date as a hard deadline. Instead, Colting says, the notion of a “use by” time stamp is supposed to have more of a symbolic meaning and can serve as a practical reminder to pay heed to some of the often-echoed existential epiphanies such as”Carpe Diem!” and “You only live once!”

More here.

Sunday Poem

In Plain Sight

When my son closes his eyes
and cups his hands over them,
he thinks he's hiding from me.

It doesn't work that way, I say.
You can't see me, but I can
still see you. He repeats

his Morse Code of peeking
and closing his eyes
a few more times in a mix

of shyness and regret.
I finally get it out
he didn't want me home yet,

he had five more math problems,
and hoped to be finished
before I walked in with the world.

I'm glad it's only math, I say.
Here, maybe this will help,
and cover my eyes,

announce I'm not home,
wishing I were seven again,
living in plain sight.

wanting to promise him
a simple plan, to show
how to live with all illusions intact,

living inside my dreams,
instead of dying inside my life.

by Jim Gwyn
from The Red Wheelbarrow 6

Love in the Gardens

Zadie Smith in the New York Review of Books:

ScreenHunter_363 Oct. 19 16.10When my father was old and I was still young, I came into some money. Though it was money “earned” for work done, it seemed, both to my father and me, no different than a win on the lottery. We looked at the contract more than once, checking and rechecking it, just like a lottery ticket, to ensure no mistake had been made. No mistake had been made. I was to be paid for writing a book. For a long time, neither of us could work out what to do about this new reality. My father kept on with his habit of tucking a ten- or twenty-pound note inside his letters to me. I took the rest of my family (my parents having separated long before) to a “resort” back in the “old country” (the Caribbean) where we rode around bored in golf carts, argued violently, and lined up in grim silence to receive a preposterous amount of glistening fruit, the only black folk in line for the buffet.

It took a period of reflection before I realized that the money—though it may have arrived somewhat prematurely for me—had come at the right time for my father. A working life launched when he was thirteen, which had ended in penury, old age, and divorce, might now, finally, find a soft landing. To this end, I moved Harvey from his shabby London flat to a cottage by the sea, and when the late spring came we thought not of Cornwall or Devon or the Lake District but of Europe.

Outrageous thought! Though not without precedent. The summer before I went to college, my father, in his scrupulous way, had worked out a budget that would allow the two of us to spend four days in Paris. Off we went. But it is not easy for a white man of almost seventy and a black girl of seventeen to go on a mini-break to Europe together; the smirks of strangers follow you everywhere.

More here.

mary beard confronts the classics

ConfrontingTheClassicsHCJoanna Scutts at The Washignton Post:

The essays that make up most of “Confronting the Classics” are as much about what happens in the gap between antiquity and modernity as they are about the ancient works of art, literature, history and architecture themselves. Beard has no time for misty-eyed idealizing of the culture of Greece and Rome (or the era when it was more widely studied), beyond what it may reveal about those doing the looking back. Classical study has been lamented as “in decline” since at least Thomas Jefferson, as she points out, and such laments “are in part the expressions of the loss and longing and the nostalgia that have always tinged classical studies.” The scholars whose books she praises most highly are therefore those who do not try to smooth over the gaps in the record but to jump into them and peek about.

In the past few years, Beard has become the public face of modern classical scholarship in Britain, fronting BBC television shows about Pompeii and Caligula. Her high profile has made her the target of some vicious online attacks, which she has discussed in a blog she writes for the Times Literary Supplement. In light of her experience, her examinations of the misogyny and mythmaking surrounding powerful women such as Boadicea,Cleopatra and Augustus’s wife Livia are particularly trenchant.

more here.

Skull of Homo erectus throws story of human evolution into disarray

Homo-erectus-skull-found--003Ian Sample at The Guardian:

The spectacular fossilised skull of an ancient human ancestor that died nearly two million years ago has forced scientists to rethink the story of early human evolution.

Anthropologists unearthed the skull at a site in Dmanisi, a small town in southern Georgia, where other remains of human ancestors, simple stone tools and long-extinct animals have been dated to 1.8m years old.

Experts believe the skull is one of the most important fossil finds to date, but it has proved as controversial as it is stunning. Analysis of the skull and other remains at Dmanisi suggests that scientists have been too ready to name separate species of human ancestors in Africa. Many of those species may now have to be wiped from the textbooks.

The latest fossil is the only intact skull ever found of a human ancestor that lived in the early Pleistocene, when our predecessors first walked out of Africa. The skull adds to a haul of bones recovered from Dmanisi that belong to five individuals, most likely an elderly male, two other adult males, a young female and a juvenile of unknown sex.

more here.

the death of letter writing

6e3dc8d8-f4ec-461f-b618-a3948c95eabbAndrew Hill at the Financial Times:

For more than 200 years from its beginnings in the 1770s, the Dead Letter Office was where Americans’ letters and parcels were sent if they were unclaimed or undeliverable. Some items were redirected: the DLO had a “blind reading” department trained to decipher illegible or vague addresses (“To my Son he lives out West he drives a red ox the rale rode goes By Thar”). The office would incinerate the others or auction their contents, which included, according to one sale list, anything from wedding rings to “False Bosoms” and quack medicines, such as “the cure-all Tennessee Swamp Shrub”. It was estimated that 6bn pieces of mail were posted in the US in 1898, of which 6.3m ended up at the DLO in Washington, DC. “What romance was to be had in an undelivered or undeliverable letter!” Simon Garfield writes in To The Letter. “And what mystery and sadness too.”

Well, the romance and mystery have certainly gone. The US Postal Service has renamed the DLO the Mail Recovery Center, consolidated four locations into one in Atlanta, Georgia, and is pushing through a “Lean Six Sigma” process improvement project to make it more efficient. Asked if they write letters, most people would echo the DLO’s famous fictional former clerk Bartleby in the Herman Melville story: “I would prefer not to.”

more here.

Saturday Poem

Yangon 2010

Parents and guardians jostle over the quality of education available to their children, it’s been observed. People work on mutual trust and understanding; it is important to avoid abuses, it’s been observed. Black (e)mailing and threats that undermine the social integrity and propriety of girls online are commonplace, it’s been observed. Certain teenagers now patronise and pirate barnyard videos and obscene images, it’s been observed. Youths are now overtly into neoteric fashion and luxury items, it’s been observed. Those about to leave for jobs overseas need to systematically prep themselves in every possible way before departure, it’s been observed. Documents and certificates are forged, bogus offices are set up to con the people, it’s been observed. The anti-plastic ecological campaign has not really caught on among the general populace, it’s been observed. To minimise the risk of molestation on public transport most ladies now choose special bus services, it’s been observed. The unemployment rate is on the increase and salaries are on the decrease, compared to last year; this has put migrant workers from rural areas in reverse immigration, it’s been observed. Movies that feature pleasant backdrops with an array of actors and comedies that poke fun of people tripping over, falling face down in the cow dung, are increasingly popular among the Yangon film buffs, it’s been observed.

by Maung Yu Py
from Bones Will Crow: Fifteen Contemporary Burmese Poets
publisher: Arc, Todmorden, UK, 2012
translation: ko ko thett and James Byrne

A psychological history of the NSA

Joe Kloc in The Daily Dot:

5024eda30399eefe0a7486aee2eec748Before the NSA came to life on the eve of Dwight Eisenhower's election, its job was done by a loose group of three independent intelligence outfits in the Army, Navy, and Air Force. The groups came into their own during World War II, as Washington began to see that significant signals intelligence, or SIGINT, could be invaluable in wartime: On the Western front, British cryptographer Alan Turing’s Enigma machine had been able to decode German movements during the Allied forces invasion of Normandy. On the Pacific front, U.S. intelligence became so crucial that Admiral Chester Nimitz said SIGINT deserved credit for the Allied victory during the Battle of Midway.

But just as the American SIGINT program’s successes came into focus during the war, so did its weaknesses. The three groups, two of which were run out of separate, converted women's schools, often viewed one another as competitors. At one point, the Army and Navy went so far as to divide up intelligence work based on whether the day of the month was odd or even. (NSA historian Thomas Johnson would describe this peculiar practice as a “Solomonic Solution.”) The British government—in many ways superior in those days in terms of intelligence gathering—would later liken dealing with the American intelligence community to dealing with the colonies after the Revolutionary War.

More here.

My love affair with Scarlett O’Hara

Hannah Betts in The Telegraph:

New Zealand Cousin Carol entered my life when I was a gauche eight-year-old, she a barely less gauche (being from NZ) 22. She stayed with my family at weekends, the two of us becoming unlikely room-mates. At her hands I was initiated in such feminine arcana as scouring my face off with a Buf-Puf (tick), giggling about boys (fail), and the merits of sexy underwear (happily then something of a pass). On one score, however, she proved the oracle. During one of our late-night conferences, she handed me a heavy tome, with a cover of a dancing girl in a jet crinoline. She was the most ravishing woman I had ever seen. “This is something you will need,” NZCC intoned, the weight of womanhood behind her. “Read it and you will understand why.” In one bound, I had been both Scarletted and Viviened.

…One learns that great love may be merely an idea – and a wrong idea, at that. Scarlett World exemplifies the principle that no one can rescue you but yourself: that life means taking responsibility for – then clawing one’s way out of – a series of monumental mistakes; tomorrow, as ever, being another day. Better be all woman than some sappy lady. Not all heroines are blonde. Charm is more important than beauty (as the novel’s opening line asserts: “Scarlett O’Hara was not beautiful, but men seldom noticed it when caught by her charm…”) Flirting can be a girl’s most powerful weapon, for which the right outfit is key, even when one has to drag down one’s late mother’s curtains. Bosoms may indeed be strategically bared before noon.

More here.

His Subject, Himself: Claudia Roth Pierpont’s ‘Roth Unbound’

Martin Amis in The New York Times:

RothAmerican anti-Semitism, which was running high throughout the 1930s, steadily increased after the onset of war. During the entire period, polls showed, well over a third of the populace stood ready to back discriminatory laws. Nor was this a mere offshoot of the general xenophobia spawned by isolationism. Every synagogue in Washington Heights was desecrated (and some were smeared with swastikas); in Boston, beatings, wreckings and defilements had become near-daily occurrences by 1942. The disgraceful fever, which ruled out all but a trickle of immigration and so cost countless lives, reached its historic apogee in 1944 — by which time the Holocaust was more or less complete. And what of the media? News of the killings emerged in May/June 1942: a verified report with a figure of 700,000 already dead. The Boston Globe gave the story the three-column headline “Mass Murders of Jews in Poland Pass 700,000 Mark,” and tucked it away at the foot of Page 12. The New York Times quoted the report’s verdict — “probably the greatest mass slaughter in history” — but gave it only two inches. We may venture to say that such reticence is slightly surprising, given that the historiography of the events outlined above now runs to many tens of thousands of volumes.

Philip Roth would use this soiled and feckless backdrop in “The Plot Against America” (2004), his 26th book; but ­anti-Semitism and its corollary, anti-anti-­Semitism, wholly dominated the publication of his first, “Goodbye, Columbus and Five Short Stories” (1959). “What is being done to silence this man?” asked a rabbi. “Medieval Jews would have known what to do with him.” Roth’s cheerful debut, some thought, shored up the same “conceptions . . . as ultimately led to the murder of six million in our time.” So he wasn’t only contending with a “rational” paranoia; he was also ensnared in the ongoing anguish of comprehension and absorption, as the sheer size of the trauma inched into consciousness. After a hate-filled public meeting at Yeshiva University in New York in 1962, Roth solemnly swore (over a pastrami sandwich) that he would “never write about Jews again.”

It was a hollow vow.

More here.

Restoring F. P. Ramsey


David Papineau reviews Margaret Paul's Frank Ramsey (1903-1930): A Sister’s Memoir, in the TLS:

F. P. Ramsey has some claim to be the greatest philosopher of the twentieth century. In Cambridge in the 1920s, he singlehandedly forged a range of ideas that have since come to define the philosophical landscape. Contemporary debates about truth, meaning, knowledge, logic and the structure of scientific theories all take off from positions first defined by Ramsey. Equally importantly, he figured out the principles governing subjective probability, and so opened the way to decision theory, game theory and much work in the foundations of economics. His fertile mind could not help bubbling over into other subjects. An incidental theorem he proved in a logic paper initiated the branch of mathematics known as Ramsey theory, while two articles in the Economic Journalpioneered the mathematical analysis of taxation and saving.

Ramsey died from hepatitis at the age of twenty-six in 1930. For some geniuses, an early death accelerates the route to canonization. But for Ramsey it had the opposite effect. Ramsey’s death coincided with Ludwig Wittgenstein’s return to Cambridge after his reclusive years in the Austrian Alps. The cult surrounding Wittgenstein quickly caught fire, and for the next fifty years dominated philosophy throughout the English-speaking world. By the time it subsided, Ramsey had somehow been relegated to a minor role in history, a footnote to an archaic Cambridge of Russell, Keynes and the Bloomsbury set.

More here.

The Essence of the Japanese Mind: Haruki Murakami and the Nobel Prize


Amanda Lewis in the LA Review of Books:

BRITISH ODDSMAKERS Ladbrokes gave 64-year-old Japanese novelist Haruki Murakami a 3-1 chance of winning the 2013 Nobel Prize in Literature. Last year, they had it at 8-1. He didn’t get it this year, losing to Alice Munro, but he has a good shot in 2014 or 2015. If he wins, he’ll be the third Japanese writer to receive the prize, but he is nothing like the two men who came before him.

The first — Yasunari Kawabata, in 1968 — was cited by the Nobel Committee “for his narrative mastery, which with great sensibility expresses the essence of the Japanese mind.” The second — Kenzaburō Ōe, in 1994 — once wrote that, “The role of literature, insofar as man is obviously a historical being, is to create a model of a contemporary age which encompasses past and future, a model of the people living in that age as well.”

Murakami, however, does not write to capture “the essence of the Japanese mind,” and his characters are not meant to be “a model” of any era or group.

“I fully believe it is the novelist's job to keep trying to clarify the uniqueness of each individual soul by writing stories,” he said when accepting the Jerusalem Prize in 2009, rejecting the conventional notion of Japan as not only racially homogenous but also somehow intellectually and emotionally the same from mind to mind, from prefecture to prefecture.

In fact, from his debut in 1979 until the early 1990s, Murakami wrote bestselling Japanese novels that were almost aggressively un-Japanese. And the Tokyo literati, particularly Ōe, hated him. Hated that his magical tales dripped with brand names and references to American pop culture. Hated that he wrote his first novel out in English before rewriting it in his native language. Hated that he never wrote about war.

More here.

A Philosophy of Tickling


Aaron Schuster in Cabinet magazine:

Aristotle famously defined man as the rational animal (zoon echon logon), and as the political animal (zoon politikon). But there are also passages in his work that indicate another less remarked upon, though no less profound, definition. In Parts of Animals, he writes: “When people are tickled, they quickly burst into laughter, and this is because the motion quickly penetrates to this part, and even though it is only gently warmed, still it produces a movement (independently of the will) in the intelligence which is recognizable. The fact that human beings only are susceptible to tickling is due (1) to the fineness of their skin and (2) to their being the only creatures that laugh.”1 Perhaps this notion of the “ticklish animal” was further elaborated in the second book of the Poetics, the lost treatise on comedy; indeed, the relationship between ticklish laughter and comic laughter remains an open question. Should tickling be investigated under the heading of comedy or of touch? Touch, Aristotle argues, is the most primary sense, and human beings are uniquely privileged in possessing the sharpest sense of touch thanks to the delicate nature of their skin. Though other animals have more advanced smell or hearing, “man’s sense of touch … excels that of all other animals in fineness.”2 We might view tickling as a side effect of the hyper-sensitivity of human touch. Our peculiar vulnerability to tickling is the price to be paid for more sophisticated and discriminating access to the world.

Why do people laugh when tickled? Why can’t you tickle yourself? Why are certain parts of the body more ticklish than others? Why do some people enjoy tickling and others not? And what is tickling, after all? In fact, it is not so easy to pin down the phenomenon, as a number of scientific studies attest. Here’s one attempt at a precise description: “Tickle may thus be finally defined as an intensely vivid complex of unsteady, ill-localized and ill-analyzed sensation, with attention distributed over the immediate sensory contents and the concomitant sensations reflexly evoked.”

More here.

Why Scandinavian Prisons Are Superior

Lead_large_tmp (12)

Doran Larson in The Atlantic:

It’s a postcard-perfect day on Suomenlinna Island, in Helsinki’s South Harbor. Warm for the first week of June, day trippers mix with Russian, Dutch, and Chinese tourists sporting sun shades and carrying cones of pink ice cream.

“Is this the prison?” asks a 40-something American woman wearing cargo pants and a floral sleeveless blouse.

Linda, my guide and translator, pauses beside me between the posts of an open picket fence. After six years of teaching as a volunteer inside American prisons, I’ve come from the private college where I work to investigate the Scandinavian reputation for humane prisons. It’s the end of my twelfth prison tour, and I consider the semantics of the question: If you can’t tell whether you’re in a prison, can it be a prison? I’ve never considered this in so many words. Yet I find that I know the answer, having felt it inside a prison cell in Denmark:There is no punishment so effective as punishment that nowhere announces the intention to punish. Linda is an intern working on a degree in public policy. Young and thoroughly practical, she smiles and says to the tourists, “Yes, you are here.”

The Americans look shocked and afraid. The father glances at his wife. The wife cocks her head back, as though she’s ventured too far. The son—fit as my La Crosse-playing students—takes a step in reverse, just outside the gate, and says to his mother: “I told you.” (Linda clearly wonders what she’s said to cause such a reaction.)

Then the son adds, his voice cracking on a nervous attempt at sarcasm: “It’s sure reassuring to know we’re being protected from criminals.”

More here.

Physics: What We Do and Don’t Know


Steven Weinberg in the NYRB:

The modern era of scientific cosmology began forty-eight years ago, with the accidental discovery of the cosmic microwave background radiation. So much for the steady state cosmology—there was an early universe. This microwave radiation has been under intensive study since the mid-1960s, both from unmanned satellites in orbit and from large ground-based radio telescopes. Its present temperature is now known to be 2.725 degrees Centigrade above absolute zero. When this datum is used in calculations of the formation of the nuclei of atoms in the first three minutes after the start of the big bang, the predicted present abundance of light elements (isotopes of hydrogen, helium, and lithium) comes out pretty much in agreement with observation. (Heavier elements are known to be produced in stars.)

More important than the measurement of the precise value of the temperature is the discovery in 1977 that the temperature of the microwave radiation is not the same throughout the sky. There are small ripples in the temperature, fluctuations of about one part in a hundred thousand. This was not entirely a surprise. There would have to have been some such ripples, caused by small lumps in the matter of the early universe that are needed to serve as seeds for the later gravitational condensation of matter into galaxies.

These lumps and ripples are due to chaotic sound waves in the matter of the early universe. As long as the temperature of the universe remained higher than about 3,000 degrees, the electrons in this hot matter remained free, continually scattering radiation, so that the compression and rarefaction in the sound waves produced a corresponding variation in the intensity of radiation. We cannot directly see into this era, because the interaction of radiation and free electrons made the universe opaque, but when the universe cooled to 3,000 degrees the free electrons became locked in hydrogen atoms, and the universe became transparent. The radiation present at that time has survived, cooled by the subsequent expansion of the universe, but still bearing the imprint of the sound waves that filled the universe before it became transparent.

More here.

Why we need fairytales: On Oscar Wilde

Jeanette Winterson in The Guardian:

WildeThe work Wilde is remembered for was written over a period of less than 10 years. The Happy Prince and Other Tales was published in 1888. That volume marks the beginning of Wilde's true creativity. He had lectured extensively in the US – but that would not have won him any lasting legacy, any more than his journalism or his poems. He had published a great many poems, but Wilde was a bad poet – he rarely found the right words and he was old-fashioned. Read him next to Emily Dickinson, Walt Whitman or WB Yeats, and you will see for yourself. We don't read his poetry now – it is dated and dead; too much Arcady and Hellenic Hours. The early plays suffer from the same verbal excess. Wilde at his worst wrote in purple. At his best he is dazzling. The birth of his children seems to have regenerated Wilde as a writer. The tedious Hellenism vanished. The purple-isms faded. There are still overwritten images – Dawn's grey fingers clutching at the stars – and he never gives up his fondness for a biblical moment, usually appearing as precious stones or pronouns (thee and thy), but his style did change. The writing became freer and sharper, and also more self-reflective, without being self-absorbed.

Academic criticism of Wilde's work has too often dismissed these fairy stories as a minor bit of sentimentalism scribbled off by an iconoclast during a temporary bout of babymania. But since JK Rowling's Harry Potter and Philip Pullman's His Dark Materials, children's literature has been repositioned as central, not peripheral, shifting what children read, what we write about what children read, and what we read as adults. At last we seem to understand that imagination is ageless. Wilde's children's stories are splendid. In addition, it seems to me that they should be revisited as a defining part of his creative process.

More here.