Sibelius and the forest of the mind

070709_r16358_p233

Composing music may be the loneliest of artistic pursuits. It is a laborious traversal of an imaginary landscape. Emerging from the process is an art work in code, which other musicians must be persuaded to unravel. Nameless terrors creep into the limbo between composition and performance, during which a score sits mutely on the desk. Hans Pfitzner dramatized that moment of panic and doubt in “Palestrina,” his 1917 “musical legend” about the life of the Italian Renaissance master. The character of Palestrina speaks for colleagues across the centuries when he stops his work to cry, “What is the point of all this? Ach, what is it for?”

The Finnish composer Jean Sibelius may have asked that question once too often. The crisis point of his career arrived in the late nineteen-twenties and the early thirties, when he was being lionized as a new Beethoven in England and America, and dismissed as a purveyor of kitsch in the tastemaking European music centers, where atonality and other modern languages dominated the scene.

more from The New Yorker here

12 is the igibum, 5 the igum

12908193

The quintic is the Snark of mathematics. It was hunted across Europe until it was finally killed off by a 26-year-old Norwegian called Niels Abel, who starved to death shortly after. But the quintic was a Boojum, you see. Unlike the equations that had gone before, Abel proved that it has no general solution. The reason why this is the case, as the French student Everiste Galois showed, is infinitely more important than the failure of the result. A day after he wrote down the explanation for this boojumish fact, he was shot dead, in a duel, aged twenty-one.

Historians of mathematics are always complaining that mathematicians are a dry and uninteresting lot; but it’s not so. Algebra has been powered by numerous astonishing characters and absurd situations. The beautiful virgin Hypatia, the first known woman mathematician (there are only three, in this book), was pulled from her chariot by an enraged mob and had her flesh scraped from her bones with oyster shells. (Women and algebra have not always been kind to each other. George Boole, who developed an algebraic system for logic, died because his wife threw buckets of icy water over him when he was in bed with a chill.) Alexandre Grothendieck is the most recent curious fellow: in his prime he knocked down policemen and won the top mathematics prize, the Fields Medal. Now he lives in total retirement in the Pyrenees, pondering how to survive on dandelion soup.

more from Literary Review here.

benjamin’s final trip

Benjamin

On her return she told Birman that she’d heard a ‘loud rattling from one of the neighbouring rooms’. Birman went to investigate and found Benjamin ‘in a desolate state of mind and in a completely exhausted physical condition’. He told her he could not go back to the border and would not move out of the hotel. She said there was no alternative and he disagreed: ‘He hinted that he had some very effective poisonous pills with him. He was lying half naked in his bed and had his very beautiful big golden grandfather watch with open cover on a little board near him, observing the time constantly.’ This ‘big golden grandfather watch’ was perhaps a pocket watch; and if so, surely the one he’d consulted earlier in the day to ration the pauses during his heroic, debilitating ascent. Birman told him about the attempted bribe and urged him to hold off. ‘He was very pessimistic’ and thought the odds were way too long. A little later, Henny Gurland came into the room and Birman left. There were several visits by a local doctor who bled the patient and administered injections, but if Birman was aware of this, she doesn’t say so. She takes it to be a clear case of suicide. ‘The next morning,’ she writes, ‘we heard that he had succeeded and was no more amongst us.’

more from the LRB here.

Woody Allen, Chick Magnet

Mark Warren in Esquire:

Allen True story: It was sophomore year, in the cafeteria at Ross S. Sterling High School in Baytown, Texas. It was Tuesday, I remember that. I had a beat-up copy of Getting Even, by Woody Allen, that I’d been flashing around, trying to impress a girl named Priscilla, who was sitting across from me finishing her Frito pie. She was small and beautiful. I was desperate for her to notice me, in the way sophomores are desperate, which always results in some stupid operatic gesture. Getting Even had become kind of a religious text to me. How had this book gotten to southeast Texas? Only I understood it. Anyway, I sat there facing Priscilla, the book wide open so that she couldn’t possibly miss it. Nothing. I leaned in across the table and asked her if she knew who Woody Allen was. She didn’t. What happened next is a blur, but I’m pretty sure I stood up, said something like, “A reading from the book of Woody Allen,” and then gave a dramatic oration of “Death Knocks,” in which Death comes for Nat Ackerman, but Ackerman beats Death at gin rummy and gets to live longer, forcing Death to look for a hotel. About halfway through, Priscilla got up, said that I was “retarded,” and walked out of my life.

Now comes Mere Anarchy, Allen’s first new essay collection in 25 years. I’m real happy to say it’s funny in the same way that Getting Even was funny when I was 16.

More here.

Karaoke 24/7: The intoxicating appeal of singing online

Michelle Tsai in Slate:

070709_tech_karaoketnI’ve always felt uncomfortable living my life online. I have a MySpace profile, but it’s empty. I don’t blog. And I won’t post pictures on Flickr if they feature me or anyone I know. But recently, I learned that I’m not completely opposed to Internet exhibitionism. When it comes to online karaoke, I’m a microphone-hogging fame whore.

SingShot is, basically, a social network for people who think they can carry a tune. When I logged on for the first time, I found a karaoke sanctum where fanatics gushed over one another’s songs, made friends (Hi, Vanee!), and thanked their fans with bizarre, New Age-y monologues. The site also tracks each song’s vitals—how many times it’s recorded, which members sang their own versions, and how those renditions were rated.

More here.

Uri Geller Runs Afoul of YouTube Users

Paul Elias in ABC News:

Ap_geller_070709_ms_2Uri Geller became a 1970s superstar and made millions with an act that included bending spoons, seemingly through the power of his own mind.

Now, the online video generation is so bent out of shape over the self-proclaimed psychic’s behavior that he’s fast reaching the same Internet pariah status as the recording and movie industries.

Geller’s tireless attempts to silence his detractors have extended to the popular video-sharing site YouTube, landing him squarely in the center of a raging digital-age debate over controlling copyrights amid the massive volume of video and music clips flowing freely online.

Geller’s critics say he and others are abusing a federal law meant to protect against online copyright infringement, and that YouTube and other Web sites are not doing enough to combat frivolous claims.

At issue is the Digital Millennium Copyright Act, or DMCA, which makes it easy for Geller and others to persuade Internet companies to remove videos and music simply by sending so-called takedown notices that claim copyright ownership. Most companies, including YouTube do almost nothing to investigate the claims.

More here.

REGARDING A NEW HUMANISM

Salvador Pániker at Edge.org:

Paniker200In 1959, C. P. Snow gave a famous lecture at Cambridge entitled “The Two Cultures and the Scientific Revolution”, lamenting the academic and professional scission between the field of science and that of letters. In 1991, the literary agent John Brockman popularized the concept of the third culture, to refer to the dawning of the scientist-writer, and hence, the birth of a new humanism. A humanism no longer bound to the classical sense of the term, but instead a new hybridization between the sciences and the humanities.

As far as philosophy is concerned, this new humanism should be aware of not only the latest in sciences, but also to as many tendencies of contemporary thought as is possible. Meaning that philosophy should not remain shut up in a professional academic department, but instead participate in an interdisciplinary intersection, “in conversation”—as the recently disappeared Richard Rorty would say—with all the other sciences. Philosophy needs to trace the maps of reality. The philosopher is, in the words of Plato, “he who possesses a vision of the whole (synoptikos),” in such, he who organizes that which is most relevant of the “stored information” (culture) and sketches out the new world views (provisional, but coherent). Moreover, the initial intuition of the analytic philosophers—who were the first to point out the importance of avoiding the traps set by language—should not be thrown out altogether.

More here.

Food Science

In Cornell’s food science department, chemists, engineers, and microbiologists are working on tomorrow’s hot products– coming soon to a supermarket near you.

Beth Saulnier in Cornell Alumni Magazine:

Screenhunter_04_jul_11_1728_2With nutrition guidelines in constant flux, products jockeying for space on supermarket shelves, and the supply chain now thoroughly global–Hotchkiss notes that today’s undergrads can’t imagine not having Chilean grapes in January–food science is big business. The Institute of Food Technologists, the industry’s professional organization, has 22,000 members, many of whom show up for its popular Annual Meeting & Food Expo. “The application of science to food is a huge thing,” Hotchkiss says. “People don’t advertise this very much, in part because the food industry wants you to think that elves make cookies. But I’ll tell you: I’ve been to a cookie factory, and cookies are made so fast you can’t even see them go by.”

Not everyone is enthusiastic about what happens in food science labs, or in the production facilities that employ so few Keebler elves. In a January cover story in the New York Times Magazine entitled “Unhappy Meals,” best-selling author Michael Pollan condemned food scientists as creators of nutritionally unsound products that have uncoupled Americans from the benefits and pleasures of natural food. (Pollan’s most recent book, The Omnivore’s Dilemma, laments “our national eating disorder” and is highly critical of industrial food production.) “Scientists operating with the best of intentions, using the best tools at their disposal, have taught us to look at food in a way that has diminished our pleasure in eating it,” Pollan writes, “while doing little or nothing to improve our health.” Among Pollan’s credos: avoid anything highly processed, and don’t eat anything your great-great-grandmother wouldn’t recognize.

More here.

sehnsuchtsland

18124

Does the Orient dream of the Orient too? Or is this a speciality of an Occident weary of civilisation and reason, which has long projected its needs for mysterious and meditatively abstemious serenity onto the eastern regions of the world from North Africa to China? In the Romantic period, India was singled out as the ultimate “fernwehland” [the longing for faraway places, an antonym for Heimweh or homesickness] the opposite pole to the goal-oriented rationality and early capitalism of Europe, although – or perhaps because – its admirers never went there. Novalis, Jean Paul and Goethe needed no more than the Bhagavad Gita and the Upanishads to believe in and feed the myth of Arcadian Indian wholeness. Today the huge success of Ayurveda and Yoga travel industry suggests that the myth lives on.

But what does an aristocratic Muslim woman from an Emirate dynasty, a Sheikha, think about such European dreams and longings? This question arose when I found out that the Arabic translation of my first novel, “Sister and Brother”, which deals with a journey to India and its psychological consequences, had landed in the hands of the Sheikha Shamma.

more from Sign and Sight here.

pynchon’s paranoia

Campaniletint2

There’s a longing at the heart of Against the Day, a tortured desire to redeem and amend—the theme is taken up as vengeance but played out as nostalgia. Order is never restored in Pynchon’s universe, though things change: an old enemy dies ignominiously at the hands of his bodyguard, an assassin is taken unawares, third parties do away with a traitorous spy. No one takes much pleasure in these messy ends—death comes too quickly to afford the living any satisfaction. The final pages of the novel offer a frazzled sentimental tale of coupling and growing old, where antique outlaws are domesticated and matters come more or less right only in the way they go more or less wrong. The idea of time travel, though lugged in for laughs, suggests a hankering to go back and fix things (in science fiction, the theme usually turns into tragic farce—tragedy if you like science fiction, farce if you don’t). Yet when men arrive from some indefinite future, fleeing some unimaginable global catastrophe, they seem only to want to be left alone, the most pitiable of refugees.

more from VQR here.

shelley: alone, being, in utter Being

Shelley_portrait

No poet was more maligned in death. To Matthew Arnold, Percy Bysshe Shelley was an “ineffectual angel, beating his luminous wings in vain”, while, in his hideous sculpture at University College, Oxford, Edward Onslow Ford portrayed him as a naked corpse on the shores of Viareggio. Shelley would have hated both depictions: no poet of the Romantic period was more intent on altering the political and physical structure of the world through an active engagement with it.

There have been some excellent biographies of Shelley, each of its time and with its own agenda: Shelley as belle-lettrist, Shelley as revolutionary, Shelley as hippy. As a result, he remains elusive – the haziest, most evanescent of the Romantics. Even those who tried to paint him from life found that his face “could never be fixed on paper”.

more from the Telegraph UK here.

A Wave of Neologisms Accepted by Websters

In the NYT:

No matter how odd some of the words might seem, the dictionary editors say each has the promise of sticking around in the American vocabulary.

”There will be linguistic conservatives who will turn their nose up at a word like `ginormous,”’ said John Morse, Merriam-Webster’s president. ”But it’s become a part of our language. It’s used by professional writers in mainstream publications. It clearly has staying power.”

One of those naysayers is Allan Metcalf, a professor of English at MacMurray College in Jacksonville, Ill., and the executive secretary of the American Dialect Society.

”A new word that stands out and is ostentatious is going to sink like a lead balloon,” he said. ”It might enjoy a fringe existence.”

But Merriam-Webster traces ginormous back to 1948, when it appeared in a British dictionary of military slang. And in the past several years, its use has become, well, ginormous.

Window of Possibility: Why one particular photograph should be in every classroom in the world

From Orion Magazine:

Milky_2We call our galaxy the Milky Way. There are at least 100 billion stars in it and our sun is one of those. A hundred billion is a big number, and humans are not evolved to appreciate numbers like that, but here’s a try: If you had a bucket with a thousand marbles in it, you would need to procure 999,999 more of those buckets to get a billion marbles. Then you’d have to repeat the process a hundred times to get as many marbles as there are stars in our galaxy.

That’s a lot of marbles.

So. The Earth is massive enough to hold all of our cities and oceans and creatures in the sway of its gravity. And the sun is massive enough to hold the Earth in the sway of its gravity. But the sun itself is merely a mote in the sway of the gravity of the Milky Way, at the center of which is a vast, concentrated bar of stars, around which the sun swings (carrying along Earth, Mars, Jupiter, Saturn, etc.) every 230 million years or so. Our sun isn’t anywhere near the center; it’s way out on one of the galaxy’s minor arms. We live beyond the suburbs of the Milky Way. We live in Nowheresville.

But still, we are in the Milky Way. And that’s a big deal, right? The Milky Way is at least a major galaxy, right?

Not really. Spiral-shaped, toothpick-shaped, sombrero-shaped—in the visible universe, at any given moment, there are hundreds of thousands of millions of galaxies. Maybe as many as 125 billion. There very well may be more galaxies in the universe than there are stars in the Milky Way.

So. Let’s say there are 100 billion stars in our galaxy. And let’s say there are 100 billion galaxies in our universe. At any given moment, then, assuming ultra-massive and dwarf galaxies average each other out, there might be 10,000,000,000,000,000,000,000 stars in the universe.  That’s 1.0 X 10 to the twenty-second power.  That’s 10 sextillion. Here’s a way of looking at it: there are enough stars in the universe that if everybody on Earth were charged with naming his or her share, we’d each get to name a trillion and a half of them.

More here.

Wimbledon 2007

Fulljgetty74808440mt242_the_champ_3 On Saturday, Venus Williams, last year’s leading spokeswoman for equal prize money, fittingly won the first Wimbledon women’s singles title since the success of that campaign.  The match was a lopsided one, with Williams too powerful and too fast for Frenchwoman Marion Bartoli to make an impression other than one of courage against all hope of winning.  Afterwards, Venus paid tribute to Billie Jean King for her efforts in making tennis the most visible and well-remunerated women’s sport in the world.  All to the good, and hopefully her graciousness will quiet some of the detractors who seem to surround the Williams sisters, who are, after all, perhaps the most exceptional story in sports, even a decade after their emergence on the pro tour.

Sunday, Roger Federer earned his (equal) slice of the pie, winning a stunningly well-played and epic five-set Wimbledon final against Rafael Nadal.  Together, these two men occupy a plateau far, far above the rest, and Nadal is the only man alive who can test Federer.  This picture (you may want to click on it to enlarge) should tell you how draining the effort of fending him off once again was for the Rajah, as his fans call him.  Notice anything strange about it?

Answer after the jump…

Read more »

Love on Campus

Why we should understand, and even encourage, a certain sort of erotic intensity between student and professor.

William Deresiewicz in The American Scholar:

Currentcover2The absentminded professor, that kindly old figure, is long gone. A new image has taken his place, one that bespeaks not only our culture’s hostility to the mind, but also its desperate confusion about the nature of love.

Look at recent movies about academics, and a remarkably consistent pattern emerges. In The Squid and the Whale (2005), Jeff Daniels plays an English professor and failed writer who sleeps with his students, neglects his wife, and bullies his children. In One True Thing (1998), William Hurt plays an English professor and failed writer who sleeps with his students, neglects his wife, and bullies his children. In Wonder Boys (2000), Michael Douglas plays an English professor and failed writer who sleeps with his students, has just been left by his third wife, and can’t commit to the child he’s conceived in an adulterous affair with his chancellor. Daniels’s character is vain, selfish, resentful, and immature. Hurt’s is vain, selfish, pompous, and self-pitying. Douglas’s is vain, selfish, resentful, and self-pitying. Hurt’s character drinks. Douglas’s drinks, smokes pot, and takes pills. All three men measure themselves against successful writers (two of them, in Douglas’s case; his own wife, in Daniels’s) whose presence diminishes them further. In We Don’t Live Here Anymore (2004), Mark Ruffalo and Peter Krause divide the central role: both are English professors, and both neglect and cheat on their wives, but Krause plays the arrogant, priapic writer who seduces his students, Ruffalo the passive, self-pitying failure. A Love Song For Bobby Long (2004) divides the stereotype a different way, with John Travolta as the washed-up, alcoholic English professor, Gabriel Macht as the blocked, alcoholic writer.

More here.

ON SUICIDE BOMBING

G. Sampath reviews Talal Asad's book on suicide bombing in The Hindu:

19LRsuicidejpgThis is one book you may want to avoid reading on a plane. Its title is On Suicide Bombing. And the author is a Muslim, with an Arab name: Talal Asad.

I came to it via a lecture by the American philosopher, Judith Butler. Her subject was ‘the human condition’. She talks about the questions Asad poses in his book: Can suicide bombing be thought? What resources do we need in order to think it? I was intrigued enough by Butler’s remarks to get a copy of the book.

Asad is an anthropologist by training. As an Arab Muslim in American academia, he is uniquely placed to offer an anthropological perspective on the discourse of terrorism in liberal democracies. ‘On Suicide Bombing’ is a collection of lectures he delivered in 2006. It has three chapters: ‘Terrorism’, ‘Suicide terrorism’ and ‘Horror at suicide terrorism’.

Asad begins with the most spectacular instance of suicide terrorism in recent history, the September 11, 2001, attack in the U.S., which sparked worldwide outrage, and rightly so. The mass killing of innocents is simply wrong and condemnable. There is nothing to debate here.

Nonetheless, Asad wants us to temporarily reserve our judgement, so that we could arrive at an understanding of the moral ground from which we pass judgment.

More here.

Parting the Veil

Now is no time to give up on supporting democracy in the Middle East. But to do so, the United States must embrace Islamist moderates.

Shadi Hamid in Democracy: A Journal of Ideas:

Shadi_hamidAmerica’s post-September 11 project to promote democracy in the Middle East has proven a spectacular failure. Today, Arab autocrats are as emboldened as ever. Egypt, Jordan, Tunisia, and others are backsliding on reform. Opposition forces are being crushed. Three of the most democratic polities in the region, Lebanon, Iraq, and the Palestinian territories, are being torn apart by violence and sectarian conflict.

Not long ago, it seemed an entirely different outcome was in the offing. As recently as late 2005, observers were hailing the “Arab spring,” an “autumn for autocrats,” and other seasonal formulations. They had cause for such optimism. On January 31, 2005, the world stood in collective awe as Iraqis braved terrorist threats to cast their ballots for the first time. That February, Egyptian President Hosni Mubarak announced multi-candidate presidential elections, another first. And that same month, after former Lebanese Prime Minister Rafiq Hariri was killed, Lebanon erupted in grief and then anger as nearly one million Lebanese took to the streets of their war-torn capital, demanding self-determination. Not long afterward, 50,000 Bahrainis–one-eighth of the country’s population–rallied for constitutional reform. The opposition was finally coming alive.

But when the Arab spring really did come, the American response provided ample evidence that while Arabs were ready for democracy, the United States most certainly was not.

More here.

Berlin’s art scene, once wild and free, is increasingly commercialized

Jeffrey Fleishman in the Los Angeles Times:

Screenhunter_02_jul_10_1149There’s a Starbucks near where the orgies used to be, and although the aura of Bohemia is distinct, things aren’t as unhinged as they were 17 years ago when punkers, pornographers, anarchists, squatters and artists of all persuasions landed amid the rust and drizzle of this liberated city.

It seems an era from a scrapbook, a time of cheap rents when everyone with a brush and a bit of brio claimed a garret. Some were talented; many were not. But they roamed the east side of a fallen wall, scavenging ideas and materials to make art and revive a naughty, creative spirit that resided here before decades of fascism and the Cold War.

The zeitgeist these days is more commercial. Galleries serve sushi amid prattle about hedge funds and economic indexes. Berlin has become a production center for works sold from Portugal to Dubai. Rents are going up. The dilettantes have departed. The foreign purveyors have nestled in. What remains is less the innocent verve of the past than an atmosphere that — although aesthetically adventurous and more open to experimentation than in most cities — has matured with a shrewd eye toward marketing.

More here.

The History Boys

In the twilight of his presidency, George W. Bush and his inner circle have been feeding the press with historical parallels: he is Harry Truman—unpopular, besieged, yet ultimately to be vindicated—while Iraq under Saddam was Europe held by Hitler. To a serious student of the past, that’s preposterous. Writing just before his untimely death, David Halberstam asserts that Bush’s “history,” like his war, is based on wishful thinking, arrogance, and a total disdain for the facts.

David Halberstam in Vanity Fair:

Screenhunter_01_jul_10_1140We are a long way from the glory days of Mission Accomplished, when the Iraq war was over before it was over—indeed before it really began—and the president could dress up like a fighter pilot and land on an aircraft carrier, and the nation, led by a pliable media, would applaud. Now, late in this sad, terribly diminished presidency, mired in an unwinnable war of their own making, and increasingly on the defensive about events which, to their surprise, they do not control, the president and his men have turned, with some degree of desperation, to history. In their view Iraq under Saddam was like Europe dominated by Hitler, and the Democrats and critics in the media are likened to the appeasers of the 1930s. The Iraqi people, shorn of their immensely complicated history, become either the people of Europe eager to be liberated from the Germans, or a little nation that great powerful nations ought to protect. Most recently in this history rummage sale—and perhaps most surprisingly—Bush has become Harry Truman.

More here.  [Thanks to Akbi Khan.]