Galileo, the fat knight


Galileo was afflicted with a cold and crazy mother—after he made his first telescope, she tried to bribe a servant to betray its secret so that she could sell it on the market!—and some of the chauvinism that flecks his life and his writing may have derived from weird-mom worries. He was, however, very close to his father, Vincenzo Galilei, a lute player and, more important, a musical theorist. Vincenzo wrote a book, startlingly similar in tone and style to the ones his son wrote later, ripping apart ancient Ptolemaic systems of lute tuning, as his son ripped apart Ptolemaic astronomy. Evidently, there were numerological prejudices in the ancient tuning that didn’t pass the test of the ear. The young Galileo took for granted the intellectual freedom conceded to Renaissance musicians. The Inquisition was all ears, but not at concerts. Part of Galileo’s genius was to transfer the spirit of the Italian Renaissance in the plastic arts to the mathematical and observational ones. He took the competitive, empirical drive with which Florentine painters had been looking at the world and used it to look at the night sky.

more from Adam Gopnik at The New Yorker here.

Wednesday Poem

My Heart

She keeps a store for hearts lost and stolen.
Should I lose one I know where to look for it.

Life’s longest epic is a day in love. I gave up
All my cares for one that takes no cure.

In life, she is laid back, in love enigmatic.
What if she plays cool, she is aching for you.

For many years now, I have polished this heart.
I will get its value when she puts a price on it.

I know you like to rub salt in my wounds.
This cauterizes me: what does it do for you?

Ghazal from Ghalib
translation by M. Shahid Alam first appeared in Kenyon Review Online, Winter 2013

A brief history of the Buffalo chicken wing

From Smithsonian:

Anchor-barFew of us realize, though, that less than 50 years ago, wings were considered one of the least desirable cuts of the chicken—a throwaway part often cooked into stock—and “buffalo” was just a wooly ungulate that wandered the Plains. Despite the recency of the invention, the event itself is shrouded in mystery. Nevertheless, there is one thing we know for certain: the “buffalo” in the name definitively refers to the city in Western New York. The most authoritative account is by New Yorker writer Calvin Trillin, who investigated the dish’s history in 1980 as he sampled the city’s most well-regarded wing joints. He presented two competing versions of how a stroke of serendipity led Teressa Bellissimo, proprietor of the Anchor Bar, to invent the dish in 1964. Her husband Frank Bellissimo, who founded the bar with Teressa in 1939, told Trillin that the invention involved a mistake—the delivery of chicken wings, instead of necks, which the family typically used when cooking up spaghetti sauce. To avoid wasting the wings, he asked Teressa to concoct a bar appetizer; the result was the wing we know today.

Dominic—Frank and Teressa’s son, who took over management of the restaurant sometime in the ’70s—told a slightly more colorful tale: It was late on a Friday night in 1964, a time when Roman Catholics still confined themselves to fish and vegetables on Fridays…Some regulars had been spending a lot of money, and Dom asked his mother to make something special to pass around gratis at the stroke of midnight. Teressa Bellissimo picked up some chicken wings—parts of a chicken that most people do not consider even good enough to give away to barflies—and the Buffalo chicken wing was born. Both Frank and Dominic agreed on a few other crucial details—that Teressa cut each wing in half to produce a “drumstick” and a “flat,” that she deep-fried them without breading and covered them in a hot sauce, and that she served them with celery (from the house antipasto) and blue cheese salad dressing. They also both reported that they became popular within weeks throughout the city, where they were (and are still) simply called “wings” or “chicken wings.”

More here. (Note: For my brother Bhaisab, who introduced me to, and who enjoyed the wings regularly during his three decades of working nextdoor to Anchor Bar as a cardiac surgeon at Buffalo General Hospital)

Laurie Anderson interviews Brian Eno

Laurie Anderson in Interview:

Img-brian2_152616699090If humans were able to hear light and parse the poetry of the spectrum, then perhaps there would be no need for Brian Eno, who seems to do it effortlessly. While the rest of us are generally content to hear sound, Eno can clearly see it. How else to explain the elaborate sonic color fields and glowing soundscapes that he creates, which feel as much like floating shapes and waves of light as they do music? And how else to make sense of a body of work that has been by turns challenging and definitive and spread across an expanse of disparate worlds and genres, from his early work with Roxy Music, to his ever-evolving solo oeuvre, to the colossal swoosh of his frequent collaborations with U2, to his numerous art projects, compositional gambits, and multimedia installations—not to mention the three ambient-music-generating apps, Scape, Bloom, and Trope, that he has created with musician and software designer Peter Chilvers.

Eno, though, has always had the good taste to never confuse grandeur with bombast. From his first gig with Roxy Music in 1971, it was clear that he was not going to play the role of rock musician as anyone had before. (He often preferred to play off stage, behind the mixing board, processing the sound with a VCS3 synthesizer, even as he sang background vocals.) Eno helped paint David Bowie's late-'70s “Berlin Trilogy” of Low (1977), Heroes (1977), and Lodger (1979) with a sense of minimalist discipline and a saturated electronic whir without sacrificing Bowie's sense of playfulness or his desperately romantic croon. With Eno behind the boards, Talking Heads catapulted from quirky New Wave outfit to mind-blowing world-funk collective, and his later work with Heads leader David Byrne, the co-credited 1981 album My Life in the Bush of Ghosts, successfully traversed a precarious edge of experimental music in its use of unconventional instrumentation and found and sampled sounds. But one of Eno's longest and most sustained creative partnerships has been with U2, which began when he and Daniel Lanois joined forces to produce the Irish foursome's brilliantly elliptical 1984 albumThe Unforgettable Fire, and continued as the group transformed itself from a club act into a force that could dominate football stadiums to the last great big rock group standing without ever compromising its creative ambition or sense of spirituality.

More here.

Not Talking About Pakistan

Taymiya R. Zaman in Tanqeed:

Road-shot-3I drew a secret line around the borders of Pakistan and rarely stepped over it. In the fall of 2007, I began teaching Islamic history at a small liberal arts college in San Francisco; even though my classes on South Asia and the Middle East could easily have included Pakistan, I made sure to exclude Pakistan from all my syllabi. To avoid ever having to talk about Pakistan, I changed the name of a course a predecessor had titled “History of South and Southeast Asia,” to “Indian Civilizations.” This now meant that the course took a leisurely route through the Indus Valley Civilization, the coming of the Aryans, the spread of Jainism and Buddhism in North India, the rise of the Mughal Empire and concluded with British colonial rule and the formation of India and Pakistan in 1947. But, after an emotionally charged lecture on Partition, I would begin a section on modern India and say nothing of Pakistan after the moment of its creation. My class, “The Modern Middle East,” covered American wars in Afghanistan but my syllabus screeched to a halt at the Pakistan border. Although the country inevitably featured in class discussions about U.S. foreign policy, I assigned no readings on Pakistan. In my other classes, I stayed away from the twentieth century, which meant that the question of Pakistan never arose.

Outside the classroom too, I was something of an expert at not talking about Pakistan.

More here.

The Gatekeepers: Six Israeli security chiefs stun world

Samuel Burke at CNN:

ScreenHunter_89 Feb. 05 18.54Six former heads of the Shin Bet, Israel’s secretive internal security service, have spoken out as a group for the first time and are making stunning revelations.

The men who were responsible for keeping Israel safe from terrorists now say they are afraid for Israel’s future as a democratic and Jewish state.

Israeli film director Dror Moreh managed to get them all to sit down for his new documentary: “The Gatekeepers.” It is the story of Israel’s occupation of the Palestinian Territories, as told by the people at the crossroads of some of the most crucial moments in the security history of the country.

“If there is someone who understands the Israeli-Palestinian conflict, it’s those guys,” the director told CNN’s Christiane Amanpour.

Against the backdrop of the currently frozen peace process, all six argue – to varying degrees – that the Israeli occupation of Palestinian land is bad for the state of Israel.

The oldest amongst the former chiefs, Avraham Shalom, says Israel lost touch with how to coexist with the Palestinians as far back as the aftermath of the Six Day War of 1967, with the occupation of Gaza and the West Bank, when the country started doubling down on terrorism.

“We forgot about the Palestinian issue,” Shalom says in the film.

A major impediment to a meaningful strategy, they say, are the Jewish extremists inside Israel – people like the Jewish Israeli who assassinated Prime Minister Yitzak Rabin, or the 1980 plot to blow up the Dome of the Rock Islamic shrine in Jerusalem.

More here.

A Legacy of Dark and Light


Marie Curie’s death was brutal. She had long faced the ravages of extended radiation exposure: fevers, cataracts, respiratory distress, running sores on her hands. Aplastic pernicious anemia finished the job. She died on July 4, 1934, at the age of sixty-seven. … What would Madame Curie have thought of the long-term ramifications of her discoveries? The manifold medical and industrial uses of radioactive materials would have staggered her, in the best way. The atomic scientist Alan E. Waltar’s Radiation and Modern Life: Fulfilling Marie Curie’s Dream (2004) gives an idea of the vast scope of the technology, which is used in increasing crop production, controlling insect pests, sterilizing medical equipment, developing new drugs, medical diagnosis, cancer treatment, nuclear power, purifying cosmetics, testing soil at construction sites with radiation gauges, measuring automotive engine wear, inspecting aircraft welds through radiography, determining rail stresses, radioisotope thermoelectric generators for spacecraft, luminescent exit signs in public buildings, DNA forensics, carbon dating, enhancing the beauty of precious gems, authenticating rare paintings, and on and on.

more from Algis Valiunas at The New Atlantis here.

leaving the witness


I began learning Mandarin Chinese in 2003 through a night class offered at my congregation in Vancouver. I had been a devout Jehovah’s Witness from the time I was a child, and I became a full-time missionary the day I graduated from high school. It was a pretty typical path for a young Witness. Pursuing any kind of career was frowned upon as materialistic and a distraction from what really mattered: preaching. Four days a week, I would put on my modest skirt and practical shoes, fill my briefcase with magazines and other Watchtower publications, and walk to the Kingdom Hall near my home in Kitsilano. I’d meet up with a car group of other Witnesses. We’d then head out to our assigned territory—the affluent neighborhoods of Vancouver’s west side. We would knock on doors—street by street, house by house. Some people would be polite, but most were just annoyed. Once in a while someone would slam the door in my face, or yell. But mostly people didn’t answer. Missionary work wasn’t the easiest in Vancouver.

more from Amber Scorah at The Believer here.

indirectly forthright, demure and definitive at once


Marianne Moore is always hiding in plain sight. She is the paradoxical radical, either distracting the reader from her traditionalism with avant-garde trappings or concealing rebellion in prim camouflage. She picketed for women’s rights and voted for Herbert Hoover. She distrusted the “obscenities” in William Carlos Williams and encouraged the “ability” in Allen Ginsberg. She breathed horror of “a sodomite” to one lesbian friend and signed letters to another “your affectionate albino-dactyl.” Those three-corned hats and men’s polo shirts: do they reflect an old-fashioned aversion to frippery or an innovative preference for androgyny? And her resolute urban celibacy (she lived in an apartment with her mother): a species of piety or a refusal of stereotypes? Moore’s mix of puritan and progressive seems quintessentially American—alert to the virtues of brown bread and the glories of Brancusi’s sculpture, to Pilgrim’s Progress as well as Ezra Pound. Likewise her get-to-the-point distrust of dreaming: “No wonder we hate poetry,” she writes in “Armor’s Undermining Modesty,” and “stars and harps and the new moon.” When Moore ends that poem on an “imperishable wish,” she means something as solid as the “hard yron” of another of her titles. Moore was indirectly forthright, demure and definitive at once.

more from Siobhan Phillips at Boston Review here.

Tuesday Poem

For All

Ah to be alive
on a mid-September
fording a stream
barefoot, pants rolled up,
holding boots, pack
sunshine, ice in the shallows,
northern rockies.

Rustle and
shimmer of icy creek waters
stones turn underfoot, small and hard as
cold nose dripping
singing inside
creek music, heart
smell of sun on gravel.

I pledge allegiance

I pledge
allegiance to the soil
of Turtle Island,
and to the beings who thereon
one ecosystem
in diversity
under the sun
With joyful
interpenetration for all.

by Gary Snyder

Pigeons Get a New Look

Carl Zimmer in The New York Times:

PigeonTEST-slide-P3TM-2-articleLarge-v3In 1855, Charles Darwin took up a new hobby. He started raising pigeons. In the garden of his country estate, Darwin built a dovecote. He filled it with birds he bought in London from pigeon breeders. He favored the fanciest breeds — pouters, carriers, barbs, fantails, short-faced tumblers and many more. “The diversity of the breeds is something astonishing,” he wrote a few years later in “On the Origin of Species” — a work greatly informed by his experiments with the birds. Pigeon breeding, Darwin argued, was an analogy for what happened in the wild. Nature played the part of the fancier, selecting which individuals would be able to reproduce. Natural selection might work more slowly than human breeders, but it had far more time to produce the diversity of life around us. Yet to later generations of biologists, pigeons were of little more interest than they are to, say, New Yorkers. Attention shifted to other species, like fruit flies and E. coli.

…Archaeologists have speculated that rock pigeons flocked to the first farms in the Fertile Crescent in the Middle East, where they pecked at loose grain. Farmers then domesticated them for food. Later, humans bred the birds to carry messages. By the eighth century B.C., Greeks were using pigeons to send the results of Olympic Games from town to town. Genghis Khan used pigeons to create a communication network across his empire in 12th century A.D. Eventually, people began breeding pigeons simply for pleasure. Akbar the Great, a 16th-century Mughal emperor, always traveled with his personal colony of 10,000 pigeons. He bred some of the birds for their ability to tumble through the air, and others for their extravagant beauty.

More here.

The Science Mystique

by Jalees Rehman

ScreenHunter_88 Feb. 04 10.16Many of my German high school teachers were intellectual remnants of the “68er” movement. They had either been part of the 1968 anti-authoritarian and left-wing student protests in Germany or they had been deeply influenced by them. The movement gradually fizzled out and the students took on seemingly bourgeois jobs in the 1970s as civil servants, bank accountants or high school teachers, but their muted revolutionary spirit remained on the whole intact. Some high school teachers used the flexibility of the German high school curriculum to infuse us with the revolutionary ideals of the 68ers. For example, instead of delving into Charles Dickens in our English classes, we read excerpts of the book “The Feminine Mystique” written by the American feminist Betty Friedan.

Our high school level discussion of the book barely scratched the surface of the complex issues related to women’s rights and their portrayal by the media, but it introduced me to the concept of a “mystique”. The book pointed out that seemingly positive labels such as “nurturing” were being used to propagate an image of the ideal woman, who could fulfill her life’s goals by being a subservient and loving housewife and mother. She might have superior managerial skills, but they were best suited to run a household and not a company, and she would need to be protected from the aggressive male-dominated business world. Many women bought into this mystique, precisely because it had elements of praise built into it, without realizing how limiting it was to be placed on a pedestal. Even though the feminine mystique has largely been eroded in Europe and North America, I continue to encounter women who cling on to this mystique, particularly among Muslim women in North America who are prone to emphasize how they feel that gender segregation and restrictive dress codes for women are a form of “elevation” and honor. They claim these social and personal barriers make them feel unique and precious.

Friedan’s book also made me realize that we were surrounded by so many other similarly captivating mystiques. The oriental mystique was dismantled by Edward Said in his book “Orientalism”, and I have to admit that I myself was transiently trapped in this mystique. Being one of the few visibly “oriental” individuals among my peers in Germany, I liked the idea of being viewed as exotic, intuitive and emotional. After I started medical school, I learned about the “doctor mystique”, which was already on its deathbed. Doctors had previously been seen as infallible saviors who devoted all their time to heroically saving lives and whose actions did not need to be questioned. There is a German expression for doctors which is nowadays predominantly used in an ironic sense: “Halbgötter in Weiß” – Demigods in White.

Through persistent education, books, magazine and newspaper articles, TV shows and movies, many of these mystiques have been gradually demolished.

Read more »

Ancient Paradoxes and the Good Life

by Scott F. Aikin and Robert B. Talisse

Sophocles_statue_in_lateranMost are already familiar with many of the thoughts driving the Ancient Paradoxical ethical tradition. Surely we’ve all either thought and endorsed or at least heard someone express thoughts along the following lines:

It’s not whether you win or lose, but how you play the game.

It’s not getting what you want, it’s wanting what you get.

Being good is its own reward.

Let’s first note an important terminological point about the paradoxical tradition. Paradox is a Greek word that, in its classical usage, that meant something counter intuitive, something surprising. Para, meaning alongside or against, and doxa, belief. So a paradox is something that runs against what we normally believe. In short, those who belong to the paradoxical tradition say surprising things. Now, the paradox is most clearly in view for us, as we endorse sentiments like It’s not whether you win or lose, it’s how you play the game, but we nevertheless cheer for winners, and nobody makes it to any Halls of Fame simply for being a good sport. The same goes for all the familiar old saws – we think they are right, but nevertheless don’t live in accord with them. The paradoxical tradition is one of consistently living in accord with sentiments like these.

Socrates was one of the first great paradoxicalists, and one of the most famous. One particular paradox he announces after the Athenians sentence him to death for impiety and corrupting the young. He says he does not believe “a good man can be harmed in life or in death” (Apology 41d). And so we have the first of the ancient paradoxes of the good life, call it:

The paradox of invulnerability: Insofar as you are virtuous, you cannot be truly harmed.

Now what makes this view paradoxical is that Socrates says this in the face of a jury who’ve sentenced him to death. Having to drink hemlock and suffer its effects. That sounds like a harm. Dying? It certainly seems worse than living on and being Socrates. How else might someone consider it a punishment?

The paradoxical perspective on this is that these slings and arrows of outrageous fortune would be harms only if they harmed our souls. A death sentence is a harm only if it makes you willing to grovel, lie and cheat to avoid it. Poverty and suffering are harms only if it makes you a horrible person, violent, or selfish. Illness is a harm only if it makes you resentful and empty.

The world can destroy us, but if we live well, it cannot destroy the good in us. The world can take the light of goodness inside you only if you let it. Our job is to tend and care for that light of decency and goodness inside us. Virtue ensures it’s not snuffed out.

Read more »

Ecology’s Image Problem

By Liam Heneghan

“There are tories in science who regard imagination as a faculty to be avoided rather than employed. They observe its actions in weak vessels and are unduly impressed by its disasters” —John Tyndall, 1870

The-Poetics-of-SpaceIn his 1881 essay on Mental Imagery, Francis Galton noted that few Fellows of the Royal Society or members of the French Institute, when asked to do so, could imagine themselves sitting at the breakfast-table from which presumably they had only recently arisen. Members of the general public, women especially, fared much better, being able to conjure up vivid images of themselves enjoying their morning meal. From this Galton, an anthropologist, noted polymath, and eugenicist, concluded that learned men, bookish men, relying as they do on abstract thought, depend on mental images little, if at all.

In this rejection of the scientific role for the imagination Galton was in disagreement with Irish physicist John Tyndall who in a 1870 address to the British Association in Liverpool entitled The Scientific Use of the Imagination claimed that in explaining sensible phenomena, scientists habitually form mental images of that which is beyond the immediately sensible. “Newton’s passage from a falling apple to a falling moon”, Tyndall wrote, “was, at the outset, a leap of the prepared imagination.” The imagination, Tyndall claimed, is both the source of poetic genius and an instrument of discovery in science.

The role of the imagination is chemistry, is well enough known. In 1890 the German Chemical Society celebrated the discovery by Friedrich August Kekulé von Stradonitz of the structure of benzene, a ring-shaped aromatic hydrocarbon. At this meeting Kekulé related that the structure of benzene came to him as a reverie of a snake seizing its own tail (the ancient symbol called the Ouroboros).

Read more »

The Land of Oz

by Misha Lepetic

“I never knew I had an inventive talent until phrenology told me so.
I was a stranger to myself until then.”
~Thomas Edison

The question of expertise is a fascinating and vexatious one. Who gets to be an expert? More accurately, who is allowed to be an expert? And what happens when expertise is, for lack of a more polite term, betrayed by one of its own? A recent New Yorker article pillorying Dr. Mehmet Oz provides some interesting lessons in this regard.

Most expertise, it can be reasonably argued, is cultivated and deployed within the context of occupational professions. For the purposes of created a baseline for the following discussion, let’s define a profession as “an organized body of experts who apply esoteric knowledge to particular cases.” This is according to Andrew Abbott, whose The System of Professions (1988) is the current sociological heavyweight when it comes to theorizing about professions.

Abbott contends that, in order to theorize this phenomenon effectively, sociology must look at professions in a holistic manner: prior research, which focused on the structure and function of individual professions, missed the larger point that the success or failure of any given profession was largely contingent upon the results of “interprofessional competition.” That is, when considered in isolation, professions make claims concerning their relevance for addressing social needs through the formation of associations, credentialing, the courting of favourable regulation, and so on. However, when viewed as a larger social phenomenon, it is apparent that these claims are subject to constant contention by other professions. There is, in fact, an ecology of professions.


As an example, while one might consider alcoholism to be an objective phenomenon centered around the over-consumption of drink by an individual, the subjective nature of alcoholism as a social phenomenon has been viewed alternatively as a moral or spiritual problem, a medical disease, a legal matter, and as a mental disorder. Respectively then, the responsibility to treat alcoholics was claimed by the clergy, doctors, lawyers and police, and psychiatrists. What is worth noting is that these professions actively partook in poaching the objective phenomenon at hand from one another. When a particular profession failed to deliver results, an opening was created for another group to take over, thereby adding to its social legitimacy and influence.

Read more »

Mandatory flu vaccination and other sick policies

by Quinn O'Neill

SneezeIf you were a patient in a hospital, whom would you rather have caring for you: a healthy professional or someone who was febrile, coughing, and struggling to stay awake? If you’re like most people, you’d want to be treated by someone who seemed healthy. It's a no-brainer.

Unfortunately, health care professionals have a tendency to show up to work even when they have the flu or a flu-like illness. According to one study more than 80% of medical practioners and over 60% of nurses, nurse’s aides, allied professionals, and administrative staff do not routinely take sick leave when experiencing influenza-like illness (ILI).

Though paid sick leave policies aren’t the norm in the US, they can have a huge impact on the spread of disease. A study looking at the 2009 flu pandemic estimated that the absence of paid sick leave could add an additional 5 million cases of ILI in the general population. It's worth noting that ILI may be caused by a wide range of pathogens (in addition to influenza viruses) for which flu vaccination offers no protection.

The issue of sick leave – whether it’s paid or not, and whether or not it’s taken when it should be – is particularly interesting in the context of mandatory vaccination of health care workers. A number of employees were recently fired by an Indiana hospital for refusing the flu shot. This wasn't an isolated incident – 29 hospitals fired unvaccinated workers last year. It seems like a pretty extreme measure. Is it justified? In determining this, we should first consider a couple of other questions.

Read more »