A Haiku Primer and Sea Slugs

by Olivia Zhu

0f1707dfe44a29d53e4b320ce90b782cA month ago, I wrote about haikus that wended their way through the Internet and found me. Really, truly—I hadn’t gone looking for them, had read them casually, and thought I might forget them. “And yet, and yet…” Chiyo-ni’s and Issa’s haikus kept nagging at me, so I looked for the translations that spoke most to me and wrote about them. Their mark lingers still, though, and when I passed by a used copy of Harold G. Henderson’s An Introduction to Haiku: An Anthology of Poems and Poets from Bashō to Shiki in a bookstore, I had to pick it up.

Relatively speaking, it’s quite an old book on haiku. Published in 1958, it now appears to be out of print—so I really felt quite fortunate stumbling across as a clueless neophyte. It’s so old that inflation dictated I pay almost double the $1.45 that’s listed on the duck-decorated paperback cover, even in the book’s well-loved condition—still well worth it, of course. From what little I’ve read online, it seems likely that Henderson might have contributed to initial interest in haiku in America, as he helped found the Haiku Society of America and published some of the earliest English-language works on Japanese haiku.

Read more »

Are American Professors More Responsive to Requests Made by White Male Students?

by Jalees Rehman

6a017c344e8898970b01b7c8b8abeb970b-320wiLess than one fifth of PhD students in the United States will be able to pursue tenure track academic faculty careers once they graduate from their program. Reduced federal funding for research and dwindling support from the institutions for their tenure-track faculty are some of the major reasons for why there is such an imbalance between the large numbers of PhD graduates and the limited availability of academic positions. Upon completing the program, PhD graduates have to consider non-academic job opportunities such as in the industry, government agencies and non-profit foundations but not every doctoral program is equally well-suited to prepare their graduates for such alternate careers. It is therefore essential for prospective students to carefully assess the doctoral program they want to enroll in and the primary mentor they would work with. The best approach is to proactively contact prospective mentors, meet with them and learn about the research opportunities in their group but also discuss how completing the doctoral program would prepare them for their future careers.

The vast majority of professors will gladly meet a prospective graduate student and discuss research opportunities as well as long-term career options, especially if the student requesting the meeting clarifies the goal of the meeting. However, there are cases when students wait in vain for a response. Is it because their email never reached the professor because it got lost in the internet ether or a spam folder? Was the professor simply too busy to respond? A research study headed by Katherine Milkman from the University of Pennsylvania suggests that the lack of response from the professor may in part be influenced by the perceived race or gender of the student.

Read more »

Creative Receptivity

by Dwight Furrow

7a15dec031344773f809d202d732bd71

Goldsworthy, Maple Leaves Arrangement

There is an ingrained set of assumptions and attitudes about creativity in the arts that harms our understanding of art and ultimately human existence. That is the idea of the artist as a relatively unconstrained maker, a fashioner ex nihilo who brings something new into being solely through the force of her imagination and capacity for self-expression. We might contrast this with an older view of art perhaps best expressed by this quote attributed to Michelangelo:

“In every block of marble I see a statue as plain as though it stood before me, shaped and perfect in attitude and action. I have only to hew away the rough walls that imprison the lovely apparition to reveal it to the other eyes as mine see it.”

On the view expressed by Michelangelo, an artist is like a skillful craftsperson who attends to the inherent qualities of a piece of raw material, it's shape, grain, texture or color, and then decides what she can do with it. Art is too varied and complex to wholly fit either description, both of which are drawn too starkly, but I want to make the case that Michelangelo's view has more to recommend it than first meets the eye.

Aesthetic appreciation is often described in terms of adopting an aesthetic attitude, a state of mind in which one attends sympathetically and with focused attention to the aesthetic features of objects. Part of that aesthetic attitude is a willingness to be receptive to what is in the work, to refrain from imposing preconceptions on it, to let the work speak for itself. The viewer or listener must open herself up to being moved by the work and to discover all there is to be discovered in it. As important as this attitude of openness and receptivity is to appreciating art, it would be exceedingly odd if this aesthetic attitude was not also part of the process of creating the work. But if we take this receptive attitude seriously it shows the limitations of our assumptions about artists as ultimate masters.

Read more »

On Saving Nuance

by Shadab Zeest Hashmi

Germanpancake-servingWe are at the Original Pancake House. Walking through these double doors, a life-size mirror is always the first to meet me, but I am never shocked by the reflection as I am now. Does the wallpaper seem more ivory than before? Are the white tablecloths casting a glare? Am I more brown? I catch several glances in the mirror, of breakfast-eaters looking up to see who has entered. Complexion is the new currency. The Election season of 2016 has ended with rising fear and fading nuance. Shades of skin are suddenly coining new dialects of silence. Language, as we know it, is shedding gradations— those gray areas where subtlety of thought resides; nuance is becoming obsolete like discarded snakeskin, but unlike snakeskin, it will not regenerate on its own.

My husband and I discuss the menu in our usual English mixed with Urdu; he orders a German pancake and a vegetarian omelet. Years ago, when we came here for the first time, he declared that the German pancakes at this restaurant were authentic and reminded him of his mother. His mother is originally from Berlin where she met and married his father (a student from Pakistan) after she converted to Islam in the ‘60s. They moved to Karachi a few years later and have lived there ever since.

Read more »

Frank Ramsey (1903–1930): A Sister’s Memoir

Ray Monk in the New York Review of Books:

ScreenHunter_2420 Dec. 05 00.03“Well, God has arrived. I met him on the 5:15 train.”

Thus, in the New Year of 1929, was Ludwig Wittgenstein’s return to Cambridge announced by John Maynard Keynes in a letter to his wife, Lydia Lopokova. Wittgenstein had previously been at Cambridge before World War I as a student of Bertrand Russell, but had acquired his godlike status through the publication after the war of his first and only book, Tractatus Logico-Philosophicus, which was very quickly recognized as a work of genius by philosophers in both Cambridge and his home city of Vienna. Wittgenstein himself was initially convinced that it provided definitive solutions to all the problems of philosophy, and accordingly gave up philosophy in favor of schoolteaching. In 1929, however, he returned to Cambridge to think again about philosophical problems, having become convinced that his book did not, in fact, solve them once and for all.

What drew him back to Cambridge was not the prospect of working again with Russell, who by this time (having been stripped of his fellowship at Trinity College, Cambridge, because of his opposition to World War I) was a freelance journalist, a political activist, and only intermittently a philosopher. Rather, it was the opportunity of working with Frank Ramsey, the man who had persuaded him of the flaws in the Tractatus. Most significantly, Ramsey had shown that the account Wittgenstein gives of the nature of logic in the Tractatus could not be entirely correct.

More here.

How, in an age in which “the fast eat the slow,” has Thomas Friedman not been gobbled up?

Belén Fernández in Jacobin:

ScreenHunter_2419 Dec. 04 20.53The late Alexander Cockburn, reflecting on the work of decorated New York Times foreign affairs columnist and neoliberal warmonger extraordinaire Thomas Friedman, once observed: “Friedman’s is an industrial, implacable noise, like having a generator running under the next table in a restaurant. The only sensible thing to do is leave.”

But while generators at least serve a rather obvious function, the same can’t usually be said of Friedman, who has just spewed out his latest unnecessarily humongous book Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations.

In the nearly eight hundred pages that comprise my electronic version of the manuscript, there is approximately one glimmer of hope: the point at which Friedman remarks that this is “maybe my last book.”

The title Thank You for Being Late is a reference to Friedman’s realization that when his Washington, DC breakfast companions are a few minutes tardy, he can use the time not only to people-watch and eavesdrop on neighboring conversations but also to have ideas. Who knew?

More here.

How Do Gravitational Waves Escape From A Black Hole?

Ethan Siegel in Forbes:

ScreenHunter_2418 Dec. 04 20.48Perhaps the greatest discovery of all announced in 2016 was the direct detection of gravitational waves. Even though they had been predicted by Einstein's general theory of relativity 101 years prior, it took the development of a laser interferometer sensitive to ripples in space that would displace two mirrors separated by multiple kilometers by less than 10^-19 meters, or 1/10,000th the width of a proton. This finally came to pass during LIGO's 2015 data run, and two bona fide black hole-black hole merger events unambiguously popped out of the data. But how does physics actually allow this? Mārtiņš Kalvāns wants to know:

This question has puzzled me for a long time. Articles about LIGO discovery state that some percentage of black hole merger mass was radiated away, leaving [a] resulting black hole smaller than [the] sum of [the] original mergers. Yet it is accepted that nothing escapes black holes […] So my question is: how was energy radiated from black hole mergers?

This is a really deep question, and goes straight to the heart of black hole physics and general relativity.

More here.

How Stigma Sows Seeds of Its Own Defeat: Defending the liberal project is a Sisyphean task in part because successfully inculcating liberal norms leads to habits that weaken the ability to sustain them

Conor Friedersdorf in The Atlantic:

In the Western world, the percentage of people who say that it is essential to live in a democracy is in precipitous decline. In the United States, only 19 percent of millennials agree that it would be illegitimate for the military to take control of government. The president-elect routinely speculates about authoritarian policies, like stripping citizenship from those who burn the American flag in protest.

During a bygone crisis in global politics, when the liberal order was under sustained attack, Friedrich Hayek published this diagnosis of the challenge before liberals:

If old truths are to retain their hold on men’s minds, they must be restated in the language and concepts of successive generations. What at one time are their most effective expressions gradually become so worn with use that they cease to carry a definite meaning. The underlying ideas may be as valid as ever, but the words, even when they refer to problems that are still with us, no longer convey the same conviction; the arguments do not move in a context familiar to us; and they rarely give us direct answers to the questions we are asking. This may be inevitable because no statement of an ideal that is likely to sway men’s minds can be complete: it must be adapted to a given climate of opinion, presuppose much that is accepted by all men of the time, and illustrate general principles in terms of issues with which they are concerned.

The passage resurfaced this week when Will Wilkinson, in-house philosopher at the Niskanen Center, cited it to suggest that the Sisyphean task of saving liberalism is now ours, the boulder at our feet, the struggle of the hill looming once again.

“If the old truths are not updated for each new age, they will slip from our grasp and lose our allegiance,” he wrote. “The terms in which those truths have been couched will become hollow, potted mottoes, will fail to galvanize, inspire, and move us. The old truths will remain truths, but they’ll be dismissed and neglected as mere dogma, noise. And the liberal, open society will again face a crisis of faith.”

Across the Western world, liberals are grappling with how to execute that project. And while I have no pat answer, I do see an obstacle to success that’s worth understanding.

More here.

Sunday Poem

On Auden’s Musee des Beaux Arts

Not for me so much do I care
what it means—
the parent smiling while
her child’s skating,
cutting figure eights over
a pond’s ice,
veil between two worlds.
One- a world to laugh & breathe in.
The other, you drown in.

Or, seeing something
fall from the sky—
who speaks for
him or her that never grew wings
or simply dreamed the possible—
mention the torturer’s horse
casually scratching its ass,
see how quickly one’s thoughts turn
soft & nuzzy.
Now is the time
to further expand the metaphor:
Off goes the gilded Jolly Roger,
a smiley face
o’er its skull & bones.
At the tiller, a pirate steering,
tacking further, each instant
more distant from those
casually orphaned of human love and care.
See how it gathers speed with all the available air.
.

Walter Burnham

____________________________________________
.
Auden's Musee des Beaux Art

Fake news, a fake president and a fake country: Welcome to America, land of no context

Andrew O'Hehir in Salon:

Trump_fake_headlines-620x412How much of the “news” is fake? How much of reality is “real”? After an election cycle driven by lies, delusions and propaganda — including lies about lies, multiple layers of fake news and meta-fake news — we are about to install a fake president, elected by way of the machineries of fake democracy. The country that elected him is fake too, at least in the sense that the voters who supported Donald Trump largely inhabit an imaginary America, or at least want to. They think it’s an America that used to exist, one they heard about from their fathers and grandfathers and have always longed to go back to. It’s not. Their America is an illusion that has been constructed and fed to them through the plastic umbilicus of Fox News and right-wing social media to explain the anger and disenfranchisement and economic dislocation and loss of relative privilege they feel. All of which are real, if not necessarily honorable; it represents the height of liberal uselessness to keep on quarreling about whether Trump’s fabled “white working class” suffers real economic pain or is just a cesspool of racism. That argument is really about other things, to be sure: It’s about whether the Democratic Party — whose long-promised era of permanent demographic hegemony and middle-class multiculturalism keeps being delayed into the indefinite future, defeat after defeat after defeat — requires a major reconstruction or just a little cosmetic surgery. Meanwhile, out in the pseudo-reality of Trumpian America, racial resentment and economic suffering are so profoundly intertwined that there’s no way to disentangle them. Arguments that the so-called left should pretty much ignore the deplorables who keep on voting against their own interests, or should abandon “identity politics” in quest of some middle-road economic populism that blends Bill Clinton and FDR, are both missing the point. In a nation where a candidate who won the popular vote by roughly 2.5 million did not win the election, we are no longer dealing with reality, at least as it used to exist.

Hillary Clinton was the ultimate Establishment candidate facing the ultimate outsider, and also a quintessential old-media personality facing a veritable Voldemort of social media. Given that, she came pretty damn close to pulling it off. But Clinton was also a candidate from reality facing a shimmering celebrity avatar, a clownish prankster who took physical form in our universe but who could say anything and do anything because he was self-evidently not real. That disadvantage proved impossible to overcome. Furthermore, Trump’s supporters may be delusional and misguided, but they aren’t half as dumb as they often look to “coastal elites.” Many of them understood, consciously or otherwise, that his incoherent promises could not be taken literally and that his outrageous personality did not reflect the realm of reality. They were sick of reality, and you can’t entirely blame them. For lots of people in “middle America” (the term is patronizing, but let’s move on) reality has been so debased, or so much replaced, as to seem valueless.

More here.

How Louisa May Alcott won the hearts of generations to come

Christina Beck in The Christian Science Monitor:

AlcottOne hundred and eighty-four years ago today, a literary giant was born to a small, struggling family in Pennsylvania. Yet within just a few decades, Louisa May Alcott won herself both a reputation and the hearts and minds of generations with her prose. Google Doodle creator Sophia Diao decided to depict Ms. Alcott with her three sisters in commemoration of her birthday and her most beloved work, “Little Women.” Alcott was born in 1832, the daughter of prominent (but impecunious) Transcendentalist intellectual Amos Bronson Alcott. Mr. Alcott moved his family around frequently, finally settling in Concord, Mass., in 1840 when Louisa was eight years old. In Concord, the Alcotts found, if not earthly wealth, then a bounty of friends and intellectual sparring partners. Prominent Transcendentalists and New England intellectuals Ralph Waldo Emerson and Henry David Thoreau also lived in town, as did Margaret Fuller and Nathaniel Hawthorne. In the midst of this intellectual bounty, however, the Alcotts continued to struggle financially, forcing Louisa to take jobs as a school teacher and a seamstress. As abolitionists and New Englanders, the Alcotts supported the North during the American Civil War, and Louisa took her commitment to the cause a step further when she served as a nurse in Union hospitals.

Her experiences during the Civil War inspired her first book, “Hospital Sketches,” which won her some small notice in the literary world. “Hospital Sketches,” although rarely read today, served to highlight Louisa’s writing talent, and attracted the notice of a publisher. That publisher met with her father, and the two men struck a deal, urging Louisa to write a book for young girls. Contingent on Louisa’s writing was a book contract for her father. At first, Louisa was reluctant to write the book, saying that it was not her preferred type of writing. Yet her manuscript, called “Little Women,” which drew on her own experiences with her three sisters, quickly became a massive success, finally lifting the Alcotts out of their longstanding poverty.

More here.

400 Years of Jerusalem Culture

04Namdar-blog427-v2Ruby Namdar at The New York Times:

Jerusalem’s veneer of harmony, tolerance and inclusiveness is as thin and as alluring as the fine layer of gold covering the gray lead dome standing on the top of the contested Temple Mount, or the Haram al-Sharif as it is called in Arabic. Timeless conflict brews under the beautiful surface of the sacred city, whose many names are yet another manifestation of the continuing rivalry around the “ownership” of its holy sites and symbolic history. The controversial resolution passed by Unesco in October, which attempted to classify the Western Wall, one of Judaism’s holiest sites, as a part of the Muslim Al Aqsa Mosque compound, is yet another step in this long tradition of conflict.

This everlasting rivalry, paradoxically, has only enhanced the beauty and cultural richness of Jerusalem. The various churches and mosques, each competing to have the tallest structure in the sacred city, have invested a fortune in building magnificent minarets and bell towers meant to own the Jerusalem skyline. The warring Christian sects, in their struggle to dominate the sacred Church of the Holy Sepulcher (a struggle that at times has led to almost comical fistfights among priests, monks and ministers), have made great efforts to enhance their part of the space and make it outshine the others. In the same tradition, the great resources allocated by the Israelis to restore and celebrate Jerusalem’s Jewish past have made it an attractive destination for travelers from all around the world.

more here.

the almost-forgotten Kathleen Collins

La-dflorez-1480630059-snap-photoJustin Taylor at The LA Times:

Kathleen Collins was a professor of film history at New York’s City College who made a groundbreaking contribution to the subject that she taught. “Losing Ground” (1982), which Collins wrote and directed, was one of the first feature-length dramas made by an African American woman. Collins, who was also an activist and playwright, never got the chance to make another film. She died in 1988, at age 46, after a bout with breast cancer — a life, and a life’s work, cut brutally short.

“Losing Ground” is the story of a marriage in crisis and an intimate portrait of the black creative class in New York in the 1970s. Sarah, a promising young academic, is married to Victor, an older and somewhat louche painter who has just made his first major sale to a museum. (Notably, his work is acquired not by an American institution but by the Louvre.) To celebrate, they rent a summer house in a majority-Puerto Rican community in the Hudson Valley, where Victor becomes smitten with the local culture (and a local woman) while Sarah starves for intellectual and emotional attention, until one of her students asks her to come back to the city to star in a film of his.

Richard Brody, writing in the New Yorker this past spring, called “Losing Ground” “a nearly lost masterwork” and noted ruefully that “[h]ad it screened widely in its time, it would have marked film history.”

more here.

Julian Barnes admits he was wrong about EM Forster

11782-004-305E324DJulian Barnes at The Guardian:

Sometimes you change your mind about a writer. Perhaps, when you first read them you were only pretending to admire what you’d been told to admire. But also your tastes change. For instance, at 25 I was more open to writers telling me how to live and how to think; by 65 I had come to dislike didacticism. I don’t want to be told how to think and how to live by, say, Bernard Shaw, or D H Lawrence or the later Tolstoy. I don’t like art – especially theatrical art – whose function seems to be to reassure us that we are on the right side. Sitting there complacently agreeing with a playwright that war is bad, that capitalism is bad, that bad people are bad. “You don’t make art out of good intentions,” is one of Flaubert’s wiser pronouncements.

Sometimes, when our tastes become more defined, they become narrower. But this doesn’t have to be the case. I want to address a rarer changing of the mind, which is altogether more enriching: when a writer you had previously been indifferent to, indeed actively despised, suddenly makes sense to you, and you realise – with, yes, a kind of joy – that at last you see the point of them.

I first read EM Forster when an English master handed out a list of Great Books to be read one summer holiday. A Passage to India was on that list. I still have the Penguin edition – a reprint of 1960, costing three shillings and sixpence – in which I read the novel. There are no notes in the margin, not a single cry of “Irony!” It clearly made little impression on me. Later, of my own volition, when I was about 20, I read A Room With a View, and actively began to take against Forster.

more here.

technology is diminishing us

Jonathan Safran Foer in The Guardian:

FoerThe first time my father looked at me was on a screen, using technology developed to detect flaws in the hulls of ships. His father, my grandfather, could only rest his hand on my grandmother’s belly and imagine his infant in his mind. But by the time I was conceived, my father’s imagination was guided by technology that gave shape to sound waves rippling off my body. The Glasgow-based Anglican obstetrician Ian Donald, who in the 1950s helped bring ultrasound technology from shipyard to doctor’s office, had devoted himself to the task out of a belief that the images would increase empathy for the unborn, and make women less likely to choose abortions. The technology has also been used, though, to make the decision to terminate a pregnancy – because of deformity, because the parent wants a child of a certain sex. Whatever the intended and actual effects, it is clear that the now iconic black and white images of our bodies before we are born mediate life and death. But what prepares us to make life-and-death decisions? My wife and I debated learning the sex of our first child before birth. I raised the issue with my uncle, a gynaecologist who had delivered more than 5,000 babies. He was prone neither to giving advice nor anything whiffing of spirituality, but he urged me, strongly, not to find out. He said, “If a doctor looks at a screen and tells you, you will have information. If you find out in the moment of birth, you will have a miracle.”

I don’t believe in miracles, but I followed his advice, and he was right. One needn’t believe in miracles to experience them. But one must be present for them. Psychologists who study empathy and compassion are finding that, unlike our almost instantaneous responses to physical pain, it takes time for the brain to comprehend “the psychological and moral dimensions of a situation”. Simply put, the more distracted we become, and the more emphasis we place on speed at the expense of depth – redefining “text” from what fills the hundreds of pages of a novel, to a line of words and emoticons on a phone’s screen – the less likely and able we are to care. That’s not even a statement about the relative worth of the contents of a novel and a text, only about the time we spend with each.

More here.

The Philosopher of Failure: Emil Cioran’s Heights of Despair

Costica Bradatan in the Los Angeles Review of Books:

OntheheightsofdespairFor some, he was one of the most subversive thinkers of his time — a 20th-century Nietzsche, only darker and with a better sense of humor. Many, especially in his youth, thought him to be a dangerous lunatic. According to others, however, he was just a charmingly irresponsible young man, who posed no dangers to others — only to himself perhaps. When his book on mysticism went to the printers, the typesetter — a good, God-fearing man — realizing how blasphemous its contents were, refused to touch it; the publisher washed his hands of the matter and the author had to publish the blasphemy elsewhere, at his own expense. Who was this man?

Emil Cioran (1911–1995) was a Romanian-born French philosopher and author of some two dozen books of savage, unsettling beauty. He is an essayist in the best French tradition, and even though French was not his native tongue, many think him among the finest writers in that language. His writing style is whimsical, unsystematic, fragmentary; he is celebrated as one of the great masters of aphorism. But the “fragment” was for Cioran more than a writing style: it was a vocation and a way of life; he called himself “un homme de fragment.”

More here.