edward garnett: mentor to genius

9780374281120Michael Dirda at the Washington Post:

In 1893 the young John Galsworthy booked passage on the clipper Torrens, then sailing from the South Seas to England. During this voyage the future author of “The Forsyte Saga” happened to become friendly with the ship’s first mate. In a letter home he described this “capital chap”— of Polish origin — as “a man of travel and experience in many parts of the world,” with “a fund of yarns.” Seven years after their shipboard conversations, Joseph Conrad — who else could it have been? — would dedicate his most famous novel, “Lord Jim,” to Galsworthy. In 1932 Galsworthy would be awarded the Nobel Prize for Literature; Conrad, of course, is now universally regarded as one of the greatest novelists of all time.

Both these writers counted themselves proteges of Edward Garnett (1868-1937), the subject of Helen Smith’s prizeworthy literary biography, “An Uncommon Reader.” No ordinary acquisitions editor or publisher’s reader, Garnett devoted his life to fostering, with tough love, the work of many young, and now famous, authors. Besides Galsworthy and Conrad, who became his close friends, he championed Stephen Crane, helped D.H. Lawrence reconfigure “Sons and Lovers,” urged T.E. Lawrence to publish “Seven Pillars of Wisdom,”lent moral and financial support to Edward Thomas — “the finest poet of his generation” — and produced the first major essay on Thomas’s American friend Robert Frost.

more here.

THOREAU’S QUESTIONS

ThoreauGeoff Wisner at The Quarterly Conversation:

Few of Thoreau’s best-known quotations take the form of a question. Yet those that do cut deep. They get under our skin. “Why should we be in such desperate haste to succeed, and in such desperate enterprises?” Thoreau asks in Walden. “What is the use of a house if you haven’t got a tolerable planet to put it on?” he writes in a letter to H.G.O. Blake.

One of the few Thoreau scholars to recognize the importance of Henry’s questions is Jeffrey Cramer, whose collection The Quotable Thoreau devotes a section to questions. In doing so, he recognizes that Thoreau was not necessarily the man with all the answers.

“Thoreau,” writes Cramer, “was the vegetarian who ate meat; the conservationist who surveyed woodlots in Walden Woods; the pacifist who endorsed violence; the hermit who loved gossip.” Thoreau was no hypocrite, as he has often been painted: he was “a questioner of the very concepts we have come to associate with his name.”

more here.

Saturday Poem

—for H & J

Choosing a Dog

"It's love," they say. You touch
the right one and a whole half of the universe
wakes up, a new half.

Some people never find
that half, or they neglect it or trade it
for money or success and it dies.

The faces of big dogs tell, over the years,
that size is a burden: you enjoy it for a while
but then maintenance gets to you.

When I get old I think I'll keep, not a little
dog, but a serious dog,
for the casual, drop-in criminal—

My kind of dog, unimpressed by
dress or manner, just knowing
what's really there by the smell.

Your good dogs, some things that they hear
they don't really want you to know—
it's too grim or ethereal.

And sometimes when they look in the fire
they see time going on and someone alone
but they don't say anything.
.

William Stafford
from The Way It Is
Graywolf Press, 1998

We Are What We Read

John Sutherland in The New York Times:

BooksI recall Noel Annan, the provost of University College London, declaring in the 1970s that the English literature department, historically the first such in England, was the “very heart” of the school. Any college president making such a claim as Annan’s today could await the men in white coats. It’s with exhilaration, then, that one hails Martin Puchner’s book, which asserts not merely the importance of literature but its all-importance. “Literature,” the first page declares, “since it emerged 4,000 years ago,” has “shaped the lives of most humans on planet Earth.” We are what we read.

“The Written World” makes this grand assertion on the basis of a set of theses. Storytelling is as human as breathing. When fabulation intersected with writing, stories were empowered to propagate themselves in society and around the world as civilization-forming “foundational texts.” Puchner opens, by way of illustration, with Alexander the Great. Under his pillow at night he had, alongside his dagger, a copy of the “Iliad.” His literary GPS, we understand. As important as the epic’s originally oral story of great conquest was the script it was written in: That too would conquer worlds. This review is printed in a variant of it. The narrative gallops on to Mesopotamia, Nineveh, clay tablets, cuneiform and Gilgamesh. Puchner explains it all with brio. By Page 50 Ashurbanipal is a name the reader will feel able to drop knowingly into any conversation on literary matters. In chronological procession there follow Buddha, Confucius (a notably brilliant chapter), “The Tale of Genji” (hooray, at last, for the woman author), the Mayas (a dark episode), the Gospels, Gutenberg, Muhammad, Luther, Cervantes, Goethe, Benjamin Franklin, Thomas Jefferson, Marx and Engels, the African epic of Sunjata — on, on and on to Derek Walcott (“new nations need stories to tell them who they are,” writes Puchner) and Harry Potter (“repetitive,” alas). The invention and spread of paper gave literature wings. So too did print and in our day, the web. Looking at his screen, Puchner wonders what foundational texts will flicker down to us. There is a joyous personality in this book. Puchner gives more of himself to the reader than most literary historians. As a child, he confides, he was entranced by the “Arabian Nights” — only cliff-hanging bedtime stories to her husband can save Scheherazade from being a one-night queen and next morning’s bridal corpse. But who originated this bundle of tales? The question nags at Puchner. He has a dream that he describes at length. What does the dream tell him? Stop looking. Searching is futile.

More here.

Friday, January 5, 2018

The Anatomy of the Urban Dictionary

From the MIT Technology Review:

ScreenHunter_2922 Jan. 05 19.17The Urban Dictionary is a crowdsourced website that records new words and their meanings. It began life in 1999 as a parody of Dictionary.com but has since become an important resource on the Web. Indeed, judges in the U.K. famously used the site in 2005 to help them understand slang used by two rappers involved in a dispute.

Part of Urban Dictionary’s appeal is its informal approach, which allows both definitions and descriptions of words. It even allows opinions, which can sometimes be offensive. It captures new words quickly and registers many of the variations that emerge over time. A voting system allows users to show admiration or disdain, revealing words’ popularity.

Today, many millions of users rely on the site to keep them up to date with slang, common usage, and popular culture.

Of course, Urban Dictionary has its shortcomings. In the absence of style guides, editors, and moderators, the content can be vague and inaccurate. Also, little is known about the people who post new words and whether the entries reflect real changes in the language or just those that affect a small subset of people.

So just how good is the Urban Dictionary at capturing new words, and how does it compare with more conventional approaches to producing online dictionaries?

More here.

Awake Under Anesthesia

Joshua Rothman in The New Yorker:

Rothman-Are-We-All-Awake-During-AnesthesiaOne day in the nineteen-eighties, a woman went to the hospital for cancer surgery. The procedure was a success, and all of the cancer was removed. In the weeks afterward, though, she felt that something was wrong. She went back to her surgeon, who reassured her that the cancer was gone; she consulted a psychiatrist, who gave her pills for depression. Nothing helped—she grew certain that she was going to die. She met her surgeon a second time. When he told her, once again, that everything was fine, she suddenly blurted out, “The black stuff—you didn’t get the black stuff!” The surgeon’s eyes widened. He remembered that, during the operation, he had idly complained to a colleague about the black mold in his bathroom, which he could not remove no matter what he did. The cancer had been in the woman’s abdomen, and during the operation she had been under general anesthesia; even so, it seemed that the surgeon’s words had lodged in her mind. As soon as she discovered what had happened, her anxiety dissipated.

More here.

How America Is Transforming Islam

Emma Green in The Atlantic:

ScreenHunter_2921 Jan. 05 19.01American culture often presents two opposing paths for young Muslims. On one side are people like President Donald Trump, who retweets unverified videos purporting to show Muslim violence; says things like “I think Islam hate us”; and claims there’s “no real assimilation” among even second- and third-generation Muslims in the U.S. On the other are movies like The Big Sick, which depicts the autobiographical love story of Kumail Nanjiani, a Muslim comedian who rejects religion and falls in love with a white woman, devastating his immigrant family.

In reality, most Muslims are somewhere in between. U.S. Muslims—roughly 60 percent of whom are under 40—are going through a process that’s quintessentially American: finding new, diverse, self-constructed identities in their faith, ranging from fully secular to deeply pious. The contours may be particular to Islam, but the story is one shared by Catholics, Jews, and even the Puritans. Muslims are creating distinctively American forms of their religion.

As a group, Muslims are extremely diverse, and their experiences reflect that diversity. Some young Muslims care deeply about their religious and cultural identities, but choose to prioritize other parts of life. Others self-define new, non-traditional ways of engaging with their faith. Immigrants understand the country differently than people who have been in the U.S. for generations; black Muslims encounter distinctive kinds of discrimination and have particular communal needs. Converts face questions from family members who might not understand their new religion, and have to navigate the sometimes-unfamiliar cultures of new friends and partners. And some Muslims don’t feel accepted by their own community, for reasons of race, gender, or sexuality.

More here.

winter in russia

1663_coverFrancisco de Borja Lasheras at Eurozine:

It is mid-afternoon and leaden rain falls over the city. After traversing a labyrinth of scruffy stairways, ill-lit corridors and backyards, Andrey and I are seated in a space rather like a classroom. This place brings together organisations, lawyers and experts in judicial reform, human rights and other representatives of civil society from one of the most unruly parts of the country from the state’s point of view: Saint Petersburg. At times the meeting takes on a somewhat melancholic air. This impression becomes more acute when one realises the age of some of those present: men and women who were young in the turbulent 90s and who now pass verdict on that period. ‘We weren’t ready for that wave of democracy’, some argue. This is a point I have heard uttered by the Ukrainian-born, Belarusian writer Svetlana Alexievich.

Back then, under Yeltsin, Russia began to dismantle the USSR from within while simultaneously experimenting with democracy and applying drastic measures to transition to a market economy under the guidance of reformists such as Gennady Burbulis and Yegor Gaidar. Convinced that there was no alternative, figures who had been raised on Marx now embraced shock therapy capitalism with the same Messianic fervour. In October 1991 Yeltsin told the Duma that this was ‘Russia’s way to democracy and not empire’. Following the shortages under Gorbachev’s perestroika, the social impact was dramatic as poverty and inequality increased.

more here.

What Is Freedom of Conscience?

ROBINSON-PROTESTERS-600x315Marilynne Robinson at the American Scholar:

The idea of conscience as we think of it is reflected in the Greek of the New Testament. It is to be found in Plato as self-awareness, a capacity for self-appraisal. In the Hebrew Bible, it is pervasively present by implication, an aspect of human experience that must be assumed to be reflected in the writing of Paul and others. In Genesis a pagan king can appeal to the Lord on the basis of the integrity of his heart and the innocence of his hands, and learn that God has honored his innocence and integrity by preventing him from sinning unintentionally. The king’s sense of himself, his concern to conform his conduct to the standard he brings to bear on it, which is a standard God acknowledges, is a kind of epitome of the concept of righteousness so central to the Hebrew Bible. That the king is a pagan, a Philistine, suggests that Torah regards moral conscience as universal, at least among those who respect and cultivate it in themselves.

Beyond the capacity to appraise one’s own actions and motives by a standard that seems, at least, to stand outside momentary impulse or longer-term self-interest and to tell against oneself, conscience is remarkably chimerical. An honor killing in one culture is an especially vicious crime in another. The effective imprisonment at forced labor of unwed mothers, or of young women deemed likely to stray, was practiced until a few decades ago in a Western country, Ireland, despite the many violations of human rights this entailed.

more here.

modernity and lateness

9780198704621Joe Paul Kroll at the TLS:

Although Adorno was writing against the misunderstanding that lateness was a sufficient explanation of greatness, his own critical amplification of the concept has not escaped popularization. Gordon McMullan and Sam Smiles, in their introduction to the collection Late Style and its Discontents, identify the culprit in Edward Said, whose posthumous book On Late Style, which applies Adorno’s thesis to a number of painters, writers and composers, is charged with spreading “the idea that the work of the last few years of truly ‘great’ creative artists is marked by a profound change of style, tone, and content which tends both to look back to the artist’s earlier years and forward, beyond his death, to future developments in the field”. As such, it offered little more than an “ideological construct, the product of a certain kind of critical” – or rather uncritical – “wish fulfil­ment” of little heuristic value to scholars. What is more, the very first application of the concept of late style, which appears to have emerged in nineteenth-century Shakespeare criticism, points to an obvious inconsistency, as Ben Hutchinson notes in the same volume: used indiscriminately, “late style” is conflated with the style of old age, Spätstil with Altersstil. This limitation is clear when reference is made to the late works of Mozart, composed in his thirties, or those of Shakespeare, written before he turned fifty. In defence of Said, however, one could point to his particular interest in “the decay of the body, the onset of ill health” – Beethoven’s deafness or Turner’s failing eyesight come to mind – as a fairly specific criterion, albeit one susceptible to the charge of setting too much store by biography.

more here.

Friday Poem

Wednesday's poem of John Milton was (coincidentally) a segue into this
by Gary Snyder who's book, No Nature, I was reading last night.
What I love about Synder is that each poem's a ripe fruit:

Milton by Firelight

Piute Creek, August 1955

“O hell, what do mine eyes
with grief behold?”
Working with an old
Singlejack miner, who can sense
The vein and cleavage
In the very guts of rock, can
Blast granite, build
Switchbacks that last for years
Under the beat of snow, thaw, mule-hooves.
What use, Milton, a silly story
Of our lost general parents,
eaters of fruit?

The Indian, the chainsaw boy,
And a string of six mules
Came riding down to camp
Hungry for tomatoes and green apples.
Sleeping in saddle-blankets
Under a bright night-sky
Han River slantwise by morning.
Jays squall
Coffee boils

In ten thousand years the Sierras
Will be dry and dead, home of the scorpion.
Ice-scratched slabs and bent trees.
No paradise, no fall,
Only the weathering land
The wheeling sky,
Man, with his Satan
Scouring the chaos of the mind.
Oh Hell!

Fire down
Too dark to read, miles from a road
The bell-mare clangs in the meadow
That packed dirt for a fill-in
Scrambling through loose rocks
On an old trail
All of a summer’s day.

by Gary Snyder
from Riprap and Cold Mountain Poems
Shoemaker & Hoard Publishers.

Prodigies’ Progress: Parents and superkids, then and now

Ann Hulbert in Harvard Magazine:

GeniusIn the fall of 1909, when two wonder boys converged on Harvard—among the first, and for a time the most famous, prodigies of the modern era—their parents proudly assumed a Pygmalion role. Norbert Wiener, the nearly 15-year-old son of the university’s first professor of Slavic languages, Leo Wiener, arrived as a graduate student in (at his father’s direction) zoology. William James Sidis (namesake and godson of the renowned Harvard psychologist who had been a mentor to his father, Boris Sidis) was admitted at 11 as a “special student” after strenuous lobbying by his father. The two superprecocious sons of two very upwardly mobile Russian immigrants, outspoken men with accents and bushy mustaches, inspired suspense. The arrival of these brilliant boys with unusual pedigrees fit the mission of Harvard’s outgoing president, Charles William Eliot, a liberal Boston Brahmin and staunch believer in equality of opportunity. He aimed to open the university’s doors to “men with much money, little money, or no money, provided that they all have brains.” And not just brains, Eliot warned complacent WASPs, who mistook “an indifferent good-for-nothing, luxurious person, idling through the precious years of college life” for an ideal gentleman or scholar. Eliot had in mind an elite with “the capacity to prove by hard work that they have also the necessary perseverance and endurance.”

Boris Sidis and his wife, Sarah, had made it their mission to jolt turn-of-the-century Americans with a thrilling, and intimidating, message: learning, if it was begun soon enough, could yield phenomenal results very early and rapidly. Russian Jews, they had fled the pogroms in Ukraine for the garment sweatshops on the United States’ East Coast in the mid-1880s. Within 10 years they had worked their way to the top of American higher education. By 1898, Sarah was a rare woman with an M.D. (from Boston University School of Medicine), and Boris had racked up a B.A., an M.A., and a Ph.D. in psychology at Harvard within four years. But inborn talent had nothing to do with their feats, or their son’s, they insisted. An as-yet-unimagined potential lay in every child, and it was time parents started cultivating it, Boris urged in an address called “Philistine and Genius,” delivered at Harvard’s summer school in 1909. The country, more than ever, needed “the individuality, the originality, the latent powers of talent and genius” too often wasted.

More here.

Reward research that changes society

Editorial in Nature:

ScienceThere is a classic narrative that stresses the importance and value of fundamental science. To make progress, one must take persistence by researchers, mix in patient financial support and then add creative imagination and logic (important for creating hypotheses and testing predictions). Then sprinkle on some unpredictable outcomes and stew for a century, or perhaps even longer. The 2016 announcement of the detection of gravitational waves is a fine product of this recipe for success. It was borne of theories of relativity that were esoteric but which now, unforeseeable at the time of their origin in 1916, underpin technologies such as global navigation. Readers of Nature probably have their own favourite examples of such success stories. Support for fundamental research remains essential, both as a signal of cultural values and as a driver of future societal progress. But research with a shorter-term or more-local vision of practical outcomes deserves reward and prestige, too — a fact perhaps taken for granted by engineers or clinical scientists, but less so in some other disciplines.

Take, for instance, the way in which regulatory authorities, commercial organizations and physical geographers at the University of Leeds, UK, collaborated to boost water quality and company performance by developing innovative catchment-management strategies in the north of England. Another example is how local health authorities partnered with a digital-media-production company to disseminate content related to a self-help technique developed by psychiatry researchers at King’s College London to combat bulimia. Both these examples are included in a database of case studies collected by the Higher Education Funding Council for England in its pioneering 2014 Research Excellence Framework (REF; see go.nature.com/2zags87). The council assesses the impact of research retrospectively, and rewards high performers with extra funds. This approach has increased financial support for some universities that pursue ‘useful’ research, but that did not fare well in previous, more-traditional funding frameworks. The next REF, which will be conducted in 2021, will allocate more weight (25% up from 20%) to impact assessments — a move that Nature supports. Other funders have signalled that they believe in direct impact, and demand a prospective view of such benefits in funding applications. The database of REF case studies is interesting partly because it highlights straightforward ways of documenting impacts through explicit description and endorsement by researchers’ partners in delivery, and partly because it reveals the variety of pathways to impact.

More here.

Thursday, January 4, 2018

The Biggest Secret

JR-feature-final-4000-2000-4-1514921102-article-header

James Risen in The Intercept:

My case was part of a broader crackdown on reporters and whistleblowers that had begun during the presidency of George W. Bush and continued far more aggressively under the Obama administration, which had already prosecuted more leak cases than all previous administrations combined. Obama officials seemed determined to use criminal leak investigations to limit reporting on national security. But the crackdown on leaks only applied to low-level dissenters; top officials caught up in leak investigations, like former CIA Director David Petraeus, were still treated with kid gloves.

Initially, I had succeeded in the courts, surprising many legal experts. In the U.S. District Court for the Eastern District of Virginia, Brinkema had sided with me when the government repeatedly subpoenaed me to testify before a grand jury. She had ruled in my favor again by quashing a trial subpoena in the case of Jeffrey Sterling, a former CIA officer who the government accused of being a source for the story about the ill-fated CIA operation. In her rulings, Brinkema determined that there was a “reporter’s privilege” — at least a limited one — under the First Amendment that gave journalists the right to protect their sources, much as clients and patients can shield their private communications with lawyers and doctors.

But the Obama administration appealed her 2011 ruling quashing the trial subpoena, and in 2013, the 4th Circuit Court of Appeals, in a split decision, sided with the administration, ruling that there was no such thing as a reporter’s privilege. In 2014, the Supreme Court refused to hear my appeal, allowing the 4th Circuit ruling to stand. Now there was nothing legally stopping the Justice Department from forcing me to either reveal my sources or be jailed for contempt of court.

But even as I was losing in the courts, I was gaining ground in the court of public opinion. My decision to go to the Supreme Court had captured the attention of the nation’s political and media classes. Instead of ignoring the case, as they had for years, the national media now framed it as a major constitutional battle over press freedom.

More here.

The Marvelous Mrs. Maisel, Season One

PhpThumb_generated_thumbnail

Phillip Maciak, Jane Hu, Aaron Bady over at the LA Review of Books:

Here’s the thing about Mrs. Maisel, though: it’s perfect. I don’t even mean that in a strictly evaluative way. Like, I don’t think it’s the best show of the year (hey,The Leftovers!). What I mean is that perfection is a compositional quality and aspiration of the show. Its arguments, as Aaron has also tweeted, are “symphonic,” its visual aesthetic is flawless, the casting is so sharp it feels likeHarry Potter for Jewish American character actors, the stand-up sets are exactly as solid and charming as they are diegetically supposed to be, everybody says either the perfectly right thing or the perfectly wrong thing, its complications are precisely calibrated, its surprises are precisely spring-loaded, its best jokes all have call-backs, and Midge Maisel’s ankles are always the same circumference. There’s nothing messy or ragged or loose or baggy about this show. And that makes it good, but that also makes it a very particular type of show.

Gilmore Girls, for instance, was not perfect in this way. Neither was The Leftovers. Neither was Friday Night Lights. Frasier was perfect. So was Breaking Bad, and so was The West Wing. In other words, perfect and not-perfect are aesthetic categories here. Perfect shows do what they’re supposed to do; not-perfect shows do what they’re going to do. Not-perfect shows can be better than perfect shows and vice versa, but it’s a risk to do either. There were moments when The Leftovers did something so seemingly ill-advised that it could have derailed the whole series. But, in the—frequent—case that The Leftovers pulled it off, the show was transcendent. On the other hand, the perfect shows operate at such great heights and require such high-wire execution that, when they falter, it’s very very noticeable. Gilmore Girls was a long, meandering, free-associative, sometimes rapturous monologue; Mrs. Maisel is a tight ten.

The other thing, though, is that Mrs. Maisel is a perfect show about perfection.

More here.

Why did protests erupt in Iran?

1478909456bc470e89dba91f10731bc7_18

Ahmad Sadri over at Al Jazeera:

The Islamic Republic of Iran is the platypus of humanity's political evolution.

Episodic Iranian unrest, from the focused, reformist uprising of 2009 (led by middle-class protesters of Tehran) to the current, wildly rejectionist riots (spearheaded by the underclass and the unemployed in the poor neighborhoods of provincial towns) cannot be understood in isolation from that melange of procedural democracy and obscurantist theocracy that was crammed into the constitution of revolutionary Iran, four decades ago.

Deep within Iran's authoritarian system there is a tiny democratic heart, complete with elective, presidential and parliamentary chambers, desperately beating against an unyielding, theocratic exoskeleton. That palpitating democratic heart has prolonged the life of the system – despite massive mismanagement of the domestic and international affairs by the revolutionary elites.

But it has failed to soften the authoritarian carapace. The reform movement has failed in its mission because the constitution grants three quarters of the political power to the office of the "Supreme Leader": an unelected, permanent appointment whereby a "religious jurist" gains enormous powers, including command of the armed forces and foreign policy, veto power over presidential cabinets and parliamentary initiatives, and the world's most formidable Pretorian Guard (IRGC), with military, paramilitary, intelligence, judicial and extrajudicial powers to enforce the will of its master.

More here.

Clarence Thomas’s Straussian Moment: The Question of Slavery and the Founding

Strauss

Corey Robin in Crooked Timber:

A question for the political theorists, intellectual historians, and maybe public law/con law experts. The question comes at the very end of this post. Forgive the build-up. And the potted history: I’m writing fast because I’m hard at work on this Clarence Thomas book and am briefly interrupting that work in order to get a reading list.

In the second half of the 1980s, Clarence Thomas is being groomed for a position on the Supreme Court, or senses that he’s being groomed. He’s the head of the EEOC in the Reagan Administration and decides to beef up on his reading in political theory, constitutional law, and American history. He hires two Straussians—Ken Masugi and John Marini—to his staff on the EEOC. Their assignment is to give him a reading list, which they do and which he reads, and to serve as tutors and conversation partners in all things intellectual, which also they do.

These are West Coast Straussians. Both Masugi and Marini hail from the Claremont orbit in California (Masugi was in the think tank, Marino was a student). Unlike the East Coast Straussians—the Blooms and Pangles, who champion a Nietzschean Strauss who’s overtly celebratory of the American Founding but is secretly critical of natural law, natural rights, and the Framers—these West Coast Straussians follow Harry Jaffa, arguing that the American Founding is the consummation of ancient virtue in a modern idiom.

But what’s also true of these West Coast Straussians is that they are intensely interested in race.

More here.

Gender is not a binary—nor is it fluid. The case for “gender viscosity”

Julian Baggin in Prospect Magazine: