On ‘Shakespeare in Swahililand’

Ed Simon at The Millions:

Wilson-Lee’s is an odd hodgepodge of a book—part memoir, part travelogue, part historical account, part literary criticism. And yet despite its chimerical nature, it is an effective book, combining as it does an adept theoretical orientation, an admirable facility with the Explication de texte of Shakespeare’s language, and a humanism that is sometimes lacking in the most arid of literary theory. Too often, conservative “defenders” of Shakespeare against some imagined threat to the canon obscure the very real ways in which both Shakespeare in particular and English literature in general were used to erase the lives and culture of people in colonized lands, as a type of soft artillery. But Wilson-Lee isn’t wrong when he says that it’s hard not to feel that Shakespeare “almost alone among writers, defies such cynicism.” He conjectures that though Shakespeare’s genius may simply be “some grand collective delusion, a truism rather than a truth,” he can’t help but find that “every time, the dawning freshness of a turn of phrase, a short exchange or an orchestrated speech makes dull the cleverness which wrote these impressions off as nostalgic.” In what is one of the book’s most poignantly beautiful scenes, Wilson-Lee describes listening to two surviving records of that Urdu production of Hamlet preserved at the British Library (the film itself being lost to posterity), explaining that the music of that production was pressed neither on vinyl nor wax cylinder, but rather “on discs made from shellac, crushed beetle-shell.” And so he could hear “the same sounds that would have rung out of the ramshackle theatres onto the Mombasa streets, the love songs of Hindustani Shakespeare, preserved in the carcasses of beetles which had once footled around the forests of Bengal.”

more here.



The beauty and mystery of Arabic calligraphy

Robert Irwin in The Spectator:

ShahnamaThe title of this book, By the Pen and What They Write, is a quotation from the Qur’an and comes from the opening of the ‘Surah al-Qalam’ (Chapter of the Pen), in which the authority of the cosmic scribes in heaven, whose writing determines the fate of humanity, is invoked in order to authenticate the revelation that follows. According to Islamic tradition, the Prophet Muhammad was illiterate (and so presumably were most of his audience). So it is odd to find writing featuring so prominently in this surah and throughout the Qur’an. Prior to the revelation of the Qur’an in the seventh century, the only texts that have survived in the Arabian Peninsula are brief, unargumentative rock inscriptions and many of these are in languages or scripts other than Arabic. So, as Angelika Neuwirth, one of the distinguished scholars to contribute to this volume, observes:

It is a striking fact then, that the Qur’an appears — seemingly — out of the void, as a fully fledged discursive text, extensive in range and replete with philosophical and theological queries.

The Bible consists of many diverse texts by diverse hands that have been assembled over the centuries. The Qur’an is not like that. Its message is held to be eternal, inimitable and untranslatable, and it was revealed to just one man in a matter of decades.AdTech Ad As a consequence, the Arabic language and script had and still has a special prestige among Muslims. That prestige had been increased towards the end of the seventh century when the Umayyad Caliph, Abd al-Malik, decreed that Arabic should be the sole language of administration in the Muslim empire and that its currency should bear Arabic inscriptions. Ambitious Nabataeans, Persians, Copts and others hastened to learn the Arabic language and script. Arabic became the major language of international commerce.

Baghdad, a city with a population many times that of medieval London and Paris combined, had an unprecedentedly large literate population. Because of this, and because of the replacement of expensive parchment by paper, literature flourished under the Abbasid caliphs from the late eighth century onwards. Hugh Kennedy concludes his chapter entitled ‘Baghdad as a Centre of Learning and Book Production’, with these resound-ing words:

I should like to argue that Abbasid Baghdad was probably the first place on the planet where an author could make a living, not by being independently wealthy or having a wealthy patron, or even being part of an institution like a monastery that subsidised his activities, but by writing books to be sold in the market to a literate public.

‘The past is a foreign country: they do things differently there.’ What was true of L.P. Hartley’s presentation of Victorian England was even more the case for the medieval Arab book world.

More here.

A Baby Wails, and the Adult World Comes Running

Natalie Angier in The New York Times:

CryA normal human baby, according to psychologists, will cry about two hours over the course of a day.

A notorious human crybaby, according to her older siblings, parents and the building superintendent, will cry for two hours every two hours, refusing to acknowledge any distinction between crying and other basic infant activities, like “being awake” or “breathing.” Current and former whine enthusiasts, take heart. It turns out that infant crying is not only as natural and justifiable as breathing: The two acts are physically, neurologically, primally intertwined. Scientists have discovered that the small cluster of brain cells in charge of fast, active respiration also grant a baby animal the power to cry.

Reporting in the Proceedings of the National Academy of Sciences, Carmen Birchmeier and Luis Hernandez-Miranda, of the Max Delbruck Center for Molecular Medicine in Berlin, and their colleagues showed that infant mice stripped of this key node — a mere 17,000 neurons, located in the evolutionarily ancient hindbrain — can breathe slowly and passively, but not vigorously or animatedly. When they open their mouths to cry, nothing comes out. As a result, their mothers ignore them, and the poorly breathing pups quickly die. “This was an astonishing finding,” Dr. Birchmeier said. “The mother could see the pups and smell the pups, but if they didn’t vocalize, it was as though they didn’t exist.” The new study is just one in a series of recent reports that reveal the centrality of crying to infant survival, and how a baby’s bawl punches through a cluttered acoustic landscape to demand immediate adult attention. The sound of an infant’s cry arouses a far quicker and stronger response in action-oriented parts of the adult brain than do similarly loud or emotionally laden noises, like a dog barking or a neighbor weeping. Scientists also have shown that the cries of many infant mammals share a number of basic sonic properties.

More here.

Monday, September 4, 2017

Sunday, September 3, 2017

How to Regulate Artificial Intelligence

Oren Etzioni in the New York Times:

ScreenHunter_2807 Sep. 03 21.47The technology entrepreneur Elon Musk recently urged the nation’s governors to regulate artificial intelligence “before it’s too late.” Mr. Musk insists that artificial intelligence represents an “existential threat to humanity,” an alarmist view that confuses A.I. science with science fiction. Nevertheless, even A.I. researchers like me recognize that there are valid concerns about its impact on weapons, jobs and privacy. It’s natural to ask whether we should develop A.I. at all.

I believe the answer is yes. But shouldn’t we take steps to at least slow down progress on A.I., in the interest of caution? The problem is that if we do so, then nations like China will overtake us. The A.I. horse has left the barn, and our best bet is to attempt to steer it. A.I. should not be weaponized, and any A.I. must have an impregnable “off switch.” Beyond that, we should regulate the tangible impact of A.I. systems (for example, the safety of autonomous vehicles) rather than trying to define and rein in the amorphous and rapidly developing field of A.I.

I propose three rules for artificial intelligence systems that are inspired by, yet develop further, the “three laws of robotics” that the writer Isaac Asimov introduced in 1942: A robot may not injure a human being or, through inaction, allow a human being to come to harm; a robot must obey the orders given it by human beings, except when such orders would conflict with the previous law; and a robot must protect its own existence as long as such protection does not conflict with the previous two laws.

These three laws are elegant but ambiguous: What, exactly, constitutes harm when it comes to A.I.? I suggest a more concrete basis for avoiding A.I. harm, based on three rules of my own.

More here.

The Future Of Work And The Social Welfare State’s Survival

Steven-Hill

Steven Hill in Social Europe:

A closer look at Germany, one of the strongest economies in Europe, is revealing. Overall, the work force has become increasingly complex and fissured, with many workers moving between different types of work — from self-employed to temp, from full-time to part-time, to mini-job to werkvertragsubcontractor, and back again. More workers now supplement their income with second, third and fourth jobs. Indeed, Eurostat says the number of Germans holding two jobs at once has nearly doubled in ten years from 1.2 million to 2.2m.

Businesses especially like hiring self-employed workers because they save 25-30% on their labor costs. Employers don’t have to pay for these workers‘ health care, retirement pension, sick leave, vacations or injured worker and unemployment compensation. Self-employed women are not entitled to maternity leave. The self-employed in Germany, like in most European member states, are legally required to pay both the employers‘ half and their own half of the health care contribution. In Germany, that amounts to a minimum of 14.6% out of their wages. And the self-employed are responsible for saving for their own retirement as well, with no contributions from employers as regularly-employed workers receive.

Nevertheless, many self-employed workers are attracted to the flexible scheduling, at least at first. But after a while many grow weary of this new kind of grind. A European Commission report found that the self-employed in Germany are 2.5 times more at-risk of poverty than salaried workers. A study by the Wissenschaftliches Institut der AOK found that among low income workers, solo self-employed Germans spend an astounding 46.5 percent of their income for health insurance. Not surprisingly, one study found that about half of self-employed workers would accept regular employment if decent jobs were available.

More here.

THE SOCIALIST EXPERIMENT

6923b87b33bf5bdc641fc5442582f5a8_XL

Katie Gilbert in Oxford American magazine:

Chokwe Lumumba had been the mayor of Jackson, Mississippi, for five months when, in November 2013, he stood behind a lectern and addressed a group of out-of-towners with a curious phrase he would soon explain with a story: “Good afternoon, everybody, and free the land!”

On his tall, thin frame he wore a bright blue tie and a loosely fitting suit, extra fabric collecting around the shoulders of his jacket. Wire-rimmed glasses rested over a perpetually furrowed brow on his narrow, thoughtful, frequently smiling face. A faint white mustache grazed his upper lip.

In welcoming the attendees of the Neighborhood Funders Group Conference, a convening of grantmaking institutions, Mayor Lumumba was conversational and at ease, as he tended to be with microphone in hand. His friends had long teased him for his loquaciousness in front of a crowd.

Lumumba informed the room that on the car ride over he’d decided he would tell them a story. He explained that big things were happening in Jackson—or, were about to happen—and his story would offer some context. It was one he had recounted many times. Polished smooth, the story was like an object he kept in his pocket and worried with his thumb until it took on the sheen of something from a fable, though the people and events were real. “It was March of 1971 when I first came to the state of Mississippi,” Lumumba began. “It was several months after the students at Jackson State had been murdered,” he said, referring to the tragedy at the city’s predominantly black college, which left two dead and twelve injured after police opened fire on a campus dormitory in May 1970, less than two weeks after the Kent State shootings.

Lumumba had traveled to Mississippi with a group called the Provisional Government of the Republic of New Afrika. He was twenty-three at the time and was taking a break from his second year of law school in Detroit.

More here.

There’s a disaster much worse than Texas. But no one talks about it

Jonathan Freedland in The Guardian:

2880A quick quiz. No Googling, no conferring, but off the top of your head: what is currently the world’s worst humanitarian disaster? If you nominated storm Harvey and the flooding of Houston, in Texas, then don’t be too hard on yourself. Media coverage of that disaster has been intense, and the pictures dramatic. You’d be forgiven for thinking that this supposedly once-in-a-thousand-years calamity – now happening with alarming frequency, thanks to climate change – was the most devastating event on the planet.

As it happens, Harvey has killed an estimated 44 Texans and forced some 32,000 into shelters since it struck, a week ago. That is a catastrophe for every one of those individuals, of course. Still, those figures look small alongside the havoc wreaked by flooding across southern Asia during the very same period. In the past few days, more than 1,200 people have been killed, and the lives of some 40 million others turned upside down, by torrential rain in northern India, southern Nepal, northern Bangladesh and southern Pakistan.

That there is a disparity in the global attention paid to these two natural disasters is hardly a novelty. It’s as old as the news itself, expressed in one, perhaps apocryphal Fleet Street maxim like a law of physics: “One dead in Putney equals 10 dead in Paris equals 100 dead in Turkey equals 1,000 dead in India equals 10,000 dead in China.”

More here.

Ken Burns’s American Canon

Ian Parker in The New Yorker:

KenLike Steven Tyler, of Aerosmith, Ken Burns has a summer house on Lake Sunapee, in New Hampshire. The property is furnished with Shaker quilts and a motorboat; every July 4th, a fifteen-foot-long American flag hangs over the back deck. He bought the house in the mid-nineties, with money earned from “The Civil War,” his nine-part PBS documentary series, and its spinoffs. When PBS first broadcast that series, in a weeklong binge in the fall of 1990, the network reached its largest-ever audience. The country agreed to gather as if at a table covered with old family photographs, in a room into which someone had invited an indefatigable fiddle player. Johnny Carson praised the series in successive “Tonight Show” monologues; stores in Washington, D.C., reportedly sold out of blank videocassettes. To the satisfaction of many viewers, and the dismay of some historians, Burns seemed to have shaped American history into the form of a modern popular memoir: a tale of wounding and healing, shame and redemption. (The Civil War was “the traumatic event in our childhood,” as Burns later put it.) History became a quasi-therapeutic exercise in national unburdening and consensus building. Burns recently recalled, “People started showing up at the door, wanting to share their photographs of ancestors.”

Burns is now sixty-four. He is friends with John Kerry and John McCain. He has been a character on “Clifford’s Puppy Days,” the animated children’s series—“What’s a documentary?” “Great question!”—and has been a guest at the Bohemian Grove, the off-the-record summer camp in Northern California for male members of the American establishment. Visitors to his office see a display of framed Burns-related cartoons, most of which assume familiarity with his filmmaking choices: an authoritative narrator offset by more emotionally committed interviewees, seen in half-lit, vaguely domestic surroundings; slow panning shots across photographs of men with mustaches; and a willingness, unusual in the genre, to attempt compendiousness, to keep going. Last year, a headline in the Onion read “Ken Burns Completes Documentary About Fucking Liars Who Claimed They Watched Entire ‘Jazz’ Series.”

More here.

A View of Her Own

Sarah Nicole Prickett in Bookforum:

Article00BEGINNING THE SECOND PARAGRAPH of her 1973 essay on Virginia Woolf and the Bloomsbury set, published in the New York Review of Books, which she cofounded, Elizabeth Hardwick had a line on the lesser members of that mutual entourage: “Certain peripheral names vex the spirits.” When the essay appeared a year later in Seduction and Betrayal, her formational work on the fates of literary women and women in literature, the line had changed and become: “Certain peripheral names scratch the mind.” No editor, and perhaps only this writer, would make such a change. Unnecessary, it’s instructive. “Certain peripheral names” is contemporaneous, cool and distant, idiomatic of her good friend Joan Didion. Wallace Stevens had “scratch the mind” in a 1938 poem, “The Man on the Dump,” but a nightingale (symbolic, like all birds) was doing the scratching, so the verb made sense. Tricky to effect a key change in a decasyllable. Hardwick makes the odd sentence weirder. Where “vex” exaggerated and made tonal her discord, and came naturally with “the spirits” to a writer who’d studied metaphysical poetry in graduate school, the furtive, unlikely “scratch” awakes in midsentence those annoying minor characters from their index and disturbs the reader, on whom the revised line falls faster thanks to the shorter “mind” and the continuous s-word after the plural, the unlikelihood registering when it’s too late to object: Who says that?

Hardwick could do more in six words than any Hemingway type, including Hemingway. Her feats of compression were exactly that, special, not habitual, because she was not really laconic and liked words better than she liked choosing between them. Her complaints about the process remind me of the writer (Jeff Goldblum) in Philip Kaufman’s Invasion of the Body Snatchers (1978), protesting the output of an overproductive hack. “It takes me six months to write one line sometimes,” he says, and when a girl asks why: “Because I pick each word individually, that’s why!” Devoted to “the interest of the mind of the individual [writer],” interested in a plurality of writers and literature, places and persons who don’t belong, Hardwick became singular eventually, not as an event; she is more so now that she has not been greatly imitated. I get it: You would be hard-pressed to trace your own thoughts over her syntax the way you can copy, half-consciously, sentences by Didion, or the way Didion copied sentences by Hemingway. Any Hardwickian rules for writing she left unsaid, subject to desire. Even her students at Barnard, said one who wrote about her later, learned little in the way of technique and found that “the idea was to study her, not a particular subject,” while she thought the purpose of study was to acclimate writers to harder lives.

More here.

Saturday, September 2, 2017

assessing kathy acker

Kathy_at_26th_st_studio_1990_nycSarah Ditum at Literary Review:

Kathy Acker is a difficult subject for a biography, largely because, as Chris Kraus notes at the outset of her book, she ‘lied all the time’. Every bit of Acker’s life tended to be fed back through the creative mill, becoming a part of either her experimental writings or her other great project, the invention of Kathy Acker: a pixie-cropped, tattooed, muscle-strapped icon of rebel literature whose confrontational autofiction broke ground, allowing other artists to make the mess of their lives into the medium of their work. Maggie Nelson, the Riot grrrl feminist punk movement and Kraus herself (whose novels cross the boundary between fiction and truth) all have their debts to Acker. But to understand Acker and her piratical, pornographic output, it helps to revisit the culture she belonged to: mid-20th-century America, and the artistic demimondes of New York and San Francisco especially.

Kraus supplies two anecdotes that, though they don’t involve Acker herself, serve to frame the times and attitudes she wrote against and out of. The first is an account of Chris Burden’s 1972 performance piece TV Hijack, which took place around the time Acker was first experimenting with DIY self-publication (producing works such as Politics and The Childlike Life of the Black Tarantula). Burden, Kraus writes, ‘appeared on Phyllis Lutjeans’s cable TV interview show and surprised her by holding a knife to her throat. Lutjeans refused to press charges. Later, she’d explain how his assault “taught her a lesson”: her desire to anchor a show was driven by her own “ego and pride”’. For Kraus, this episode is typical of ‘an era when people seemed eager for “lessons”’.

more here.

kafka: the early years

John Banville at the NYRB:

For a person as sensitive as Kafka was, or at least as he presented himself as being—it is entirely possible to view his life in a light other than the one he himself shone upon it—inner escape was the only available strategy. “If we are to believe his own personal mythology,” Stach writes, “he drifted out of life and into literature,” to the point, indeed, that as an adult he would declare that he was literature, and nothing else. Stach, however, offers another and, in its way, far more interesting possibility when he asks, “What if literature was the only feasible way back for him?” Yet along this route into the psychological depths of Kafka’s emotional and artistic self we must pick our way carefully, recalling Kafka’s own skepticism toward Freudian analysis—“I consider the therapeutic part of psychoanalysis a helpless error”—and keeping in mind one of what are known as the Zürau aphorisms, in which he declares with uncharacteristic vehemence: “No psychology ever again!”

Naturally much of this volume is taken up with an account of Kafka’s formal education. One might expect that the student years of an artist would be of great biographical interest, but it is rarely the case, and Stach’s account of Kafka’s schooling is no exception. Perhaps the reason is that the education of an artist is for the most part a self-administered process, the progress of which is not recorded in class placings and examination results.

more here.

The mysterious masterpiece of Portugal’s great modernist

170904_r30463Adam Kirsch at The New Yorker:

If ever there was a writer in flight from his name, it was Fernando Pessoa. Pessoa is the Portuguese word for “person,” and there is nothing he less wanted to be. Again and again, in both poetry and prose, Pessoa denied that he existed as any kind of distinctive individual. “I’m beginning to know myself. I don’t exist,” he writes in one poem. “I’m the gap between what I’d like to be and what others have made of me. . . . That’s me. Period.”

In his magnum opus, “The Book of Disquiet”—a collage of aphorisms and reflections couched in the form of a fictional diary, which he worked on for years but never finished, much less published—Pessoa returns to the same theme: “Through these deliberately unconnected impressions I am the indifferent narrator of my autobiography without events, of my history without a life. These are my Confessions and if I say nothing in them it’s because I have nothing to say.”

This might sound like an unpromising basis for a body of creative work that is now considered one of the greatest of the twentieth century. If a writer is nothing, does nothing, and has nothing to say, what can he write about? But, like the big bang, which took next to nothing and turned it into a cosmos, the expansive power of Pessoa’s imagination turned out to need very little raw material to work with. Indeed, he belongs to a distinguished line of European writers, from Giacomo Leopardi, in the early nineteenth century, to Samuel Beckett, in the twentieth, for whom nullity was a muse.

more here.

Must we really “love one another or die”? A few words on Auden’s “September 1, 1939”

Cynthia Haven in The Book Haven:

Auden-bookSeptember 1, 1939, is the day Nazi Germany invaded Poland. W.H. Audenfamously wrote a poem to commemorate the occasion. “September 1, 1939” begins:

I sit in one of the dives
On Fifty-second Street
Uncertain and afraid
As the clever hopes expire
Of a low dishonest decade:
Waves of anger and fear
Circulate over the bright
And darkened lands of the earth,
Obsessing our private lives;
The unmentionable odour of death
Offends the September night.

Accurate scholarship can
Unearth the whole offence
From Luther until now
That has driven a culture mad,
Find what occurred at Linz,
What huge imago made
A psychopathic god:
I and the public know
What all schoolchildren learn,
Those to whom evil is done
Do evil in return.

The poem was taken up after 9/11, and appeared under thumb tacks and refrigerator magnets throughout the nation. But the last lines of the second stanza got special scrutiny in the new century. Was it referring to eternal truths? Or claiming the Versailles Treaty that ended World War I justified the new invasion?

More here.

The princess myth: Hilary Mantel on Diana

Hilary Mantel in The Guardian:

ScreenHunter_2806 Sep. 02 14.31Royal time should move slowly and by its own laws: creeping, like the flow of chrism from a jar. But 20 ordinary years have jog-trotted by, and it’s possible to have a grownup conversation with someone who wasn’t born when Diana died. Her widower is long remarried. Her eldest son, once so like her, shows signs of developing the ponderous looks of Philip, his grand-father. Diana should be as passe as ostrich plumes: one of those royal or quasi-royal women, like Mary of Teck or Wallis Simpson or the last tsarina, whose images fade to sepia and whose bones are white as pearls. Instead, we gossip about her as if she had just left the room. We still debate how in 1981 a sweet-faced, puppy-eyed 20-year-old came to marry into the royal house. Was it a setup from the start? Did she know her fiance loved another woman? Was she complicit, or was she an innocent, garlanded for the slab and the knife?

For some people, being dead is only a relative condition; they wreak more than the living do. After their first rigor, they reshape themselves, taking on a flexibility in public discourse. For the anniversary of her death, the princess’s sons remember her for the TV cameras, and we learn that she was “fun” and “very caring” and “a breath of fresh air”. They speak sincerely, but they have no news. Yet there is no bar on saying what you like about her, in defiance of the evidence. Private tapes she made with her voice coach have been shown in a TV documentary, Diana: In Her Own Words. They were trailed as revealing a princess who is “candid” and “uninhibited”. Yet never has she appeared so self-conscious and recalcitrant. Squirming, twitching, avoiding the camera’s eye, she describes herself hopefully as “a rebel”, on the grounds that she liked to do the opposite of everyone else. You want to veil the lens and explain: that is reaction, not rebellion. Throwing a tantrum when thwarted doesn’t make you a free spirit. Rolling your eyes and shrugging doesn’t prove you are brave. And because people say “trust me”, it doesn’t means they’ll keep your secrets.

More here.

Body politic: Women in the cinema of Partition

Feryal Ali Gauhar in Herald:

CinemaThe cinematic experience is a gratifying hoax, predicated on a suspension of disbelief. We are convinced that all the disparate elements contributing to the production of a filmic experience – such as the transition of time and space, sometimes expanded, oftentimes contracted, the sequencing of scenes, the staging of action, the movement or stillness of camera, the scripted, memorised, rehearsed, measured, timed and delivered dialogue, the birth and nurturing of characters, the orchestration of light, the composition of music – are not crafted but, combined with each other, represent a well-spliced, invisibly strung-together reality. Cinema’s power lies in the illusion it creates, in making us believe that the constructed image, carefully (or carelessly) crafted and structured, is a reality that we are privileged to watch from a safe distance. The act of watching a film, of being in a darkened space, alone yet surrounded by others who are also alone, is like allowing oneself to enter spaces not visible in the stark light of the day. These are constructed spaces, made to seem alive, throbbing with possibility, enabling the human heart to feel things we would otherwise be guarded about.

Film theorists in the 1970s held that cinema provides its viewers a separation from their own egos or perceptions of reality while at the same time reinforcing those egos and perceptions. Perhaps the power of cinema lies in inducing us to subject our ‘self’ to a momentary and perceived loss of control, sort of like a free-fall experience from a twin-engine plane. We know that soon enough, perhaps too soon, we shall hit terra firma and all will be well again; that we will no longer have to engage with difficult situations or deal with suppressed emotions; that we will be unfettered by the suffering of the tragic hero who makes us cry and the buffoonery of the comedian who makes a fool of himself or herself for our pleasure. So why is it then, that, as makers and watchers of films, we return constantly to subjects of human misery and turmoil, to representations of what we consider historical truths, to the scenes of terrible violence, to the destruction of nations, cities, memories, lives? More importantly, why is it that cinema based on ‘historical fact’ is usually about turbulence and injustice and not about peace and prosperity? Why do we feel the need to revisit the past, along with its unresolved angst and the agony of things that went terribly wrong? Is the purpose of investing large amounts of money in film production to celebrate human suffering?

More here.