Deng Xiaoping: A Revolutionary Life

Dikotter_06_15Frank Dikötter at Literary Review:

Deng was himself denounced during the Cultural Revolution, a decade of willed chaos ably and critically covered by the authors. Even so, they underplay just how complicit Deng was in creating the very campaign that engulfed him and his family. They never mention his active participation in vicious denunciation meetings against the former minister of public security Luo Ruiqing, who eventually jumped through a window in an unsuccessful bid to commit suicide ('He dived like a popsicle,' Deng scoffed). Likewise missing is even a cursory reference to Deng's leading role in persecuting Ulanfu, the head of Inner Mongolia, whom Deng harshly accused in July 1966 of every conceivable crime, from 'taking the capitalist road' to 'opposing Chairman Mao'. A few months later Deng's own turn came, with accusations that he was the 'Number Two Capitalist Roader'.

But the Chairman protected Deng. Unlike many of his colleagues, he rarely endured struggle sessions with Red Guards, but spent many years in the countryside, sheltered from the vagaries of the Cultural Revolution. As the third and final part, 'The Pragmatist', demonstrates, the Chairman even brought him back twice, using him to counterbalance the growing influence of Zhou Enlai. Here, too, there are curious omissions. It is well known, for instance, that as chief-of-staff of the People's Liberation Army, Deng ordered a military crackdown on a Muslim-dominated county in Yunnan in 1975, prompting the massacre of over 1,600 people, some of them children.

more here.



why do we get bored, and what is the point of boredom? The science of being sick and tired

Tosin Thompson in New Statesman:

WheelSo, what is exactly is boredom?

The Oxford dictionary describes it as: “Feeling weary and impatient because one is unoccupied or lacks interest in one’s current activity”. For a feeling so common, it's surprising that the word first appeared written down in 1852, in Charles Dickens’ Bleak House. In it, Lady Dedlock says she is “bored to death” with her marriage. The late Robert Plutchik, a Professor Emeritus at the Albert Einstein College of Medicine, created a “Wheel of Emotions” (extended in order of intensity) in 1980, and placed boredom after disgust, as a milder form of disgust:

Although boredom is essential for human development it’s been given a bad rap. “Boredom has traditionally been associated with a range of negative outcomes, both within the workplace and outside it,” Sandi Mann and Rebekah Cadman of the University of Central Lancashire write in their 2014 paper. Mann and Cadman examined the relationship between boredom and creative potential on a range of tasks in two studies. In the first study, 80 eager volunteers visited their lab only to be given the dull, monotonous chore of copying out lengthy lists of telephone numbers, or to be excluded from it (this was the control group), followed by the creative task of thinking of as many possible uses for a pair of plastic cups. In the second study, a further 90 volunteers were split into three groups, each group being assigned to various types of boring activities (copying numbers, reading the numbers, or being excused from the whole thing – again, a control), followed by a creative task. “Results suggested that boring activities resulted in increased creativity and that boring reading activities lead to more creativity in some circumstances,” the authors write.

More here.

Pregnancy: Prepare for unexpected prenatal test results

Diana W. Bianchi in Nature:

Comment1A healthy pregnant woman has a blood test to rule out the possibility that her baby has certain abnormalities, such as Down's syndrome. One week later, a genetic counsellor calls her and recommends a follow-up test such as amniocentesis. When the counsellor calls again, she says that the baby is healthy but that the mother needs to be screened for cancer. Since 2011, clinicians have been able to analyse the genome of a fetus by sequencing DNA fragments found floating in the mother's blood. With the use of these non-invasive prenatal tests soaring (see 'Test scores'), mothers are increasingly facing unexpected, 'incidental' findings about their own health. As of late 2014, at least 26 pregnant women with abnormal blood-test results later learned that they had cancer1. In 10 of them, the prenatal tests prompted the medical assessments that revealed this; in the other 16, the cancers were not discovered until the mothers developed symptoms.

Parents, obstetricians and physicians have been taken by surprise. Consent forms used by test providers rarely mention the possibility of findings concerning the mother's health. And caregivers have little guidance on what to do when such findings arise. Test providers need to rethink their consent forms to prevent unwarranted confusion and anxiety — not least, women deciding to terminate their pregnancies on the basis of wrong interpretations of test results2. And professional societies, such as the American College of Medical Genetics and Genomics (ACMG), the American College of Obstetricians and Gynecologists and the Society for Maternal-Fetal Medicine (SMFM), need to take the lead on providing education and clinical guidance.

More here.

Hotel Melancholia

Hotel-Despair-42-24280141-960x601

Suzanne Joinson in Aeon (image Western Motel by Edward Hopper. 1957):

A person is not supposed to be in both Asia and Africa in the same week on a regular basis; the world should not be traversed at that speed. It was scrambling, discombobulating; worse, it was damaging – some central element of my subjective self was being ebbed away. Yet, I still said yes. I was the go-to girl for a last-minute flight to anywhere, and whenever I returned home, lightly tethered to a house-share in Brixton, south London, I plotted to be away again.

When I climbed out of a taxi on my way home, or dragged my suitcase towards my front door, I would think of Jean Rhys, writing in Good Morning, Midnight (1939): ‘Walking back in the night. Back to the hotel. Always the same hotel … You go up the stairs. Always the same stairs, always the same room.’ My life on a loop, searching for the new, but in reality going round in circles.

There is a part of the brain called the hippocampus that is shaped just like a seahorse. It is in many ways still an unconquered mystery, but it is believed to act as an internal sat-nav. It provides a crossroads between memory and the processing of location, and not just locations of geography and place – although it does deal in those, contextualising landmark objects and images to understand landscapes, interiors and scenes – but also the mapping of an emotional geography such as future goals and aspirations and how to reach them, or memory sequences, or the systemisation of our own personal narratives. It is how we understand where we are and how we put ourselves into the points of view of others. Depression has been found to have a dampening and distorting effect on the hippocampus, so that we become, in many layers of the word, lost.

I don’t know if my hippocampus navigator was suppressed by too much travel or if I was simply exhausted from a decade of avoiding intimate relationships and any semblance of a stable home. Whatever it was, the suicidal impulse triggered by the architecture of hotels and all the signifiers connected to them – key cards, long corridors, the ting of a service bell – kept growing stronger.

More here.

Separated at Birth

150604_HIST_1947-Mountbatten.jpg.CROP.original-original

A section Excerpted from Midnight’s Furies: The Deadly Legacy of India’s Partition by Nisid Hajari, out now from Houghton Mifflin Harcourt, in Slate:

When they imagine the terrible riots that accompanied the 1947 partition of the Indian subcontinent, most people are picturing the bloodshed in the Punjab. On Aug. 15, the new border had split the province in two, leaving millions of Punjabi Hindus and Sikhs in what was now Pakistan, and at least as many Punjabi Muslims in India.

Gangs of killers roamed the border districts, slaughtering minorities or driving them across the frontier. Huge, miles-long caravans of refugees took to the dusty roads in terror. They left grim reminders of their passage—trees stripped of bark, which they peeled off in great chunks to use as fuel; dead and dying bullocks, cattle, and sheep; and thousands upon thousands of corpses lying alongside the road or buried shallowly. Vultures feasted so extravagantly that they could no longer fly.

As awful as the carnage was, though, it was for much of August concentrated in the Punjab. The combatants were mostly peasants, armed with crude weapons. If the two new governments had managed to quell the mayhem quickly, they might in time have found scope to cooperate on issues ranging from economic development to foreign policy. Instead, the infant India and Pakistan would soon be drawn into a rivalry that’s lasted almost 70 years and has cast a nuclear shadow over the subcontinent.

More here.

Review of Jean Dreze and Amartya Sen’s An Uncertain Glory: India and its Contradictions,

Download

Sucharita Mukherjee in Logos:

India’s extraordinary heterogeneities have long attracted, dismayed and befuddled visitors and researchers alike but what is often obscured in discussions of the usual divides of religion, caste, gender, region and language is the burgeoning of income inequality. While India’s poor have been hard to ignore since the colonial era, the persistent deprivation of these masses of humanity seem all the starker in modern India when juxtaposed with the ever growing opulence of the rich.

Class and other inequalities combine to impose multiple vicious circles upon those scraping by at the most disadvantaged end of the social spectrum. Dreze and Sen identify this “resilient division” between the privileged and the rest in Indian society as India’s biggest challenge, yet one seldom highlighted in all the recent hype about Indian prosperity. Making a strong case for equality and social justice amid economic progress, An Uncertain Glory becomes an almost indispensable economic, social and political reader to understand India’s checkered development story.

The distinction between a narrow concept of economic growth, defined in terms of rising aggregate income levels, and a broad notion of economic development, defined as improvement in the average person’s standard of living. is often obscured in political and economic logics based on faith in the eventual trickle-down of growth from the top to propel a broader based development. Admittedly, many of India’s challenges such as poverty, economic and gender inequality or illiteracy are lasting vestiges of a colonial past which either aggravated them or left them unaddressed. Immediate post-independence policies also did little to address them adequately.

More here.

Tuesday, June 9, 2015

What really went on in Wonderland

150608_r26609-320Anthony Lane at The New Yorker:

Legend has it that a book came out of a boat trip, but nothing is ever that simple. The mathematician, moonlighting as an alchemist, turned things both animate and inanimate into different substances. Dodgson became a dodo (a word that toys not just with extinction but with Dodgson’s own tendency to stammer), while Duckworth, who later became chaplain to Queen Victoria, shrank into a duck; both creatures splash about not in a sun-warmed river but in a pool of a child’s tears. Alice Liddell became “Alice,” with no surname to tether her. “Alice’s Adventures Underground” became what we call, for the sake of convenience, “Alice in Wonderland,” although there is no such book. “Alice’s Adventures in Wonderland” was published in 1865; the hundred-and-fiftieth anniversary has been widely celebrated this year. In 1871 came “Through the Looking-Glass, and What Alice Found There”—another title that we often elide or get wrong. In that fable, our heroine walks into a wood where objects lose their names. She puts her hand on a tree, and can’t summon the word for it. Even her own identity escapes her: “Then it really has happened, after all! And now, who am I?”

Douglas-Fairhurst is at home with transformation. His previous work, “Becoming Dickens” (2012), the best and the most fine-fingered of the many books published to coincide with the bicentenary of the novelist’s birth, touched upon the genesis of “The Pickwick Papers,” “Oliver Twist,” and other early successes. If Dickens scholarship is a crowded field, however, Carroll studies should have a sign nailed firmly above the door: “Standing Room Only.”

more here.

How textiles repeatedly revolutionized technology

Virginia Postrel in Aeon:

ScreenHunter_1219 Jun. 09 15.56The story of technology is in fact the story of textiles. From the most ancient times to the present, so too is the story of economic development and global trade. The origins of chemistry lie in the colouring and finishing of cloth. The textile business funded the Italian Renaissance and the Mughal Empire; it left us double-entry bookkeeping and letters of credit, Michelangelo's David and the Taj Mahal. As much as spices or gold, the quest for fabrics and dyestuffs drew sailors across strange seas. In ways both subtle and obvious, textiles made our world.

Most conspicuously, the Industrial Revolution started with the spinning jenny, the water frame, and the thread-producing mills in northern England that installed them. Before railroads or automobiles or steel mills, fortunes were made in textile technology. The new mills altered where people lived and how they worked. And the inexpensive fabrics they produced changed the way ordinary people looked.

Then, a second conspicuous wave of textile innovation began with the purple shade that francophile marketers named mauve. The invention of aniline dyes in the mid-19th century made a full spectrum of colour – including newly intense blacks – universally available. The synthetic-dye business gave rise to the modern chemical industry, and yet more technology-based fortunes.

More here.

Arabic-language, M.I.A.-inspired Israeli girl band

Gaar Adams in Foreign Policy:

ScreenHunter_1218 Jun. 09 15.52The music video for “Habib Galbi” (Love of My Heart), a sorrowful Yemeni folk song, opens with a simple shot across the desert. Inside a small hut, an exasperated woman pulls back the woven curtain of a Bedouin tent and croons in Arabic over a hollow, hypnotic drumbeat and ghostly minor key: “Love of my heart and eyes, it is a wonder who has turned you against me.”

From the shisha-smoking old lady with kohl-lined eyes, to the Yemeni dance sequences and classically Arabic mournful undertones, “Habib Galbi” looks like it could be straight out of southern Arabia. And in some ways, it is: The song is sung in authentic Yemeni dialect and is composed from the lyrics of ancient Yemeni folk songs. When a Yemeni friend recently played “Habib Galbi” for his elderly grandmother in Sanaa, their accents were so good she thought that the all-girl singing trio might be from the Haraz, a rugged mountainous region just west of the capital.

But the sandy landscape in the music video is far from the Haraz Mountains — it was shot over 1,500 miles away in the Arabah region near the Mediterranean Sea. Though the Arabic may sound effortless, those singing it actually only know the language as a second tongue. And the band — called A-Wa, a stylized transliteration of Arabic slang for “yeah” — hasn’t even come close to stepping foot in Yemen. They’re Israeli.

More here.

Hacking the Brain: How we might make ourselves smarter in the future

Maria Konnikova in The Atlantic:

BrainThe perfectibility of the human mind is a theme that has captured our imagination for centuries—the notion that, with the right tools, the right approach, the right attitude, we might become better, smarter versions of ourselves. We cling to myths like “the 10 percent brain”—which holds that the vast majority of our thinking power remains untapped—in part because we hope the minds of the future will be stronger than those of today. It’s as much a personal hope as a hope for civilization: If we’re already running at full capacity, we’re stuck, but what if we’re using only a small fraction of our potential? Well, then the sky’s the limit. But this dream has a dark side: The possibility of a dystopia where an individual’s fate is determined wholly by his or her access to cognition-enhancing technology. Where some ultra-elites are allowed to push the limits of human intelligence, while the less fortunate lose any chance of upward mobility. Where some Big Brother–like figure could gain control of our minds and decide how well we function.

What’s possible now, and what may one day be? In a series of conversations with neuroscientists and futurists, I glimpsed a vision of a world where cognitive enhancement is the norm. Here’s what that might look like, and how we can begin thinking about the implications.

More here.

The Evidence Points to a Better Way to Fight Insomnia

Austin Frakt in The New York Times:

SleepInsomnia is worth curing. Though causality is hard to assess, chronic insomnia is associated with greater risk of anxiety, depression, hypertension, diabetes, accidents and pain. Not surprisingly, and my own experience notwithstanding, it is also associated with lower productivity at work. Patients who are successfully treated experience improved mood, and they feel healthier, function better and have fewer symptoms of depression. Which remedy would be best for me? Lunesta, Ambien, Restoril and other drugs are promised by a barrage of ads to deliver sleep to minds that resist it. Before I reached for the pills, I looked at the data. Specifically, for evidence-based guidance, I turned to comparative effectiveness research. That’s the study of the effects of one therapy against another therapy. This kind of head-to-head evaluation offers ideal data to help patients and clinicians make informed treatment decisions. As obvious as that seems, it’s not the norm. Most clinical drug trials, for instance, compare a drug with a placebo, because that’s all that’s required for F.D.A. approval. In recognition of this, in recent years more federal funding has become available for comparative effectiveness research.

When it comes to insomnia, comparative effectiveness studies reveal that sleep medications aren’t the best bet for a cure, despite what the commercials say. Several clinical trials have found that they’re outperformed by cognitive behavioral therapy. C.B.T. for insomnia (or C.B.T.-I.) goes beyond the “sleep hygiene” most people know, though many don’t employ — like avoiding alcohol or caffeine near bedtime and reserving one’s bed for sleep (not reading or watching TV, for example). C.B.T. adds — through therapy visits or via self-guided treatments — sticking to a consistent wake time (even on weekends), relaxation techniques and learning to rid oneself of negative attitudes and thoughts about sleep.

More here.

Tuesday Poem

Homan and Chicago Ave.

Cross the blood

that quilts your busted lip
with the tender tip
of   your tongue. That lip’s
blood is brackish and white
meat flares from the black
swell. You crossed your mama’s
mind so call her sometimes.
She dreams your dead daddy
still puts his hands on her
waist. She calls his name
then crosses herself, calls
the police then crosses
her fingers. Cross me
and get cut across your cheek,
its fat bag full of   bad words
and cheap liquor you hide
from your badass kids. Make
a wish for bad weather
when the hoodlums get to shooting
in a good summer’s heat.
Cross the territory between
two gangs and feel eyes stare
and cross in a blur of crosshairs.
When a shot man lands
in the garden of trash the block
flares up like an appetite
spurred on by the sight
of prey, by the slurred
prayer of a man so death-close
he sees buzzards burrow
their bladed beaks into
his entry wound. Tune
the trumpets. Make way through
dusk’s clutter. After death
the dead cross over into song,
their bones tuning-forked
into vibrancy. Cross your lips,
mutiny against all speech
when a corpse starts singing
despite its leaded larynx. Don’t
say miracle when butterflies
break from a death-gaped skull,
rout the sky, and scatter.
.

by Phillip B. Williams
from Poetry Magazine, 2013

Mind Your Own Business

Haney-mindful-Ehrenreich1

Barbara Ehrenreich in The Baffler (image by Lisa Haney):

At about the beginning of this decade, mass-market mindfulness rolled out of the Bay Area like a brand new app. Very much like an app, in fact, or a whole swarm of apps. Previous self-improvement trends had been transmitted via books, inspirational speakers, and CDs; now, mindfulness could be carried around on a smartphone. There are hundreds of them, these mindfulness apps, bearing names like Smiling Mind and Buddhify. A typical example features timed stretches of meditation, as brief as one minute, accompanied by soothing voices, soporific music, and images of forests and waterfalls.

This is Buddhism sliced up and commodified, and, in case the connection to the tech industry is unclear, a Silicon Valley venture capitalist blurbed a seminal mindfulness manual by calling it “the instruction manual that should come with our iPhones and BlackBerries.” It’s enough to make you think that the actual Buddha devoted all his time under the Bodhi Tree to product testing. In the mindfulness lexicon, the word “enlightenment” doesn’t have a place.

In California, at least, mindfulness and other conveniently accessible derivatives of Buddhism flourished well before BlackBerries. I first heard the word in 1998 from a wealthy landlady in Berkeley, advising me to be “mindful” of the suffocating Martha Stewart-ish decor of the apartment I was renting from her, which of course I was doing everything possible to un-see. A possible connection between her “mindfulness” and Buddhism emerged only when I had to turn to a tenants’ rights group to collect my security deposit. She countered with a letter accusing people like me—leftists, I suppose, or renters—of oppressing Tibetans and disrespecting the Dalai Lama.

More here.

The Science of Scarcity

MJ15_p38_01

Cara Feinberg in Harvard Magazine (Photograph by Jim Harrison):

TOWARD THE END of World War II, while thousands of Europeans were dying of hunger, 36 men at the University of Minnesota volunteered for a study that would send them to the brink of starvation. Allied troops advancing into German-occupied territories with supplies and food were encountering droves of skeletal people they had no idea how to safely renourish, and researchers at the university had designed a study they hoped might reveal the best methods of doing so. But first, their volunteers had to agree to starve.

The physical toll on these men was alarming: their metabolism slowed by 40 percent; sitting on atrophied muscles became painful; though their limbs were skeletal, their fluid-filled bellies looked curiously stout. But researchers also observed disturbing mental effects they hadn’t expected: obsessions about cookbooks and recipes developed; men with no previous interest in food thought—and talked—about nothing else. Overwhelming, uncontrollable thoughts had taken over, and as one participant later recalled, “Food became the one central and only thing really in one’s life.” There was no room left for anything else.

Though these odd behaviors were just a footnote in the original Minnesota study, toprofessor of economics Sendhil Mullainathan, who works on contemporary issues of poverty, they were among the most intriguing findings. Nearly 70 years after publication, that “footnote” showed something remarkable: scarcity had stolen more than flesh and muscle. It had captured the starving men’s minds.

More here.

A Crisis at the Edge of Physics

07GRAY-blog427

Adam Frank and Marcelo Gleiser in the NYT (image by Gérard DuBois):

DO physicists need empirical evidence to confirm their theories?

You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple.

A few months ago in the journal Nature, two leading researchers, George Ellis and Joseph Silk, published a controversial piece called “Scientific Method: Defend the Integrity of Physics.” They criticized a newfound willingness among some scientists to explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” Despite working at the cutting edge of knowledge, such scientists are, for Professors Ellis and Silk, “breaking with centuries of philosophical tradition of defining scientific knowledge as empirical.”

Whether or not you agree with them, the professors have identified a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility.

How did we get to this impasse?

More here.

Monday, June 8, 2015

Sunday, June 7, 2015

Must We Mean What We Say? On Stanley Cavell

Charles Petersen in n + 1:

ScreenHunter_1214 Jun. 08 12.10Stanley Cavell, born in 1926 and now 86 years old, is one of the greatest American philosophers of the past half-century. He was also something of a musical prodigy and like many prodigies his accomplishments struck him as a matter of fraud. During his freshman year at Berkeley, he writes in Little Did I Know, his 2010 memoir, he walked into one of his first piano courses and was asked to prove he had the requisite chops by playing a piece on the spot. Not having practiced anything but jazz for years—this was 1944, and big band swing was at its peak—the budding pianist sat down at the bench, broke into a half-remembered theme from a Liszt impromptu, and “stopped playing as the theme was about to elaborate itself, as if I could have gone on to the end were there time and need.” He could not have gone on to the end, nor even a note further, but his teacher, a brilliant young pianist with some of the look of Marlene Dietrich, was nonetheless taken in. “Isn’t it fine to hear a man’s touch at the piano?” she said to the class. Cavell felt smitten, but also unmanned. “It is true that I had really done whatever . . . . I had done, but I could not go on.” Although he could play almost anything on demand—and would later win praise from Ernest Bloch, Milton Babbitt, and Roger Sessions, rescuing the premiere of one of the latter’s works through an emergency mid-concert transposition—for Cavell it was as if each new performance followed only from instinct, without the understanding that promised a way forward. No matter his successes, he couldn’t escape the feeling that he was a fraud.

Two decades later, in 1965, Cavell, having abandoned music for philosophy, returned to the problem of fraudulence in a now classic essay, “Music Discomposed.” (It would become a centerpiece in his landmark first collection, Must We Mean What We Say? [1969].) The motivating question of the essay — “How can fraudulent art be exposed?” — though couched in the nomenclature of composers like Cage and Stockhausen, seems now, in light of Cavell’s memoir, to be addressed as much to his own uncertainties as a young musician.

More here.