The Nation Is Made Of These

Javed Jabbar in Outlook:

JjThe further we move on from March 23, 1940—and from December 16, 1971, when the original Pakistan disintegra­ted—the stronger and deeper, and at the same time, more stressful, becomes the search for a cohesive sense of Pakistani nationalism. Composed of several elements, Muslim identity is the prime driver of Pakistani nationalism, but not exclusively so. Non-Muslim identities are small, yet vital and intrinsic parts of Pakistani nationalism. Foremost among them are Pakistani Hindu and Buddhist citizens. In a foundational sense, Pakistani Hindus and Buddhists are the oldest Pakistanis. Their ancestors have lived upon the lands that constitute our territory for centuries before the first Muslims arrived, or before the first local was converted to Islam.

If roots in territory make people eligible to be regarded as ‘sons of the soil’, then these non-Muslims are the first sons and daughters of Pakistan. It is unfortunate that, in the name of a state entity created less than 70 years ago, peoples whose ancestors have lived on these very territories for over 7,000 years are now ‘minorities’. Worse, many of them are regarded (and regard themselves) as second-class citizens. Adherents of all non-Islamic faiths are ineligible, by virtue of their religion, to be elected president or prime minister. This is so despite the fact that Article 20 of the Constitution grants—subject to law, public order and morality—freedom to profess religion and to manage religious institutions. Article 25, which deals with citizens’ equality says, in part: “All citizens are equal before law and are entitled to equal protection of law.” This is further reinforced by Article 36, which obliges the state to safeguard the legitimate rights and interests of minorities, including their due representation in the federal and provincial services. Working with Hindus, Christians and Zoroastrians for over 30 years in remote areas and in urban centres, this wri­ter has first-hand experience of their attitudes and act­ions. These non-Muslim Pakistanis have a deep love for Pakistan. They are proud Pakistanis, and contribute abundantly to the country’s progress—and little to its problems! Yet Pakistani Hindus are often regarded by segments of the state and society as Indian ‘agents’ or as being inherently disloyal to Pakistan. Such suspicion is a profanity in itself.

More here.

Tuesday Poem

Gaia's Men

The winds are high, the sun bright.
Near the service station door are parked
two white work trucks, both overloaded
with tillers, shovels and rakes,
the hood is up on one . . .no smoke. . .out of oil, I guess.
I hear a deep melodic voice, but pretend
I’ve heard nothing. Choosing, instead, to watch
the numbers on the pump climb higher,
twelve gallons–– $30.00––seven more gallons
before my tank will be full.
As I wait for the dull thump of the pump as it turns off
I hear the voice again, “Can I pump your gas for you?”
I turn to see two thin men, standing near the trucks.
They have pulled off their overalls,
but their faces are still dirt-smeared,
and their hair bound beneath baseball caps and rags.
All day they have cut furrows in the field just up the road.
I wonder, if Gaia has missed these times
when the men return to resume cutting
wayward roots out of her body,
smoothing lumps with their Hula Hoes,
leaving behind their offering of seeds, and fresh water.
Now, their toil complete, they flirt with Aphrodite,
hoping she will lead them to a warm
cleansing downpour,
hoping she will dry their bodies
with her long wheat colored hair.
Wrap her arms around their narrow waists
kiss their honey brown skin,
rub, and oil their aching muscles
allow them to rest heavy heads on her full breast.

by Georgia Anne Banks Martin
from Thanal Online

A Prescription for Frustration

Abigail Zuger in The New York Times:

BookTime spent in hospitals brings out the inner Chekhov in some doctors, the inner Che in others. Then there are the occasional hybrids, the storytellers who secretly plot revolution and the revolutionaries who wind up telling fairy tales. One might argue that Dr. Leana Wen and Dr. Joshua Kosowsky belong to the latter group. At least, their impressive “When Doctors Don’t Listen” is a manifesto motivated by very active imaginations, not that this necessarily diminishes the book’s importance. The authors, both emergency room physicians at Brigham and Women’s Hospital in Boston, do a fine job of sorting through most of the serious problems in American medicine today, including the costs, overtesting, overprescribing, overlitigation and general depersonalization. All are caused at least in part, they argue, by the increasing use of algorithms in medical care. Algorithms are flow charts, created by groups ranging from individual hospitals to large professional organizations, dictating what tests doctors should order and what medications they should prescribe in hundreds of different situations. Deployed throughout medicine, the algorithms are perhaps used most frequently in emergency rooms, where any single word a patient utters may set off a long cascade of programmed activity.

…The book’s insights and cautionary tales should appeal to medical and lay readers alike: they combine into a superb analysis of how doctors listen and think, and offer detailed suggestions for how they could do both better. But when the authors embark on an earnest campaign for patients to grab the reins and steer their own wayward doctors gently but firmly onto the right path — there, I would argue, is where the fantasy begins. “Participate in your physical exam,” they urge their readers. “Make the differential diagnosis together.” A healthy person might take on that difficult assignment, especially the advice about learning how to tell an effective story (start at the beginning, don’t leave anything out, avoid technical terms). But most acutely ill people just want an experienced person to take over, to do what has to be done and do it fast.

More here.

Karl Marx: A Man of His Time

31FREEDLAND-popup

Jonathan Freedland reviews Jonathan Sperber's Karl Marx, in the NYT:

The Karl Marx depicted in Jonathan Sperber’s absorbing, meticulously researched biography will be unnervingly familiar to anyone who has had even the most fleeting acquaintance with radical politics. Here is a man never more passionate than when attacking his own side, saddled with perennial money problems and still reliant on his parents for cash, constantly plotting new, world-changing ventures yet having trouble with both deadlines and personal hygiene, living in rooms that some might call bohemian, others plain “slummy,” and who can be maddeningly inconsistent when not lapsing into elaborate flights of theory and unintelligible abstraction.

Still, it comes as a shock to realize that the ultimate leftist, the father of Communism itself, fits a recognizable pattern. It’s like discovering that Jesus Christ regularly organized bake sales at his local church. So inflated and elevated is the global image of Marx, whether revered as a revolutionary icon or reviled as the wellspring of Soviet totalitarianism, that it’s unsettling to encounter a genuine human being, a character one might come across today. If the Marx described by Sperber, a professor at the University of Missouri specializing in European history, were around in 2013, he would be a compulsive blogger, and picking Twitter fights with Andrew Sullivan and Naomi Klein.

But that’s cheating. The express purpose of “Karl Marx: A Nineteenth-­Century Life” is to dispel the dominant notion of a timeless Marx — less man, more ideological canon — and relocate him where he lived and belonged, in his own time, not ours. Standing firm against the avalanche of studies claiming Marx as forever “our contemporary,” Sperber sets out to depict instead “a figure of the past,” not “a prophet of the present.”

Religion Without God

Dworkin_2-040413_jpg_230x1148_q85

Ronald Dworkin in the NYRB:

The familiar stark divide between people of religion and without religion is too crude. Many millions of people who count themselves atheists have convictions and experiences very like and just as profound as those that believers count as religious. They say that though they do not believe in a “personal” god, they nevertheless believe in a “force” in the universe “greater than we are.” They feel an inescapable responsibility to live their lives well, with due respect for the lives of others; they take pride in a life they think well lived and suffer sometimes inconsolable regret at a life they think, in retrospect, wasted. They find the Grand Canyon not just arresting but breathtakingly and eerily wonderful. They are not simply interested in the latest discoveries about the vast universe but enthralled by them. These are not, for them, just a matter of immediate sensuous and otherwise inexplicable response. They express a conviction that the force and wonder they sense are real, just as real as planets or pain, that moral truth and natural wonder do not simply evoke awe but call for it.

There are famous and poetic expressions of the same set of attitudes. Albert Einstein said that though an atheist he was a deeply religious man:

To know that what is impenetrable to us really exists, manifesting itself as the highest wisdom and the most radiant beauty which our dull faculties can comprehend only in their most primitive forms—this knowledge, this feeling, is at the center of true religiousness. In this sense, and in this sense only, I belong in the ranks of devoutly religious men.

Percy Bysshe Shelley declared himself an atheist who nevertheless felt that “The awful shadow of some unseen Power/Floats though unseen among us….” Philosophers, historians, and sociologists of religion have insisted on an account of religious experience that finds a place for religious atheism. William James said that one of the two essentials of religion is a sense of fundamentality: that there are “things in the universe,” as he put it, “that throw the last stone.” Theists have a god for that role, but an atheist can think that the importance of living well throws the last stone, that there is nothing more basic on which that responsibility rests or needs to rest.

What’s The Question About Your Field That You Dread Being Asked?

Bk_95_jon_kleinberg

Over at Edge, a number of thinkers answer this question. Jon Kleinberg, Professor of Computer Science, Cornell University:

“How can we have this much data and still not understand collective human behavior?”

If you want to study the inner workings of a giant organization distributed around the globe, here are two approaches you could follow—each powerful, but very different from each other. First, you could take the functioning of a large multinational corporation as your case study, embed yourself within it, watch people in different roles, and assemble a picture from these interactions. Alternately, you could do something very different: take the production of articles on Wikipedia as your focus, and download the site's complete edit-by-edit history back to the beginning: every revision to every page, and every conversation between two editors, time-stamped and labeled with the people involved. Whatever happened in the sprawling organization that we think of as Wikipedia—whatever process of distributed self-organization on the Internet it took to create this repository of knowledge—a reflection of it should be present and available in this dataset. And you can study it down to the finest resolution without ever getting up from your couch.

These Wikipedia datasets—and many other sources like them—are completely public; the same story plays out with restricted access if you're a data scientist at Facebook, Amazon, Google, or any of a number of other companies: every conversation within people's interlocking social circles, every purchase, every expression of intent or pursuit of information. And with this hurricane of digital records, carried along in its wake, comes a simple question: How can we have this much data and still not understand collective human behavior?

There are several issues implicit in a question like this. To begin with, it's not about having the data, but about the ideas and computational follow-through needed to make use of it—a distinction that seems particularly acute with massive digital records of human behavior. When you personally embed yourself in a group of people to study them, much of your data-collection there will be guided by higher-level structures: hypotheses and theoretical frameworks that suggest which observations are important. When you collect raw digital traces, on the other hand, you enter a world where you're observing both much more and much less—you see many things that would have escaped your detection in person, but you have much less idea what the individual events mean, and have no a priori framework to guide their interpretation. How do we reconcile such radically different approaches to these questions?

Monday, April 1, 2013

Sunday, March 31, 2013

From Pygmalion to Bladerunner, we keep falling for our robot creations. But then, what else is AI good for?

George Zarkadakis in Aeon:

ScreenHunter_161 Mar. 31 20.07Artificial intelligence is arguably the most useless technology that humans have ever aspired to possess. Actually, let me clarify. It would be useful to have a robot that could make independent decisions while, say, exploring a distant planet, or defusing a bomb. But the ultimate aspiration of AI was never just to add autonomy to a robot’s operating system. The idea wasn’t to enable a computer to search data faster by ‘understanding patterns’, or communicate with its human masters via natural language. The dream of AI was — and is — to create a machine that is conscious. AI means building a mechanical human being. And this goal, as supposedly rational technological projects go, is deeply strange.

Consider the ramifications of a conscious machine: one that thinks and feels like a human, an ‘electronic brain’ that dreams and ponders its own existence, falls in and out of love, writes sonnets under the moonlight, laughs when happy and cries when sad. What exactly would it be good for? What could be the point of spending billions of dollars and countless hours of precious research time in order to arrive at a replica of oneself?

Technology is a cultural phenomenon, and as such it is molded by our cultural values. We prefer good health to sickness so we develop medicine. We value wealth and freedom over poverty and bondage, so we invent markets and the multitudinous thingummies of comfort. We are curious, so we aim for the stars. Yet when it comes to creating conscious simulacra of ourselves, what exactly is our motive? What deep emotions drive us to imagine, and strive to create, machines in our own image? If it is not fear, or want, or curiosity, then what is it? Are we indulging in abject narcissism? Are we being unforgivably vain? Or could it be because of love?

More here.

Is Wagner bad for us?

Nicholas Spice in the London Review of Books:

In one of the European galleries at the British Museum, there’s a bronze medal of Erasmus made in Antwerp in 1519 by the artist Quentin Metsys. A portrait of Erasmus in profile is on the front of the medal. On the reverse, the smiling bust of Terminus, the Roman god of boundaries, and the words ‘concedo nulli’ – ‘I yield to no one.’It’s said that Erasmus kept a figurine of the god Terminus on his desk. He wrote: ‘Out of a profane god I have made myself a symbol exhorting decency in life. For death is the real terminus that yields to no one.’

Like anyone who has spent time thinking about Wagner, I have inevitably come back to the subject of boundaries and limits, and in particular to questions about the boundary that lies between Wagner’s works and his listeners, and about the experience, apparently not uncommon, of that boundary becoming blurred or even disappearing, an experience that may hold a clue to the feeling, also not uncommon, that Wagner’s work is in some sense not altogether good for us.

Respecting boundaries was not Wagner’s thing. Transgression he took in his stride – stealing other men’s wives when he needed them, spending other people’s money without worrying too much about paying it back – while artistically his ambitions knew no bounds. There is something awe-inspiring about his productivity under hostile conditions, the way, though living on the breadline, he turned out masterpieces when there was no reasonable prospect of any of them being performed: gigantic works, pushing singers and musicians to the limits of their technique, and taking music itself to the edges of its known universe. Theft; the breaking of vows, promises and contracts; seduction, adultery, incest, disobedience, defiance of the gods, daring to ask the one forbidden question, the renunciation of love for power, genital self-mutilation as the price of magic: Wagner’s work is everywhere preoccupied with boundaries set and overstepped, limits reached and exceeded. ‘Wagnerian’ has passed into our language as a byword for the exorbitant, the over-scaled and the interminable.

More here.

Worst Magazine Cover of the Year?

Seth Mnookin in Slate:

130328_MEDEX_TimeMagCover.jpg.CROP.article250-mediumA decade ago, when I was on the national desk at Newsweek, a handful of us would spend slow nights competing to see who could come closest to writing the Platonic ideal of a perfect coverline.

The game only had one real guideline: The headlines had to be vaguely rooted in reality.

That’s a journalistic precept that Time feels free to ignore. Witness the headline emblazoned in all-caps on the cover of the magazine’s April 1 issue: “HOW TO CURE CANCER.” It’s followed by an asterisk that directs you to a subtitle, just to make sure you get the point: “Yes, it’s now possible, thanks to new cancer dream teams that are delivering better results faster.”

Which, of course, is completely, utterly, inarguably false. The roughly 580,000 Americans who will die this year from cancer know the reality all too well. For some context, that’s more people than will die from chronic lower respiratory diseases, strokes, accidents, Alzheimer’s disease, and diabetes combined.

That’s not to say that there haven’t been major advances in treating some types of cancer, including acute lymphoblastic leukemia in children, testicular cancer in men, andearly-stage breast cancer in women. On the whole, however, our ability to treat solid tumors in late-stage disease remains, in the words of Nita Maihle, the director of Yale’s Biology of Reproductive Tract Cancers Program, “abysmal.”

More here.

“V.” at L: Pynchon’s First Novel Turns Fifty

V-thomas-pynchon-290

Alexander Nazaryan in The New Yorker:

Penguin recently announced that Thomas Pynchon will publish his next novel, “Bleeding Edge,” this fall. Set in Manhattan’s “Silicon Alley,” it will mark Pynchon’s literary return to New York City, where he has not ventured since his début, “V.,” published fifty years ago this month. In the intervening years, Pynchon has journeyed far and wide: Southern California (“The Crying of Lot 49” and “Inherent Vice”), Northern California (“Vineland”), Chicago (“Against the Day”), the American colonies (“Mason & Dixon”), and pretty much all of Europe, Harvard Square, Namibia, and Siberia (“Gravity’s Rainbow”).

The world, too, has changed a little since Benny Profane chased alligators through the sewers of Manhattan. Medgar Evers was killed three months after the publication of “V.,” and J.F.K. five months after that. Then R.F.K. and M.L.K. There was the rise of acid and pot, the riots of Newark and Detroit.

Despite all of the places he’s travelled, despite the near-infinite reach of his fiction, there is nevertheless a tendency, I find, to think of the media-averse Pynchon as hermetically sealed in a vat of his own ideas, puns, and fears. His famous paranoia has to it a pervasive, timeless quality, equally suspicious of all creeds and systems, of individuals and corporations alike.

But to read “V.” today is to experience Pynchon anew. Blast through the multilayered densities of “Gravity’s Rainbow,” “Mason & Dixon,” and “Against the Day,” and you have a young Cornell graduate, an engineer from Long Island, writing with an earnestness you might not have expected, about a world he could never recover. And though we think of Pynchon as the progenitor of postmodern irony, the novel’s central theme, as uttered by the jazz saxophonist McClintic Sphere, is one of sly but unmistakable sincerity: “Keep cool but care.”

I should confess that I have no idea what “V.” is about—and I have read it twice. It may be about Benny Profane, a hopeless schlemiel who, having been discharged from the Navy, bounces around New York City with a comically harmless gang called the Whole Sick Crew, spending a good amount of time in the aforementioned crocodilian pursuit. Or the novel could be about Herbert Stencil, the son of a prominent British consular official, Sidney Stencil, who had “died under unknown circumstances in 1919 while investigating the June Disturbances in Malta.” Stencil’s entire existence is focused on the hunt for V., a classic novelistic quest-without-resolution (in fact, V. might be fiction’s greatest example of a MacGuffin). V. may be a person, or may be a place, though it could also be neither: Pynchon calls it, at one point, “a remarkably scattered concept” and, at another, “the ultimate Plot Which Has No Name.”

No Thing

9780262018708

Richard Marshall reviews No Medium by Craig Dworkin, in 3:AM Magazine:

Something has been fixed in. Something about nothingness, about unreadability and unwriterbility, about silence and absence, abjection and a special kind of boredom. Craig Dworkin’s book is about an aspect of this fix. He looks at “works that are blank, erased, clear, or silent.” He argues that “we should understand media not as blank, base things but as social events, and that there is no medium, understood in isolation, but always a plurality of media: interpretive activities taking place in socially inscribed places.” The last chapter gives a list of key examples of more than 100 scores and readings of ‘silent’ music.

Blanchot’s ‘gigantic’ de Sade impressed Beckett as being “jealous of Satan and of his eternal torments, and confronting nature more than human-kind.” Satan’s torments were in darkness, alone and in an eternity of ice. Jealousy is a feisty off-shoot of ambition. So why is de Sade jealous? De Sade is jealous of the perturbality of Satan. 120 days of Sodom reads like an accountant’s log. What disturbed Beckett when he read Kafka was the imperturbability. “I am wary of disasters that allow themselves to be recorded like a statement of accounts.” De Sade fails in his gigantic quest to be disturbed and so is jealous of Satan’s achievement. This links to the modern fix. In the modern fix there is a crucial disturbance freaking in blankness. There is an instinct in this stuff to not tone down what is mistakenly taken to be superfluous. Oddly, complexity and the amorphous can seem abstract. But they are correspondences of a desperate tormented plenum wriggling at the abyss. Torment in this mode stands time still, skips lives, makes space hard to cross. This is the liveliness of a “nothing that is not there and the nothing that is.”

“You would do better, at least no worse, to obliterate texts than to blacken margins, to fill in the holes of words till all is blank and flat and the whole ghastly business looks like what it is, senseless, speechless, issueless misery.” That’s Sam Beckett. Carl Andre says, “A thing is a hole in a thing it is not.” Dworkin starts to work out what he calls the logic of the substrate by examining the blank-paged poetry book Nudism in Jean Cocteau’s film Orphee of 1950. It is considered a pretentious joke in the film by Orpheus. Dworkin suggests that a sophisticated reading would get that it was a joke, but that a more sophisticated reading would refuse to get the joke. It depends on “how closely one reads a work that seems to ask only that it not be read.” At more or less the same time John Cage was delivering his ‘Lecture on Nothing’ where he said, “I have nothing to say and I am saying it.”

Doctor Who and the New British Empire

1364637128

Chris Oates in the LA Review of Books:

Doctor Who is so British that Brits tend to disbelieve that it has become popular in the US. Their reaction at being told that one of their quirky national traditions attracts an audience unfamiliar with tea towels and gap years is a bit like an American being told that the Nathan’s Hot Dog Eating Contest is being livestreamed unironically across France. Really? That’s what you’re watching? But only we watch that.

First broadcast in 1963, Doctor Who centers on a humanoid alien, the Doctor, who travels throughout time and space with a human companion from contemporary Britain, fighting aliens and extricating himself from hopeless situations. The show was famous for its low production values. The Doctor’s spaceship/time machine, the TARDIS, is a wooden box that, notwithstanding its transgalactic origins, looks exactly like a police telephone booth from 1960s Britain. The Doctor’s greatest enemies, the Daleks, are slightly smaller wooden boxes whose main weapons look strikingly like toilet plungers. Nonetheless, it was a hit. The show was in production until 1989 and rebooted in 2005. In the UK, the show is a bit like Star Trek. It often inspires sketches for the annual Comic Relief telethon, which in 2011 got a 37-percent audience share, unheard of in the US, where a network on a strong night might average 14 percent. The Guardian art critic Jonathan Jones has called Doctor Who “Britain’s greatest television show.” It has that kind of hyperbolically vaunted status.

Doctor Who is also quintessentially British not because it is made in Britain or because it is popular in Britain, but because it reflects the development of the United Kingdom’s place in the world in the past half century. The show continued the youth adventure literature enabled and encouraged by imperialism into a post-imperial time. The Doctor acts as the epitome of how Britons (and perhaps Westerners in general) would like to see themselves and their actions in the world.

Sunday Poem

Easter in the Cancer Ward

Because it has been years since my hands
have dyed an egg or I’ve remembered
my father with color in his beard,
because my fingers have forgotten
the feel of wax melting on my skin,
the heat of paraffin warping air,
because I prefer to view death politely from afar,
I agree to visit the children’s cancer ward.

In her ballet-like butterfly slippers, Elaine pad-pads
down the carpeted hall. I bring the bright bags,
press down packets of powdered dye, repress my slight unease.
She sweeps her hair from her volunteer’s badge, leaves
behind her own residents’ ward for a few hours’ release.
The new wing’s doors glide open onto great light. Everything is
vibrant and clattered with color. Racing
up, children converge, their green voices rising.

What does one do with the embarrassment of staring
at sickness? Suddenly, I don’t know where to place
my hands. Children with radiant faces
reach out thinly, clamor for the expected bags, lead
us to the Nurses’ kitchen. Elaine introduces me and reads
out a litany of names. Some of the youngest wear
old expressions. The bald little boy loves Elaine’s long mane of hair
and holds the healthy thickness to his face, hearing

her laugh as she pulls him close. “I’m dying,”
he says, and Elaine tells him she is, too: too
much iron silting her veins. I can never accept that truth
yet, in five months, she’ll slip away in a September
night – leaving her parents and me to bow our heads, bury her
in a white wedding gown, our people’s custom.
But right now, I don’t know this. Right now, we are young,
still immortal, and the kids fidget, crying

out for their eggs. Elaine divides them into teams;
I lay out the tools for the operation.
I tell them all how painting Easter eggs used to be done
in the Old Country. Before easy dyes were common,
villagers boiled onion peels, ladled eggs
into pots so the shells wouldn’t break.
They’d scoop them out, flushed a brownish-
red, and the elders would polish and polish
Read more »

How Nature Resets Our Minds and Bodies

From The Atlantic:

Nature%20tree%20mainJust before the dawn of the twentieth century, William James, one of the early giants of modern psychology, explained that human attention comes in two different forms. The first is directed attention, which enables us to focus on demanding tasks like driving and writing. Reading a book also requires directed attention, and you'll notice that you start to zone out when you're tired, or when you've been reading for hours at a time. The second form is involuntary attention, which comes easily and doesn't require any mental effort at all. As James explained, “Strange things, moving things, wild animals, bright things, pretty things, words, blows, blood, etc., etc., etc.” all attract our attention involuntarily.

Nature restores mental functioning in the same way that food and water restore bodies. The business of everyday life — dodging traffic, making decisions and judgment calls, interacting with strangers — is depleting, and what man-made environments take away from us, nature gives back. There's something mystical and, you might say, unscientific about this claim, but its heart actually rests in what psychologists call attention restoration theory, or ART. According to ART, urban environments are draining because they force us to direct our attention to specific tasks (e.g., avoiding the onslaught of traffic) and grab our attention dynamically, compelling us to “look here!” before telling us to instead “look over there!” These demands are draining — and they're also absent in natural environments. Forests, streams, rivers, lakes, and oceans demand very little from us, though they're still engaging, ever changing, and attention-grabbing. The difference between natural and urban landscapes is how they command our attention. While man-made landscapes bombard us with stimulation, their natural counterparts give us the chance to think as much or as little as we'd like, and the opportunity to replenish exhausted mental resources.

More here.

Maya Angelou: my terrible, wonderful mother

From The Guardian:

The first decade of the 20th century was not a great time to be born black and poor and female in St Louis, Missouri, but Vivian Baxter was born black and poor, to black and poor parents. Later she would grow up and be called beautiful. As a grown woman she would be known as the butter-coloured lady with the blowback hair.

Maya-Angelou-with-her-mot-006My mother, who was to remain a startling beauty, met my father, a handsome soldier, in 1924. Bailey Johnson had returned from the first world war with officer's honours and a fake French accent. They were unable to restrain themselves. They fell in love while Vivian's brothers walked around him threateningly. He had been to war, and he was from the south, where a black man learned early that he had to stand up to threats, or else he wasn't a man. The Baxter boys could not intimidate Bailey Johnson, especially after Vivian told them to lay off. Vivian's parents were not happy that she was marrying a man from the south who was neither a doctor nor lawyer. He said he was a dietician. The Baxters said that meant he was just a negro cook. Vivian and Bailey left the contentious Baxter atmosphere and moved to California, where little Bailey was born. I came along two years later. My parents soon proved to each other that they couldn't stay together. They were matches and gasoline. They even argued about how they were to break up. Neither wanted the responsibility of taking care of two toddlers. They separated and sent me and Bailey to my father's mother in Arkansas. I was three and Bailey was five when we arrived in Stamps, Arkansas. We had identification tags on our arms and no adult supervision. I learned later that Pullman car porters and dining car waiters were known to take children off trains in the north and put them on other trains heading south.

Save for one horrific visit to St Louis, we lived with my father's mother, Grandmother Annie Henderson, and her other son, Uncle Willie, in Stamps until I was 13. The visit to St Louis lasted only a short time but I was raped there and the rapist was killed. I thought I had caused his death because I told his name to the family. Out of guilt, I stopped talking to everyone except Bailey. I decided that my voice was so powerful that it could kill people, but it could not harm my brother because we loved each other so much. My mother and her family tried to woo me away from mutism, but they didn't know what I knew: that my voice was a killing machine. They soon wearied of the sullen, silent child and sent us back to Grandmother Henderson in Arkansas, where we lived quietly and smoothly within my grandmother's care and under my uncle's watchful eye.

More here.

Saturday, March 30, 2013

THE FACTS, THE MYTHS AND THE FRAMING OF IMMIGRATION

Kenan Malik in Pandaemonium:

ScreenHunter_161 Mar. 30 21.25At the heart of the current debate about immigration are two issues: the first is about the facts of immigration, the second about public perception of immigration.

The facts are relatively straightforward. Immigration is a good and the idea that immigrants come to Britain to live off benefits laughable. Immigrants put more money into the economy than they take out and have negligible impact on jobs or wages. An independent report on the impact of immigrationcommissioned by the Home Office in 2003, looked at numerous international surveys and conducted its own study in Britain. ‘The perception that immigrants take away jobs from the existing population, or that immigrants depress the wages of existing workers’, it concluded, ‘do not find confirmation in the analysis of the data laid out in this report.’ More recently studies have suggested that immigration helps raise wages except at the bottom of the jobs ladder where it has a slightnegative impact. That impact on low paid workers matters hugely, of course, but is arguably more an issue of labour organization than of immigration.

Immigrants are less likely to claim benefits than British citizens. According to the Department for Work and Pensions, of the roughly 1.8 million non-British EU citizens of working age in this country, about 90,000, or around 5%, claim an ‘out of work benefit’, compared with around 13% of Britons. Migrants from outside the EU are also much less likely to claim benefits.

More here.

It’s a part of my paleo fantasy, it’s a part of my paleo dream

David Gorski in Science-Based Medicine:

9324290-natural-medicineThere are many fallacies that undergird alternative medicine, which evolved into “complementary and alternative medicine” (CAM), and for which the preferred term among its advocates is now “integrative medicine,” meant to imply the “best of both worlds.” If I had to pick one fallacy that rules above all among proponents of CAM/IM, it would have to be either the naturalistic fallacy (i.e., that if it’s natural—whatever that means—it must be better) or the fallacy of antiquity (i.e., that if it’s really old, it must be better). Of course, the two fallacies are not unrelated. In the minds of CAM proponents, old is more likely to have been based on nature, and the naturalistic fallacy often correlates with the fallacy of antiquity. Basically, it’s a rejection of modernity, and from it flow the interest in herbalism, various religious practices rebranded as treatments (thousands of years ago, medicine was religion and religion was medicine—the two were more or less one and physicians were often priests as well), and the all-consuming fear of “toxins,” in which it is thought that the products of modernity are poisoning us.

Yes, there is a definite belief underlying much of CAM that technology and pharmaceuticals are automatically bad and that “natural” must be better. Flowing from that belief is the belief that people were happier and much healthier in the preindustrial, preagricultural past, that cardiovascular disease was rare or nonexistent, and that cancer was seldom heard of. Of course, it’s hard not to note that cancer and heart disease are primarily diseases of aging, and life expectancy was so much lower back in the day that a much smaller percentage of the population lived to advanced ages than is the case today. Even so, an implicit assumption among many CAM advocates is that cardiovascular disease is largely a disease of modern lifestyle and diet and that, if modern humans could somehow mimic preindustrial or, according to some, even preagricultural, lifestyles, that cardiovascular disease could be avoided.

More here.