The Myth of American Meritocracy

Ron Unz in The American Conservative:

ScreenHunter_1194 May. 20 20.06Just before the Labor Day weekend, a front page New York Times story broke the news of the largest cheating scandal in Harvard University history, in which nearly half the students taking a Government course on the role of Congress had plagiarized or otherwise illegally collaborated on their final exam.1 Each year, Harvard admits just 1600 freshmen while almost 125 Harvard students now face possible suspension over this single incident. A Harvard dean described the situation as “unprecedented.”

But should we really be so surprised at this behavior among the students at America’s most prestigious academic institution? In the last generation or two, the funnel of opportunity in American society has drastically narrowed, with a greater and greater proportion of our financial, media, business, and political elites being drawn from a relatively small number of our leading universities, together with their professional schools. The rise of a Henry Ford, from farm boy mechanic to world business tycoon, seems virtually impossible today, as even America’s most successful college dropouts such as Bill Gates and Mark Zuckerberg often turn out to be extremely well-connected former Harvard students. Indeed, the early success of Facebook was largely due to the powerful imprimatur it enjoyed from its exclusive availability first only at Harvard and later restricted to just the Ivy League.

During this period, we have witnessed a huge national decline in well-paid middle class jobs in the manufacturing sector and other sources of employment for those lacking college degrees, with median American wages having been stagnant or declining for the last forty years. Meanwhile, there has been an astonishing concentration of wealth at the top, with America’s richest 1 percent now possessing nearly as much net wealth as the bottom 95 percent.2 This situation, sometimes described as a “winner take all society,” leaves families desperate to maximize the chances that their children will reach the winners’ circle, rather than risk failure and poverty or even merely a spot in the rapidly deteriorating middle class. And the best single means of becoming such an economic winner is to gain admission to a top university, which provides an easy ticket to the wealth of Wall Street or similar venues, whose leading firms increasingly restrict their hiring to graduates of the Ivy League or a tiny handful of other top colleges.3 On the other side, finance remains the favored employment choice for Harvard, Yale or Princeton students after the diplomas are handed out.4

More here.



Fake Diplomas, Real Cash: Pakistani Company Axact Reaps Millions

Declan Walsh in the New York Times:

17PAKISTAN-WEB-1-master675-v2Seen from the Internet, it is a vast education empire: hundreds of universities and high schools, with elegant names and smiling professors at sun-dappled American campuses.

Their websites, glossy and assured, offer online degrees in dozens of disciplines, like nursing and civil engineering. There are glowing endorsements on the CNN iReport website, enthusiastic video testimonials, and State Department authentication certificates bearing the signature of Secretary of State John Kerry.

“We host one of the most renowned faculty in the world,” boasts a woman introduced in one promotional video as the head of a law school. “Come be a part of Newford University to soar the sky of excellence.”

Yet on closer examination, this picture shimmers like a mirage. The news reports are fabricated. The professors are paid actors. The university campuses exist only as stock photos on computer servers. The degrees have no true accreditation.

In fact, very little in this virtual academic realm, appearing to span at least 370 websites, is real — except for the tens of millions of dollars in estimated revenue it gleans each year from many thousands of people around the world, all paid to a secretive Pakistani software company.
More here.

Study may explain mysterious cancer–day care connection

Warren Cornwall in Science:

LeukemiaFor years, scientists have noticed an interesting pattern of cancer among children. Those who went to day care early in life were less likely to later develop the most common childhood cancer: acute lymphoblastic leukemia (ALL). Now, a 7-year study appears to have unraveled the molecular mechanism driving ALL. The work may explain why early exposure to infections in places such as day cares seems to protect against the disease and why unrelated vaccines help guard against this cancer. For Mel Greaves, a cancer cell biologist at the University of London’s Institute of Cancer Research, the finding provides an explanation for the hypothesis he has long promoted: that when infants in modern societies are sheltered from routine infections, their immune systems are more likely to overreact during later infections, paving the way for ALL. “I see it as the missing link,” he says of the new research.

Most childhood ALL involves a malfunction of B cells, the scouts of the immune system that patrol the bloodstream looking for intruders like viruses and bacteria; they make antibodies that help fight infections. But with leukemia, the immune system goes haywire, churning out flawed, immature B cells at a prodigious rate and crowding out healthy blood cells. Normal B cells are a marvel of adaptability. As they mature, they reprogram their own DNA, enabling the immune system to produce millions of different B cells programmed to recognize the vast range of potential infections. The DNA rearrangement relies on a sequence of enzymes. First, proteins known as RAGs cut and paste whole chunks of DNA. After that, another enzyme, AID, goes to work “fine-tuning” the DNA by altering single nucleotides. But Greaves and colleagues suspected this process could go awry, introducing mutations that create flawed B cells that could cause leukemia. In a series of experiments, they found evidence that much of the problem lay with a breakdown in the orderly sequence of gene editing during infections. Rather than the RAGs doing their business and then stepping aside for the AID, the AID kicked in simultaneously, potentially increasing the risk of gene-editing errors. These tantalizing results came to a head in an experiment on mice with a genetic abnormality linked to childhood ALL. The condition, in which two genes associated with blood formation are fused together, is found in the cord blood of 1% of all newborns. But most children with it never go on to develop full-blown ALL. The researchers wondered if unregulated mutations set off by repeated infections later in childhood could make the difference, triggering the leukemia.

More here.

Wednesday Poem

Adelle Explains Urgency to the Judge
.

A few hours before the wedding I started to draw again. I had never taken my sketches seriously. But these new pictures showed a mastery I never thought possible before. In those few hours, I gained a sense of myself. I locked the door to the bridal suite and sketched everything I could: the windows, an armoire, my violet nightgown hanging from a hanger. That was when I heard a knock, followed by shouts and threats. It was my mother with the white dress.
.

by Kristina Marie Darling
from Amethyst Arsenic, 4.1

Women’s Work: the legacy of the 1970s women’s movement

Vivian Gornick in Book Forum:

BettyFORTY YEARS AGO, when the second wave of the American feminist movement was young, and its signature phrase, “the personal is political,” was electrifying, many of the movement’s radicals (this reviewer among them) went to war with the age-old conviction that marriage and motherhood were the deepest necessities of every woman’s life. If we looked honestly at what many of us really wanted, as we were doing in the 1970s and ’80s, it was not marriage and motherhood at all; it was rather the freedom to discover for ourselves the lives we might actually want to pursue. In our pain and anger at having been denied that freedom, we often turned recklessly on these conventional wisdoms. Marriage was rape, we cried, motherhood slavery. No equality in love? We’ll do without! What we didn’t understand—and this for years on end—was that between the ardor of our revolutionary rhetoric and the dictates of flesh-and-blood reality lay a no-man’s-land of untested pronouncements. How easy it was for us to declare ourselves “liberated,” how chastening to experience the force of contradictory feeling that undermined these defiant simplicities. As we moved inexorably toward the moment when we were bound to see that we were throwing the baby out with the bathwater, nearly every one of us became a walking embodiment of the gap between theory and practice: the place in which we were to find ourselves time and again.

KATE BOLICK is a forty-two-year-old journalist who, since childhood, has harbored a fantasy of living alone and becoming what she calls a “real” writer, but, like many women of her generation, she has found it nearly impossible to pursue that dream. In a memoir, Spinster, she traces the problem to its origins. “Whom to marry, and when it will happen” are the book’s opening words. “These two questions define every woman’s existence, regardless of where she was raised or what religion she does or doesn’t practice. She may grow up to love women instead of men, or to decide she simply doesn’t believe in marriage. No matter. These dual contingencies govern her until they’re answered, even if the answers are nobody and never.”

More here.

Tuesday, May 19, 2015

A Conversation with An-My Le

Le-web1

Over at The Brooklyn Rail:

Sara Christoph (Rail): The current blockbuster American Sniper, which deals with the same subject matter as your own work, might be a good place for us to begin.The success of these types of movies fascinates me, though it is not surprising, given the way the films tend to mythologize the soldier’s experience in a one-dimensional way. As someone who has spent years carefully parsing the nuances of what it means to live through or participate in a war, what was your reaction to the film?

An-My Lê: You know, I rarely have time to go to the movies, but I did see American Sniper. I also saw Rory Kennedy’s Last Days in Vietnam. I should have seen it months ago. I think I had P.T.S.D. afterwards. I was very happy to see American Sniper, because I am always fascinated with this subject, but I was disappointed. It was kind of a great story—

Rail: Just the feat of his accomplishments, leaving aside the moral issues.

Lê: Yes, the feat of it. The stress, the focus, the psychology of the mission and how it affected him—all of that really interests me more than anything else. But you’re right, it is very one-dimensional. Some filmmakers, like Kathryn Bigelow and her film Zero Dark Thirty, are interested in portraying something that is three-dimensional. She’s an artist, and hers is a fictional account. And there is something about working within that fiction that allows for a satisfying and challenging description. I don’t think Clint Eastwood did that, even though he can be a great filmmaker. I’m not sure why. Perhaps he got so caught up in wanting to pay tribute to Chris Kyle as a veteran. And of course that is important. It is a responsibility.

Rail: Specifically because of the way Kyle’s story ends, being killed by a fellow vet. There’s an added responsibility to an individual’s legacy.

Lê: The topic of the military raises questions in ways that other topics would not. There are photographers who have dealt with extreme poverty, or who have photographed horrific labor conditions, and they are not held accountable in the same way. They aren’t asked: what do you think of poverty? But the question of the military is so complicated that it riles up people’s opinions. And when your work is about the military, people want to know: are you for or are you against it? Maybe American Sniper was too caught up in having a straightforward message.

More here.

What Reading Wordsworth Teaches Us About Poverty

Paul-Ryan-Wikipedia-Commons-1024x678

Jamison Kantor in The Brooklyn Quarterly (Photo: Wikimedia Commons):

How does one get from British Romantic writers to Paul Ryan? The answer may lie in the language that each of them used.

Before turning to the emotions that are associated with poetic language, let’s look briefly at the emotional logic of the system itself. With the emergence of industrial labor in England, rural workers had to dramatically change their mindset. Now, people who had never lived under the rule of capitalism were expected to enter the industrial marketplace, endure the vicissitudes of prices—and the poor-relief to which they were connected—and reorganize their lifestyle around an administration over which they had almost no control. Swift market fluctuations did not just mean that foresight and planning were difficult. Existence under this new regime also meant a change in consciousness. In order to tolerate such insecurity, workers would have to believe in the promise of the new capitalist enterprise; that, despite the incessant variability built into their lives, rising industrial productivity would eventually bring them comforts far greater than what they had through rural work.

Ironically, the Speenhamland system may have played a role in this. “Hope,” the economic historian Karl Polanyi writes in his classic The Great Transformation (1944), “…was distilled out of the nightmare population and wage laws, and was embodied in a concept of progress so inspiring that it appeared to justify the cast and painful dislocations to come.”[3] This belief has remained a part of modern urban poverty. The endless, small decisions that the working poor have to make merely in order to survive act as a cruel stand-in for the sanctified idea of capitalist choice. In actuality, the constant pressure of evaluation and selection can fatigue people so much that their cognitive function is diminished. Recently, a group of contemporary neuroscientists from the University of Warwick have shown that poverty actually impedes cognitive function by putting this burden of choice on workers continually: which bus to take, which groceries will be least expensive, which residential utility will be most essential for living.[4] Add to these factors the inherent bustle of urban life, and poverty becomes a twofold deprivation: not only do people lack material provisions, but they also lack the time for deeper moments of contemplation. Poverty literally trades intellection for survival.

More here.

Austerity Bites: Fiscal Lessons from the British General Election

Blyth_payitforward4

Jonathan Hopkin and Mark Blyth in Foreign Affairs:

Despite Conservative spending cuts, the United Kingdom’s deficit was reduced by only half of what the party anticipated when it took office in 2010. The nation’s economy did not start to grow until late 2013, after a panicked treasury minister, George Osborne, relaxed austerity measures. The United Kingdom’s economic problems, the Conservatives maintained, were the result of Labour’s supposed profligacy in running budget deficits during the boom years of the early 2000s, leaving the economy exposed to the financial crisis. This, they argued, made draconian spending cuts inevitable.

However, as the crisis hit in 2007, the United Kingdom had the lowest debt to GDP ratio in the G7, lower than when Labour had taken power a decade earlier. And if Labour was supposedly running excessive deficits, the markets remained strangely unconcerned, with market rates on British bonds running close to pre-collapse lows. This left many wondering why the British budget exploded in 2008 and what it might say about coalition rule in the United Kingdom.

These questions, however, were strangely missing from discussion during the election. Cameron did not discuss why the United Kingdom’s outsized and overleveraged financial sector made the nation suffer disproportionately from the worst financial crisis since the 1930s. Financial deregulation and the unsustainable growth in private, not public, credit fatally exposed the United Kingdom’s banks to the United States’ subprime credit crisis. The collapse in credit growth in 2007–09 hurt the United Kingdom’s budget not because the Labour government was too deep in debt but because the national economy was more dependent on financial activity than elsewhere. By 2007, the British Exchequer was taking nearly 25 percent of total tax revenue out of the financial sector just prior to the crisis, which was a mere ten percent of the economy. With the financial crisis, these revenues plummeted, leaving the government short of cash and needing to borrow heavily.

More here.

America and horses

Symbolism-for-beginners_1260_999_80Nell Boeschenstein at The Morning News:

For as long as humans and horses have coexisted, humans have looked at horses and seen in them that ineffable quality we associate with the things we have lumped into a broad box labeled “Beautiful Things,” along with mountain ranges, the night sky, flowers in bloom, and the female form. The human-horse relationship is so much more intimate than the human-cow or human-chicken or human-sheep relationship. These are animals we ride, and when riding them we have been both mistaken for and mythologized as one and the same being. I don’t need to belabor how little man might have accomplished and how much more slowly he would have done it without the horse. The way we feel about the horse is more like how we feel about dogs than how we feel about stock animals.

Yet as urban dog and cat ownership skyrocketed in the United States between 1920 and 1940, so did meat-processing plants supplying demand for pet food with horsemeat. About 200 such meat plants opened during those decades, even as horses were publicly beloved on a scale we can hardly imagine from today’s perspective. There were the YA novels, yes. There was also Seabiscuit running across headlines and between 1930 and 1948, Gallant Fox, Omaha, War Admiral, Whirlaway, Count Fleet, Assault, and Citation won the Triple Crown in an impressive string, a feat only one American Thoroughbred had managed before. Only four have managed it since; the last one, Affirmed, was in 1978.

more here.

on Jack Smith’s ‘Hamlet in the Rented World’

ArticleJ. Hoberman at Artforum:

GIVEN THAT JACK SMITH never actually completed another movie after Flaming Creatures (1963), that most of his theater pieces concern the impossibility of their coming into existence, and that many all-but-identical drafts of the same scripts were found among his papers, it’s hardly surprising that he should have been fascinated by the most famously indecisive character in world literature.

Hamlet in the Rented World (A Fragment) is a twenty-seven-minute assemblage put together by Jerry Tartaglia on behalf of the Gladstone Gallery in New York from materials discovered in the Jack Smith Archives, including five quarter-inch audio reels and four rolls of 16-mm film (two of them untouched camera originals), all dating from the early 1970s. Guided by Smith’s scripts, Tartaglia’s reconstruction may be considered the artist’s last, posthumous word. (Hundreds of slides, the material for scores of the slide shows Smith presented during the ’70s and ’80s, remain—but we won’t go there.) Tartaglia, an avant-garde filmmaker whose deep involvement with Smith’s movies began when he discovered Flaming Creatures’ camera original in a laboratory discard bin in 1978 and who has labored over restorations of all Smith’s other film projects, knows this material better than anyone on earth.

more here.

Dennis Cooper’s Haunted HTML Novel

Screen_Shot_2015-01-16_at_8.49.26_AM.0.0Paige K. Bradley at Bookforum:

Dennis Cooper's latest book, Zac’s Haunted House, was released online in mid-January by the Paris-based small press and label Kiddiepunk. Dubbed an “html novel” and offered as a free download, it consists of seven html files, each of which expands into a long, vertical scroll of animated gifs. You could call Zac’s Haunted House many things: net art, a glorified Tumblr, a visual novel, a mood board, or a dark night of the Internet's soul. It has just a few words—the chapter titles and a few subtitles embedded in some of the gifs—but it still very clearly belongs to Cooper’s own haunted oeuvre, capable of evoking powerful and gnarled emotions. Although it is something of an about-face from his last novel, The Marbled Swarm—with that book’s intentionally contrived, digressive language—Zac’s Haunted House still displays Cooper’s obsessive attention to form and style. It also features his by now nearly classical imagery and interests: The vulnerable young male body juxtaposed with death and failure; charged use of subcultural vernacular; and confused bodies, to say nothing of identities, fumbling through sex and subterfuge. Cooper has always written characters whose ineloquence hints at experiences that defy language; now, telling a story almost exclusively in images, he pushes this inarticulateness in a new direction. The result is surprisingly eloquent, and accurately speaks to our experience of the present, online and IRL.

more here.

Tuesday Poem

I thought I was following a track of freedom
and for awhile it was
Adrienne Rich

Rivers/Roads

Consider the earnestness of pavement
its dark elegant sheen after rain,
its insistence on leading you somewhere

A highway wants to own the landscape,
it sections prairie into neat squares
swallows mile after mile of countryside
to connect the dots of cities and towns,
to make sense of things

A river is less opinionated
less predictable
it never argues with gravity
its history is a series of delicate negotiations with
time and geography

Wet your feet all you want
Hericlitus says,
it's never the river you remember;
a road repeats itself incessantly
obsessed with its own small truth,
it wants you to believe in something particular

The destination you have in mind when you set out
is nowhere you have ever been;
where you arrive finally depends on
how you get there,
by river or by road
.

by Michael Crummey
from Arguments With Gravity
Kingston, Ont.: Quarry Press, 1996.

The Trouble With Scientists

Philip Ball in Nautilus:

EyesSometimes it seems surprising that science functions at all. In 2005, medical science was shaken by a paper with the provocative title “Why most published research findings are false.”1 Written by John Ioannidis, a professor of medicine at Stanford University, it didn’t actually show that any particular result was wrong. Instead, it showed that the statistics of reported positive findings was not consistent with how often one should expect to find them. As Ioannidis concluded more recently, “many published research findings are false or exaggerated, and an estimated 85 percent of research resources are wasted.”2 It’s likely that some researchers are consciously cherry-picking data to get their work published. And some of the problems surely lie with journal publication policies. But the problems of false findings often begin with researchers unwittingly fooling themselves: they fall prey to cognitive biases, common modes of thinking that lure us toward wrong but convenient or attractive conclusions. “Seeing the reproducibility rates in psychology and other empirical science, we can safely say that something is not working out the way it should,” says Susann Fiedler, a behavioral economist at the Max Planck Institute for Research on Collective Goods in Bonn, Germany. “Cognitive biases might be one reason for that.” Psychologist Brian Nosek of the University of Virginia says that the most common and problematic bias in science is “motivated reasoning”: We interpret observations to fit a particular idea. Psychologists have shown that “most of our reasoning is in fact rationalization,” he says. In other words, we have already made the decision about what to do or to think, and our “explanation” of our reasoning is really a justification for doing what we wanted to do—or to believe—anyway. Science is of course meant to be more objective and skeptical than everyday thought—but how much is it, really?

Whereas the falsification model of the scientific method championed by philosopher Karl Popper posits that the scientist looks for ways to test and falsify her theories—to ask “How am I wrong?”—Nosek says that scientists usually ask instead “How am I right?” (or equally, to ask “How are youwrong?”). When facts come up that suggest we might, in fact, not be right after all, we are inclined to dismiss them as irrelevant, if not indeed mistaken. The now infamous “cold fusion” episode in the late 1980s, instigated by the electrochemists Martin Fleischmann and Stanley Pons, was full of such ad hoc brush-offs. For example, when it was pointed out to Fleischmann and Pons that their energy spectrum of the gamma rays from their claimed fusion reaction had its spike at the wrong energy, they simply moved it, muttering something ambiguous about calibration.

More here.

Human Ingenuity Takes On Cancer’s Darwinian Ways

George Johnson in The New York Times:

DarwinThe powerful algorithm that has populated the earth with 10 million species, each occupying a different ecological niche, is an example of what computer scientists call “random generate and test.” Start with the DNA alphabet, then blindly shuffle the letters to produce a kaleidoscope of living forms. The fittest, selected by the demands of the environment, will multiply and fill their habitats. The Darwinian principle is also at work inside the body, though in very different ways. Through random variation and selection, the immune system spins out the endless diversity of antibodies that it uses to stop microscopic invaders. But cancer also thrives through this evolutionary imperative as, mutation by mutation, a normal human cell transforms into a deadly tumor, which becomes fitter and fitter at the expense of its host. Among the advantages it evolves is the ability to outwit our immunological defenses.

One of the most encouraging developments in medical research has been the effort to help the immune system fight back, beating cancer at its own evolutionary game. That was a dominant theme last month at the annual meeting of the American Association for Cancer Research in Philadelphia as scientists discussed recent successes in immunotherapy while considering how far the field still has to go. Why have these treatments been working so well with some cancers but not others? And why, even in the best cases, do not all patients respond? The realization that Darwinian forces, for good and bad, are at work inside us can be traced to the early 1950s, when Frank Macfarlane Burnet, an Australian virologist, was pondering how we manage to fight off a potentially infinite variety of invading microbes, tailoring an antibody against each one. One possibility was that when an interloper is identified, by its molecular bumps and grooves, the immune system systematically engineers an appropriate weapon. Nature doesn’t work in such a methodical manner, and Burnet suggested a messier, more intuitive explanation: the clonal selection theory of immunity.

More here.

Monday, May 18, 2015

Perceptions

CBE 2 w GHB
Sughra Raza. Carmine Bee-eater With its Beater. Botswana, 2015.

Digital photograph.

“(Bee-eaters) forage over grasslands and Acacia savanna, and are well known for the ingenious use of ‘beaters’ to chase up grasshoppers, dragonflies and other prey species. These beaters usually take the form of grazing herds of game and domestic animals, and large flocks of carmine bee-eaters may gather overhead. They also use various creatures as convenient mobile perches from which to swoop off, snatching insects flushed by their ride.

Northern Carmine Bee-eaters in particular are masters of this trait, and rides range from elephants, donkeys and goats to Kori and Arabian Bustards, Abyssinian Ground Hornbills …”

More here and here.

Sunday, May 17, 2015

Confessions of a Bitter Cripple

Elizabeth Barnes in Philosop-her:

Img_18131I didn’t expect to feel so angry. A few years ago, having established a certain amount of professional security, I decided to start doing more work on social and feminist philosophy – especially philosophical issues related to disability. I’d always done some work on the topic, but I considered doing lots of work on it a professional luxury that had to be earned. When I began to focus more of my research on disability, I expected plenty of things – a deeper sense of fulfillment from what I was doing, a fair amount of side eye from colleagues, worries that the topic was too niche to be of general interest – but I didn’t expect the emotional drain that the work would be. I feel angry – more than I could’ve anticipated and more than I often care to admit – when I write about disability. And I also, at times, feel so, so sad.

I have sat in philosophy seminars where it was asserted that I should be left to die on a desert island if the choice was between saving me and saving an arbitrary non-disabled person. I have been told it would be wrong for me to have my biological children because of my disability. I have been told that, while it isn’t bad for me to exist, it would’ve been better if my mother could’ve had a non-disabled child instead. I’ve even been told that it would’ve been better, had she known, for my mother to have an abortion and try again in hopes of conceiving a non-disabled child. I have been told that it is obvious that my life is less valuable when compared to the lives of arbitrary non-disabled people. And these things weren’t said as the conclusions of careful, extended argument. They were casual assertions. They were the kind of thing you skip over without pause because it’s the uncontroversial part of your talk.

More here.