One of the Most Beautiful Film Trilogies of All Time Just Got Even More Beautiful

Dana Stevens in Slate:

ApuTo live without seeing the films of the Indian director Satyajit Ray, said Akira Kurosawa in 1975, “means existing in the world without seeing the sun or the moon.” Though Ray was 11 years his junior, Kurosawa spoke of him that day in Moscow as a master. “I can never forget the excitement in my mind after seeing it,” he recalled of Ray’s debut Pather Panchali, 20 years after that film’s success at Cannes helped to usher in a new era of cinematic globalism—one that would eventually make it possible for a Japanese filmmaker to praise an Indian one in a speech being translated for a Russian audience. “It is the kind of cinema that flows with the serenity and nobility of a big river.”

In 2015—now 60 years since Pather Panchali’s release—Kurosawa’s simple words remain the best Ray criticism I’ve heard and, really, all the recommendation his films require. Pather Panchali, along with its two sequels, Aparajito (1956) and The World of Apu(1959)—the three together are known as “the Apu Trilogy,” after their main character—has just been re-released by Janus Films in a pristine 4K restoration, to be made available in a Criterion Blu-ray set later this year. (The original negatives of all three films were burned in a film-lab fire in London in 1993, making the restoration process especially difficult.) If this trilogy comes anywhere near your town—it opened earlier this month for a run at New York’s Film Forum, with plans to spread to more U.S. cities through the summer—I can’t exhort you any more strongly to see it than Kurosawa already has. Do you really want to exist in the world without ever seeing the sun or the moon?

More here.



Why Your Immune System Doesn’t Eat You Alive

Esther Landhuis in Scientific American:

ImmuneFor a long time researchers figured the body had a tidy way of dealing with immune cells that might trigger diabetes, lupus or other autoimmune diseases—it must kill off these rogue cells early in life, before the immune system matures. New research published on May 19 in Immunity challenges this age-old thinking. Instead, the body seems to keep these so-called self-reactive T cells in benign form to fight potential invaders later. That conclusion comes from a comprehensive set of immune analyses in mice and people, in which a team at Stanford University has found surprisingly large numbers of self-reactive T cells lurking in the bloodstream through adulthood. The cells are not easily activated, though, suggesting the presence of “a built-in brake,” says immunologist Mark Davis, the paper’s senior author. The findings renew debate about how the immune system manages to marshal its forces against myriad foreign invaders all the while leaving our own tissues alone.

The controversy emerged decades ago when researchers learned the secret to the immune system’s incredible versatility. They discovered that a special gene-shuffling process makes millions of antibodies and receptors. Their sheer number and variety allow our immune cells to recognize any conceivable pathogen, in principle. But the explanation also posed a puzzle: Those random gene rearrangements also produce T cells that could attack the body’s own tissues. As a solution, some scientists proposed that the body wipes out those self-reactive cells while the immune system is developing. Subsequent experiments by several labs supported this proposal.

More here.

Thursday, May 21, 2015

The surprising links between faith and evolution and climate denial — charted

Chris Mooney in the Washington Post:

ScreenHunter_1196 May. 21 22.40For a long time, we’ve been having a pretty confused discussion about the relationship between religious beliefs and the rejection of science — and especially its two most prominent U.S. incarnations, evolution denial and climate change denial.

At one extreme is the position that science denial is somehow deeply or fundamentally religion’s fault. But this neglects the wide diversity of views about science across faiths and denominations — and even across individuals of the same faith or denomination — not all of which are anti-climate science, or anti-evolution.

At the other extreme, meanwhile, is the view that religion has no conflict with science at all. But that can’t be right either: Though the conflict between the two may not be fundamental or necessary in all cases, it is pretty clear that the main motive for evolution denial is, indeed, a perceived conflict with faith (not to mention various aspects of human cognition that just make accepting evolution very hard for many people).

The main driver of climate science rejection, however, appears to be a free market ideology — which is tough to characterize as religious in nature. Nonetheless, it has often been observed (including by me) that evolution denial and climate science rejection often seem to overlap, at least to an extent.

More here.

the british and their weather

Eb8053f4-fef2-11e4_1150876hRichard Hamblyn at The Times Literary Supplement:

“When two Englishmen meet”, wrote Samuel Johnson in 1758, “their first talk is of the weather; they are in haste to tell each other, what each must already know, that it is hot or cold, bright or cloudy, windy or calm.” It remains an insightful observation, not for what it says about the British obsession with weather – that was a truism even then – but for what it says about the value of natural knowledge. Talking about the weather in the present tense is a more or less futile undertaking, but it was as far as the science of meteorology had advanced in the millennium and a half since the appearance of Aristotle’s influential treatise, theMeteorologica, in the fourth century BC. Since then, the sky had remained an unknowable blue wilderness, populated by meteors (“any bodies in the air or sky that are of a flux and transitory nature”, according to Johnson’s Dictionary: hence “meteorology”), but as the nineteenth century dawned, things began to change. In 1802, Luke Howard gave clouds the names we still use today (cirrus, stratus, cumulus), and in 1804, Francis Beaufort devised the standardized wind-scale that now bears his name. “People were looking at the skies in new ways”, as Peter Moore observes at the outset of The Weather Experiment, his gripping account of nineteenth-century weather science, and by the middle of the century the Meteorological Department of the Board of Trade (better known today as the Met Office) was ready to issue the world’s first official weather forecast.

more here.

walter russell and The Secret of Light

Walter-russell22Dan Piepenbring at The Paris Review:

Walter Russell (May 19, 1871–May 19, 1963) was the progenitor of a “new world-thought” centered on light; in books such as The Electrifying Power of Man-Woman Balance, The Book of Early Whisperings, and The Dawn of a New Day in Human Relations, he foresaw “a marriage between religion and science” in which the laws of physics would be rewritten. He believed that weight “should be measured dually as temperature is,” with “an above and below zero,” and that “the sunlight we feel upon our bodies is not actual light from the sun.” (Russell’s Wikipedia entry notes gingerly that his ideology “has not been accepted by mainstream scientists.”)

In what’s ostensibly his seminal text, The Secret of Light, he outlines a philosophy rife with capitalized Nouns and portentous pseudo erudition:

Man lives in a bewildering complex world of EFFECT of which he knows not the CAUSE. Because of its seemingly infinite multiplicity and complexity, he fails to vision the simple underlying principle of Balance in all things. He, therefore, complexes Truth until its many angles, sides and facets have lost balance with each other and with him.

more here.

On James Wood’s ‘The Nearest Thing to Life’

161168742X.01.MZZZZZZZJonathan Russell Clark at The Millions:

This book, which manages to be even slimmer than How Fiction Works, also manages to be even better. The Nearest Thing to Life is as close as we’ll ever get to a manifesto from the British-born New Yorker critic. Contained in the book’s 134 pages is a passionate defense of criticism, a memoir of Wood’s early life and influences, and an insightful study of the meaning of fiction.

This should all be old hat by now. Every year, new books arrive promising some meditation on fiction’s quintessence, and though many of them are useful and even well written, they rarely offer truly fresh observations. All of which makes The Nearest Thing to Life that much more remarkable. Wood succeeds so well because of his knack for recognizing defining contradictions. Consider the way he unpacks the duality of fiction through the lens of religion:

The idea that anything can be thought and said inside the novel –– a garden where the greatWhy? hangs unpicked, gloating in the free air –– had, for me, an ironically symmetrical connection with the actual fears of official Christianity outside the novel: that without God, asDostoyevsky put it, “everything is permitted.” Take away God, and chaos and confusion reign; people will commit all kinds of crimes, think all kinds of thoughts. You need God to keep a lid on things. This is the usual conservative Christian line. By contrast, the novel seems, commonsensically, to say: ‘Everything has always been permitted, even when God was around. God has nothing to do with it.’

more here.

Meet the Manhattan mothers who think they deserve a ‘wife bonus’

Celia Walden in The Telegraph:

Stepfordwives_3308916bWomen are notoriously bad at asking for bonuses. Which is why I did my homework and created – as BusinessInsider.com suggested – “a master plan”. I waited “the appropriate amount of time” (in my case, five years), made sure the big boss was in a good mood and took him out to lunch (“somewhere intimate, where there will be no interruptions”). I eschewed any usage of the word “need” (stinking, as it does, of desperation) in my pitch – which was “backed up with reports, charts and documentation of my positive performance” – and I tried to “remain respectful” as he stared slack-jawed back at me, before throwing his head back and roaring with laughter. Asking my own husband for a bonus simply for being his wife was never going to be anything less than preposterous. Yet according to an author of the forthcoming memoir, Primates of Park Avenue, this is what a glittering tribe of crispy-haired Upper East Side Manhattan wives do every year – depending, of course, on how well they have managed the domestic budget, socialised, upheld a variety-filled performance in the bedroom… and succeeded in getting the kids into a ‘Big Ten’ school.

Wednesday Martin, a social researcher who has been immersing herself in the lives of “Park Lane Primates” for over a decade, explains how the “wife bonus”, as she has called it, works in practice. “It might be hammered out in a pre-nup or post-nup, and distributed on the basis of not only how well her husband’s fund had done, but her own performance — the same way their husbands were rewarded at investment banks. In turn, these bonuses were a ticket to a modicum of financial independence and participation in a social sphere where you don’t just go to lunch, you buy a $10,000 table at the benefit luncheon a friend is hosting.”

More here.

Reproducibility crisis: Blame it on the antibodies

Monya Baker in Nature:

AntibodyIn 2006, things were looking pretty good for David Rimm, a pathologist at Yale University in New Haven, Connecticut. He had developed a test to guide effective treatment of the skin cancer melanoma, and it promised to save lives. It relied on antibodies — large, Y-shaped proteins that bind to specified biomolecules and can be used to flag their presence in a sample. Rimm had found a combination of antibodies that, when used to 'stain' tumour biopsies, produced a pattern that indicated whether the patient would need to take certain harsh drugs to prevent a relapse after surgery. He had secured more than US$2 million in funding to move the test towards the clinic. But in 2009, everything started to fall apart. When Rimm ordered a fresh set of antibodies, his team could not reproduce the original results. The antibodies were sold by the same companies as the original batches, and were supposed to be identical — but they did not yield the same staining patterns, even on the same tumours. Rimm was forced to give up his work on the melanoma antibody set. “We learned our lesson: we shouldn't have been dependent on them,” he says. “That was a very sad lab meeting.”

Antibodies are among the most commonly used tools in the biological sciences — put to work in many experiments to identify and isolate other molecules. But it is now clear that they are among the most common causes of problems, too. The batch-to-batch variability that Rimm experienced can produce dramatically differing results. Even more problematic is that antibodies often recognize extra proteins in addition to the ones they are sold to detect. This can cause projects to be abandoned, and waste time, money and samples. Many think that antibodies are a major driver of what has been deemed a 'reproducibility crisis', a growing realization that the results of many biomedical experiments cannot be reproduced and that the conclusions based on them may be unfounded. Poorly characterized antibodies probably contribute more to the problem than any other laboratory tool, says Glenn Begley, chief scientific officer at TetraLogic Pharmaceuticals in Malvern, Pennsylvania, and author of a controversial analysis1 showing that results in 47 of 53 landmark cancer research papers could not be reproduced.

More here.

Thursday Poem

Captain of the Lighthouse

The late hour trickles into morning. The cattle low profusely by the anthill
where brother and I climb and call Land’s End. We are watchmen
overlooking a sea of hazel-acacia-green, over torrents of dust whipping about
in whirlwinds and dirt tracks that reach us as firths.

We man our lighthouse – cattle as ships. We throw warning lights whenever
they come too close to our jagged shore. The anthill, the orris-earth
lighthouse, from where we hurl stones like light in every direction.

Tafara stands on its summit speaking in sea-talk, Aye-aye me lad – a ship’s a-
coming! And hurls a rock at the cow sailing in. Her beefy hulk jolts and turns.
Aye, Captain, another ship saved! I cry and furl my fingers into an air-long
telescope – searching for more vessels in the day-night.

Now they low on the anthill, stranded in the dark. Their sonorous cries haunt
through the night. Aye, methinks, me miss my brother, Captain of the
lighthouse, set sail from land’s end into the deepest seventh sea.
.

by Togara Muzanenhamo
from Spirit Brides
Carcanet Press Ltd., Manchester, 2006

Wednesday, May 20, 2015

Comics and the Eternal Present

Kurt Klopmeier in The Critical Flame:

ScreenHunter_1195 May. 21 07.57“What, then, is time?” Christian philosopher St. Augustine asked. “If no one asks me, I know what it is. If I wish to explain it to him who asks, I do not know.” We say that time flies or that it drags. We have it on our hands or we are pressed for it. And although we cannot experience any time other than our own present, physicists tell us that there is nothing particularly special about the present. It seems that all times exist at once, and it is only our perception that limits our view of it. Because of this, it’s very hard to understand and even harder to convey the idea that everything according to special relativity is constantly happening.

The way people experience time—that it has a direction and a flow—appears to be inaccurate, at least according to our best understanding of physics. In his explanation of special relativity,The Fabric of the Cosmos, Brian Greene explains: “There is no use crying over spilled milk, because once spilled it can never be unspilled: we never see splattered milk gather itself together, rise off the floor, and coalesce in a glass that sets itself upright on a kitchen counter.” Events happen in one direction, and one alone. Time seems to move always forward in a particular sequence that is never interrupted. However, Greene writes, “as hard as physicists have tried, no one has found any convincing evidence within the laws of physics that supports this intuitive sense that time flows. In fact, a reframing of some of Einstein’s insights from special relativity provides evidence that time does not flow … The outside perspective … in which we’re looking at the whole universe, all of space at every moment of time, is a fictitious vantage point, one that none of us will ever have.” But this view of time, and the way that authors have tried to use it, can offer enlightening insights about the world that normal sequential narratives cannot, and can shed light on the way narrative operates on our understanding.

More here.

The Myth of American Meritocracy

Ron Unz in The American Conservative:

ScreenHunter_1194 May. 20 20.06Just before the Labor Day weekend, a front page New York Times story broke the news of the largest cheating scandal in Harvard University history, in which nearly half the students taking a Government course on the role of Congress had plagiarized or otherwise illegally collaborated on their final exam.1 Each year, Harvard admits just 1600 freshmen while almost 125 Harvard students now face possible suspension over this single incident. A Harvard dean described the situation as “unprecedented.”

But should we really be so surprised at this behavior among the students at America’s most prestigious academic institution? In the last generation or two, the funnel of opportunity in American society has drastically narrowed, with a greater and greater proportion of our financial, media, business, and political elites being drawn from a relatively small number of our leading universities, together with their professional schools. The rise of a Henry Ford, from farm boy mechanic to world business tycoon, seems virtually impossible today, as even America’s most successful college dropouts such as Bill Gates and Mark Zuckerberg often turn out to be extremely well-connected former Harvard students. Indeed, the early success of Facebook was largely due to the powerful imprimatur it enjoyed from its exclusive availability first only at Harvard and later restricted to just the Ivy League.

During this period, we have witnessed a huge national decline in well-paid middle class jobs in the manufacturing sector and other sources of employment for those lacking college degrees, with median American wages having been stagnant or declining for the last forty years. Meanwhile, there has been an astonishing concentration of wealth at the top, with America’s richest 1 percent now possessing nearly as much net wealth as the bottom 95 percent.2 This situation, sometimes described as a “winner take all society,” leaves families desperate to maximize the chances that their children will reach the winners’ circle, rather than risk failure and poverty or even merely a spot in the rapidly deteriorating middle class. And the best single means of becoming such an economic winner is to gain admission to a top university, which provides an easy ticket to the wealth of Wall Street or similar venues, whose leading firms increasingly restrict their hiring to graduates of the Ivy League or a tiny handful of other top colleges.3 On the other side, finance remains the favored employment choice for Harvard, Yale or Princeton students after the diplomas are handed out.4

More here.

Fake Diplomas, Real Cash: Pakistani Company Axact Reaps Millions

Declan Walsh in the New York Times:

17PAKISTAN-WEB-1-master675-v2Seen from the Internet, it is a vast education empire: hundreds of universities and high schools, with elegant names and smiling professors at sun-dappled American campuses.

Their websites, glossy and assured, offer online degrees in dozens of disciplines, like nursing and civil engineering. There are glowing endorsements on the CNN iReport website, enthusiastic video testimonials, and State Department authentication certificates bearing the signature of Secretary of State John Kerry.

“We host one of the most renowned faculty in the world,” boasts a woman introduced in one promotional video as the head of a law school. “Come be a part of Newford University to soar the sky of excellence.”

Yet on closer examination, this picture shimmers like a mirage. The news reports are fabricated. The professors are paid actors. The university campuses exist only as stock photos on computer servers. The degrees have no true accreditation.

In fact, very little in this virtual academic realm, appearing to span at least 370 websites, is real — except for the tens of millions of dollars in estimated revenue it gleans each year from many thousands of people around the world, all paid to a secretive Pakistani software company.
More here.

Study may explain mysterious cancer–day care connection

Warren Cornwall in Science:

LeukemiaFor years, scientists have noticed an interesting pattern of cancer among children. Those who went to day care early in life were less likely to later develop the most common childhood cancer: acute lymphoblastic leukemia (ALL). Now, a 7-year study appears to have unraveled the molecular mechanism driving ALL. The work may explain why early exposure to infections in places such as day cares seems to protect against the disease and why unrelated vaccines help guard against this cancer. For Mel Greaves, a cancer cell biologist at the University of London’s Institute of Cancer Research, the finding provides an explanation for the hypothesis he has long promoted: that when infants in modern societies are sheltered from routine infections, their immune systems are more likely to overreact during later infections, paving the way for ALL. “I see it as the missing link,” he says of the new research.

Most childhood ALL involves a malfunction of B cells, the scouts of the immune system that patrol the bloodstream looking for intruders like viruses and bacteria; they make antibodies that help fight infections. But with leukemia, the immune system goes haywire, churning out flawed, immature B cells at a prodigious rate and crowding out healthy blood cells. Normal B cells are a marvel of adaptability. As they mature, they reprogram their own DNA, enabling the immune system to produce millions of different B cells programmed to recognize the vast range of potential infections. The DNA rearrangement relies on a sequence of enzymes. First, proteins known as RAGs cut and paste whole chunks of DNA. After that, another enzyme, AID, goes to work “fine-tuning” the DNA by altering single nucleotides. But Greaves and colleagues suspected this process could go awry, introducing mutations that create flawed B cells that could cause leukemia. In a series of experiments, they found evidence that much of the problem lay with a breakdown in the orderly sequence of gene editing during infections. Rather than the RAGs doing their business and then stepping aside for the AID, the AID kicked in simultaneously, potentially increasing the risk of gene-editing errors. These tantalizing results came to a head in an experiment on mice with a genetic abnormality linked to childhood ALL. The condition, in which two genes associated with blood formation are fused together, is found in the cord blood of 1% of all newborns. But most children with it never go on to develop full-blown ALL. The researchers wondered if unregulated mutations set off by repeated infections later in childhood could make the difference, triggering the leukemia.

More here.

Wednesday Poem

Adelle Explains Urgency to the Judge
.

A few hours before the wedding I started to draw again. I had never taken my sketches seriously. But these new pictures showed a mastery I never thought possible before. In those few hours, I gained a sense of myself. I locked the door to the bridal suite and sketched everything I could: the windows, an armoire, my violet nightgown hanging from a hanger. That was when I heard a knock, followed by shouts and threats. It was my mother with the white dress.
.

by Kristina Marie Darling
from Amethyst Arsenic, 4.1

Women’s Work: the legacy of the 1970s women’s movement

Vivian Gornick in Book Forum:

BettyFORTY YEARS AGO, when the second wave of the American feminist movement was young, and its signature phrase, “the personal is political,” was electrifying, many of the movement’s radicals (this reviewer among them) went to war with the age-old conviction that marriage and motherhood were the deepest necessities of every woman’s life. If we looked honestly at what many of us really wanted, as we were doing in the 1970s and ’80s, it was not marriage and motherhood at all; it was rather the freedom to discover for ourselves the lives we might actually want to pursue. In our pain and anger at having been denied that freedom, we often turned recklessly on these conventional wisdoms. Marriage was rape, we cried, motherhood slavery. No equality in love? We’ll do without! What we didn’t understand—and this for years on end—was that between the ardor of our revolutionary rhetoric and the dictates of flesh-and-blood reality lay a no-man’s-land of untested pronouncements. How easy it was for us to declare ourselves “liberated,” how chastening to experience the force of contradictory feeling that undermined these defiant simplicities. As we moved inexorably toward the moment when we were bound to see that we were throwing the baby out with the bathwater, nearly every one of us became a walking embodiment of the gap between theory and practice: the place in which we were to find ourselves time and again.

KATE BOLICK is a forty-two-year-old journalist who, since childhood, has harbored a fantasy of living alone and becoming what she calls a “real” writer, but, like many women of her generation, she has found it nearly impossible to pursue that dream. In a memoir, Spinster, she traces the problem to its origins. “Whom to marry, and when it will happen” are the book’s opening words. “These two questions define every woman’s existence, regardless of where she was raised or what religion she does or doesn’t practice. She may grow up to love women instead of men, or to decide she simply doesn’t believe in marriage. No matter. These dual contingencies govern her until they’re answered, even if the answers are nobody and never.”

More here.

Tuesday, May 19, 2015

A Conversation with An-My Le

Le-web1

Over at The Brooklyn Rail:

Sara Christoph (Rail): The current blockbuster American Sniper, which deals with the same subject matter as your own work, might be a good place for us to begin.The success of these types of movies fascinates me, though it is not surprising, given the way the films tend to mythologize the soldier’s experience in a one-dimensional way. As someone who has spent years carefully parsing the nuances of what it means to live through or participate in a war, what was your reaction to the film?

An-My Lê: You know, I rarely have time to go to the movies, but I did see American Sniper. I also saw Rory Kennedy’s Last Days in Vietnam. I should have seen it months ago. I think I had P.T.S.D. afterwards. I was very happy to see American Sniper, because I am always fascinated with this subject, but I was disappointed. It was kind of a great story—

Rail: Just the feat of his accomplishments, leaving aside the moral issues.

Lê: Yes, the feat of it. The stress, the focus, the psychology of the mission and how it affected him—all of that really interests me more than anything else. But you’re right, it is very one-dimensional. Some filmmakers, like Kathryn Bigelow and her film Zero Dark Thirty, are interested in portraying something that is three-dimensional. She’s an artist, and hers is a fictional account. And there is something about working within that fiction that allows for a satisfying and challenging description. I don’t think Clint Eastwood did that, even though he can be a great filmmaker. I’m not sure why. Perhaps he got so caught up in wanting to pay tribute to Chris Kyle as a veteran. And of course that is important. It is a responsibility.

Rail: Specifically because of the way Kyle’s story ends, being killed by a fellow vet. There’s an added responsibility to an individual’s legacy.

Lê: The topic of the military raises questions in ways that other topics would not. There are photographers who have dealt with extreme poverty, or who have photographed horrific labor conditions, and they are not held accountable in the same way. They aren’t asked: what do you think of poverty? But the question of the military is so complicated that it riles up people’s opinions. And when your work is about the military, people want to know: are you for or are you against it? Maybe American Sniper was too caught up in having a straightforward message.

More here.

What Reading Wordsworth Teaches Us About Poverty

Paul-Ryan-Wikipedia-Commons-1024x678

Jamison Kantor in The Brooklyn Quarterly (Photo: Wikimedia Commons):

How does one get from British Romantic writers to Paul Ryan? The answer may lie in the language that each of them used.

Before turning to the emotions that are associated with poetic language, let’s look briefly at the emotional logic of the system itself. With the emergence of industrial labor in England, rural workers had to dramatically change their mindset. Now, people who had never lived under the rule of capitalism were expected to enter the industrial marketplace, endure the vicissitudes of prices—and the poor-relief to which they were connected—and reorganize their lifestyle around an administration over which they had almost no control. Swift market fluctuations did not just mean that foresight and planning were difficult. Existence under this new regime also meant a change in consciousness. In order to tolerate such insecurity, workers would have to believe in the promise of the new capitalist enterprise; that, despite the incessant variability built into their lives, rising industrial productivity would eventually bring them comforts far greater than what they had through rural work.

Ironically, the Speenhamland system may have played a role in this. “Hope,” the economic historian Karl Polanyi writes in his classic The Great Transformation (1944), “…was distilled out of the nightmare population and wage laws, and was embodied in a concept of progress so inspiring that it appeared to justify the cast and painful dislocations to come.”[3] This belief has remained a part of modern urban poverty. The endless, small decisions that the working poor have to make merely in order to survive act as a cruel stand-in for the sanctified idea of capitalist choice. In actuality, the constant pressure of evaluation and selection can fatigue people so much that their cognitive function is diminished. Recently, a group of contemporary neuroscientists from the University of Warwick have shown that poverty actually impedes cognitive function by putting this burden of choice on workers continually: which bus to take, which groceries will be least expensive, which residential utility will be most essential for living.[4] Add to these factors the inherent bustle of urban life, and poverty becomes a twofold deprivation: not only do people lack material provisions, but they also lack the time for deeper moments of contemplation. Poverty literally trades intellection for survival.

More here.

Austerity Bites: Fiscal Lessons from the British General Election

Blyth_payitforward4

Jonathan Hopkin and Mark Blyth in Foreign Affairs:

Despite Conservative spending cuts, the United Kingdom’s deficit was reduced by only half of what the party anticipated when it took office in 2010. The nation’s economy did not start to grow until late 2013, after a panicked treasury minister, George Osborne, relaxed austerity measures. The United Kingdom’s economic problems, the Conservatives maintained, were the result of Labour’s supposed profligacy in running budget deficits during the boom years of the early 2000s, leaving the economy exposed to the financial crisis. This, they argued, made draconian spending cuts inevitable.

However, as the crisis hit in 2007, the United Kingdom had the lowest debt to GDP ratio in the G7, lower than when Labour had taken power a decade earlier. And if Labour was supposedly running excessive deficits, the markets remained strangely unconcerned, with market rates on British bonds running close to pre-collapse lows. This left many wondering why the British budget exploded in 2008 and what it might say about coalition rule in the United Kingdom.

These questions, however, were strangely missing from discussion during the election. Cameron did not discuss why the United Kingdom’s outsized and overleveraged financial sector made the nation suffer disproportionately from the worst financial crisis since the 1930s. Financial deregulation and the unsustainable growth in private, not public, credit fatally exposed the United Kingdom’s banks to the United States’ subprime credit crisis. The collapse in credit growth in 2007–09 hurt the United Kingdom’s budget not because the Labour government was too deep in debt but because the national economy was more dependent on financial activity than elsewhere. By 2007, the British Exchequer was taking nearly 25 percent of total tax revenue out of the financial sector just prior to the crisis, which was a mere ten percent of the economy. With the financial crisis, these revenues plummeted, leaving the government short of cash and needing to borrow heavily.

More here.