eulogy for Mikhail Kalashnikov

ImgresStefany Anne Golberg at Misfit Press:

Among the displays of assault rifles at the Mikhail Kalashnikov Museum in Izhevsk is a small lawnmower Kalashnikov designed to push about the grounds of his summer cottage. It is said that Mikhail Kalashnikov loved to care for his grass. Kalashnikov gave the lawnmower the same sensible qualities he gave the gun that bears his name. The lawnmower is light, simple, cheap to construct and easy to hold—something a child could use.

Kalashnikov didn’t regret inventing the Kalashnikov rifle. “I invented it for the protection of the Motherland,” he said. Still, he once mused that he would like to have been known as a man who helped farmers and gardeners. “I wanted to invent an engine that could run forever,” Kalashnikov once said. “I could have developed a new train, had I stayed in the railway.” But this was not to be.

Mikhail Kalashnikov was born in the rural locality of Kurya, the 17th child of peasants. When Kalashnikov was still a boy, his family’s property was confiscated and they were deported to Western Siberia. The farming was hard there, but harder was the shame of being exiled from the Soviet workers’ paradise. Kalashnikov was a sickly child and though his studies didn’t take him past secondary school, the future inventor dreamed of being a poet. After finishing the seventh grade, young Kalashnikov gathered his poetry books and worked as a technician on the Turkestan-Siberian railway, until he was conscripted into the Red Army in 1938. He worked with tanks and, in his spare time, tinkered with small arms. In 1941, Kalashnikov was wounded in battle. There, in the hospital, suffering from war wounds and shellshock, Kalashnikov had his vision. “I decided to build a gun of my own which could stand up to the Germans,” he would later say.

more here.

Saturday Poem

The Chase

They say the chase ends where the earth is put together
by two halves, but no matter —because that is you
at thirty, perhaps forty:
corpus callosum of the brain,
two loaves opening and closing like a book.

Your arms spring out and lungs push and pull
rinsing the midnight air—
no matter, because you are there, chasing
the child of wonder and hope
through cities confined in smog.

You missile through firs, through mouths dusted
with mathematical chalk.
You follow the muddy-water spillways peppered with
bacterial spore.

Not the shadow that greets itself in the dark
but the utter collision of evaporating rain
leads you on.
Not the lightning’s sketch but the black puzzle of night,
as you appear and disappear among people,
chasing he who knows your name
but won’t tell.
.

by Victor Martinez
from Paper Dance -55 Latino Poets
Persea Books, 1994
.

A New Biography Says George W. Bush Really Was the Decider

Jason Zengerle in The New York Times:

BushIt’s an axiom of American politics that presidents become more popular once they are ex-presidents. Admittedly, George W. Bush had nowhere to go but up. With two months left in his second term, Bush’s approval rating sat at an abysmal 25 percent, just one point higher than Richard Nixon’s during Watergate. On the day of Barack Obama’s inauguration, when a Marine helicopter ferried the outgoing president away from the United States Capitol, many in the crowd serenaded him with chants of “Bye-bye Bush!” and “Go home to Texas!” Then the predictable happened. Bush’s absence from public life made Americans’ hearts for him grow fonder. Out of the spotlight, he busied himself painting oil portraits of family pets and world leaders; when he did dip his toe into political waters, it was for laudable and uncontroversial causes like fighting AIDS and malaria in Africa. His poll numbers began their inexorable climb. By June of last year, Bush’s favorability rating was 52 percent — higher than Obama’s at the time. His younger brother, Jeb, started his ill-fated 2016 presidential run with the declaration, “I am my own man.” But by the end of Jeb’s run, he was appearing alongside Dubya at rallies. Although Jeb’s fraternal Hail Mary ultimately fell short, his older brother’s re-emergence on the campaign trail only served to confirm that, fewer than eight years after being hounded from the White House, George W. Bush had become a less polarizing, fairly popular, at times even lovable figure.

Readers of the presidential historian Jean Edward Smith’s mammoth new biography, “Bush,” will surely be cured of this political amnesia. Smith — who has written biographies of Ulysses S. Grant, Franklin Delano Roosevelt and Dwight Eisenhower — is unsparing in his verdict on our 43rd president. “Rarely in the history of the United States has the nation been so ill-served as during the presidency of George W. Bush,” Smith writes in the first sentence of the preface. And then he gets harsh. In Smith’s clipped retelling of his subject’s early years, Bush was an unaccomplished, callow son of privilege who cashed in on his family’s connections for everything from his admission to Yale to his avoidance of Vietnam. Quoting Bush’s tautological explanation of his wasted youth — “When I was young and irresponsible, I behaved young and irresponsibly” — Smith concludes, “That pretty well says it all.” Being Texas governor “was scarcely a full-time job,” and his 2000 victory in the presidential race owed as much to the ineptness of his Democratic opponent, Al Gore — who “came across as wooden and self-­important” — as it did to Bush’s “ease on the campaign trail.” None of this prepared Bush for the gravity of the responsibilities he would face as president, Smith argues, and time and again Bush failed to meet the challenges of the office.

More here.

Friday, July 22, 2016

Do Government Incentives Make Us Bad Citizens?

McMahon_bodyJohn McMahon reviews Samuel Bowles's The Moral Economy: Why Good Incentives Are No Substitute for Good Citizens, in The Boston Review:

From sin taxes to the Affordable Care Act’s individual mandate, from tax rebates for buying an electric car to performance-based school funding, governments extensively deploy material incentives to regulate citizens’ behaviors. The idea is straightforward: economic costs and benefits shape people’s choices, so changing those costs and benefits can change their actions.

This approach is intuitively appealing in our age, as it uses an enlightened mix of encouragement and coercion to advance public goals. But it works only if people act rationally in their own self-interest and respond accordingly to alterations in cost-benefit calculations. This may not seem much of an “if”; the notion that we all maximize our own good has been the basis of a long strain of economic thinking stretching back at least to Adam Smith, who asserted, “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest.” But is this really an accurate depiction of our behavior? And what is the significance of individual or collective political agency in a world of government-by-incentives?

Samuel Bowles’s new book The Moral Economy: Why Good Incentives Are No Substitute for Good Citizens provides a lucid and comprehensive answer to the first of these questions. Synthesizing findings from experimental and behavioral economics, psychology, and anthropology over the last two decades, Bowles convincingly argues that people do not act on the basis of amoral self-interest alone. Rather, we regularly proceed from “ethical and other-regarding motivations.” Furthermore, these “social preferences,” as Bowles call them, can be crowded out and eventually eroded by policies that rely exclusively on manipulating material self-interest. He then moves these lessons from social science research into the realm of both policymaking and political theory, contending that the proper role of government is to construct a “policy paradigm of synergy between incentives and constraints, on the one hand, and ethical and other-regarding motivations, on the other.”

Bowles doesn’t explore the second question, which is about political agency. This is a striking omission because he emphasizes the need for public policy and governance to cultivate good citizens. Yet there is no place in his recommendations for the active citizen practicing democracy through political participation, protest, and social movements. The book is haunted by the absence of active responses to government-instituted incentives and policies.

More here.

Money: The Brave New Uncertainty of Mervyn King

King_mervyn-071416

Paul Krugman reviews Mervyn King's The End of Alchemy: Money, Banking, and the Future of the Global Economy, in the NYRB:

These days, of course, the pound sterling is much less widely used than the dollar, the euro, or even the yen or the yuan, and the Bank of England is correspondingly overshadowed in many ways by its much younger counterparts abroad. Yet the bank still punches above its weight in troubled times. In part that’s because London remains a great financial center. But it’s also thanks to the Bank of England’s intellectual adventurousness.

It was a big departure for the Federal Reserve—which has historically been run by bankers rather than academics—when Ben Bernanke, a distinguished monetary economist, was appointed as chairman in 2006. But Mervyn King, a former professor at the London School of Economics, was already running the Bank of England. And it was these two professors who guided the English-speaking world’s biggest economies through the recent financial crisis.

Now King, like Bernanke, has written a book inspired by his experiences. But it’s not at all the book one might have expected. It’s not a play-by-play of the crisis, or a tell-all, or a personal memoir. In fact, King not-so-subtly mocks the authors of such books, which “share the same invisible subtitle: ‘how I saved the world.’”

King’s book is, instead, devoted to “economic ideas.” It is rich in wide-ranging historical detail, with many stories I didn’t know—the desperate shortage of banknotes at the outbreak of World War I, the remarkable emergence of the “Swiss dinar” (old Iraqi notes printed from Swiss plates) in Kurdistan. But it is mainly an extended meditation on monetary theory and the methodology of economics.

And a fascinating meditation it is. As I’ll explain shortly, King takes sides in a long-running dispute between mainstream economic analysis and a more or less radical fringe that rejects the mainstream’s methods—and comes down on the side of the radical fringe. The policy implications of his methodological radicalism aren’t as clear or, I’d argue, as persuasive as one might like, but he definitely challenges policy as well as research orthodoxy.

You don’t have to agree with everything King says—and I don’t—to be impressed by his willingness to let his freak flag fly. His assertion that we haven’t done nearly enough to head off the next financial crisis will, I think, receive wide assent; I don’t know anyone who thinks, for example, that the US financial reforms enacted in 2010 were sufficient. But his assertion that the whole intellectual frame we’ve been using is more or less irreparably flawed is a brave position that should produce a lot of soul-searching among both economists and policy officials.

More here.

Ending the Violence

Castile

Cedric Johnson in Jacobin:

Former New York City mayor Rudolph Giuliani, the supreme booster of “broken windows” policing, was quick to attack Black Lives Matter activists, claiming that BLM is “inherently racist because, number one, it divides us.” He also chastised activists for allegedly ignoring violence within black communities and suggested that they were responsible for civilian-police conflicts because their criticism “puts a target on the backs of” police officers.

Other conservatives have echoed claims that the Obama administration and Black Lives Matter protests have created dangerous conditions for police officers. They are wrong. Policing is not the most hazardous occupation in the United States. In fact, it is not even in the top ten.

And contrary to the claim that the Obama administration — no unwavering supporter of anti–police brutality efforts — has enabled anti-police sentiment, violence against police officers has decreased during Obama’s tenure, especially when compared to the George W. Bush years. Over 70 percent of the violence against law enforcement that has occurred so far this year has been carried out by white men.

Finally, anti–police brutality struggles should not be reduced to the “movement for black lives.” Surely the hashtag and slogan, and the network of activists who align with BLM, have been instrumental in drawing national and international attention to the issue of police violence, but on the ground, protests are comprised of all manner of people representing victims’ families, traditional civil rights organizations, neighborhood and community groups, labor unions, civil liberties advocates, youth and student organizations, various left political tendencies, and solitary actors. And organizing against police brutality has a much longer lineage, one that certainly predates the birth of BLM’s millennial spokespersons.

In the hands of conservatives, Black Lives Matter has become an easy foil for dismissing a longer-standing set of struggles against police violence and mass incarceration.

More here.

On the march to the robot apocalypse

0724_brainiacDesktopKevin Hartnett at The Boston Globe:

Here’s a fun game: Tap your finger against a surface, but before you do, predict the sound it will make. Did you get it right? If not, you better practice, because pretty soon a robot’s going to play this game better than you can.

On the march to the robot apocalypse, the ability to perform such a quirky task may not seem especially portentous, but new research out of MIT demonstrates why such a capacity lays the foundation for far more sophisticated actions.

Andrew Owens, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory, and his collaborators presented the research at a conference in Las Vegas last month. There they explained how they’d engineered a computer algorithm that can “watch” silent video of a drumstick striking different kinds of objects and create a sound that, in many cases, closely matches the one generated by the actual event.

“The computer has to know a lot about the physical world,” says Owens. “It has to know what material you’re hitting, that a cushion is different than a rock or grass. It also has to know something about the action you’re taking. If you’re [striking the surface] hard, it should be a loud sound loud, if soft, a softer sound.”

more here.

the moral quandries of eating meat

137ff108-4e47-11e6-953c-cb8534df0970Julian Baggini at the Times Literary Supplement:

Indifference, however, appears to be the norm. Most of us live in “carninormative” societies where meat eating is so normal that no matter how many qualms we might have about it, it just doesn’t feel wrong to most of us. This is most evident in the mismatch between the almost universal reflective disapproval of inhumane intensive farming and the unreflective buying choices of most consumers. Christopher Belshaw, in his contribution to The Moral Complexities of Eating Meat, is surely too optimistic when he claims it is unnecessary to say anything about factory farming because “there is little point either in defending the indefensible or in attacking a practice that almost every reader here will already condemn”. I am constantly amazed to see well-educated, thoughtful people order meat at restaurants without any questions about its provenance.

So we ought to be thinking more seriously about animal ethics. Yet it is often the case that the more rigorous we try to be, the more inadequate our conceptual tools look. The crudest tool of all is the utilitarian hedonic yardstick, which equates the good with whatever decreases suffering and increases happiness or pleasure. Utilitarianism starts with the undeniable premiss that the well-being of sentient creatures matters, yet ends with the incredible conclusion that all that matters is maximizing total well-being. Even if that were true, it’s difficult enough applying the principle to humans, since there are important qualitative differences between kinds of positive and ­negative human feelings. When we try to apply the principle across species, these problems multiply. How can you compare Hammy the hamster playing on his wheel and Miles Davis playing his trumpet?

more here.

The Heroic Art of Agnes Martin

Als_1-071416-1Hilton Als at the London Review of Books:

Walking through the show, one can see how ultimately unsuited Martin was to be a hard-core Abstract Expressionist; the movement was too noisy, and what did she have to do with bop, the Beats, that wall of sound and bodies that wanted to shout the squares down in favor of “kicks”? Martin was interested not in discord but in harmony. While Jackson Pollock said he was nature, Martin strove to represent how nature made her feel or should make us feel—humble, free. Nature was to her what it was to Ralph Waldo Emerson’s “transparent eye” in his transcendentalist masterpiece, “Nature” (1836)—a space unrivaled in its ability to inspire and transform.

Emerson’s idealism—“Nature is made to conspire with spirit to emancipate us”—was not unlike Martin’s. Often, in her lovely, empathic writing, she tries to communicate what being an artist must mean if one is going to make real work: becoming a conduit of the beautiful, that which cannot be explained. In her 1989 piece, “Beauty Is the Mystery of Life,” Martin wrote:

When a beautiful rose dies beauty does not die because it is not really in the rose. Beauty is an awareness in the mind. It is a mental and emotional response that we make. We respond to life as though it were perfect. When we go into a forest we do not see the fallen rotting trees. We are inspired by a multitude of uprising trees…. The goal of life is happiness and to respond to life as though it were perfect is the way to happiness. It is also the way to positive art work.

Still, Martin would not be able to create “positive” artwork for years to come; the journey was long.

more here.

In these heartless times, The Little Prince reminds us what it is to be human

Azar Nafisi in The Guardian:

AzarDo you remember the fox? Not just any fox, this one is a sage; the one that reveals the truth to the Little Prince, who reveals it to the pilot, who reveals it to us, the readers. As he says goodbye to his friend, the fox tells the Little Prince, “Here is my secret. It’s quite simple: One sees clearly only with the heart. Anything essential is invisible to the eyes.” When as a child I first heard my father read me The Little Prince in a sunny room in Tehran, I was not aware that the story, along with tales from Shahnameh: The Persian Book of Kings, Pinocchio, the work of Mulla Nasrudin, the Alice stories, The Wizard of Oz and The Ugly Duckling, among others, would become one of the main pillars of my “republic of imagination”. My father’s democratic way of introducing me to these stories shaped my attitude towards works of imagination as universal spaces, transcending the boundaries of geography, language, ethnicity, religion, gender, race, nationality and class. I knew that although this fox and his prince were products of a Frenchman’s mind, and although the book was written in a language foreign to me, at a time before I was born and in a place I had never seen, by virtue of hearing and later reading it, that story would also become my story, that Little Prince and fox belonged as much to me as Scheherazade and her 1,001 nights belonged to the French, American, British, Turkish, German and all other readers who would in reading cherish them and “tame” them, the way the Prince learned to tame the fox.

This is how I, a little girl from Iran, came to know and love France, through a little prince and a fox. I had met foxes before; in fact, my father introduced me to the animal in a fable by Jean de La Fontaine. In this story, like most stories, the fox is sly and clever, cheating a simple crow of his meal. Later, my father translated La Fontaine’s fables complete with their beautiful illustrations which he, an amateur painter, drew himself, copying from the originals. In those and most other illustrations the fox looked pretty, with a gorgeous bushy tail and wide eyes. The Little Prince’s fox was not pretty; its bushy tail, more like an upright broom, was not beautiful and his eyes were so narrow they could be barely seen. Yet this animal forever changed my attitude towards the fox – I began to see it in a different light. From this perspective, the fox’s slyness was not due to malice, but to the need to survive. Although I felt sorry for the chickens (which didn’t prevent me from eating them), the fox hunted them so that he could stay alive, unlike some human beings who not only kill and eat the chickens but hunt foxes for entertainment and sport. Gradually, I came to understand why those wide eyes, always brimming with anxiety and fear, seemed to be on the lookout for some invisible but very real menace.

More here.

Friday Poem

Letter from My Ancestors
.

We wouldn’t write this,
wouldn’t even think of it. We are working
people without time on our hands. In the old country,

we milk cows or deliver the mail or leave,
scattering to South Africa, Connecticut, Missouri,
and finally, California for the Gold Rush –

Aaron and Lena run the Yosemite campground, general
store, a section of the stagecoach line. Morris comes
later, after the earthquake, finds two irons

and a board in the rubble of San Francisco.
Plenty of prostitutes need their dresses pressed, enough
to earn him the cash to open a haberdashery and marry

Sadie – we all have stories, yes, but we’re not thinking
stories. We have work to do, and a dozen children. They’ll
go on to pound nails and write up deals, not musings.

We document transactions. Our diaries record
temperatures, landmarks, symptoms. We
do not write our dreams. We place another order,

make the next delivery, save the next
dollar, give another generation – you,
maybe – the luxury of time

to write about us.
.

by Krista Benjamin
from The Best American Poetry 2006
Scribner Poetry, NY
.

How the Brain Builds Memory Chains

Sara Chodosh in Scientific American:

BrainThink about the first time you met your college roommate. You were probably nervous, talking a little too loudly and laughing a little too heartily. What else does that memory bring to mind? The lunch you shared later? The dorm mates you met that night? Memories beget memories, and as soon as you think of one, you think of more. Now neuroscientists are starting to figure out why. When two events happen in short succession, they feel somehow linked to each other. It turns out that apparent link has a physical manifestation in our brains, as researchers from the Hospital for Sick Children in Toronto (SickKids), the University of Toronto and Stanford University describe in this week’s Science. “Intuitively we know that there’s a structure to our memory,” says neuroscientist Paul Frankland, affiliated with both the University of Toronto and SickKids. “These experiments are starting to scratch the surface of how memories are linked in the brain.”

In your brain, and in the brains of lab mice, recollections are physically represented as collections of neurons with strengthened connections to one another. These clusters of connected cells are known as engrams, or memory traces. When a mouse receives a light shock to the foot in a particular cage, an engram forms to encode the memory of that event. Once that memory forms the set of neurons that make up the engram are more likely to fire. Furthermore, more excitable neurons—that is, brain cells that activate easily—are more likely to be recruited into an engram, so if you increase the excitability of particular neurons, you can preferentially include them in a new engram. The question was, did that principle apply to two memories that happen close together in time? Neurons in a newly formed memory trace are subsequently more excitable than neighboring brain cells for a transient period of time. It follows then that a memory formed soon after the first might be encoded in an overlapping population of neurons, which is exactly what Frankland and study co-lead author Sheena Josselyn, found.

More here.

Thursday, July 21, 2016

How food became a matter of morals

Julian Baggini in The Guardian:

ScreenHunter_2112 Jul. 22 01.19The way these cream cakes flaunt themselves,” says saucy Carry On star Barbara Windsor, glaring disapprovingly at a chocolate eclair bursting with whipped cream, “it’s enough to lead a girl astray.” Her frown turns into a giggle. “Given half a chance,” she adds before tucking in gleefully.

Nothing captures the peculiarly moralistic British attitude to food better than this 15-second advert from the 1970s. And if poetry is the art of capturing whole worlds in few words then its immortal slogan “naughty but nice” is greater proof of its author’s artistry than the Booker prize its writer Salman Rushdie would go on to win.

For as long as we can remember, the British have associated delicious food with depraved indulgence. Anything that tastes good has got to be bad for your body, soul or both. The marketing department of Magnum knew this when it called its 2002 limited edition range the Seven Deadly Sins. Nothing makes a product more enticing than its being naughty, or even better, wicked.

More here.

How a Guy From a Montana Trailer Park Overturned 150 Years of Biology

Ed Yong in The Atlantic:

ScreenHunter_2111 Jul. 22 01.14In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Sprirbille. “That was the bottleneck of my life.”

More here.

Liberalism after Brexit

Will Davies at the Political Economy Research Center:

ScreenHunter_2110 Jul. 22 01.07Given that Brexit was an event imagined and delivered from within the Conservative Party, one of the most important analyses of it is Matthew d’Ancona’s examination of how the idea shifted from the party’s margins to its mainstream over the post-Thatcher era. Two things in particular stand out in his account.

Firstly, the political plausibility of Brexit rose as a direct response to Tony Blair’s dogmatic assumption that European integration was a historical destiny, which encompassed the UK. No doubt a figure such as Blair would have discovered a messianic agenda under any historical circumstances. But given he gained power specifically in the mid-90s, he was one palpable victim of the fin de siècle ideology (stereotyped by Francis Fukuyama’s ‘end of history’ thesis, but also present in Anthony Giddens’ ‘Third Way’) that the world was programmed to converge around a single political system.

Neo-conservative faith in violent ‘democratisation’ was Blair’s worst indulgence on this front, but a view of European unification (and expansion) as inevitable was responsible for inciting the Tory reaction within Westminster. Europe could have been viewed as a particular historical path, adopted in view of the particular awfulness of the European 20th century. Instead, in a Hegelian fashion, the idea of Europe became entangled with the idea of ‘globalisation’, and the conservative reaction was to refuse both.

Secondly, Tory Brexiteers view the EU as an anti-market project, which blocks economic freedom. This is also weirdly ahistorical.

More here.

Minds turned to ash

Josh Cohen in 1843 Magazine:

When Steve first came to my consulting room, it was hard to square the shambling figure slumped low in the chair opposite with the young dynamo who, so he told me, had only recently been putting in 90-hour weeks at an investment bank. Clad in baggy sportswear that had not graced the inside of a washing machine for a while, he listlessly tugged his matted hair, while I tried, without much success, to picture him gliding imperiously down the corridors of some glassy corporate palace. Steve had grown up as an only child in an affluent suburb. He recalls his parents, now divorced, channelling the frustrations of their loveless, quarrelsome marriage into the ferocious cultivation of their son. The straight-A grades, baseball-team captaincy and Ivy League scholarship he eventually won had, he felt, been destined pretty much from the moment he was born. “It wasn’t so much like I was doing all this great stuff, more like I was slotting into the role they’d already scripted for me.” It seemed as though he’d lived the entirety of his childhood and adolescence on autopilot, so busy living out the life expected of him that he never questioned whether he actually wanted it. Summoned by the bank from an elite graduate finance programme in Paris, he plunged straight into its turbocharged working culture. For the next two years, he worked on the acquisition of companies with the same breezy mastery he’d once brought to the acquisition of his academic and sporting achievements. Then he realised he was spending a lot of time sunk in strange reveries at his workstation, yearning to go home and sleep. When the phone or the call of his name woke him from his trance, he would be gripped by a terrible panic. “One time this guy asked me if I was OK, like he was really weirded out. So I looked down and my shirt was drenched in sweat.”

One day a few weeks later, when his 5.30am alarm went off, instead of leaping out of bed he switched it off and lay there, staring at the wall, certain only that he wouldn’t be going to work. After six hours of drifting between dreamless sleep and blank wakefulness, he pulled on a tracksuit and set off for the local Tesco Metro, piling his basket with ready meals and doughnuts, the diet that fuelled his box-set binges. Three months later, he was transformed into the inertial heap now slouched before me. He did nothing; he saw no one. The concerned inquiries of colleagues quickly tailed off. He was intrigued to find the termination of his employment didn’t bother him. He spoke to his parents in Chicago only as often as was needed to throw them off the scent. They knew the hours he’d been working, so didn’t expect to hear from him all that much, and he never told them anything important anyway.

More here.

Were Ants the World’s First Farmers?

Jackson Landers in Smithsonian:

Attaqueennest1cropt1web_jpg__800x600_q85_cropHumans have been practicing agriculture for about 10,000 years. But the attine ants of South America (which include the well-known leafcutters) have us beat by a long way. According to a new paper co-authored by entomologist Ted Schultz, curator of ants at Smithsonian's National Museum of Natural History, attine ants, which farm on an industrial scale similar to humans, have been carefully cultivating gardens with a complex division of labor to grow an edible fungus. Schultz's team found that the ants have been doing this far longer than previously believed—up to 65 million years—and that we have much to learn from them. Schultz and his co-authors, led by by Sanne Nygaard, Guojie Zhang and Jacobus Boomsma of the University of Copenhagen, conducted an analysis of the genomes of the various species of attine ants as well as the fungus that they cultivate. Their results answer some long-standing evolutionary questions. The 210 species of attine ants, including the 47 species of leafcutters, forage through the forests of Central and South America in search of leaves and other vegetation, which they carve into pieces using their powerful jaws and carry back to their nests. But they never eat the leaves directly. The plant matter is used as a growth medium for certain varieties of edible fungi which Schultz's team says have been cultivated and passed on by generations of ants going back tens of millions of years.

…Humans may have important lessons to learn from the attine ants. We have struggled to protect the survival of our crops for only about 10,000 years. “We're constantly coming up with herbicides or antibiotics to control pests. And the pests are constantly evolving countermeasures against those things,” Schultz says. The most economically important variety of banana became functionally extinct in the 1960's and another variety is heading in the same direction. “Somehow this system with the ants has been in an equilibrium for millions of years,” he adds.

More here.

Wednesday, July 20, 2016

Fences: A Brexit Diary

Zadie Smith in the New York Review of Books:

ScreenHunter_2109 Jul. 20 22.33Back in the old neighborhood in North West London after a long absence, I went past the local primary school and noticed a change. Many of my oldest friends were once students here, and recently—when a family illness returned us to England for a year—I enrolled my daughter. It’s a very pretty redbrick Victorian building, and was for a long time in “special measures,” a judgment of the school inspection authority called Ofsted, and the lowest grade a state school can receive. Many parents, upon reading such a judgment, will naturally panic and place their children elsewhere; others, seeing with their own eyes what Ofsted—because it runs primarily on data—cannot humanly see, will doubt the wisdom of Ofsted and stay put. Still others may not read well in English, or are not online in their homes, or have never heard of Ofsted, much less ever considered obsessively checking its website.

In my case I had the advantage of local history: for years my brother taught here, in an after-school club for migrant children, and I knew perfectly well how good the school is, has always been, and how welcoming to its diverse population, many of whom are recently arrived in the country. Now, a year later, Ofsted has judged it officially “Good,” and if I know the neighborhood, this will mean that more middle-class, usually white, parents will take what they consider to be a risk, move into the environs of the school, and send their kids here.

If this process moves anything like it does in New York, the white middle-class population will increase, keeping pace with the general gentrification of the neighborhood, and the boundaries of the “catchment area” for the school will shrink, until it becomes, over a number of years, almost entirely homogeneous, with dashes of diversity, at which point the regulatory body will award its highest rating at last.

More here.