It may not feel like anything to be an alien

Susan Schneider in KurzweilAI:

Arrival-alien-messagesHumans are probably not the greatest intelligences in the universe. Earth is a relatively young planet and the oldest civilizations could be billions of years older than us. But even on Earth, Homo sapiens may not be the most intelligent species for that much longer. The world Go, chess, and Jeopardy champions are now all AIs. AI is projected to outmode many human professions within the next few decades. And given the rapid pace of its development, AI may soon advance to artificial general intelligence—intelligence that, like human intelligence, can combine insights from different topic areas and display flexibility and common sense. From there it is a short leap to superintelligent AI, which is smarter than humans in every respect, even those that now seem firmly in the human domain, such as scientific reasoning and social skills. Each of us alive today may be one of the last rungs on the evolutionary ladder that leads from the first living cell to synthetic intelligence.

What we are only beginning to realize is that these two forms of superhuman intelligence—alien and artificial—may not be so distinct. The technological developments we are witnessing today may have all happened before, elsewhere in the universe. The transition from biological to synthetic intelligence may be a general pattern, instantiated over and over, throughout the cosmos. The universe’s greatest intelligences may be postbiological, having grown out of civilizations that were once biological. (This is a view I share with Paul Davies, Steven Dick, Martin Rees, and Seth Shostak, among others.) To judge from the human experience—the only example we have—the transition from biological to postbiological may take only a few hundred years. I prefer the term “postbiological” to “artificial” because the contrast between biological and synthetic is not very sharp. Consider a biological mind that achieves superintelligence through purely biological enhancements, such as nanotechnologically enhanced neural minicolumns. This creature would be postbiological, although perhaps many wouldn’t call it an “AI.” Or consider a computronium that is built out of purely biological materials, like the Cylon Raider in the reimagined Battlestar Galactica TV series.

More here.



A paean to the strange suburbs of Baltimore

Waters7John Waters at Lapham's Quarterly:

If you play that game where you pick your porn-star moniker by using the name of the street of your first childhood home as your first name and your real middle name as your last, I’d be Clark Samuels. John S. Waters Jr., 1401 Clark Avenue, Lutherville, Maryland. My first house. My first arena. My launching pad to creative filth.

Does anyone ever forget the first room you were allowed to make your own? As a kid, I was lucky enough to have my own bedroom far enough away from my parents or brother and sisters that spying on me was nearly impossible. All I ever wanted to be was the rock-and-roll king. When I first saw Elvis on The Ed Sullivan Show in 1957, twitching and moaning “Heartbreak Hotel,” I was almost eleven but immediately knew I was gay. Then the decorator gene kicked in. Down from my walls came the family’s tasteful Audubon prints and up went glossy head shots of the Everly Brothers, Tab Hunter, and the Platters (one of whom sported a pencil mustache). I nagged my mom and dad into buying me a reel-to-reel tape recorder so I could be an early pirate and tape all the hits off the radio without having to wait to buy them. Then I’d play these rockabilly and rhythm-and-blues numbers over and over as I danced around my room lip-synching, gyrating, and talking to myself. Finally I had a think tank.

more here.

Pablo Larraín’s ‘Neruda’

ArticleJ. Hoberman at Artforum:

WITH HIS NEW FILM, Neruda, Chile’s master of the political gothic, Pablo Larraín, exhumes a sacred monster: namely, his nation’s 1971 Nobel Laureate, the poet Pablo Neruda. Hardly a biopic, Nerudafocuses on a brief, if dramatic, period in its subject’s life—a fifteen-month period from January 1948 through March 1949 during which the poet, an elected senator and an outspoken member of the banned Chilean Communist Party, went underground, finally escaping over the Andes to Argentina.

Neruda devotes only a dozen pages to the topic in his memoirs, half of them concerning the exciting last stage of his getaway. The writing is routinely self-aggrandizing: “Even the stones of Chile” knew his voice, he brags, only partially in jest. The movie, written by Guillermo Calderón, is wryly admiring of Neruda’s imperturbable chutzpah.

Embodied with dour, deadpan magnificence by Luis Gnecco (who played a leftist organizer in No [2012], the third installment of Larraín’s anti-Pinochet trilogy), Neruda is introduced running a gauntlet of flashbulbs as he enters the senatorial washroom to denounce the nation’s sellout president, Gabriel González Videla. The tumult continues as leftists party in masquerade. Neruda, dressed in a burnoose as Lawrence of Arabia, thrills comrades and admirers with his poet’s voice, as his indulgent second wife (charmingly portrayed by the Argentinean actress Mercedes Morán) calls it, sonorously reciting, not for the last time, the youthful love poem that begins, “Tonight I can write the saddest lines. . . .”

more here.

English Medieval Embroidery: Opus Anglicanum

The_Steeple_Aston_Cope_1310-40_-_detail_of_angel_on_horseback_with_lute_c_Victoria_and_Albert_Museum_London-1440x1080-c-topJuliet Barker at Literary Review:

English medieval embroidery might seem an odd subject for a book, but this is no ordinary volume. Published to accompany a major exhibition at the V&A (which runs until 5 February 2017), it is not only a catalogue and scholarly monograph but also a visual feast, with magnificent colour plates on virtually every page, bursting at the seams with titbits of fascinating information. It’s the sort of book that makes you want to hug yourself with glee: revelatory and as exquisitely produced as the medieval embroidery it celebrates.

The stereotypical embroiderer, medieval or otherwise, is always female and usually noble-born. Charles Henry Hartshorne, writing in 1845, summed it up neatly in one of the earliest studies on the subject: embroidery ‘served to occupy the leisure of the English gentlewoman when there were but few other modes in which her talents could be employed’; immured within a castle chamber or convent, ‘the needle alone supplied an unceasing source of amusement’. What English Medieval Embroidery demonstrates is that embroidery in England was first and foremost a business, employing both male and female workers whose professional skill was renowned throughout Europe. In the 13th and 14th centuries their work was in demand everywhere from Scandinavia to Portugal and from Riga to Patras, with the Church hierarchy and the papal curia, in particular, proving insatiable in their appetite for what was known outside England as opus anglicanum.

more here.

Wednesday, January 4, 2017

On John Berger, 1926–2017

Screen-Shot-2017-01-04-at-12.38.24-PM

Bruce Robbins in n+1:

A FEW MONTHS AGO, while watching the lush and loving Tilda Swinton and Colin MacCabe documentary The Seasons in Quincy: Four Portraits of John Berger, I found myself thinking what an endless series of portraits this man has given rise to, and will keep giving rise to. The beauty of the film’s montage—much of it of Berger’s Alpine home—is a self-conscious tribute to the beauty Berger teaches us to see in the world, in art and outside it.

Berger’s decision in the early ’70s to spend what turned out to be five decades of his life in a small village in the French Alps is easy to misunderstand. He wasn’t seeking a refuge from the world, but the right kind of contact with the world. The film overflows with his charm and energy as a friend and neighbor; there’s none of the solitary artist pose. Peasants were a major reason he came; his son became one. And peasants became essential to his politics.

Lyrical about the man, the film skimps on his politics—something that often happens with efforts to sell a political or philosophical outsider to a mainstream audience. The weird thing is of course that one of the best exemplars of transporting seemingly taboo perspectives into the family living room was Berger himself. In 1972, his BBC show Ways of Seeing suddenly made it clear to viewers and to TV executives alike that it could be tremendously enjoyable to look at art through Marxist eyes. The series spent a still-startling amount of time on landscape as property, and on the sexism of nudes.

No matter what he was looking at, Berger never stopped asking uncomfortable and therefore stimulating questions.

More here.

Why Sex Is Mostly Binary but Gender Is a Spectrum

11269_628f16b29939d1b060af49f66ae0f7f8

Siddhartha Mukherjee in Nautilus:

Anyone who doubts that genes can specify identity might well have arrived from another planet and failed to notice that the humans come in two fundamental variants: male and female. Cultural critics, queer theorists, fashion photographers, and Lady Gaga have reminded us— accurately—that these categories are not as fundamental as they might seem, and that unsettling ambiguities frequently lurk in their borderlands. But it is hard to dispute three essential facts: that males and females are anatomically and physiologically different; that these anatomical and physiological differences are specified by genes; and that these differences, interposed against cultural and social constructions of the self, have a potent influence on specifying our identities as individuals.

That genes have anything to do with the determination of sex, gender, and gender identity is a relatively new idea in our history. The distinction between the three words is relevant to this discussion. By sex, I mean the anatomic and physiological aspects of male versus female bodies. By gender, I am referring to a more complex idea: the psychic, social, and cultural roles that an individual assumes. By gender identity, I mean an individual’s sense of self (as female versus male, as neither, or as something in between).

For millennia, the basis of the anatomical dissimilarities between men and women—the “anatomical dimorphism” of sex—was poorly understood. At the beginning of the third century, Galen, the most influential anatomist in the ancient world, performed elaborate dissections to try to prove that male and female reproductive organs were analogs of each other, with the male organs turned inside out and the female’s turned outside in. The ovaries, Galen argued, were just internalized testicles retained inside the female body because females lacked some “vital heat” that could extrude the organs. “Turn outward the woman’s [organs] and double the man’s, and you will find the same,” he wrote. Galen’s students and followers stretched this analogy, quite literally, to its absurd point, reasoning that the uterus was the scrotum ballooning inward, and that the fallopian tubes were the seminal vesicles blown up and expanded.

More here.

Drugs du jour

Duj

Cody Delistraty in Aeon:

The drugs chosen to pattern our culture over the past century have simultaneously helped to define what each generation has most desired and found most lacking in itself. The drugs du jour thus point towards a cultural question that needs an answer, whether that’s a thirst for spiritual transcendence, or for productivity, fun, exceptionalism or freedom. In this way, the drugs we take act as a reflection of our deepest desires and our inadequacies, the very feelings that create the cultures in which we live.

To be clear, this historical investigation predominately concerns psychoactive drugs. It accounts for a large family of drugs embracing LSD, cocaine, heroin, ecstasy, barbiturates, anti-anxiety medications, opiates, Adderall and the like, but not anti-inflammatories such as ibuprofen (Advil) or pain relievers such as acetaminophen (Tylenol). These pharmaceuticals are not drugs that alter one’s state of mind and are consequently of little use when making sociocultural analyses.

The drugs up for discussion also cut across boundaries of law (just because a drug is illegal does not preclude it from being central to a cultural moment) and class (a drug used by the lower class is no less culturally relevant than drugs favoured by the upper class, although the latter tend to be better recorded and retrospectively viewed as of ‘greater cultural importance’). Finally, the category of drugs under scrutiny cuts across therapeutic, medical and recreational usage.

To understand the way we create and popularise drugs to match the culture we have, consider cocaine. Readily available at the turn of the 20th century, cocaine was outlawed in 1920 with the passing of The Dangerous Drugs Act in the United Kingdom (and in 1922 in the United States under the Narcotic Drugs Import and Export Act). Cocaine’s initial popularity in the late-19th-century was in large part due to ‘its potent euphoric effects’, according to Stuart Walton, an ‘intoxication theorist’ and author of Out of It: A Cultural History of Intoxication (2001). Cocaine, Walton told me, ‘helped potentiate a culture of resistance to Victorian norms, the abandonment of rigorous civility in favour of an emergent “anything goes” social libertarianism in the era of the Jugendstil, and the rise of social-democratic politics’.

More here.

Turning to faith for the greater good: Climate Guardian

Virginia Gewin in Nature:

ClimateVeerabhadran Ramanathan has modelled greenhouse-gas dynamics and quantified the chlorofluorocarbon (CFC) contribution to Earth's global warming. His work at the Scripps Institution of Oceanography in La Jolla, California, shows that CFC-replacing hydrofluorocarbons (HFCs) also have a potent climate-warming effect. This finding led in October to HFCs being added to the Montreal Protocol on Substances that Deplete the Ozone Layer. He has engaged for a decade with religious leaders to act on climate change.

When did you realize that science alone might not galvanize climate-change action?

Many of my colleagues and I could see that, by mid-century, we'd shoot past 2-degrees warming, yet there was no public support for the drastic actions needed to steer us away from the cliff. I was discouraged and depressed. Then I got an e-mail telling me I'd been elected to the Pontifical Academy of Sciences in Vatican City, a body of only 80 members, one-third of whom are Nobel laureates.

How did your early contact with the Vatican affect your outlook?

I initially thought the e-mail was spam. Before I got involved with the Vatican, I didn't have the foggiest notion that religion could help to combat climate change. I've since gone on record to say that global warming has to be taught in every church, synagogue, mosque and temple before we are likely to take the sort of drastic actions necessary to head it off.

Where did your involvement lead?

At a meeting hosted by the Vatican in 2011, I teamed up with Dutch Nobel laureate Paul Crutzen to focus on glaciers. That opened my eyes to the power of the Church. In the meeting's scientific report, we included a prayer to protect humanity. There was tremendous opposition, but I stood behind its inclusion. We saw the potential of mobilizing religion to help, and proposed a Vatican-hosted meeting on sustainability. This took place in 2014 under Pope Francis.

What happened after that meeting?

In a Science paper that followed, we pointed out that we need a moral revolution: solving climate change requires a fundamental shift in humanity's attitude towards each other and nature (P. Dasgupta and V. Ramanathan Science 345, 1457–1458; 2014). Faith leaders can make such a revolution happen. After the sustainability meeting, I had two minutes to give a summary to the Pope in the car park. I showed him that 50–60% of climate-warming pollution comes from the wealthiest people on the planet. The bottom 3 billion contribute just 5%, but will experience the worst effects of climate change. That appealed to the Pope. He asked what to do. I told him to ask people to be better stewards of the planet.

More here.

LOVE, TERROR, AND CIGARETTES

170109_r29267-320x444-1482944965Joan Acocella at The New Yorker:

The German writer Gregor Hens smoked his first cigarette when he was five. His mother gave it to him. It was New Year’s Eve, and the Hens family, like many Germans, were out in the snow setting up fireworks. But they couldn’t light the fuses, because Gregor’s two older brothers were fighting over the lighter. Frau Hens finally lost patience: “She pulled out a cigarette, lit it and held it out to me.” Little Gregor took this wonderful thing and held it to the fuse of one of the rockets, which shot into the sky. Then he saw that the cigarette’s ember had ceased to glow. “You have to take a drag on it, my mother said out of the half-darkness.” He took a drag, the ember glowed again, and the child suffered a near-collapse from coughing and joy.

As Hens tells us in his memoir, “Nicotine” (Other; translated from the German by Jen Calleja), this experience eventually landed him with a decades-long addiction to nicotine. It also, he believes, gave him the beginnings of a personality: “I became myself for the first time.” He means this literally. In his mind, the entire episode—the coughing fit, his mother’s blue hat, his almost uncontainable pride in the fact that he, not his brothers, detonated the first rocket—comes together into a story, the first memory he has that is a story rather than just an image or a sensation. And, because he is a writer, he sees this birth of a story as the birth of his personality. How nice: to have the emergence of one’s self marked by a rocket exploding!

more here.

Carmen Herrera: Art Without Lies

Herrera-la-silla-e1482342429399Claire Messud at the NYRB:

When Carmen Herrera is asked to explain her paintings in Alison Klayman’s film about her life and work, The 100 Years Show (2015), she says, “If I could put those things in words, I wouldn’t do the painting. I would tell you…Usually artists are not the best people to talk about art. I think it’s a great mistake. You cannot talk about art—you have to art about art.”

Herrera, now 101 years old, has spent the better part of a century doing just that. For most of that time, the world paid scant attention to her “arting”: only in the last decade has she been granted the attention she should by rights have received half a century ago. As a Cuban, as a woman, she found little public support in the New York art world. Reluctant to be classified by national origin or gender, she struggled to find a place, but continued to produce work, prolifically, for decades: “I kept going,” she says in Klayman’s film. “I couldn’t stop.”

Herrera paints or draws daily even now, and her work has remained an exhilarating example of hard-edged abstraction. Inspired by Miró and Mondrian, a friend of Barnett Newman and of Leon Polk Smith, a young artist in post-war Paris at the same time as Ellsworth Kelly, she is an artist for whom the pure line remains the source of inspiration and joy (“I like straight lines…I like order. In this chaos that we live in, I like to put order,” she explained in a 1994 interview).

more here.

Vulgar Tongues: an Alternative History of English Slang

Vulgar-tongues-an-alternative-history-of-english-slangLynne Truss at The New Statesman:

This is the trouble with books on slang. However exhaustive they are, they always leave you asking, “But why?” Max Décharné’s engaging book Vulgar Tongues is a spectacular feat, collating information from a mind-boggling range of sources – from jazz lyrics to dime novels, from 18th-century brothel directories to 1960s criminal autobiographies.

Take a word such as “chippie”, meaning whore. Décharné gives us a couple of quotations from Dashiell Hammett’s Red Harvest (1929) and Raymond Chandler’s The High Window(1942) – which is where you would expect him to find some. But his killer examples are the title of the jazz record “Chasin’ Chippies” by Cootie Williams and His Rug Cutters (1938) and an exchange from a 1960 Chester Himes novel set in Harlem,The Big Gold Dream:

“I was watching out for my girls,” Dummy replied.

“Your girls?”

“He’s got two chippie whores,” Grave Digger replied. “He’s trying to teach them how to hustle.”

more here.

Tuesday, January 3, 2017

Sir Tony Atkinson, economist and campaigner, 1944-2017

201503_Atkinson_Inequality

Chris Giles and Sarah O’Connor in the FT:

When academic economics was obsessed by free markets and a ruthless search for efficiency, while simply seeing societies as populated by multiple copies of one representative individual, the study and measurement of inequality was deeply unfashionable.

Sir Tony Atkinson, who died aged 72 on New Year’s Day, was the British economist who kept that flame alive through the 1980s and 1990s, surviving to see it return to the centre of economic concerns on both the political left and right.

For more than 50 years Atkinson battled for economics to take poverty and inequalityseriously, crediting his interest in the subject to a stint of voluntary service working at a deprived hospital in Hamburg in the mid-1960s. From 1967 when he took up a fellowship at St John’s College, Cambridge, he dedicated himself to the theory and the practicalities of understanding differences in society.

Three aspects of Atkinson’s work stand out. First came his concern about understanding the causes and consequences of poverty. This was a practical passion, which led him to ask what had to change in policy to improve people’s lives.

The practicalities did not end in the don’s study or lecture theatre, however. As a social campaigner as well as academic, Atkinson also sought to improve people’s lives on the street — manning a stall in Brightlingsea market in the early 1970s, when a professor at nearby Essex university, to explain benefit entitlements to passers-by.

A second strand of his academic contribution was more theoretical. As editor of the Journal of Public Economics for 25 years, and often in partnership with Joseph Stiglitz, Nobel Prize-winning economist, he challenged the orthodoxy of free-market economics, providing answers to how to reform economies where markets are not working well and policy constraints are severe.

More here.

The whole philosophy community is mourning Derek Parfit. Here’s why he mattered.

Shutterstock_356591723

Dylan Matthews in Vox:

Derek Parfit, who died at age 74 on Sunday evening, was not the most famous philosopher in the world. But he was among the most brilliant, and his papers and books have had a profound, incalculably vast impact on the study of moral philosophy over the past half century.

His work did not dwell on topics of merely academic interest. He wrote about big topics that trouble everyone, philosopher and layperson alike: Who am I? What makes me “me”? What separates me from other people? How should I weigh my desires against those of others? What do I owe to my children, and to the future in general? What does it mean for an action to be right or wrong, and how could we know?

Parfit was not a prolific author; he tended to write his books over the course of decades, refining them repeatedly after discussions with colleagues and students. In the end, he wrote only two: 1984’s Reasons and Persons, and 2011’s On What Matters, a two-volume, 1,440 page tome whose third volume is still yet to be published. But both are classics, the latter generating such furious debate that a volume of essays discussing it was released two years before the book itself even came out (most of the key arguments had circulated in draft form for some time).

For an excellent overview of Parfit’s life and the major themes of his work, I highly recommend Larissa MacFarquhar’s beautiful and incisive New Yorker profile, published as On What Matters finally hit shelves. But perhaps the best way to experience Parfit’s writing, and understand why both his ideas and his method of articulating them proved so influential, is to dig into a few of his most important and fascinating arguments.

If there’s a single idea with which Parfit is most strongly identified, it’s the view that personal identity — who you are, specifically, as a person — doesn’t matter. This argument, made in the 1971 paper “Personal Identity” and in the third section of Reasons and Persons, is jarring at first, but his case is persuasive, and the implications are profound.

Parfit asks us to imagine that he is fatally injured in an accident, but his brain is mostly unharmed. His two brothers are also in the accident, and emerge brain-dead, but with otherwise healthy bodies. Doctors then split his healthy brain in half, and implant a half in each of his brothers’ bodies. “Each of the resulting people believes that he is me, seems to remember living my life, has my character, and is in every other way psychologically continuous with me,” Parfit writes in Reasons and Persons. “And he has a body that is very like mine.”

He then asked: What happened to Derek Parfit in all this? Did he die? That can’t be right; if anything, he doubled.

More here.

The Rules of the Game: A New Electoral System

Maskin_sen_1-011917

Eric Maskin and Amartya Sen in The NY Review of Books:

Americans have been using essentially the same rules to elect presidents since the beginning of the Republic. In the general election, each voter chooses one candidate; each state (with two current exceptions) awards all its Electoral College votes to the candidate chosen by the largest number of voters (not necessarily a majority) in that state; and the president-elect is the candidate with a majority of Electoral College votes.

Primary elections for president have also remained largely unchanged since they replaced dealings in a “smoke-filled room” as the principal method for selecting Democratic and Republican nominees. In each state, every voter votes for one candidate. In some states, the delegates to the national convention are all pledged to support the candidate getting a plurality of votes (again, possibly less than a majority). In others, delegates are assigned in proportion to the total votes of the candidates.

These rules are deeply flawed. For example, candidates A and B may each be more popular than C (in the sense that either would beat C in a head-to-head contest), but nevertheless each may lose to C if they both run. The system therefore fails to reflect voters’ preferences adequately. It also aggravates political polarization, gives citizens too few political options, and makes candidates spend most of their campaign time seeking voters in swing states rather than addressing the country at large.

There are several remedies. Perhaps in order of increasing chance of adoption, they are: (1) to elect the president by the national popular vote instead of the Electoral College; (2) to choose the winner in the general election according to the preferences of a majority of voters rather than a mere plurality, either nationally or by state; and, easiest of all, (3) to substitute majority for plurality rule in state primaries.

More here.

A Few Quick Bloggy Thoughts on the Sam Gold / Daniel Craig / David Oyelowo “Othello”

Daniel-and-David-1280x720

Isaac Butler in Parabasis:

Now that I've seen it, I’d like to briefly talk about a couple of aspects of Sam Gold’s recent, much-lauded production of Othello at New York Theatre Workshop and how it will hopefully come to influence how we produce Shakespeare. No, I’m not talking about the casting of movie stars, although both Daniel Craig and David Oyelowo’s performances were excellent. I’m talking instead about the way the production through sheer excellence makes a great case against both over conceptualization and the recent controversial efforts to “translate” Shakespeare plays to make them more accessible to contemporary audiences.

If you’ve seen this Othello, you might balk at my saying that it resists over conceptualization. This Othello was very directed, and the design was extremely present. The production transformed New York Theatre Workshop into a wood-lined barracks, with only a few light sources (none of them traditional theatrical light). When the audience entered, a soldier sat on stage playing Guitar Hero. The first scene took place almost entirely in the dark. The costumes looked deliberately unfinished. The cast sang Hotline Bling at one point. Two cast members besides Oyelowo were Black, shifting the focus away from Othello’s racial otherness and towards other themes.

Yet the production avoided literal conceptualizing. This play did not literally take place during the Iraq war, but it gestured at it. It did not literally take place in a barracks, but used elements of a barracks as scenery. Rather than taking us to a specific place or time, the design choices instead aimed to create the right environment for this interpretation of the text, an interpretation that was focused on war and its effects on masculinity.

More here.

The End of Progressive Neoliberalism

1483356265FraserFF15chicagocomegetmyvotebobsimpson2_11_16666

Nancy Fraser in Dissent:

Trump’s victory is not solely a revolt against global finance. What his voters rejected was not neoliberalism tout court, but progressive neoliberalism. This may sound to some like an oxymoron, but it is a real, if perverse, political alignment that holds the key to understanding the U.S. election results and perhaps some developments elsewhere too. In its U.S. form, progressive neoliberalism is an alliance of mainstream currents of new social movements (feminism, anti-racism, multiculturalism, and LGBTQ rights), on the one side, and high-end “symbolic” and service-based business sectors (Wall Street, Silicon Valley, and Hollywood), on the other. In this alliance, progressive forces are effectively joined with the forces of cognitive capitalism, especially financialization. However unwittingly, the former lend their charisma to the latter. Ideals like diversity and empowerment, which could in principle serve different ends, now gloss policies that have devastated manufacturing and what were once middle-class lives.

Progressive neoliberalism developed in the United States over the last three decades and was ratified with Bill Clinton’s election in 1992. Clinton was the principal engineer and standard-bearer of the “New Democrats,” the U.S. equivalent of Tony Blair’s “New Labor.” In place of the New Deal coalition of unionized manufacturing workers, African Americans, and the urban middle classes, he forged a new alliance of entrepreneurs, suburbanites, new social movements, and youth, all proclaiming their modern, progressive bona fides by embracing diversity, multiculturalism, and women’s rights. Even as it endorsed such progressive notions, the Clinton administration courted Wall Street. Turning the economy over to Goldman Sachs, it deregulated the banking system and negotiated the free-trade agreements that accelerated deindustrialization. What fell by the wayside was the Rust Belt—once the stronghold of New Deal social democracy, and now the region that delivered the electoral college to Donald Trump. That region, along with newer industrial centers in the South, took a major hit as runaway financialization unfolded over the course of the last two decades. Continued by his successors, including Barack Obama, Clinton’s policies degraded the living conditions of all working people, but especially those employed in industrial production. In short, Clintonism bears a heavy share of responsibility for the weakening of unions, the decline of real wages, the increasing precarity of work, and the rise of the two–earner family in place of the defunct family wage.

More here.

A Gut Makeover for the New Year

Roni Caryn Rabin in The New York Times:

‏microIf you’re making resolutions for a healthier new year, consider a gut makeover. Refashioning the community of bacteria and other microbes living in your intestinal tract, collectively known as the gut microbiome, could be a good long-term investment in your health. Trillions of microbial cells inhabit the human body, outnumbering human cells by 10 to one according to some estimates, and growing evidence suggests that the rich array of intestinal microbiota helps us process nutrients in the foods we eat, bolsters the immune system and does all sorts of odd jobs that promote sound health. A diminished microbial ecosystem, on the other hand, is believed to have consequences that extend far beyond the intestinal tract, affecting everything from allergies and inflammation, metabolic diseases like diabetes and obesity, even mental health conditions like depression and anxiety.

Much of the composition of the microbiome is established early in life, shaped by forces like your genetics and whether you were breast-fed or bottle-fed. Microbial diversity may be further undermined by the typical high-calorie American diet, rich in sugar, meats and processed foods. But a new study in mice and people adds to evidence that suggests you can take steps to enrich your gut microbiota. Changing your diet to one containing a variety of plant-based foods, the new research suggests, may be crucial to achieving a healthier microbiome. Altering your microbiome, however, may not be easy, and nobody knows how long it might take. That’s because the ecosystem already established in your gut determines how it absorbs and processes nutrients. So if the microbial community in your gut has been shaped by a daily diet of cheeseburgers and pepperoni pizza, for example, it won’t respond as quickly to a healthy diet as a gut shaped by vegetables and fruits that has more varied microbiota to begin with.

More here.

How the Trolley Problem Explains 2016

Clio Chang in The New Republic:

2a5a4975e9a7860a9ce598dbbc7a50125f8f64aaImagine that you’re a rude teen hanging near some trolley tracks, kicking around rocks, when you look up and notice that five people are tied to the tracks. A trolley suddenly appears, careening towards them. You can save the five people by pulling a lever, thus diverting the trolley to a different set of tracks where another person (dang! Just your luck!) is tied up. Do you pull the lever, killing one to save five?

This thought experiment, devised by philosopher Philippa Foot in 1967, along with its sadder partner dilemma, the Fat Man (instead of flipping a switch, you have to push a fat man off of a bridge to stop the trolley from hitting the five people), has found a resurgence in 2016, becoming what Brian Feldman has termed the “Internet’s Most Philosophical Meme.” The Facebook page Trolley Problem Memeshas grown to over 170,000 likes since its inception earlier this year; newly added memes attract thousands of likes and hundreds of in-depth comments (the inside joke “multi-track drifting” almost always inevitably comes up, which riffs off of another meme, the gory details of which are way beyond the scope of this piece). You can now buy t-shirts that say “I pulled the lever” or “I choose you to be my ethical dilemma” with a picture of the trolley on it. If something happened this year, you can bet that there is a trolley problem for it.

More here.

The Art and Life of Louise Bourgeois by Robert Storr

Article00A roundtable discussion at Bookforum:

Christopher Lyon: First I'd like to say something about the book we're here to discuss. This 828-page tome on the art and life of Louise Bourgeois, who was born in 1911 and died in 2010, is the product of some thirty years of work on Robert Storr's part. It comprehensively surveys Bourgeois's career as an artist, which spanned nearly seventy-five years, with more than nine hundred illustrations. Chapters relating Bourgeois's life and analyzing her creative achievement alternate with portfolios, in chronological sequence, that show the unfolding of her oeuvre. The final chapter is a coda that details Rob's close and complicated relationship with his subject, beginning in the early 1980s. It is, and probably will remain, the definitive monograph on Louise.

Robert Storr: I don't think there's such a thing as a definitive book, that's part of my point. It will be the first essai at making a comprehensive book. I should just say in parentheses that the fact it exists at all is very much to the credit of Chris, who has stayed with this project long past the patience of most mere mortals. In terms of design, production, the whole thing.

CL: Thank you for saying that. I thought it would be good if we could pull everybody into the conversation at the beginning. You're all familiar with Louise's work and I wondered to what extent this book confirmed, challenged, or surprised you.

more here.