Donna Huanca. Poly Styrene's Braces, 2015 Performance View.
Curated by Anne Barlow, Art in General, New York in collaboration with kim? Contemporary Art Centre, Riga, LV
by Brooks Riley
Stacey Balkan in Public Books:
In the autumn of 1839, an unusually strong tropical storm devastated coastal communities along the Bay of Bengal in what was then the English East India Company’s premier settlement. A decade later, Company merchant and sometime scientist Henry Piddington coined the term “cyclone” to describe this climatological phenomenon, taking a cue from the seaborne storm’s circular movement and eerily hollow center, or “eye.” So common are cyclones in that part of the world that when a tornado—a typically smaller, terrestrial storm—ravaged the land-locked city of New Delhi in 1978, local newspapers erroneously identified the storm as a cyclone.
Amitav Ghosh was a graduate student at the time of the tornado and recounts its aftermath in a new monograph entitled The Great Derangement: Climate Change and the Unthinkable. His first book-length work of nonfiction in decades, The Great Derangement began as a series of lectures delivered last autumn at the University of Chicago. Focused in part on fictional representations of climate change, the author begins by addressing the bewildering absence of such storms from what he calls the “mansions” of “serious” fiction—an egregious oversight, he argues, given the proliferation of similarly catastrophic storms like Hurricane Sandy. Ultimately he asks: “Is climate change [simply] too wild a stream to be navigated in the accustomed barques of narration?”
More here.
Julia Belluz in Vox:
Over my years in health journalism, I’ve debunked many dubious claims. I’ve discussed how to cover quacks like Dr. Oz and the Food Babe, and how to navigate a medical world so filled with hooey it can make your head spin.
But I wasn’t always fluent in the ways of detecting bull. My eyes were opened in my early 20s, when I met a group of researchers at McMaster University in Canada. They taught me about the limitations of different kinds of evidence, why anecdotes are often wildly misleading, and what a well-designed study looks like. This experience changed how I see the world.
I’ve often wondered why these concepts aren’t taught in schools. We are bombarded with health claims — in the news, on TV, in magazines, at the doctor’s office or the pharmacy — and many of us lack the basic skills to navigate them.
That’s why I found this giant new trial, which is just wrapping up now in Uganda, so compelling. Its mission, according to Sir Iain Chalmers, the Cochrane Collaboration co-founder who’s co-leading it, is to teach children to “detect bullshit when bullshit is being presented to them.”
More here.
Adam Briggle and Robert Frodeman in The Guardian:
In a widely-discussed recent essay for the New Atlantis, the policy scholar Daniel Sarewitz argues that science is in deep trouble. While modern research remains wondrously productive, its results are more ambiguous, contestable and dubious than ever before. This problem isn’t caused by a lack of funding or of scientific rigour. Rather, Sarewitz argues that we need to let go of a longstanding and cherished cultural belief – that science consists of uniquely objective knowledge that can put an end to political controversies. Science can inform our thinking; but there is no escaping politics.
Sarewitz, however, fails to note the corollary to his argument: that a change in our expectations concerning the use of science for policy implies the need to make something like philosophical deliberation more central to decision making.
Philosophy relevant? We had better hope so. Because the alternative is value fundamentalism, where rather than offering reasons for our values, we resort to dogmatically asserting them. This is a prescription for political dysfunction – a result increasingly common on both sides of the Atlantic.
More here.
Ryan Lizza in The New Yorker:
The man behind this new message is Steve Bannon, who became the C.E.O. of the Trump campaign in August. Bannon is on leave from Breitbart, the right-wing news site where he served as executive chairman, and where he honed a view of international politics that Trump now parrots. Bannon, who is sixty-two, is new to right-wing rabble-rousing, compared to someone like Stone. Bannon was raised in a blue-collar Democratic family around Norfolk and Richmond, Virginia. He served in the Navy, went to the Harvard Business School, and became wealthy as a mergers-and-acquisition deal-maker for Goldman Sachs, in the nineteen-eighties. He made a fortune by buying a share of the royalties for “Seinfeld” back in 1993, and receives them to this day. Bannon met Andrew Breitbart, the founder of the news Web site, when Bannon was financing conservative documentaries in Los Angeles in the aughts. Breitbart, who previously worked with the Drudge Report, started Breitbart in 2005 as a conservative news aggregator, much like his former employer. In the fall of 2009, Bannon and Breitbart worked together on a business plan to launch a more ambitious version of the site, and Bannon joined its board in 2011, once the financing deal closed. When Andrew Breitbart died, in 2012, Bannon became executive chairman and took over the site. Back then Breitbart was a pugnacious but still recognizably conservative site, but, with Bannon in charge, its politics started to change.
Bannon embraced the growing populist movement in America, including the “alt-right,” a new term for white nationalists, who care little about traditional conservative economic ideas and instead stress the need to preserve America’s European heritage and keep out non-whites and non-Christians. Under Bannon, Breitbart promoted similar movements in Europe, including the United Kingdom Independence Party, the National Front in France, Alternative for Germany, and the Freedom Party in the Netherlands. Bannon likes to say that his goal is “to build a global, center-right, populist, anti-establishment news site.” After the election is over, Breitbart, which has offices in London and Rome, plans to open up new bureaus in France and Germany.
More here.
Matt Taibbi in Rolling Stone:
The first symptom of a degraded aristocracy is a lack of capable candidates for the throne. After years of indulgence, ruling families become frail, inbred and isolated, with no one but mystics, impotents and children to put forward as kings. Think of Nikolai Romanov reading fortunes as his troops starved at the front. Weak princes lead to popular uprisings. Which brings us to this year's Republican field.
There wasn't one capable or inspiring person in the infamous “Clown Car” lineup. All 16 of the non-Trump entrants were dunces, religious zealots, wimps or tyrants, all equally out of touch with voters. Scott Walker was a lipless sadist who in centuries past would have worn a leather jerkin and thrown dogs off the castle walls for recreation. Marco Rubio was the young rake with debts. Jeb Bush was the last offering in a fast-diminishing hereditary line. Ted Cruz was the Zodiac Killer. And so on.
The party spent 50 years preaching rich people bromides like “trickle-down economics” and “picking yourself up by your bootstraps” as solutions to the growing alienation and financial privation of the ordinary voter. In place of jobs, exported overseas by the millions by their financial backers, Republicans glibly offered the flag, Jesus and Willie Horton.
More here.
Video length: 46:55
Steven Rose in The Guardian:
Yet another book about consciousness? These days it seems no self-respecting neuroscientist should be without at least one book-length stab at explaining how the brain enables that most central, if elusive, feature of what makes us human. This is Susan Greenfield’s second. Yet, as she reminds us, it has only been in the last few decades that consciousness studies, once regarded as the province of philosophers, and off-limits for neuroscience, has become a cottage industry for brain researchers, oblivious to the sceptics who joke that the initiator of this new wave was an anaesthesiologist, Stuart Hameroff, whose day job ought surely to be elucidating the processes through which people become unconscious.
This origin may help explain why many brain researchers have such a narrow definition of consciousness, understood by Greenfield, in common with her many peers, as what we retain while awake and lose while asleep or anaesthetised. Such a restricted description raises many questions about this protean term. Can there be consciousness in the abstract, distinct from being conscious of something? Awareness is only one of the several meanings the OED ascribes to consciousness, including self-knowledge and, to me the most important, “the totality of the impressions, thoughts, and feelings, which make up a person’s conscious being”. Neuroscientists are rarely trained in philosophy, but a little modesty might not go amiss. Some committed reductionists among them maintain that consciousness is merely a “user illusion” – that you may think you are making conscious decisions but in “reality” all the hard work is being done by the interactions of nerve cells within the brain. Most, however, are haunted by what their philosophical sympathisers call the “hard problem” of the relationship between objective measures – say of light of a particular wavelength – and qualia, the subjective experience of seeing red.
More here.
Julian Hanna reviews Stefany Anne Golberg and Morgan Meis's Dead People in 3:AM Magazine:
What makes a life noteworthy and important? What makes a good life? And when a life ends, what constitutes a good summing up, a worthy eulogy? How can a writer, pressed for time, do justice to a life – a great life, presumably – within the hackneyed confines of a two-page obituary? How does one attempt to revive such a dead form?
Or if not dead, then at least resting – unsung, taken for granted – like the manifesto before Marx and Marinetti, when it meant simply a straightforward declaration of intent, no “spectre haunting Europe”, no “courage, audacity, and revolt”. The standard obituary form is: so-and-so was born, rose (usually struggled) to greatness, and died. There is almost a sense that words fail in the face of death, so it is best just to state the facts. How do you breathe life into such a predictable story?
In Dead People, Stefany Anne Golberg and Morgan Meis show us one approach to reinvigorating the form. The collection of twenty-nine obituaries has a provocative cover that makes it great fun to read on the metro. It was written mainly for The Smart Set over the past decade (there are exceptions drawn from n+1 and The New Yorker), and the almost single venue contributes to the high degree of intimacy present in the telling of each life. The authors strike a tone of late-night candour, loose and flowing in the warm glow of the third or fourth drink. But the easy style belies a deeper engagement, honesty about the subject, and a willingness to deal with difficult themes. “We’ve chosen to take these lives personally”, the authors declare in the preface. What Golberg and Meis achieve for the most part is an effortless distillation, boiling down the essence of a public figure’s achievements. The big idea, the breakthrough, the one thing that makes an indelible mark on the culture – this is what we are shown in each brief life.
More here.
Daniel Larison in The American Conservative:
The Wall Street Journal publishes another shameless pro-Saudi editorial. This part stood out:
A Saudi air strike last week mistakenly killed civilians at a funeral in Yemen, and the White House is now leaking that Mr. Obama is rethinking U.S. support for the Yemen campaign. But the U.S. has made similar targeting errors in many conflicts, and Saudi bombing won’t get more precise if the U.S. bugs out. The U.S. ought to be helping the Saudis with enough support that they can win in Yemen.
Even by the WSJ‘s standards, this is an insane position to take. The funeral massacre last week was obviously not carried out by “mistake.” The coalition repeatedly hit the same target to maximize casualties, and it chose the target because many high-ranking political and military figures were in attendance. The coalition wanted to hit the target, and it did so several times in a row. They weren’t concerned about the hundreds of civilians killed and injured in the process, and it is absurd to claim that they were. When presented with an obvious atrocity committed by a U.S. client, the WSJ predictably ignores the evidence and insists on even more aggressive support for the offending government.
There is also no realistic prospect of a Saudi victory in Yemen. The coalition’s original goals were to drive the Houthis out of the capital and reinstate Hadi. Even if the coalition could somehow manage to do the former, it would come at an excruciating cost for the civilian population. The latter goal has always been hopeless. Hadi had scant domestic support when the war began, and now he has none. There is no chance that the coalition can “win,” and so it makes absolutely no sense for the U.S. to increase support for the war in the hopes that they do. Yemen has been wrecked and its people brought to the brink of famine because of the foolish belief that the Saudis could “win.”
More here.
Nicola Twilley in Aeon:
Readers of Hacker News, a website popular with programmers and tech entrepreneurs, were the first to latch on to Rhinehart’s Soylent post, encouraging him to share the recipe online. When he did, it quickly spawned an animated Reddit thread in which DIY Soylent adopters reviewed recipes, discussed magnesium sourcing, and compared bowel movements. Within three months, Rhinehart decided that demand was sufficient for him to quit his tech start-up and form his own company in order to supply Soylent to the masses. By the time Soylent 1.0 started shipping in May 2014, the company had already accumulated a backlog of more than 20,000 pre-orders, adding up to more than $2 million dollars in sales and – at a conservative estimate – a collective saving of 2,875 years.
hat, one wonders, are people doing with all this extra time? Will we see a new Renaissance: a Soylent-fuelled flowering of novels, art or, at the very least, apps? It is perhaps too early to tell, but early signs are mixed. Rhinehart has ploughed his 90 minutes a day into launching his company, and says he still has ‘a long reading list, a long online course list, a lot of personal projects I’d like to do’. He is not against using the time for relaxation, of course, and tells me that he’s heard from other early adopters that they spend an extra hour and a half watching TV, hanging out with friends and family, or just catching up on our pervasive national sleep deficit.
‘Just giving people a little more time in general is something the United States really needs,’ he told me. ‘However you use that time is up to you.’
My own experience bodes less well. I lived on Soylent for five days (Rhinehart sent me a week’s supply, but I cracked early) and I was indeed painfully aware of vast open periods that I would have typically spent planning, shopping for, making, enjoying and cleaning up after meals. Much to my editor’s disappointment, I spent all that extra time joylessly clicking around on the internet, my brain resisting every effort to corral it into more productive activities as if it knew it was being cheated of an expected break. (My editor kindly pointed out that this might be more of a reflection of my own personal failings than a shortcoming inherent to Soylent.)
Of course, this is not the first time Americans have been promised relief from the time-suck of food preparation. Today’s Soylent craze has its roots in the post-Second World War embrace of convenience foods. And, then as now, the range of possible uses for that saved time ranged from the trivial to the substantial – but with a much more gendered twist.
More here.
David Treuer in the Los Angeles Times:
Thesis: The idea behind Ian McEwan’s new novel, “Nutshell”— an imagining of the events leading up to “Hamlet” (Gertrude and Claudius plotting to and then going ahead with killing King Hamlet) set in modern London, with the three principal parts played by John and Trudy and Claude (poet, trophy wife and scheming real estate-dealing brother, respectively) and narrated by Trudy and John’s unborn child in utero — is quite possibly the worst idea for a novel ever.
Antithesis: In McEwan’s hands, the hugely improbable feat of convincingly narrating a novel in the first person from the point of view of a fetus comes alive because McEwan is such a good writer, someone who drives his prose toward the impossible (and has done so throughout his career) in ways that continue to surprise. And a fetus is the perfect spokesling for the as-of-yet-unexplored terrain leading up the bloodbath that is “Hamlet”: How can two people, both of whom must have loved another, conspire to kill another (a spouse), get away with it, and then how do they suffer for what their imaginations made them do? This is the stuff of very good literature.
Discussion: The novel (McEwan’s 17th book) begins with the unnamed fetus uncomfortably crammed in his mother’s uterus some weeks before his inevitable eviction in a state of agitated suspension. Nabokov observed that many novels lurch forward because of eavesdropping (young Marcel spies on the music teacher’s daughter Mademoiselle Vinteuil during a lesbian frolic; the hedges and closets and hallways and fountains of mountain spas are constructed, it seems, to provide Lermontov’s characters with ways of knowing that which they shouldn’t or can’t otherwise). What better place to eavesdrop on a dastardly plot than from the comfort of someone’s womb? That’s what the fetus does from the start: He wombdrops on his mother and her boyfriend as they discuss the fetus’ father, John.
More here.
Ethan Siegel in Forbes:
Every moment that passes finds us traveling from the past to the present and into the future, with time always flowing in the same direction. At no point does it ever appear to either stand still or reverse; the “arrow of time” always points forwards for us. But if we look at the laws of physics — from Newton to Einstein, from Maxwell to Bohr, from Dirac to Feynman — they appear to be time-symmetric. In other words, the equations that govern reality don’t have a preference for which way time flows. The solutions that describe the behavior of any system obeying the laws of physics, as we understand them, are just as valid for time flowing into the past as they are for time flowing into the future. Yet we know from experience that time only flows one way: forwards. So where does the arrow of time come from?
Many people believe there might be a connection between the arrow of time and a quantity called entropy. While most people normally equate “disorder” with entropy, that’s a pretty lazy description that also isn’t particularly accurate. Instead, think about entropy as a measure of how much thermal (heat) energy could possibly be turned into useful, mechanical work. If you have a lot of this energy capable of potentially doing work, you have a low-entropy system, whereas if you have very little, you have a high-entropy system. The second law of thermodynamics is a very important relation in physics, and it states that the entropy of a closed (self-contained) system can only increase or stay the same over time; it can never go down. In other words, over time, the entropy of the entire Universe must increase. It’s the only law of physics that appears to have a preferred direction for time.
More here.
Julia Felsenthal in Vanity Fair:
At the beginning of Future Sex, Emily Witt’s probing investigation into 21st-century female sexuality, the author is single, 30, and not thrilled about it. She occasionally has sex with men she knows, friends, and friends of friends, casual entanglements that she dismisses as distractions. She and her partners are “souls flitting through limbo, piling up against one another like dried leaves, awaiting the brass trumpets and wedding bells of the eschaton.” Witt feels keenly that she’s missing out on the kind of committed monogamous partnership that had always seemed part and parcel of adulthood, reward for a life of rules followed. “I nurtured my idea of the future,” she writes, “which I thought of as the default denouement of my sexuality, and a destiny rather than a choice. The vision remained suspended, jewel-like in my mind, impervious to the storms of actual experience, a crystalline point of arrival.”
By the book’s end, Witt is several years older and in a different headspace. “I now understood the fabrication of my sexuality,” she writes, “I saw the seams of its construction and the arbitrary nature of its myth.” Her circumstances aren’t markedly changed; the evolution is psychic and semantic. “I knew,” she writes, “that naming sexual freedom as an ideal put the story I told myself about my life in greater alignment with the choices I had already made. It offered continuity between my past and the future. It gave value to experiences that I had viewed with frustration or regret.”
It’s a subtle shift, but the experiences that catalyze it are not so subtle. Future Sex, as the title suggests, takes Witt to the furthest extremes of the erotic vanguard on a quest to establish the contours of her own sexuality, and of female sexuality more generally, in an age of Internet dating and abundant, diverse pornography, of delayed reproduction and more open relationships.
More here.
Why is math so good at describing the world? Video length: 53:05
John Fabian Witt in the New York Times:
Once in every great while, nature and nurture combine in a single person the qualities of erratic genius, herculean work ethic and irrepressible ambition. Think of Picasso in art, Ali in boxing or Roth in literature. Add a penchant for provocation untethered to the constraints of conventional human interaction and you get, in the law, Judge Richard Posner of the United States Court of Appeals for the Seventh Circuit in Chicago.
In the past half-century there has been no figure more dominant or more controversial in American law than Posner. He has written more than 50 books, over 500 articles and nearly 3,000 majority opinions for his court. Not even Supreme Court Justice Oliver Wendell Holmes Jr. — to whom he is often compared — matches his productivity and range.
William Domnarski’s biography, the first such book on Posner, draws on extensive interviews and on access to Posner’s correspondence at the University of Chicago. “Richard Posner” portrays a man who aims self-consciously to be (in his words) a “Promethean intellectual hero,” remaking the world of the law by sheer will. The questions Domnarski asks are, What makes this extraordinary character tick — and to what end?
More here.
The woman is about hair
gathering on the ground and between the breasts
that move up and down with each breath
in suffering.
In twenty years I will exist.
Even if i’m dead in twenty years I will exist
more than I do now.
I shave my legs in the shower
until my ass goes numb.
The water gathers all of me around
and says “that’s what you get”
the same way men say
“that’s just how the world works”
as if they’re happy about it.
I make a prayer for you in front of the closet mirror
where the light from inside moves
around the room to see itself reflected.
The woman sees herself in everything and nothing.
You can open the news and read
anything you want to.
That’s the magic of being alive here.
You can even read about yourself
long after you’re dead.
.
Joshua Jennifer Espinoza
from Feminist Wire, July 2015
Maria Konnikova in The New York Times:
In 1993, a few years after the success of his firm’s ad campaign that introduced the Apple Mac, the head of the Chiat/Day agency, Jay Chiat, decided that it was time to have a workplace that matched the verve of the agency’s advertising. In Los Angeles, he commissioned Frank Gehry to create a place that was “playful, zany and stylish”: no cubicles, no offices, no traditional desks. Space was filled with a four-story statue of a pair of binoculars and pods from old fairground rides where “people would sit together . . . and think creative thoughts.” In New York, Chiat tasked Gaetano Pesce with much the same vision, resulting in murals of red lips, chairs with springs for feet and a floor in front of a bathroom that would raise many an eyebrow in today’s trigger-warning culture (a picture of a man urinating). When Frank Duffy, an architect who is no stranger to innovation in office design himself (he is credited as a founder of Bürolandschaft, the office-landscaping movement), saw the project, he said, “Perhaps its gravest weakness is that it is a place where ‘play’ is enforced on everyone, all the time.”
That statement is at the heart of “Messy: The Power of Disorder to Transform Our Lives,” the latest book from the economist-turned-journalist Tim Harford. At first, it seems an odd comment to include. After all, isn’t the Chiat/Day approach the quintessence of mess — disrupting the staid old office style, pivoting in a new creative direction, or any of those other business clichés? But it’s actually perfect. Because the mess Harford has in mind is less physical than psychical. It’s not that disruption is inherently good, or that we should strive actually to be messy — unconstrained by desks or real work spaces, free to roam and think, surrounded by playful towers of stuff in stubborn defiance of Kondo-ization. It’s that rigid rules are bad, whether they err on the side of too much mess or too little. Rigidity disempowers people. In telling us to be messy, Harford urges us to recapture our autonomy. A less catchy, but perhaps more accurate, title for the book would be “Control: The Power of Autonomy and Flexibility.”
More here.
Roger Scruton in City Journal:
During the 1960s and 1970s, the consensus in Western academic and intellectual institutions was very much on the left. Writers like Michel Foucault and Pierre Bourdieu shot to eminence by attacking the civilization they dismissed as “bourgeois.” The critical-theory writings of Jürgen Habermas achieved a dominant place in the curriculum in the social sciences, despite their stupefying tediousness. The rewriting of national history as a tale of “class struggle,” undertaken by Eric Hobsbawm in Britain and Howard Zinn in the United States, became a near-orthodoxy not only in university history departments but also in high schools. For us dissidents, it was a dispiriting time, and there was scarcely a morning when I did not wake up during those years, asking myself whether my teaching at the University of London was the right choice of career. Then came the collapse of Communism in Eastern Europe, and I allowed myself to hope.
For a while, it looked as though an apology might be forthcoming from those who had devoted their intellectual and political efforts to whitewashing the crimes of the Soviet Union or praising the “people’s republics” of China and Vietnam. But the moment proved short-lived. Within a decade, the Left establishment was back in the driver’s seat, with Zinn and Noam Chomsky renewing their intemperate denunciations of America, the European Left regrouped against “neoliberalism” (the new name for the free economy) as though this had been the trouble all along, Habermas and Ronald Dworkin collecting prestigious prizes for their barely readable defenses of ruling leftist platitudes, and the veteran Marxist Hobsbawm rewarded for a lifetime of unswerving loyalty to the Soviet Union by his appointment as “Companion of Honour” to the Queen.
True, the enemy was no longer described as before: the Marxist template did not easily fit the new conditions, and it seemed a trifle foolish to champion the cause of the working class, when its last members were joining the ranks of the unemployable or the self-employed. But one thing remained unchanged in the wake of Communism’s collapse: the conviction that it was unacceptable to be on the “right.”
More here.