An unexpected rain of spiders led to a lovely Twitter geek-out between astronomers and arachnologists

Ed Yong in The Atlantic:

Lead_960 (3)Last Wednesday, a spider fell onto Jamie Lomax’s laptop. Two days later, it happened again. Soon enough, several spiders were crawling across the ceiling of her office. “It was a little unnerving,” says Lomax, who’s an astronomer at the University of Washington. “I’m not scared of spiders but if someone else wants to take care of the spider in a room, I’ll gladly let them do it over me. And I don’t really want them raining down on my head.”

Lomax identified the abseiling arachnids as zebra jumping spiders, and tweeted about her experiences with the hashtag #ItIsRainingSpidersNotMen. And after considering options including “nukes and fire,” she settled for notifying her university. They sent over an exterminator, who failed to find any lingering spiders within the ceiling. He figured that a nest had probably hatched, and the newborn spiders had scattered. “But a couple of hours later, there were still spiders everywhere,” she tells me. “As of yesterday, there still were.”

Meanwhile, fellow astronomer Alex Parker had read Lomax’s tweets. “Have you tried lasers?” he replied. “Seriously though, some jumping spiders will chase laser pointers like cats do.”

There are, indeed, many Youtube videos of them doing exactly that. But Emily Levesque—Lomax’s colleague, with an office two doors down—wanted to see it for herself. “She has a laser pointer and she happens to be the only other person with spiders in her office,” says Lomax. “She ran down to me and said: You have to see this.”

More here.

Mohammed Hanif: Not All Attacks Are Created Equal

Mohammed Hanif in the New York Times:

10hanif1-inyt-master768There is a sickeningly familiar routine to terrorist attacks in Pakistan. If one happens in your city, you get a text message or a phone call asking if you are O.K. What happened? you ask. From that, the caller concludes that you are O.K.. Then you turn on the TV and watch the screen zoom in on a Google map or an animated blast before cameras reach the scene and start beaming images of bloodied slippers.

Last Saturday, I went through the same routine during a stay in London. It was a friend in Pakistan who alerted me by text message about the attacks here. As I looked for the TV remote, I got another message from him. “Did you ever think you’d hear about London from Pakistan?”

I found the observation slightly upsetting. I wanted to write back: “You are sitting peacefully in your home in Islamabad. This is not the time to be ironic. There is no irony in carnage.” I didn’t reply, and instead got busy trying to track down my son, who happened to be in the area near where the attack happened. Last year, I sent him off to university in London, calculating this was a safer place than home in Karachi.

After I found out that my son was all right, I had time for ironic reflection.

In the weeks during which a concert in Manchester and a lively neighborhood of London were struck by terrorists, a dozen people were killed at two sites in Tehran, an ice cream parlor was blown up in Baghdad, a single bombing killed some 90 people in Kabul, then more Afghans died during protests about that attack and then still more Afghans died in another attack at a funeral for people killed during the protests.

More here.

‘‘Saying something is a miracle is a failure of imagination’’

JP O'Malley in New Humanist:

For over half a century, the philosopher and scientist Daniel Dennett has sought to answer two questions: how come there are minds? And how is it possible for minds to answer this question? The short answer, Dennett argues, is that minds evolved – a proposition that has implications for the major cultural and scientific debates of our time. His new book, “From Bacteria to Bach and Back” (Allen Lane), is his most thorough exploration of the territory yet, drawing on ideas from computer science and biology.

Daniel-DennettHow exactly would you define consciousness?
Defining it is not a useful activity. Consciousness isn’t one thing. It’s a bunch of things. So I wouldn’t want to fall into the trap of giving a strict definition.

Okay, let’s put it another way: what does consciousness consist of?
It consists of all the thoughts and experiences that we can reflect on and think about. We also know that there are lots of things that are unconscious in us that happen too.

Why are people resistant to seeing the mind described in the computational terms you use?
Because they are afraid that this method of thinking about the mind will somehow show that their minds are not as wonderful as they thought they were. On the contrary, my approach shows, I think, that minds are even more wonderful than what we thought they were. Because what minds do is stupendous. There are no miracles going on, but it’s pretty amazing. And the informed scientific picture of how the mind works is just ravishingly beautiful and interesting.

What is your main disagreement with the Cartesian view that mind and body are separate?
That it predicts nothing. And it postulates a miracle. Saying something is a miracle is basically just deciding that you are not even going to try. It’s a failure of imagination.

Why was the arrival of language such an important moment in the development of human beings?
Other mammals and vertebrates can have social learning and elements of culture. They can have local traditions which are not instincts; those are carried by the genes. Traditions are carried by organisms imitating their elders, for instance. There are the rudiments of cultural accumulation, or cultural exploitation, in chimpanzees, and in birds, for example. But it never takes off. Let’s be generous and say there might be a dozen ways of saying things by imitation. But we humans have hundreds of thousands of things that we pass on that we don’t have to carry in our genes. It’s language that makes all of that possible.

Do the minds of humans differ from the minds of other animals because of culture?
Yes, I think that’s the main reason for it. Our brains are not that different from chimpanzees’ brains. They are not hugely bigger, they are not made of different kinds of neurons and they don’t use much more energy, or anything like that. But they have a lot of thinking tools that chimpanzees don’t have. And you can’t do much thinking with your bare brain. You need thinking tools, which fortunately we humans don’t have to build for ourselves. They have already been made for us.

More here.

I saw Wonder Woman, and you should, too

Caperton in Feministe:

Wonderwoman-560x455I saw Wonder Woman last night, and here’s the completely spoiler-free part of my review: You should go see it. We saw it in IMAX 3D, and I would pay IMAX 3D money to see it that way again. The Boy said he would just as happily have watched it at home after it hits Redbox, but he still liked it and thought it was cool. The important thing is that the action was great, the story was believable and touching, the characters were three-dimensional, the character arcs were compelling, and there were some parts where I teared up. (I also teared up at the commercial where the parents put the lion’s mane on the dog, so take that as you will.) It’s really good. You should see it. I’m serious, you should. Thus ends my completely spoiler-free review. The next section might get mildly spoilery, and the last part extremely so, so be warned.

Wonder Woman was a really good movie.

Patty Jenkins had a lot on her plate. She didn’t have to just make a good movie. She had to make the Wonder Woman movie that so many have called for and so few have had the guts to take on. She had to do justice to a character who has inspired countless girls and women. And on top of that, she had to create the Platonic ideal of a superhero movie, because of course if a woman-led movie gets even marginally bad response, it’s taken as proof that studios shouldn’t waste their money on women at all. So no pressure there. Wonder Woman did it. Jenkins and her cast and crew have made a movie that has broken box-office records, gotten rave reviews, and was loved by Wonder Woman’s longtime fans. Heretofore unrevealed Wonder Woman fans have been popping up on my social media feeds like mushrooms in well-loved Wonder Woman t-shirts. So I guess it’s pretty good, or something. People seem to think so. It’s not a perfect movie. There are some glaring faults. Despite obvious efforts, diversity remains a problem. The island of the Amazons has women of every age and ethnicity taking roles in the army, the senate, and the town. That said, women of color are still underutilized in featured roles — Florence Kasumba’s turn as Senator Acantha was disappointingly brief — and prominent visibility. Plans are already in the works for a Wonder Woman 2, with Jenkins again at the helm, to be set in the modern-day U.S., so hopefully that’s something that will be improved upon in the next go-round. The other thing that bugged me (and I don’t consider this a spoiler, because come on, you knew it was coming) was the huge super-on-super battle at the end. It suffered from the same problem as Man of Steel‘s five-hour Let’s Wreck Metropolis final battle: At some point, when you’ve punched each other through enough walls and thrown enough trucks at each other, throwing a truck at someone no longer seems like a big deal. “An armored van? Meh. Come back when you’ve hit him with a 747.” I need to see consequences for the dueling supers and not just the 1.38 million casualties of the Battle of Metropolis.

Wow. If that’s how much I had to say about the negative stuff, you might want to grab a snack and a pillow before we start in on the stuff I liked.

More here.

Saturday, June 10, 2017

Fredric Jameson on ‘One Hundred Years of Solitude’

Fredric Jameson in the London Review of Books:

SolitudeThe first centennial of the Soviet revolution, indeed the fifth centennial of Luther’s, risk distracting us from a literary earthquake which happened just fifty years ago and marked the cultural emergence of Latin America onto that new and larger stage we call globalisation – itself a space that ultimately proves to be well beyond the separate categories of the cultural or the political, the economic or the national. I mean the publication of Gabriel García Márquez’s One Hundred Years of Solitude in 1967, which not only unleashed a Latin American ‘boom’ on an unsuspecting outside world but also introduced a host of distinct national literary publics to a new kind of novelising. Influence is not a kind of copying, it is permission unexpectedly received to do things in new ways, to broach new content, to tell stories by way of forms you never knew you were allowed to use. What is it, then, that García Márquez did to the readers and writers of a still relatively conventional postwar world?

He began his productive life as a movie reviewer and a writer of movie scenarios nobody wanted to film. Is it so outrageous to consider One Hundred Years of Solitude as a mingling, an intertwining and shuffling together of failed movie scripts, so many fantastic episodes that could never be filmed and so must be consigned to Melquíades’s Sanskrit manuscript (from which the novel has been ‘translated’)? Or perhaps it may be permitted to note the astonishing simultaneity of the beginning of his literary career with the so-called Bogotazo, the assassination in 1948 of the great populist leader Jorge Eliécer Gaitán (and the beginning of the seventy-year long Violencia in Colombia), just as García Márquez was having lunch down the street and, not much further away, the 21-year-old Fidel Castro was waiting in his hotel room for an afternoon meeting with Gaitán about the youth conference he had been sent to organise in Bogota that summer.

The solitude of the title should not at first be taken to mean the affective pathos it becomes at the end of the book: first and foremost, in the novel’s founding or refounding of the world itself, it signifies autonomy.

More here.

It’s Not Islam That Drives Young Europeans to Jihad, France’s Top Terrorism Expert Explains

Davide Lerner in Haaretz:

1078102010Salman Abedi, the suicide bomber who killed 22 people at a Manchester pop concert this week, started life advantageously enough: to parents who had fled Gadhafi’s Libya for a new life in Britain. But actually it was that kind of dislocation that would send him off kilter two decades later, says Olivier Roy, one of France’s top experts on Islamic terrorism.

“An estimated 60 percent of those who espouse violent jihadism in Europe are second-generation Muslims who have lost their connection with their country of origin and have failed to integrate into Western societies,” Roy says.

They are subject to a “process of deculturation” that leaves them ignorant of and detached from both the European society and the one of their origins. The result, Roy argues, is a dangerous “identity vacuum” in which “violent extremism thrives.”

More here.

Teaching Humility in an Age of Arrogance

Photo_82396_landscape_650x433

Michael Patrick Lynch in The Chronicle of Higher Education:

One way the internet distorts our picture of ourselves is by feeding the human tendency to overestimate our knowledge of how the world works. Most of us know what it’s like to think we remember more from high-school physics or history than we actually do. As the cognitive scientists Steven Sloman and Philip Fernbach have detailed recently, such overestimation extends farther than you might think: Ask yourself whether you can really explain how a toilet or a zipper works, and you may find yourself surprisingly stumped. You assume you know how things work when you often don’t know at all.

This sort of ignorance is partly due to the fact that human beings aren’t isolated knowing machines. We live in an economy of knowledge that distributes cognitive and epistemic labor among specialists. That’s a good thing — no one person can know everything, or even very much. But put all the doctors, scientists, mechanics, and plumbers together, and we collectively know quite a bit.

Yet this often means we blur the line between what’s inside our heads and what’s not. Some philosophers have argued that this blurring is actually justified because knowing itself is often an extended process, distributed in space. When I know something because of your expert testimony — say, that my car’s alternator is broken — what I know is partly in your head and partly in mine. If that’s right, then living in a knowledge economy literally increases my knowledge because knowing is not just an individual phenomenon.

Suppose this extended, distributed picture of knowledge is right. Add the personalized internet, with its carefully curated social-media feeds and individualized search results, and you get not one knowledge economy, but many different ones, each bounded by different assumptions of which sources you can trust and what counts as evidence and what doesn’t. The result is not only an explosion of overconfidence in what you individually understand but an active encouragement of epistemic arrogance.

More here.

WHY WE LIE: THE SCIENCE BEHIND OUR DECEPTIVE WAYS

Why-we-lie-cover.adapt.280.1

Yudhijit Bhattacharjee in National Geographic:

The ubiquity of lying was first documented systematically by Bella DePaulo, a social psychologist at the University of California, Santa Barbara. Two decades ago DePaulo and her colleagues asked 147 adults to jot down for a week every instance they tried to mislead someone. The researchers found that the subjects lied on average one or two times a day. Most of these untruths were innocuous, intended to hide one’s inadequacies or to protect the feelings of others. Some lies were excuses—one subject blamed the failure to take out the garbage on not knowing where it needed to go. Yet other lies—such as a claim of being a diplomat’s son—were aimed at presenting a false image. While these were minor transgressions, a later study by DePaulo and other colleagues involving a similar sample indicated that most people have, at some point, told one or more “serious lies”—hiding an affair from a spouse, for example, or making false claims on a college application.

That human beings should universally possess a talent for deceiving one another shouldn’t surprise us. Researchers speculate that lying as a behavior arose not long after the emergence of language. The ability to manipulate others without using physical force likely conferred an advantage in the competition for resources and mates, akin to the evolution of deceptive strategies in the animal kingdom, such as camouflage. “Lying is so easy compared to other ways of gaining power,” notes Sissela Bok, an ethicist at Harvard University who’s one of the most prominent thinkers on the subject. “It’s much easier to lie in order to get somebody’s money or wealth than to hit them over the head or rob a bank.”

As lying has come to be recognized as a deeply ingrained human trait, social science researchers and neuroscientists have sought to illuminate the nature and roots of the behavior. How and when do we learn to lie? What are the psychological and neurobiological underpinnings of dishonesty? Where do most of us draw the line?

More here.

H.G. Wells still speaks to our fears and dreams

UnnamedMichael Dirda at The Washington Post:

During the first half of his writing career, H.G. Wells (1866-1946) imagined a machine that would travel through time, the fearsome tripods of Martian invaders, a moon rocket powered by Cavorite, the military tank (in the short story “The Land Ironclads”) and other engineering marvels. But, as Jeremy Withers’s “The War of the Wheels” reminds us, the father of science fiction was also fascinated by the bicycle.

If you look through Wells’s bibliography, you’ll notice that he was never strictly a writer of what he called “fantasias of possibility.” Yes, he found his first success in “The Time Machine,” published in 1895, but that same year he also brought out a collection of slight fictional pieces titled “Selected Conversations With an Uncle ,” a satirical fantasy called “The Wonderful Visit” — about an angel who is mistaken for a bird and shot by a clergyman — and “The Stolen Bacillus and Other Incidents,” a volume of his early science fictional short stories. In the following year, the industrious Wells then published his terrifying, Swiftian nightmare, “The Island of Dr. Moreau,” but also “The Wheels of Chance,” his first realistic, mildly comic novel in which a young draper’s apprentice goes off on a two-week bicycle holiday.

Obviously somewhat autobiographical — Wells had been a draper’s apprentice — that book reflects its author’s early passion for the bicycle. Throughout the 1890s, as Withers notes, England was crazy about cycling. What was called the “safety bicycle” — essentially the basic clunker we still know today — had supplanted those elegant big-and-small-wheeled marvels of earlier years. Pneumatic tires had improved ease of pedaling. Cycling clubs and specialty shops flourished. Hotshots, who sped along hunched over their handlebars, were called “scorchers.” Moralists worried that ladies might find sitting on a bike saddle sexually stimulating.

more here.

‘being wagner’ by simon callow

Being-wagnerThomas Laqueur at The Guardian:

His radically innovative, embarrassingly voluptuous, riveting – or, some will say, boring – music is at the heart of the controversy, and of Callow’s attraction to his subject. He says he has been a Wagnerian since early adolescence: he knew all about leitmotiven and the “Tristan chord”. But pointing to his music is only to push the question one step back: why? Lots of composers before Wagner used the same notes in a chord but he managed to keep it unresolved from the beginning of a work to orgasmic ending – five hours of Tantric harmonic deferral. That got the naming rights, one supposes.

Beethoven, the only composer who comes close to Wagner in his daring breaks from the past and who was met initially with a similarly uncomprehending and hostile reaction, was well on his way to being the assimilated prototypical genius of the 19th century within 20 years of his death. In fact, Wagner’s Dresden performances of Beethoven’s Ninth in the 1840s played an important role in inserting a wild and unruly symphony into the heart of the musical canon. Wagner’s operas are unquestionably canonical but they still generate the sort of hostility they did when they were first performed. None of the great 20th-century masters – not Stravinsky, not Schoenberg, not Boulez – is as divisive today as this composer born more than 200 years ago.

more here.

The trucker’s life

51+kY9EF5IL._SX327_BO1 204 203 200_Nathan Deuel at the LA Times:

What hooks Murphy so thoroughly, despite society’s apparent disapproval, is that in addition to the money and freedom, the rough-and-tumble underworld of big trucks and long drives actually feels like a meaningful lesson in the pride and purity of hard work. “When you hired movers,” he writes, “they moved it. Execution was the imperative. This unequivocation was very attractive to me then, as it is now.”

There are all kinds of truckers. Murphy’s a mover (or a bedbugger), not to be confused with car haulers (parking lot attendants), animal transporters (chicken chokers), refrigerated food haulers (reefers) or hazmat haulers (suicide jockeys.) What unites most of them, Murphy explains with some distaste, is how happily they communicate with each other over CB radios, in a kind of private social network Murphy doesn’t seem to relish like he does all that time alone.

The way Murphy thinks of it, most of the other long-haul drivers are all too happy to gather around the gas station and guffaw. What they’re probably missing out on, Murphy suggests, are lonelier and more poetic thoughts, such as the way the engines themselves, “want to work hard. What they like is a full load and twenty-hour run at 65.” When you maintain one properly, he writes, the thing can run a million miles.

more here.

Now let’s fight back against the politics of fear

Naomi Klein in The Guardian:

UntitledShock. It’s a word that has come up again and again since Donald Trump was elected in November 2016 – to describe the poll-defying election results, to describe the emotional state of many people watching his ascent to power, and to describe his blitzkrieg approach to policymaking. A “shock to the system” is precisely how his adviser Kellyanne Conway has repeatedly described the new era. For almost two decades now, I’ve been studying large-scale shocks to societies: how they happen, how they are exploited by politicians and corporations, and how they are even deliberately deepened in order to gain advantage over a disoriented population. I have also reported on the flipside of this process: how societies that come together around an understanding of a shared crisis can change the world for the better. Watching Donald Trump’s rise, I’ve had a strange feeling. It’s not just that he’s applying shock politics to the most powerful and heavily armed nation on earth; it’s more than that. In books, documentary films and investigative reporting, I have documented a range of trends: the rise of superbrands, the expanding power of private wealth over the political system, the global imposition of neoliberalism, often using racism and fear of the “other” as a potent tool, the damaging impacts of corporate free trade, and the deep hold that climate change denial has taken on the right side of the political spectrum. And as I began to research Trump, he started to seem to me like Frankenstein’s monster, sewn together out of the body parts of all of these and many other dangerous trends.

Ten years ago, I published The Shock Doctrine: The Rise of Disaster Capitalism, an investigation that spanned four decades of history, from Chile after Augusto Pinochet’s coup to Russia after the collapse of the Soviet Union, from Baghdad under the US “Shock and Awe” attack to New Orleans after Hurricane Katrina. The term “shock doctrine” describes the quite brutal tactic of systematically using the public’s disorientation following a collective shock – wars, coups, terrorist attacks, market crashes or natural disasters – to push through radical pro-corporate measures, often called “shock therapy”. Though Trump breaks the mould in some ways, his shock tactics do follow a script, one familiar from other countries that have had rapid changes imposed under the cover of crisis. During Trump’s first week in office, when he was signing that tsunami of executive orders and people were just reeling, madly trying to keep up, I found myself thinking about the human rights advocate Halina Bortnowska’s description of Poland’s experience when the US imposed economic shock therapy on her country in the midst of communism’s collapse. She described the velocity of change her country was going through as “the difference between dog years and human years” and she observed that “you start witnessing these semi-psychotic reactions. You can no longer expect people to act in their own best interests when they’re so disoriented they don’t know – or no longer care – what those interests are.”

More here.

Arundhati Roy’s Return to the Form That Made Her Famous

Karan Mahajan in The New York Times:

RoyRoy’s first and only other novel, “The God of Small Things,” was a commercial and critical sensation. The gorgeous story of a doomed South Indian family, it sold six million copies and won the Booker Prize. It became a sort of legend — both for its quality and for its backwater publishing story: Roy, unlike so many other successful Indian writers in English, didn’t live abroad or attend an elite college. She had trained as an architect and had an obscure career as an indie actress and screenwriter. Her success, which involved foreign agents and a startling advance, was linked to India’s kick-starting, liberalizing economy as well. It seemed everything had come together for Roy’s book. Roy reacted with instinctive defiance. She stopped writing fiction and began protesting against the Indian state, which, she felt, was steamrollering the rights of the poor and collaborating with capitalist overlords. Several books of essays followed. Their titles — “The Algebra of Infinite Justice,” “The End of Imagination,” “Capitalism: A Ghost Story” — convey the largeness of her concerns. She traveled with Maoist guerrillas in an Indian forest, marched with anti-big-dam protesters, met with Edward Snowden in a Moscow hotel room, and was threatened and even briefly imprisoned by the Indian government — and she continued to write. But the writing was not of the same standard as her fiction. Though occasionally witty in its put-downs, it was black-and-white and self-righteous — acceptable within the tradition of political writing, but not artful.

So it is a relief to encounter the new book and find Roy the artist fully and brilliantly intact: prospering with stories and writing in gorgeous, supple prose. The organs of a slaughtered buffalo in one scene “slip away like odd-shaped boats on a river of blood”; the “outrageous” femininity of transgender women or hijras in a neighborhood make the “real, biological women” look “cloudy and dispersed”; a boat is seen “cleaving through a dark, liquid lawn” of a weed-choked lake. Again and again beautiful images refresh our sense of the world. The story concerns several people who converge over an abandoned baby at an anti-corruption protest in Delhi in 2011. There is a hijra named Anjum who has survived the anti-Muslim Gujarat riots of 2002. There is her sidekick, a former mortuary worker who calls himself Saddam Hussain because he is obsessed with the “courage and dignity” of Saddam “in the face of death.” And there is an enigmatic middle-class woman called S. Tilottama who ferries the abandoned baby to her home.

More here.

Friday, June 9, 2017

HOW DIFFERENT—AND DANGEROUS—IS TERRORISM TODAY?

Wright-How-Different-And-Dangerous-Is-Terrorism-Today-690

Robin Wright in The New Yorker:

On Sunday, just hours after three men launched an assault on London Bridge, British Prime Minister Theresa May stepped in front of 10 Downing Street and told the world, “We believe we are experiencing a new trend in the threat we face.” In many ways, the attack in the British capital, as well as others over the past two years in Nice, Berlin, Stockholm, Paris, and Manchester, actually weren’t all that unique in terms of tactics, targets, or even motive. A century ago, a battered horse-drawn wagon loaded with a hundred pounds of dynamite—attached to five hundred pounds of cast-iron weights—rolled onto Wall Street during lunch hour. The wagon stopped at the busiest corner in front of J. P. Morgan’s bank. At 12:01 p.m., it exploded, spraying lethal shrapnel and bits of horse as high as the thirty-fourth floor of the Equitable Building, on Broadway. A streetcar was derailed a block away. Thirty-eight people were killed; many were messengers, stenographers, clerks, and brokers who were simply on the street at the wrong time—what are today known as “soft targets.” Another hundred and forty-three people were injured.

That attack, on September 16, 1920, was, at the time, the deadliest act of terrorism in American history. Few surpassed it for the next seventy-five years, until the Oklahoma City bombing, in 1995, and then the September 11th attacks, in 2001. The Wall Street case was never solved, although the investigation strongly pointed to followers of a charismatic Italian anarchist named Luigi Galleani. Like isis and its extremist cohorts today, they advocated violence and insurrection against Western democracies and justified innocent deaths to achieve it.

Europe has also faced periods of more frequent terrorism than in the recent attacks. Between 1970 and 2015, more than ten thousand people were killed in over eighteen thousand attacks, according to the University of Maryland’s Global Terrorism Database. The deadliest decades were, by far, the nineteen-seventies and eighties—during the era of Germany’s Baader-Meinhof gang, Italy’s Red Brigades, Spain’s E.T.A., Britain’s Irish Republican Army, and others. The frequency of attacks across Europe reached as high as ten a week. In 1980, I covered what was then the deadliest terrorist attack in Europe since the Second World War, when a bomb, planted in a suitcase, blew up in the waiting room of Bologna’s train station. Eighty-five people were killed; body parts were everywhere. A neo-fascist group, the Armed Revolutionary Nuclei, claimed credit.

Yet May is correct: modern terrorism is still evolving.

More here.

How Machine-Learning Helps Us Understand Strange Materials and Their Stranger Physics

Vasudevan Mukunth in The Wire:

Tw_logo_fbIf you’ve visited The Wire‘s Facebook page, you must have noticed a globe of 24 dots joined by lines in our social media logo. Now say you have a real-life replica of that globe made of a very elastic polymer in your hands, and you stretch it, squeeze it, twist it around, even knot it with itself. Let’s say the polymer does not tear or break. The study of those features of the globe that are preserved while you were messing with it is called topology.

A topological phase of matter is one whose topology and energy are related. For example, physicists have known that at a lower temperature, the surface of a single-atom-thick layer of superfluid helium develops vortices in pairs that move around each other according to how they are both rotating. At a slightly higher temperature, the vortices become unpaired – but stay put instead of moving around. This is a topological phase transition: the topology of the substance changes according to the temperature.

This exact example – of vortices in liquid helium – is called the Kosterlitz-Thouless (KT) transition, for its discoverers David Thouless and John M. Kosterlitz. There are many other examples of topological phase transitions, all utilising the quirky things that quantum mechanics makes possible in strange, sometimes useful, ways. For example, physicists use topological concepts to understand electrical conductors better (especially insulators and superconductors), as well as apply it to the study of the smallest packets of energy as well as to discover the shape of the universe. In engineering, topological phases are used to find particles that, when they bump into others of their own kind, vanish in a puff of energy; build more efficient hard-drives; and make better robots.

This breadth of applications, as the British-American physicist F. Duncan Haldane has remarked, is thanks to quantum mechanics’s willingness, and classical physics’s reluctance, to be bizarre.

More here.

How (Not) to Criticize Karl Polanyi

Revolucion_industria-1-704x297

Steven Klein in Democracy:

Once a relatively obscure Hungarian academic, Karl Polanyi has posthumously become one of the central figures in debates about globalization. This recent interest in his thought has occasioned an unsympathetic treatment by Jeremy Adelman in the Boston Review. Adelman, a Princeton professor, has scores to settle with Polanyi. But his article ends up revealing more about the limits of our current political debates than anything about the man himself.

Polanyi’s classic book, The Great Transformation: The Political and Economic Origins of Our Time, published in 1944, argued that the utopian obsession with self-adjusting markets had wreaked havoc in nineteenth-century European society, eventually laying the groundwork for the rise of fascism. His once unfashionable views have witnessed a remarkable revival of late. His name is frequently invoked when describing the dangers that global market integration poses to democracy. Polanyi has now moved one step closer to intellectual canonization with the publication of Gareth Dale’s excellent biography, Karl Polanyi: A Life on the Left (2016), the impetus of Adelman’s article.

First, there are aspects of Polanyi’s thought worth criticizing. His historical account of the origins of the market society is murky. He neglects gender, race, and colonialism, although he was a supporter of anti-colonial struggles. Yet, instead, Adelman returns to a well-worn and wrong-headed criticism of Polanyi: that his thought represents a romantic revolt against markets in favor of a warm communalism, a stance that inevitably leads to violent nationalism and tyrannical “collectivism.”

More troubling still is Adelman’s explanation for why Polanyi was supposedly attracted to romantic attacks on liberalism. In Adelman’s telling, Polanyi, who was born into an assimilated Jewish family but converted to Christianity, suffered from a sort of intellectual Stockholm Syndrome: Excluded from European society, he romanticized his murderous oppressors.

More here.

What Both the Left and Right Get Wrong About Race

12424_4ff792cd7f1132cdce40f2da0c437ee4

Dalton Conley and Jason Fletcher in Nautilus:

[L]et us ask what is perhaps the most controversial question in the human sciences: Do genetic differences by ancestral population subgroup explain observed differences in achievement between self-identified race groups in the contemporary United States over and above all the environmental differences that we also know matter? In their best-selling 1994 book, The Bell Curve: Intelligence and Class Structure in American Life, Richard Herrnstein and Charles Murray indeed made the argument that blacks are genetically inferior to whites with respect to cognitive ability. Their “evidence,” however, contained no molecular genetic data, and was flawed as a result. But today we have molecular data that might potentially allow us to directly examine the question of race, genes, and IQ. We raise this pernicious question again only to demonstrate the impossibility of answering it scientifically.

If Herrnstein and Murray redux wanted to proceed, perhaps an obvious way would be to examine whether all the small differences across the genomes of the average black and average white person in a dataset “add up” in a way that suggests that one group has, on average, genetic signatures that predict higher levels of important phenotypes, such as educational attainment. There are at least two ways of “adding up” genomes. The first is to use polygenic scores. The second is the use of principal components. Both have serious drawbacks.

A polygenic score is a single number that captures the sum total of thousands of little effects in the genome on a given trait. It is constructed by running a million or more separate comparisons for each place along the 23 pairs of chromosomes where there is variation (i.e. you have an A-A and I have a G-A) measured in a dataset. When summed, these measures can predict—albeit noisily—the distribution of a given phenotype in the population. The best performing polygenic score to date is for height. A single number calculated from someone’s DNA can explain about 50 percent of the variation in actual height in the population. A score that has been developed for education (and cognitive ability) can explain about 7 percent of the variation in years of schooling, according to a 2016 Nature study, and that score has since been refined to improve its predictive power. So while these are not explaining all of the genetic variation (we think height is about 80 percent genetic and education is at least 25 percent genetic), they do predict. Someone at the upper end of the education distribution is likely to get more than two more years of schooling on average than someone at the bottom of the pack (lowest 10 percent) in terms of his or her polygenic score.

As it turns out, however, these scores when developed for one population—say, those of European descent—fail to predict for other populations.

More here.

the iceman cometh…

6029c3461James Hamblin at The Atlantic:

On the stage stood a Dutch man in black shorts and a synthetic blue shirt. His grayish hair flopped as he paced. He looked somehow robust despite an absence of prominent musculature and a sort of convex abdomen. This was Wim Hof.

“Depression, fear, pain, anxiety—you name it,” Hof’s voice boomed through the speakers. “We are able to get into any cell and change the chemistry. We are able to get into the DNA.”

Hof claims that people can address, prevent, and treat most any malady by focusing the mind to control the metabolic processes in their cells. For example, we can will our bodies to heat up in cold situations. He told the audience “we can beat cancer” by shutting down malignant cells. “I challenge any university in the world to test this out,” he roared.

For a four-hour seminar in The Wim Hof Method, attendees paid around $200. The ticket offered an opportunity to hear Hof speak and to perform his famous breathing exercises, and then to take a brief dip in an inflatable pool of ice water.

more here.