Alice Oswald’s odd, brilliant ode to the English countryside

160912_r28661-1200x1200-1472749149Dan Chiasson at The New Yorker:

Oswald, born in 1966, is the daughter of a renowned garden designer, and read classics at Oxford. She lives with her family near a bend in the River Dart, in Devon, the misty setting for “The Hound of the Baskervilles.” In an interview, she has said that she likes “the way that the death of one thing is the beginning of something else,” an Ovidian mind-set equally fit for the gardener and the translator. Her nature poems tend to be revisions of earlier poems on the same subjects: John Clare’s “Badger” has become “Body,” in which the sleep of the dead “under their mud roof” is disturbed by a badger “hard at work / with the living shovel of himself”; you can hear Dickinson’s “I heard a Fly buzz—” amid the “horrible trapped buzzing” of Oswald’s “Flies”; Andrew Marvell’s “On a Drop of Dew” bequeaths to Oswald “A Rushed Account of the Dew”; Ted Hughes’s “The Thought Fox” stalks Oswald’s “Fox.” This isn’t simply influence or homage, though Oswald is generous about crediting her forebears. The deeper urge is to collaborate with the dead, whose descriptions of badgers and foxes and flies are part of a timeless continuum that now includes Oswald and her readers, each new mind capturing the world according to its distinct angle and music.

A poet whose thoughts are saturated with prior literature recognizes the actual, living fox by mentally matching it to the fox on the page, a reversal of the usual perceptual order—observe, then describe—that threatens to fog up her vision. There is an impulse in these poems to inventory the natural world without the palliatives of conventional description; the paradox, as old as classical pastoral and georgic, is that our nature is to describe, an imperative that seems perfectly unnatural when measured against the unselfconscious work of bees or ants or oxen.

more here.

Quantum Hanky-Panky

Seth Lloyd in Edge:

Lloyd640Right now, there's been a resurgence of interest in ideas of applying quantum mechanics and quantum information to ideas of quantum gravity, and what the fundamental theory of the universe actually is. It turns out that quantum information has a lot to offer people who are looking at problems like, for instance, what happens when you fall into a black hole? (By the way, my advice is don't do that if you can help it.) If you fall into a black hole, does any information about you ever escape from the black hole? These are questions that people like Stephen Hawking have been working on for decades. It turns out that quantum information has a lot to give to answer these questions.

Twenty-five years ago, I started working on the problem of quantum computing, which is how atoms and molecules, photons, elementary particles, process information. At that point, there were only half a dozen people in the world looking at this problem, and now there are thousands. There goes the neighborhood. In any field that expands by so much so rapidly, there are now all kinds of branches on this tree. There are still branches of the fundamental questions of how we understand the world, in terms of how it processes information. Right now, there's been a resurgence of interest in ideas of applying quantum mechanics and quantum information to ideas of quantum gravity, and what the fundamental theory of the universe actually is. It turns out that quantum information has a lot to offer people who are looking at problems like, for instance, what happens when you fall into a black hole? (By the way, my advice is don't do that if you can help it.) If you fall into a black hole, does any information about you ever escape from the black hole? These are questions that people like Stephen Hawking have been working on for decades. It turns out that quantum information has a lot to give to answer these questions.

More here.

How to raise a genius: lessons from a 45-year study of super-smart children

Tom Clynes in Nature:

Nature_geniuskids_del2On a summer day in 1968, professor Julian Stanley met a brilliant but bored 12-year-old named Joseph Bates. The Baltimore student was so far ahead of his classmates in mathematics that his parents had arranged for him to take a computer-science course at Johns Hopkins University, where Stanley taught. Even that wasn't enough. Having leapfrogged ahead of the adults in the class, the child kept himself busy by teaching the FORTRAN programming language to graduate students. Unsure of what to do with Bates, his computer instructor introduced him to Stanley, a researcher well known for his work in psychometrics — the study of cognitive performance. To discover more about the young prodigy's talent, Stanley gave Bates a battery of tests that included the SAT college-admissions exam, normally taken by university-bound 16- to 18-year-olds in the United States. Bates's score was well above the threshold for admission to Johns Hopkins, and prompted Stanley to search for a local high school that would let the child take advanced mathematics and science classes. When that plan failed, Stanley convinced a dean at Johns Hopkins to let Bates, then 13, enrol as an undergraduate.

Stanley would affectionately refer to Bates as “student zero” of his Study of Mathematically Precocious Youth (SMPY), which would transform how gifted children are identified and supported by the US education system. As the longest-running current longitudinal survey of intellectually talented children, SMPY has for 45 years tracked the careers and accomplishments of some 5,000 individuals, many of whom have gone on to become high-achieving scientists. The study's ever-growing data set has generated more than 400 papers and several books, and provided key insights into how to spot and develop talent in science, technology, engineering, mathematics (STEM) and beyond. “What Julian wanted to know was, how do you find the kids with the highest potential for excellence in what we now call STEM, and how do you boost the chance that they'll reach that potential,” says Camilla Benbow, a protégé of Stanley's who is now dean of education and human development at Vanderbilt University in Nashville, Tennessee. But Stanley wasn't interested in just studying bright children; he wanted to nurture their intellect and enhance the odds that they would change the world. His motto, he told his graduate students, was “no more dry bones methodology”.

More here.

Tuesday, September 6, 2016

Israel’s Impending Civil War

Uri Avnery in the London Review of Books:

ScreenHunter_2195 Sep. 06 18.56Something happens to retired chiefs of the Israeli internal Security Service, Shin Bet. Once they leave their jobs, they become spokesmen for peace. How come? Shin Bet agents are the only members of the establishment who come into real, direct, daily contact with Palestinians. They interrogate Palestinian suspects, torture them, try to turn them into informers. They collect information, penetrate the most remote parts of Palestinian society. They know more about the Palestinians than anybody else in Israel (and perhaps in Palestine, too).

The intelligent among them (intelligence officers can be intelligent) also come to conclusions that evade many politicians: that there is a Palestinian nation, that this nation will not disappear, that the Palestinians want a state of their own, that the only solution to the conflict is a Palestinian state next to Israel. And so, on leaving the service, Shin Bet chiefs become outspoken advocates of the two-state solution.

The identity of all secret service personnel is, well, secret, except the chiefs. (When I was a member of the Knesset, I submitted a bill which stipulated that the name of the service chiefs be made public. The bill was rejected, like all my proposals, but soon afterwards the prime minister decreed that the names of the chiefs be made public.) Some time ago, Israeli TV showed a documentary called The Doorkeepers, in which all the living ex-chiefs of the Shin Bet and the Mossad advocated peace based on the two-state solution. They expressed their opinion that there will be no peace unless the Palestinians achieve a national state of their own.

More here.

Smarter brains are blood-thirsty brains

From Phys.org:

ScreenHunter_2194 Sep. 06 18.42A University of Adelaide-led project has overturned the theory that the evolution of human intelligence was simply related to the size of the brain—but rather linked more closely to the supply of blood to the brain.

The international collaboration between Australia and South Africa showed that the human brain evolved to become not only larger, but more energetically costly and blood thirsty than previously believed.

The research team calculated how blood flowing to the brain of human ancestors changed over time, using the size of two holes at the base of the skull that allow arteries to pass to the brain. The findings, published in the Royal Society Open Science journal, allowed the researchers to track the increase in human intelligence across evolutionary time.

“Brain size has increased about 350% over human evolution, but we found that blood flow to the brain increased an amazing 600%,” says project leader Professor Emeritus Roger Seymour, from the University of Adelaide. “We believe this is possibly related to the brain's need to satisfy increasingly energetic connections between nerve cells that allowed the evolution of complex thinking and learning.

“To allow our brain to be so intelligent, it must be constantly fed oxygen and nutrients from the blood.”

More here.

Harold Bloom on Alvin Feinman’s Self-limiting Transcendence

Harold Bloom in The Critical Flame:

ScreenHunter_2193 Sep. 06 18.21I first met Alvin Feinman in September 1951, the day before I encountered another remarkable young man who also became a life-long friend, Angus Fletcher. Alvin was twenty-two, a year older than we were, and a graduate student in philosophy at Yale, where Angus and I were students of literature. Alvin, to my lasting sorrow, died in 2008. Of my closest friends I am fortunate still to have Angus, having lost Alvin, Archie Ammons, and John Hollander, three superb poets and majestic intellects.

I am no poet; I cannot forget. Many of my friends are or were poets: Mark Strand, a recent loss; Robert Penn Warren, and happily still with us, William Merwin, John Ashbery, Jay Wright; and younger figures: Rosanna Warren, Henri Cole, Martha Serpas, Peter Cole.

Alvin at twenty-two was already a poet of astonishing individuation: the emergence of voice in him clarified as rapidly as it had in Rimbaud and Hart Crane.

More here.

The Long 20th Century of Terror

Marat-davidRobert Zaretsky at The American Scholar:

Terrorism is as old as recorded history. Plutarch describes how ancient Spartans would ambush and kill a few enslaved helots every year to keep the rest in a state of terror. A few centuries later, according to Josephus, the Jewish Zealots earned the moniker sicarii, or dagger men, thanks to their practice of slitting the throats of Roman officials in crowded marketplaces. The dagger was also the weapon of choice for the Assassins, a medieval Shiite sect dedicated to the destruction of both the Sunnis and the Crusaders. For more than a millennium, a Hindu offshoot known as the Thuggees strangled unsuspected travelers as offerings to the goddess Kali.

Fast forward to the modern age, when the French Revolution ushered in a century and a half of guillotines, gulags, and gas chambers. The defining trait of totalitarian states ever since, from Nazi Germany and Fascist Italy to Communist Russia and China, has been the systematic and sustained use of terror to maintain power. Whether used by states that capitalize on violence and repression, or by stateless movements that monopolize the attention of our media and governments (and justify wars), terror remains the order of the day.

Historians and sociologists, philosophers and political theorists have interpreted terrorism, adding a great deal to our knowledge, but less to our understanding. For the latter, perhaps we need to turn to novelists.

more here.

Exploring the riven region of Ukraine

51u3Q0-pkSLWilliam T. Vollman at Bookforum:

We Americans suffered on September 11, but we have no dread that our nation will imminently disappear. Our states and territory remain intact; no invader struts on our soil. We live as if tomorrow will closely resemble today. The past influences us powerfully, for a fact, but almost invisibly. The Civil War, for instance, has small perceptible impact on our daily getting and spending. But our complacency is no more than a local peculiarity. In certain other places, among them Ukraine, history blocks the horizon like a range of mountains. “Today,” writes Judah, “what you think of this past, how you relate to it, determines what you think about the future of Ukraine. And what you think of the past is quite likely to be bound up with the history of your own family and where you live.”

The nineteenth-century ancestors of a Donetsk coal-mining family could easily have been Russian—and even back then, Russians and Ukrainians were bickering. Nowadays this Donetsk family might well look kindly on Putin. Meanwhile, a family from the southwestern region of Transcarpathia looks back on a past when, as Judah puts it, their land appeared in the same travel guidebook as Vienna, Prague, and Trieste, those glamorous cities of the Austro-Hungarian Empire. One would expect these people to be less drawn to Russia, as is indeed the case. Transcarpathia then belonged to Galicia, another bygone region whose identity was cultivated by Hapsburg overlords “keen to divide and rule and to balance Polish identity and aspirations.” So Judah carefully explains the matter: “Today, when we see [from] voting patterns in Ukraine” that eastern Galicia is “more nationalistic and proud of its Ukrainianness, this is the historical root of the reason why.”

more here.

The Last Time I Saw Basquiat

Basquiat-studioLuc Sante at The New York Review of Books:

The first time I met Jean-Michel Basquiat was in the spring of 1979, at the Mudd Club. His hair was dyed orange and cut very short with a V-shaped widow’s peak in the front. He wore a lab coat and carried a briefcase. “Going on a trip?” I asked him. “Always,” he replied. He had a disquieting stare. He had probably taken fifty drugs that night, but it was clear there was a lot more to him than that.

He was sleeping on the floors of a rotating set of NYU dorm rooms then. He had no money at all. He had recently stopped tagging as SAMO and had renamed himself MAN-MADE, although that wasn’t a tag but a signature for things he made, T-shirts and collages and these color-Xerox postcards, which he sold for a buck or two. Eventually he sold one to Henry Geldzahler and one to Andy Warhol, and his name became currency.

Before that, though, he was still writing on walls, but as a poet rather than a tagger. I wish I could remember more of his works than just the one someone photographed him writing on Lafayette Street near Houston: “The whole livery line/ Bow like this with/ The big money all/ Crushed into these feet.”

more here.

Researchers Confront an Epidemic of Loneliness

Katie Hafner in The New York Times:

LonelyLoneliness, which Emily Dickinson described as “the Horror not to be surveyed,” is a quiet devastation. But in Britain, it is increasingly being viewed as something more: a serious public health issue deserving of public funds and national attention. Working with local governments and the National Health Service, programs aimed at mitigating loneliness have sprung up in dozens of cities and towns. Even fire brigades have been trained to inspect homes not just for fire safety but for signs of social isolation. “There’s been an explosion of public awareness here, from local authorities to the Department of Health to the media,” said Paul Cann, chief executive of Age UK Oxfordshire and a founder of The Campaign to End Loneliness, a five-year-old group based in London. “Loneliness has to be everybody’s business.” Researchers have found mounting evidence linking loneliness to physical illness and to functional and cognitive decline. As a predictor of early death, loneliness eclipses obesity.

“The profound effects of loneliness on health and independence are a critical public health problem,” said Dr. Carla M. Perissinotto, a geriatrician at the University of California, San Francisco. “It is no longer medically or ethically acceptable to ignore older adults who feel lonely and marginalized.” In Britain and the United States, roughly one in three people older than 65 live alone, and in the United States, half of those older than 85 live alone. Studies in both countries show the prevalence of loneliness among people older than 60 ranging from 10 percent to 46 percent. While the public, private and volunteer sectors in Britain are mobilizing to address loneliness, researchers are deepening their understanding of its biological underpinnings. In a paper published earlier this year in the journal Cell, neuroscientists at the Massachusetts Institute of Technology identified a region of the brain they believe generates feelings of loneliness. The region, known as the dorsal raphe nucleus, or D.R.N., is best known for its link to depression.

More here.

The Fear is Real Poetry

like some wild horse chained
to his stall just ripped out
the post & chewed on the links
& got free & burned down the barn
so he could see
the moon dance
an irish amazing reel
& ran & ran & ran
until the sweat poured
like honey
& the wounds
cleaned the
tired arabian
trail

that's what this honesty
tells me rip out the post

& i never knew my father's
loneliness & never knew my mother's
fear although i wore them like
hard saddles

there's plenty of time
to die

stones on the road
shattered glass

by Jim Bell
from Crossing the Bar
Slate Roof: a Publishing Collective

Monday, September 5, 2016

Sunday, September 4, 2016

Paul Feyerabend’s defense of astrology

Massimo Pigliucci in Plato's Footnote:

ScreenHunter_2190 Sep. 05 10.22Paul Feyerabend was the enfant terrible of 1960s philosophy of science. His most famous book, Against Method argued that science is a quintessentially pragmatic enterprise, with scientists simply using or discarding what does and does not work, meaning that there is no such thing as the scientific method. It’s not for nothing that he was referred to as a methodological anarchist. (Incidentally, the new edition of the book, with introduction by Ian Hacking, is definitely worth the effort.)

Throughout his career as an iconoclast he managed to piss off countless philosophers and scientists, for example by once cheering creationists in California for their bid to get “creation science” taught in schools. That, Feyerabend thought, would teach a lesson to self-conceited scientists and keepers of order and rationality. But he wasn’t stupid, immediately adding that the creationists themselves would then surely become just as dogmatic and self-conceited as the scientific establishment itself. His hope was for a balance of forces, a 1960s version of John Stuart Mill’s famous concept of the free market of ideas, where the best ones always win, in the long run. (If only.)

When I was a young scientist I wasn’t too fond of Feyerabend, to put it mildly. And even as an early student of philosophy of science, I felt much more comfortable with the likes of Popper or even Kuhn (despite the famous intellectual rivalry between the two) than with the very idea of methodological anarchism. But while some people turn more conservative when they age, I guess I’ve become — to my surprise — more of an anarchist, and I have slowly, though not quite completely, re-evaluated Feyerabend.

More here.

Is Artificial Intelligence Permanently Inscrutable?

Aaron Bornstein in Nautilus:

Dmitry Malioutov can’t say much about what he built.

As a research scientist at IBM, Malioutov spends part of his time building machine learning systems that solve difficult problems faced by IBM’s corporate clients. One such program was meant for a large insurance corporation. It was a challenging assignment, requiring a sophisticated algorithm. When it came time to describe the results to his client, though, there was a wrinkle. “We couldn’t explain the model to them because they didn’t have the training in machine learning.”

In fact, it may not have helped even if they were machine learning experts. That’s because the model was an artificial neural network, a program that takes in a given type of data—in this case, the insurance company’s customer records—and finds patterns in them. These networks have been in practical use for over half a century, but lately they’ve seen a resurgence, powering breakthroughs in everything from speech recognition and language translation to Go-playing robots and self-driving cars.

Bornstein-BR-1
HIDDEN MEANINGS: In neural networks, data is passed from layer to layer, undergoing simple transformations at each step. Between the input and output layers are hidden layers, groups of nodes and connections that often bear no human-interpretable patterns or obvious connections to either input or output. “Deep” networks are those with many hidden layers.Michael Nielsen /NeuralNetworksandDeepLearning.com

As exciting as their performance gains have been, though, there’s a troubling fact about modern neural networks: Nobody knows quite how they work. And that means no one can predict when they might fail.

Take, for example, an episode recently reported by machine learning researcher Rich Caruana and his colleagues. They described the experiences of a team at the University of Pittsburgh Medical Center who were using machine learning to predict whether pneumonia patients might develop severe complications. The goal was to send patients at low risk for complications to outpatient treatment, preserving hospital beds and the attention of medical staff. The team tried several different methods, including various kinds of neural networks, as well as software-generated decision trees that produced clear, human-readable rules.

The neural networks were right more often than any of the other methods. But when the researchers and doctors took a look at the human-readable rules, they noticed something disturbing: One of the rules instructed doctors to send home pneumonia patients who already had asthma, despite the fact that asthma sufferers are known to be extremely vulnerable to complications.

The model did what it was told to do: Discover a true pattern in the data. The poor advice it produced was the result of a quirk in that data. It was hospital policy to send asthma sufferers with pneumonia to intensive care, and this policy worked so well that asthma sufferers almost never developed severe complications. Without the extra care that had shaped the hospital’s patient records, outcomes could have been dramatically different.

More here.

The Myth of Thumbprints: Reading John Berger in Berlin

Seventh-Man

Alexis Zanghi in the LA Review of Books:

“THEY TRAVELED in groups of 100. Mostly at night. In lorries. And on foot.”

During the 1970s, migrants leaving Portugal in search of opportunity developed a system to ensure their safe arrival at their destination, and to deter fleecing by people smugglers. Before departing, each man would take his own picture. Then, he would rip the picture in two, keeping one half of his face for himself and giving the other half to the smuggler. Once over the border, the man would mail his half back to his family, to indicate that he had arrived safely in France, Germany, or Switzerland, or any of the other northern European countries reliant on cheap labor from the depressed and volatile countries ringing the Mediterranean. Then, the smuggler would come to collect payment from the migrant’s family, bearing his half of the man’s face as evidence.

These pictures stare up at the reader of A Seventh Man like eerie passport photos. Written by John Berger in collaboration with Swiss photographer Jean Mohr in the 1970s, the book sought to document the daily lives of migrant workers in the industrial north of Europe. In one, ripped in half on a diagonal, a man’s forehead drifts apart from his chin, eyes obscured by the tear, suspended on the page. The effect is one of facelessness and anonymity. This is perhaps the intention of Berger and Mohr: to highlight, and in doing so, hopefully negate the erasure inherent in migration. Berger sought to facilitate “working class solidarity”: to promote empathy among workers, across linguistic and cultural borders. Then, one in every seven workers in Europe was a migrant.

Today, as well, one in every seven people in the world is a migrant, refugee, or otherwise displaced individual.

At the Museum Europäischer Kulturen in Berlin, a collective of migrant artists called KUNSTASYL (literally “art asylum”) are at work on a “peaceful takeover” of the museum’s east wing. On the ground floor, one artist, Dachil Sado, has painted a large Hokusai wave washing over a giant, oversized thumbprint.

More here.

Martha Nussbaum thinks we shouldn’t lose our tempers

Julian Baggini in Prospect:

RTRGIQS_web-1When a philosopher writes a book with five abstract nouns in a six-word title, you might justly fear a laboured tome of desiccating logical analysis. When the author is Martha Nussbaum, however, you can be reassured. Nussbaum is one of the most productive and insightful thinkers of her generation, though strangely undervalued in the UK. She combines a philosopher’s demand for conceptual clarity and rigorous thinking with a novelist’s interest in narrative, art and literature. The result is an impressive body of work spanning the overlapping territories of politics, ethics and the emotions.

Her latest work examines the significance of anger and forgiveness in the intimate and political spheres, as well as in the “middle realm” between them in which we interact with each other as colleagues, acquaintances and fellow citizens. It belongs to a genre entirely of its own, a kind of highbrow political-, social- and self-improvement.

Its core thesis is summed up in her opening discussion of Aeschylus’ Oresteia trilogy. In its final part, The Eumenides, Athena brings the bloody cycle of vengeance to an end by establishing a court, judge and jury. This allows reasoned law to take the place of the Furies, the ancient goddesses of revenge, who are nonetheless invited to take their place in the city. Nussbaum says that many understand the play “to be a recognition that the legal system must incorporate the dark vindictive passions and honour them.” However, when the Furies accept Athena’s offer they do so with “a gentle temper” and change their name to the “Kindly Ones” (Eumenides). Anger and revenge are not reintegrated, they are transformed.

More here.

He may have invented one of neuroscience’s biggest advances. But you’ve never heard of him

Anna Vlasits in Stat:

160812_pan_027a-1600x900The next revolution in medicine just might come from a new lab technique that makes neurons sensitive to light. The technique, called optogenetics, is one of the biggest breakthroughs in neuroscience in decades. It has the potential to cure blindness, treat Parkinson’s disease, and relieve chronic pain. Moreover, it’s become widely used to probe the workings of animals’ brains in the lab, leading to breakthroughs in scientists’ understanding of things like sleep, addiction, and sensation.

So it’s not surprising that the two Americans hailed as inventors of optogenetics are rock stars in the science world. Karl Deisseroth at Stanford University and Ed Boyden at the Massachusetts Institute of Technology have collected tens of millions in grants and won millions in prize money in recent years. They’ve stocked their labs with the best equipment and the brightest minds. They’ve been lauded in the media and celebrated at conferences around the world. They’re considered all but certain to win a Nobel Prize.

There’s only one problem with this story:

It just may be that Zhuo-Hua Pan invented optogenetics first.Section break

Even many neuroscientists have never heard of Pan.

More here.

The Dark Undone

D R Haney in The Nervous Breakdown:

MacbethThe thought came to me when I was fifteen and trying to sleep on New Year’s Eve. Nothing I recall had happened to incite it. I’d spent the night babysitting my younger siblings while my mother attended a party, and she returned home around one in the morning and everyone went to bed. (My parents had divorced, though they continued to quarrel as if married.) My brother was sleeping in the bunk below mine, and as I stared at the ceiling and listened to the house settle, I thought: Why don’t you go into the kitchen and get a knife and stab your family to death? It wasn’t an impulse; it was a kind of philosophical question that I found myself pursuing. I thought of true-crime cases and wondered at the difference between, say, Charles Manson and me. Why was he capable of killing? Why was I not? Was it a matter of morality? But for me morality was tied to religion, and I’d declared myself an atheist a year or so before. Nor did man’s law amount to an automatic deterrent; some killings — those sanctioned or even performed by the state — were viewed as “right.” But wasn’t a life a life? So, if I wanted to get a knife and stab my family to death, as I knew I didn’t, why would that be any more “wrong” than a soldier killing in combat? Because my family was “innocent”? But weren’t many victims of war also innocent? And why was I wondering in the first place? Didn’t serial killers similarly brood before acting? I knew some did. I’d read the letters they sent to the press and police: Stop me before I kill again. I don’t want to do it, but I must. Maybe I was one of them. Maybe there was no difference between me and Charles Manson. You can’t choose what you are; you simply are.

I tossed and turned. The quiet of the sleeping house was loud — how loud was the quiet that followed murder? Maybe I was destined to know. I desperately wanted proof — irrefutable proof — that I would never hurt anyone as, more by the minute, seemed inevitable.

More here.