The Last Time I Saw Basquiat

Basquiat-studioLuc Sante at The New York Review of Books:

The first time I met Jean-Michel Basquiat was in the spring of 1979, at the Mudd Club. His hair was dyed orange and cut very short with a V-shaped widow’s peak in the front. He wore a lab coat and carried a briefcase. “Going on a trip?” I asked him. “Always,” he replied. He had a disquieting stare. He had probably taken fifty drugs that night, but it was clear there was a lot more to him than that.

He was sleeping on the floors of a rotating set of NYU dorm rooms then. He had no money at all. He had recently stopped tagging as SAMO and had renamed himself MAN-MADE, although that wasn’t a tag but a signature for things he made, T-shirts and collages and these color-Xerox postcards, which he sold for a buck or two. Eventually he sold one to Henry Geldzahler and one to Andy Warhol, and his name became currency.

Before that, though, he was still writing on walls, but as a poet rather than a tagger. I wish I could remember more of his works than just the one someone photographed him writing on Lafayette Street near Houston: “The whole livery line/ Bow like this with/ The big money all/ Crushed into these feet.”

more here.

Researchers Confront an Epidemic of Loneliness

Katie Hafner in The New York Times:

LonelyLoneliness, which Emily Dickinson described as “the Horror not to be surveyed,” is a quiet devastation. But in Britain, it is increasingly being viewed as something more: a serious public health issue deserving of public funds and national attention. Working with local governments and the National Health Service, programs aimed at mitigating loneliness have sprung up in dozens of cities and towns. Even fire brigades have been trained to inspect homes not just for fire safety but for signs of social isolation. “There’s been an explosion of public awareness here, from local authorities to the Department of Health to the media,” said Paul Cann, chief executive of Age UK Oxfordshire and a founder of The Campaign to End Loneliness, a five-year-old group based in London. “Loneliness has to be everybody’s business.” Researchers have found mounting evidence linking loneliness to physical illness and to functional and cognitive decline. As a predictor of early death, loneliness eclipses obesity.

“The profound effects of loneliness on health and independence are a critical public health problem,” said Dr. Carla M. Perissinotto, a geriatrician at the University of California, San Francisco. “It is no longer medically or ethically acceptable to ignore older adults who feel lonely and marginalized.” In Britain and the United States, roughly one in three people older than 65 live alone, and in the United States, half of those older than 85 live alone. Studies in both countries show the prevalence of loneliness among people older than 60 ranging from 10 percent to 46 percent. While the public, private and volunteer sectors in Britain are mobilizing to address loneliness, researchers are deepening their understanding of its biological underpinnings. In a paper published earlier this year in the journal Cell, neuroscientists at the Massachusetts Institute of Technology identified a region of the brain they believe generates feelings of loneliness. The region, known as the dorsal raphe nucleus, or D.R.N., is best known for its link to depression.

More here.

The Fear is Real Poetry

like some wild horse chained
to his stall just ripped out
the post & chewed on the links
& got free & burned down the barn
so he could see
the moon dance
an irish amazing reel
& ran & ran & ran
until the sweat poured
like honey
& the wounds
cleaned the
tired arabian
trail

that's what this honesty
tells me rip out the post

& i never knew my father's
loneliness & never knew my mother's
fear although i wore them like
hard saddles

there's plenty of time
to die

stones on the road
shattered glass

by Jim Bell
from Crossing the Bar
Slate Roof: a Publishing Collective

Monday, September 5, 2016

Sunday, September 4, 2016

Paul Feyerabend’s defense of astrology

Massimo Pigliucci in Plato's Footnote:

ScreenHunter_2190 Sep. 05 10.22Paul Feyerabend was the enfant terrible of 1960s philosophy of science. His most famous book, Against Method argued that science is a quintessentially pragmatic enterprise, with scientists simply using or discarding what does and does not work, meaning that there is no such thing as the scientific method. It’s not for nothing that he was referred to as a methodological anarchist. (Incidentally, the new edition of the book, with introduction by Ian Hacking, is definitely worth the effort.)

Throughout his career as an iconoclast he managed to piss off countless philosophers and scientists, for example by once cheering creationists in California for their bid to get “creation science” taught in schools. That, Feyerabend thought, would teach a lesson to self-conceited scientists and keepers of order and rationality. But he wasn’t stupid, immediately adding that the creationists themselves would then surely become just as dogmatic and self-conceited as the scientific establishment itself. His hope was for a balance of forces, a 1960s version of John Stuart Mill’s famous concept of the free market of ideas, where the best ones always win, in the long run. (If only.)

When I was a young scientist I wasn’t too fond of Feyerabend, to put it mildly. And even as an early student of philosophy of science, I felt much more comfortable with the likes of Popper or even Kuhn (despite the famous intellectual rivalry between the two) than with the very idea of methodological anarchism. But while some people turn more conservative when they age, I guess I’ve become — to my surprise — more of an anarchist, and I have slowly, though not quite completely, re-evaluated Feyerabend.

More here.

Is Artificial Intelligence Permanently Inscrutable?

Aaron Bornstein in Nautilus:

Dmitry Malioutov can’t say much about what he built.

As a research scientist at IBM, Malioutov spends part of his time building machine learning systems that solve difficult problems faced by IBM’s corporate clients. One such program was meant for a large insurance corporation. It was a challenging assignment, requiring a sophisticated algorithm. When it came time to describe the results to his client, though, there was a wrinkle. “We couldn’t explain the model to them because they didn’t have the training in machine learning.”

In fact, it may not have helped even if they were machine learning experts. That’s because the model was an artificial neural network, a program that takes in a given type of data—in this case, the insurance company’s customer records—and finds patterns in them. These networks have been in practical use for over half a century, but lately they’ve seen a resurgence, powering breakthroughs in everything from speech recognition and language translation to Go-playing robots and self-driving cars.

Bornstein-BR-1
HIDDEN MEANINGS: In neural networks, data is passed from layer to layer, undergoing simple transformations at each step. Between the input and output layers are hidden layers, groups of nodes and connections that often bear no human-interpretable patterns or obvious connections to either input or output. “Deep” networks are those with many hidden layers.Michael Nielsen /NeuralNetworksandDeepLearning.com

As exciting as their performance gains have been, though, there’s a troubling fact about modern neural networks: Nobody knows quite how they work. And that means no one can predict when they might fail.

Take, for example, an episode recently reported by machine learning researcher Rich Caruana and his colleagues. They described the experiences of a team at the University of Pittsburgh Medical Center who were using machine learning to predict whether pneumonia patients might develop severe complications. The goal was to send patients at low risk for complications to outpatient treatment, preserving hospital beds and the attention of medical staff. The team tried several different methods, including various kinds of neural networks, as well as software-generated decision trees that produced clear, human-readable rules.

The neural networks were right more often than any of the other methods. But when the researchers and doctors took a look at the human-readable rules, they noticed something disturbing: One of the rules instructed doctors to send home pneumonia patients who already had asthma, despite the fact that asthma sufferers are known to be extremely vulnerable to complications.

The model did what it was told to do: Discover a true pattern in the data. The poor advice it produced was the result of a quirk in that data. It was hospital policy to send asthma sufferers with pneumonia to intensive care, and this policy worked so well that asthma sufferers almost never developed severe complications. Without the extra care that had shaped the hospital’s patient records, outcomes could have been dramatically different.

More here.

The Myth of Thumbprints: Reading John Berger in Berlin

Seventh-Man

Alexis Zanghi in the LA Review of Books:

“THEY TRAVELED in groups of 100. Mostly at night. In lorries. And on foot.”

During the 1970s, migrants leaving Portugal in search of opportunity developed a system to ensure their safe arrival at their destination, and to deter fleecing by people smugglers. Before departing, each man would take his own picture. Then, he would rip the picture in two, keeping one half of his face for himself and giving the other half to the smuggler. Once over the border, the man would mail his half back to his family, to indicate that he had arrived safely in France, Germany, or Switzerland, or any of the other northern European countries reliant on cheap labor from the depressed and volatile countries ringing the Mediterranean. Then, the smuggler would come to collect payment from the migrant’s family, bearing his half of the man’s face as evidence.

These pictures stare up at the reader of A Seventh Man like eerie passport photos. Written by John Berger in collaboration with Swiss photographer Jean Mohr in the 1970s, the book sought to document the daily lives of migrant workers in the industrial north of Europe. In one, ripped in half on a diagonal, a man’s forehead drifts apart from his chin, eyes obscured by the tear, suspended on the page. The effect is one of facelessness and anonymity. This is perhaps the intention of Berger and Mohr: to highlight, and in doing so, hopefully negate the erasure inherent in migration. Berger sought to facilitate “working class solidarity”: to promote empathy among workers, across linguistic and cultural borders. Then, one in every seven workers in Europe was a migrant.

Today, as well, one in every seven people in the world is a migrant, refugee, or otherwise displaced individual.

At the Museum Europäischer Kulturen in Berlin, a collective of migrant artists called KUNSTASYL (literally “art asylum”) are at work on a “peaceful takeover” of the museum’s east wing. On the ground floor, one artist, Dachil Sado, has painted a large Hokusai wave washing over a giant, oversized thumbprint.

More here.

Martha Nussbaum thinks we shouldn’t lose our tempers

Julian Baggini in Prospect:

RTRGIQS_web-1When a philosopher writes a book with five abstract nouns in a six-word title, you might justly fear a laboured tome of desiccating logical analysis. When the author is Martha Nussbaum, however, you can be reassured. Nussbaum is one of the most productive and insightful thinkers of her generation, though strangely undervalued in the UK. She combines a philosopher’s demand for conceptual clarity and rigorous thinking with a novelist’s interest in narrative, art and literature. The result is an impressive body of work spanning the overlapping territories of politics, ethics and the emotions.

Her latest work examines the significance of anger and forgiveness in the intimate and political spheres, as well as in the “middle realm” between them in which we interact with each other as colleagues, acquaintances and fellow citizens. It belongs to a genre entirely of its own, a kind of highbrow political-, social- and self-improvement.

Its core thesis is summed up in her opening discussion of Aeschylus’ Oresteia trilogy. In its final part, The Eumenides, Athena brings the bloody cycle of vengeance to an end by establishing a court, judge and jury. This allows reasoned law to take the place of the Furies, the ancient goddesses of revenge, who are nonetheless invited to take their place in the city. Nussbaum says that many understand the play “to be a recognition that the legal system must incorporate the dark vindictive passions and honour them.” However, when the Furies accept Athena’s offer they do so with “a gentle temper” and change their name to the “Kindly Ones” (Eumenides). Anger and revenge are not reintegrated, they are transformed.

More here.

He may have invented one of neuroscience’s biggest advances. But you’ve never heard of him

Anna Vlasits in Stat:

160812_pan_027a-1600x900The next revolution in medicine just might come from a new lab technique that makes neurons sensitive to light. The technique, called optogenetics, is one of the biggest breakthroughs in neuroscience in decades. It has the potential to cure blindness, treat Parkinson’s disease, and relieve chronic pain. Moreover, it’s become widely used to probe the workings of animals’ brains in the lab, leading to breakthroughs in scientists’ understanding of things like sleep, addiction, and sensation.

So it’s not surprising that the two Americans hailed as inventors of optogenetics are rock stars in the science world. Karl Deisseroth at Stanford University and Ed Boyden at the Massachusetts Institute of Technology have collected tens of millions in grants and won millions in prize money in recent years. They’ve stocked their labs with the best equipment and the brightest minds. They’ve been lauded in the media and celebrated at conferences around the world. They’re considered all but certain to win a Nobel Prize.

There’s only one problem with this story:

It just may be that Zhuo-Hua Pan invented optogenetics first.Section break

Even many neuroscientists have never heard of Pan.

More here.

The Dark Undone

D R Haney in The Nervous Breakdown:

MacbethThe thought came to me when I was fifteen and trying to sleep on New Year’s Eve. Nothing I recall had happened to incite it. I’d spent the night babysitting my younger siblings while my mother attended a party, and she returned home around one in the morning and everyone went to bed. (My parents had divorced, though they continued to quarrel as if married.) My brother was sleeping in the bunk below mine, and as I stared at the ceiling and listened to the house settle, I thought: Why don’t you go into the kitchen and get a knife and stab your family to death? It wasn’t an impulse; it was a kind of philosophical question that I found myself pursuing. I thought of true-crime cases and wondered at the difference between, say, Charles Manson and me. Why was he capable of killing? Why was I not? Was it a matter of morality? But for me morality was tied to religion, and I’d declared myself an atheist a year or so before. Nor did man’s law amount to an automatic deterrent; some killings — those sanctioned or even performed by the state — were viewed as “right.” But wasn’t a life a life? So, if I wanted to get a knife and stab my family to death, as I knew I didn’t, why would that be any more “wrong” than a soldier killing in combat? Because my family was “innocent”? But weren’t many victims of war also innocent? And why was I wondering in the first place? Didn’t serial killers similarly brood before acting? I knew some did. I’d read the letters they sent to the press and police: Stop me before I kill again. I don’t want to do it, but I must. Maybe I was one of them. Maybe there was no difference between me and Charles Manson. You can’t choose what you are; you simply are.

I tossed and turned. The quiet of the sleeping house was loud — how loud was the quiet that followed murder? Maybe I was destined to know. I desperately wanted proof — irrefutable proof — that I would never hurt anyone as, more by the minute, seemed inevitable.

More here.

A Different Kind of Safe Space

Ted Gup in The Chronicle of Higher Education:

Photo_78205_landscape_850x566This is a sanctuary for free speech,” I will tell my students next week at their first class in nonfiction writing. With this manifesto I begin each semester and have for 30-plus years. What does it mean? It means that in here, you can say anything. “Anything?” a student will inevitably ask, signaling both mischief and anxiety.

Anything.

If they want to say the F-word, they are free to do so. (Frankly, I prefer it to an endless barrage of “likes” and “you knows.”) They can say or write the N-word, and any other term of racial, gender, or ethnic disparagement. Now I have their attention. I may or may not be in violation of a college rule, but I am clearly outside the “safe zone.” And that’s fine, because outside of it is where I want to be, and where I want them to be.This is, after all, a class in writing, and words are not to be trifled with. They have consequences. You want to use a word — any word — fine by me, but be prepared to accept what happens next. No, I will not reprimand you, but neither will I rush in to save you. It is a lesson not only in the power of words, but in democracy, free speech, and responsibility.

Words are dangerous, but not as dangerous as efforts to suppress them, be it by government or dean — and certainly not as insidious as self-censorship. I know my views are not shared by some colleagues and students (although not a one has ever complained), particularly in this age of safe spaces, trigger warnings, microaggressions, and speech codes. And no, I am not a reactionary conservative but an old-school liberal, a devotee of John Stuart Mill and Bertrand Russell, who, in a joyful display of defiance, titled one of his collections Unpopular Essays. But on one point Russell and I part company. He said that “all movements go too far.” He is mostly right, but not with regard to free speech. Here, there is no too far. The very idea of capping speech suffocates the imagination. Another of my favorite writers, E.B. White, wrote that “in a free country it is the duty of writers to pay no attention to duty.” I’ve always loved that quote, but I think it craves an asterisk. There is a duty: to be true to oneself.

More here.

Saturday, September 3, 2016

Attica, Attica: The Story of the Legendary Prison Uprising

04FOREMAN-master768James Forman Jr. at the NY Times:

Attica. The name itself has long signified resistance to prison abuse and state violence. In the 1975 film “Dog Day Afternoon,” Al Pacino, playing a bank robber, leads a crowd confronting the police in a chant of “Attica, Attica.” The rapper Nas, in his classic “If I Ruled the World,” promises to “open every cell in Attica, send ’em to Africa.” And Attica posters were once commonplace in the homes of black nationalists. The one in my family’s apartment in the 1970s featured a grainy black-and-white picture of Attica’s protesting prisoners, underneath the words “We are not beasts.”

But memories of the 1971 uprising at Attica prison have grown hazy. I recently mentioned the word to a politically active Yale College student, who responded: “I know it’s a prison where something important happened. But I’m not sure of the details.”

Heather Ann Thompson, a professor of history at the University of Michigan, has the details. Thompson spent more than a decade poring over trial transcripts, issuing countless requests for hidden government documents, and interviewing dozens of survivors and witnesses.

more here.

‘I Contain Multitudes’ by Ed Yong

X500Tim Radford at The Guardian:

We are not alone. We have never been alone. We are possessed. Our inner demons cannot be cast out, because they did not move in and take possession: they were here before us, and will live on after us. They are invisible, insidious and exist in overwhelming numbers. They manage us in myriad ways: deliver our minerals and vitamins, help digest our lunch, and provide in different ways all our cheese, yoghurt, beer, wine, bread, bacon and beef. Microbes can affect our mood, take charge of our immune system, protect us from disease, make us ill, kill us and then decompose us.

As complex, multicellular lifeforms, we are their sock puppets. We spread them, introduce variety into their brief lives and provide them with all they need to replicate and colonise new habitats. We are perambulating tower blocks, each occupied by maybe 40tn tiny tenants. Our skins are smeared with a thin film of microbial life, with ever greater numbers occupying every orifice and employed in colossal numbers in our guts.

Yet, until late in human history, we didn’t know they were there at all. We still do not know who they all are, or what they do. We discover new things almost every day.

more here.

‘Horses, Horses, in the End the Light Remains Pure’

Horses-horsesRobert Fay at The Quarterly Conversation:

The Japanese novelist Hideo Furukawa is interested in the “blank spaces,” as he puts it, what’s not said in official histories and school textbooks. He reminds his countrymen that violence has always been central to Japanese history. In his newly translated novel, Horses, Horses, in the End the Light Remains Pure: A Tale that Begins with Fukushima (translated by Doug Slaymaker with Akiko Takenaka), the narrator Hideo Furukawa tells us:

I’m about to touch on Japanese History. This is unbearably uncomfortable, to me anyway, all this history stuff. Our history, the history of the Japanese, is nothing more than the history of killing people.

Furukawa (born in 1966) is well-regarded in Japan as a literary novelist: in addition to receiving the Mishima Yukio Prize in 2006 for his novel Love, he has also won Japanese science fiction and mystery awards. His only other work in English is the novel Belka, Why Don’t You Bark? published by Haikasoru in 2012 (translated by Michael Emmerich). Belka is a rather curious work, at times told from the point of view of various dogs and often employing prose that reads like a poor imitation of hard-boiled fiction. It is not the best introduction to Furukawa and his considerable talents.

Furukawa is a native of Fukushima prefecture, the so-called ground zero of 3.11, and he found himself in Kyoto and Tokyo during 3.11 and its immediate aftermath. Feeling the call to go home, he convinced Shincho Publishing to underwrite a trip to Fukushima and the surrounding prefectures.

more here.

Revealing the Whole Truth About Mother Teresa

Kai Schultz in the New York Times:

ScreenHunter_2189 Sep. 03 20.59Taking on a global icon of peace, faith and charity is not a task for everyone, or, really, hardly anyone at all. But that is what Dr. Aroup Chatterjee has spent a good part of his life doing as one of the most vocal critics of Mother Teresa.

Dr. Chatterjee, a 58-year-old physician, acknowledged that it was a mostly solitary pursuit. “I’m the lone Indian,” he said in an interview recently. “I had to devote so much time to her. I would have paid to do that. Well, I did pay to do that.”

His task is about to become that much tougher, of course, when Mother Teresa is declared a saint next month.

In truth, Dr. Chatterjee’s critique is as much or more about how the West perceives Mother Teresa as it is about her actual work. As the canonization approaches, Dr. Chatterjee hopes to renew a dialogue about her legacy in Kolkata, formerly Calcutta, where she began her services with the “poorest of the poor” in 1950.

More here.

For first time, carbon nanotube transistors outperform silicon

Adam Malecek in Phys.org:

ScreenHunter_2188 Sep. 03 20.53For decades, scientists have tried to harness the unique properties of carbon nanotubes to create high-performance electronics that are faster or consume less power—resulting in longer battery life, faster wireless communication and faster processing speeds for devices like smartphones and laptops.

But a number of challenges have impeded the development of high-performance transistors made of carbon nanotubes, tiny cylinders made of carbon just one atom thick. Consequently, their performance has lagged far behind semiconductors such as silicon and gallium arsenide used in computer chips and personal electronics.

Now, for the first time, University of Wisconsin-Madison materials engineers have created carbon nanotube transistors that outperform state-of-the-art silicon transistors.

Led by Michael Arnold and Padma Gopalan, UW-Madison professors of materials science and engineering, the team's carbon nanotube transistors achieved current that's 1.9 times higher than silicon transistors. The researchers reported their advance in a paper published Friday (Sept. 2) in the journal Science Advances.

“This achievement has been a dream of nanotechnology for the last 20 years,” says Arnold. “Making carbon nanotube transistors that are better than silicon transistors is a big milestone. This breakthrough in carbon nanotube transistor performance is a critical advance toward exploiting carbon nanotubes in logic, high-speed communications, and other semiconductor electronics technologies.”

More here. [Thanks to Farrukh Azfar.]