Minutes that changed the course of rock history

Jim_SummariaBrian Doyle at The American Scholar:

On a spring day in 1964, a boy walked into the Oldfield Hotel in the London suburb of Greenford—or perhaps the White Hart Hotel in Acton, or perhaps an unknown pub on London’s North Circular Road; fact slides so easily into myth—and demanded an audition from the band playing there that night.

The boy was 17 years old. He was the drummer for a surf band called the Beachcombers. He hit his drums so hard that six-inch nails had to be driven through the base of his kit into the stage to keep it from wandering off when he played. He had been playing drums for five years. He had first tried the bugle, but then he heard the American jazz drummers Gene Krupa and Philly Jo Jones, and he was enlightened, so he switched to the drums and practiced in a music store with a kindhearted and probably hard-of-hearing owner. At age 14 he quit school altogether and got a job repairing radios. Part of the reason he quit school was that his teachers thought he was a dolt: “Retarded artistically, idiotic in other respects,” wrote his art teacher.

more here.

how mortality shapes our existence

2014+27human2William Boyd at The New Statesman:

I want to start with a luminously beautiful – and luminously profound – quotation from Vladimir Nabokov’s autobiography Speak, Memory. He writes: “The cradle rocks above an abyss, and common sense tells us that our existence is but a brief crack of light between two eternities of darkness.”

“Common sense”. I believe that the knowledge of this state of affairs is the fundamental truth about our human nature: the fact that our lives simply amount to our individual occupation of this “brief crack of light” between two eternities of darkness shapes everything that makes us human and is responsible for everything good – and everything bad – about us.

You might argue that if you believe in a religious faith, where life and an afterlife are ordained and somehow controlled by a supernatural being – a god or gods – then this awareness of our temporal, bounded existence in time doesn’t apply. In response, you might counter-argue that religious faith is created expressly to confound and disprove this primordial conviction: a faith created, as Philip Larkin put it, to “pretend we never die”.

But whatever the nature of a faith in a supernatural being, or beings, and whatever its unprovable postulates, I am convinced that what makes our species unique among the fauna of this small planet circling its insignificant star is that we know we are trapped in time, caught briefly between these two eternities of darkness, the prenatal darkness and the posthumous one.

more here.

Kim Philby and the hazards of mistrust

140728_r25268-320x441-1405636862Malcolm Gladwell at The New Yorker:

In December of 1961, a high-ranking K.G.B. agent knocked on the door of the U.S. Embassy in Helsinki, asking for asylum. His name was Antoliy Golitsyn, and he had a remarkable secret to share. There had existed within the British intelligence service, he said, a “ring of five”—all of whom knew one another and all of whom had been recruited by the Soviets in the nineteen-thirties. Burgess and Maclean, who had decamped to Moscow a decade earlier, were No. 1 and No. 2. The art historian Anthony Blunt had been under suspicion by M.I.5 for some time. He was No. 3. No. 4 sounded a lot like Philby: that was why M.I.5 rekindled its investigation of him shortly thereafter. But who was the fifth? When Philby managed to escape to Moscow, concern grew. Had the mysterious fifth man tipped him off?

Within the espionage world, Golitsyn was a deeply divisive figure. Some suspected that he was a fabulist, who embroidered his accounts of K.G.B. secrets in order to extend his usefulness to Western intelligence. Two people remained firmly convinced of Golitsyn’s bona fides, however. The first was Philby’s lunchmate at the C.I.A., James Angleton. The news about Philby convinced Angleton that the C.I.A. must be riven with moles as well, and he set off on a frenzied search for traitors which consumed the American intelligence community for the next decade.

more here.

Mohammed Suliman’s Tweets from the War in Gaza

Mohammed Suliman's tweets from Gaza (via Juan Cole):

More here.

How to Unmarry Your Wife

Sarah Viren in The Morning News:

How-to-unmarry-your-wifeIt was sweltering the day I unmarried Marta, and we weren’t even together. I was with my little brother in a Penske truck, the flat haze of West Texas rising before us like the credits at the end of a movie. Marta was with our three-month-old daughter back in Iowa, where the weather was temperate. Highs were in the 70s, lows in the 50s, and Marta was still married to me. Don McLean was coming in concert that weekend and there were drink specials at our favorite vegan restaurant. Our three-month-old baby cried for milk and slept and cried some more. A couple of days later, the two of them flew out to West Texas to join me in our new home next to a university where Marta and I both had jobs, and where we were no longer married to each other.

It’s hard to define when the act of unmarrying takes place. Were we unmarried as soon as I drove out of Iowa in that Penske van and into Missouri, where same-sex marriage is not recognized? Or was it only official once Marta joined me in Texas, where marriages like ours are outright banned? Or perhaps the real unmarrying occurred when we changed our mailing address with the post office, which would mean we were unmarried for a week without even realizing it. Getting unmarried to someone is also quite different from divorcing them. There are no legal documents to sign. There are no lawyers or judges explaining the terms to you. There is just you and your once-wife and your still-legal baby in a one-story orange brick house under the beaming sun of a West Texas neighborhood where you feel the same as you did before. Almost the same—you are both aware a difference exists, and you can also feel that something small but significant has changed.

More here.

Against Intersectionality

6a00d83453bcda69e201a73d986125970d-800wi

Justin Smith in Berfrois:

Does a Muslim Chechen migrant laborer in a provincial Siberian city –a ‘Caucasian’ if anyone ever was– enjoy ‘white privilege’? It seems offensive to suggest that he does. Of course, there is some scenario on which his children could be taken to the US and raised by Americans, and if this were to happen they would have a set of privileges denied to African adoptees. But that scenario is so remote from the actual range of advantages of which this Chechen can avail himself as he navigates his own social reality that one may as well not mention it. In his context, though racially ‘white’ by American standards, he is the object of suspicion, contempt, and exclusion. The thought that he is ‘white’ has almost certainly never crossed his mind.

Now of course there is nothing wrong in principle with focusing on our own parochial context—indeed it is our responsibility to be concerned with it, and to strive to improve it. When Kimberlé Crenshaw first introduced the intersectional approach, she had just such a focused and non-global concern, namely, to analyze the actors’ categories that come into play in government responses to domestic violence against women in the United States. But one serious problem with staying faithful to actors’ categories and thinking of local contexts in terms of ‘race’, is that this seems to imply a universal natural order in which the locally salient distinctions between different types of people are grounded. And there simply is no such order. What we find when we move to the global context, and to the longue durée, rather, is that the focus on supposedly racial physical attributes is generally an a posteriori rationalization of a prior unequal system of interaction between members of different ethnic groups. The more aggravated this inequality, typically, the more racially different the people on different sides of the ethnic divide will appear to one another.

More here.

Don’t Send Your Kid to the Ivy League

Lede_art_feature_deresiewicz_cuffs_645

William Deresiewicz in TNR [h/t: Simon During]:

Let’s not kid ourselves: The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself. In the affluent suburbs and well-heeled urban enclaves where this game is principally played, it is not about whether you go to an elite school. It’s about which one you go to. It is Penn versus Tufts, not Penn versus Penn State. It doesn’t matter that a bright young person can go to Ohio State, become a doctor, settle in Dayton, and make a very good living. Such an outcome is simply too horrible to contemplate.

This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent. As of 2006, only about 15 percent of students at the most competitive schools came from the bottom half. The more prestigious the school, the more unequal its student body is apt to be. And public institutions are not much better than private ones. As of 2004, 40 percent of first-year students at the most selective state campuses came from families with incomes of more than $100,000, up from 32 percent just five years earlier.

The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game. The more hurdles there are, the more expensive it is to catapult your kid across them. Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools. The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely. Today, fewer than half of high-scoring students from low-income families even enroll at four-year schools.

More here.

Writers or Missionaries?

Shatz_writersmissionaries_ba_img_0

Adam Shatz in The Nation:

Shortly after September 11, I interviewed V.S. Naipaul about his views on Islam for The New York Times Magazine. Much of what he said was predictably ugly, a provocation calculated to offend liberal sensibilities. “Non-fundamentalist Islam,” he told me, is “a contradiction.” September 11 had no cause other than “religious hate.” But Naipaul said something else that I will never forget: that ultimately, you have to make a choice—are you a writer, or are you a missionary? At the time, this remark struck me as glib, even dishonest. If anyone was a missionary, wasn’t it Naipaul, with his crude attacks on Muslims, his extreme Hindu nationalism and his snobbery, all of it dressed up as devotion to the noble calling of writing and art?

Still, the remark stayed with me. I couldn’t dismiss it; I have since seen its wisdom, although I am no fonder of Naipaul’s views now than I was then. Naipaul was evoking the tension between the writer, who describes things as he or she sees them, and the missionary or the advocate, who describes things as he or she wishes they might be under the influence of a party, movement or cause. The contrast is not as stark as Naipaul suggests, but it exists, and the more closely you analyze a society, the more you allow yourself to see and to hear, the more you experience this tension.

In Finding the Center, Naipaul writes that travel “became a necessary stimulus for me. It broadened my worldview; it showed me a changing world and took me out of my own colonial shell…. My uncertainty about my role withered; a role was not necessary. I recognized my own instincts as a traveler and was content to be myself, to be what I had always been, a looker. And I learned to look in my own way.” He continues:

To arrive in a place without knowing anyone there, and sometimes without an introduction; to learn how to move among strangers for the short time one could afford to be among them; to hold oneself in constant readiness for adventure or revelation; to allow oneself to be carried along, up to a point, by accidents; and consciously to follow up other impulses—that could be as creative and imaginative a procedure as the writing that came after. Travel of this sort became an intense experience for me. It used all the sides of my personality; I was always wound up…. There was always the possibility of failure—of not finding anything, not getting started on the chain of accidents and encounters. This gave a gambler’s excitement to every arrival. My luck held; perhaps I made it hold.

In this passage, Naipaul captures some of the most crucial aspects of reporting: an alert or receptive passivity; a willingness to expose oneself to unfamiliar and even unsettling experiences and people, to give up control and to get lost. This is not as easy as it sounds. That “readiness for adventure or revelation” has to be cultivated. As Walter Benjamin writes in his memoir Berlin Childhood Around 1900, “not to find one’s way around a city does not mean much. But to lose one’s way in a city, as one loses one’s way in a forest, requires some schooling.”

More here.

Beyond Energy, Matter, Time and Space

George Johnson in The New York Times:

SunThough he probably didn’t intend anything so jarring, Nicolaus Copernicus, in a 16th-century treatise, gave rise to the idea that human beings do not occupy a special place in the heavens. Nearly 500 years after replacing the Earth with the sun as the center of the cosmic swirl, we’ve come to see ourselves as just another species on a planet orbiting a star in the boondocks of a galaxy in the universe we call home. And this may be just one of many universes — what cosmologists, some more skeptically than others, have named the multiverse. Despite the long string of demotions, we remain confident, out here on the edge of nowhere, that our band of primates has what it takes to figure out the cosmos — what the writer Timothy Ferris called “the whole shebang.” New particles may yet be discovered, and even new laws. But it is almost taken for granted that everything from physics to biology, including the mind, ultimately comes down to four fundamental concepts: matter and energy interacting in an arena of space and time.

There are skeptics who suspect we may be missing a crucial piece of the puzzle. Recently, I’ve been struck by two books exploring that possibility in very different ways. There is no reason why, in this particular century, Homo sapiens should have gathered all the pieces needed for a theory of everything. In displacing humanity from a privileged position, the Copernican principle applies not just to where we are in space but to when we are in time.

More here.

Monday, July 21, 2014

Sunday, July 20, 2014

Superintelligence: We need to endow robots with human values

Clive Cookson in the Financial Times:

SuperintelligenceSince the 1950s proponents of artificial intelligence have maintained that machines thinking like people lie just a couple of decades in the future. In Superintelligence – a thought-provoking look at the past, present and above all the future of AI – Nick Bostrom, founding director of Oxford’s university’s Future of Humanity Institute, starts off by mocking the futurists.

“Two decades is a sweet spot for prognosticators of radical change: near enough to be attention-grabbing and relevant, yet far enough to make it possible that a string of breakthroughs, currently only vaguely imaginable, might by then have occurred,” he writes. He notes, too, that 20 years may be close to the typical remaining duration of a forecaster’s career, limiting “the reputational risk of a bold decision”.

Yet his book is based on the premise that AI research will sooner or later produce a computer with a general intelligence (rather than a special capability such as playing chess) that matches the human brain. While the corporate old guard such as IBM has long been interested in the field, the new generation on the US West Coast is making strides. Among the leaders, Google offers PR-led glimpses into its work, from driverless cars to neural networks that learn to recognise faces as they search for images in millions of web pages.

Approaches to AI fall into two overlapping classes. One, based on neurobiology, aims to understand and emulate the workings of the human brain. The other, based on computer science, uses the inorganic architecture of electronics and appropriate software to produce intelligence, without worrying too much how people think. Bostrom makes no judgment about which is most likely to succeed.

More here.

The Free Market is an Impossible Utopia

Polanyi

Henry Farrell in The Monkey Cage:

Fred Block (research professor of sociology at University of California at Davis) and Margaret Somers (professor of sociology and history at the University of Michigan) have a new book, “The Power of Market Fundamentalism: Karl Polanyi’s Critique” (Harvard University Press, 2014). The book argues that the ideas of Karl Polanyi, the author of “The Great Transformation,” a classic of 20th century political economy, are crucial if you want to understand the recession and its aftermath. I asked the authors a series of questions.

HF – Your book argues for the continued relevance of Karl Polanyi’s work, especially “The Great Transformation.” What are the ideas at the core of Polanyi’s thought?

FB & MS – Polanyi’s core thesis is that there is no such thing as a free market; there never has been, nor can there ever be. Indeed he calls the very idea of an economy independent of government and political institutions a “stark utopia”—utopian because it is unrealizable, and the effort to bring it into being is doomed to fail and will inevitably produce dystopian consequences. While markets are necessary for any functioning economy, Polanyi argues that the attempt to create a market society is fundamentally threatening to human society and the common good. In the first instance the market is simply one of many different social institutions; the second represents the effort to subject not just real commodities (computers and widgets) to market principles but virtually all of what makes social life possible, including clean air and water, education, health care, personal, legal, and social security, and the right to earn a livelihood. When these public goods and social necessities (what Polanyi calls “fictitious commodities”) are treated as if they are commodities produced for sale on the market, rather than protected rights, our social world is endangered and major crises will ensue.

Free market doctrine aims to liberate the economy from government “interference”, but Polanyi challenges the very idea that markets and governments are separate and autonomous entities. Government action is not some kind of “interference” in the autonomous sphere of economic activity; there simply is no economy without government rules and institutions. It is not just that society depends on roads, schools, a justice system, and other public goods that only government can provide. It is thatall of the key inputs into the economy—land, labor, and money—are only created and sustained through continuous government action. The employment system, the arrangements for buying and selling real estate, and the supplies of money and credit are organized and maintained through the exercise of government’s rules, regulations, and powers.

More here.

The New Science of Evolutionary Forecasting

Michael-laessig_dpa

Carl Zimmer in Quanta:

If you want to understand why evolutionary biologist have been so loathe to make predictions, read “Wonderful Life,” a 1989 book by the late paleontologist Stephen Jay Gould.

The book is ostensibly about the Cambrian explosion, a flurry of evolutionary innovation that took place more than 500 million years ago. The oldest known fossils of many of today’s major animal groups date to that time. Our own lineage, the vertebrates, first made an appearance in the Cambrian explosion, for example.

But Gould had a deeper question in mind as he wrote his book. If you knew everything about life on Earth half a billion years ago, could you predict that humans would eventually evolve?

Gould thought not. He even doubted that scientists could safely predict that any vertebrates would still be on the planet today. How could they, he argued, when life is constantly buffeted by random evolutionary gusts? Natural selection depends on unpredictable mutations, and once a species emerges, its fate can be influenced by all sorts of forces, from viral outbreaks to continental drift, volcanic eruptions and asteroid impacts. Our continued existence, Gould wrote, is the result of a thousand happy accidents.

To illustrate his argument, Gould had his readers imagine an experiment he called “replaying life’s tape.” “You press the rewind button and, making sure you thoroughly erase everything that actually happened, go back to any time and place in the past,” he wrote. “Then let the tape run again and see if the repetition looks at all like the original.” Gould wagered that it wouldn’t.

Although Gould only offered it as a thought experiment, the notion of replaying the tape of life has endured. That’s because nature sometimes runs experiments that capture the spirit of his proposal.

For an experiment to be predictable, it has to be repeatable. If the initial conditions are the same, the final conditions should also be the same. For example, a marble placed at the edge of a bowl and released will end up at the bottom of the bowl no matter how many times the action is repeated.'

Biologists have found cases in which evolution has, in effect, run the same experiment several times over. And in some cases the results of those natural experiments have turned out very similar each time. In other words, evolution has been predictable.

More here.

This Man Wants to Genetically Engineer Trees to Save the World

Kvb8horsnsgvgn8rc5hr

Annalee Newitz in io9:

The poplar tree's genome has been sequenced and it has 42 thousand genes — roughly twice the number as a human. It turns out that this is typical for a perennial plant like the poplar. Though we animals think of ourselves as far more sophisticated than plants, Tuskan explained that trees have to be a lot tougher and more resilient than the typical animal. He explained:

Humans or mice or elephants can move. If it's cold they can go underground or build shelter. Perennial plants have to stand there and take it for thousands of years in some cases — they have to be equipped biochemically for a drought, ready for heat or cold, ready for an insect attack. I think that's part of why plants have larger arrays of genes — that's their way of surviving.

Out of all these genes, only a handful may turn out to be useful for industry. “Half of the genes have no known function,” Tuskan said, “and with lignin it's probably somewhere between a dozen and three or four dozen genes that will turn out to be important.”

A lot of what Tuskan's lab does with poplars is an effort to link the behavior of specific genes to physical traits in the tree. This kind of analysis is called a genome-wide association study or GWAS, which everybody in the field pronounces “gee wass,” like J-Lo for genome geeks. “Basically it's figuring out the genome's relationship to the phenotype,” said Tuskan.

He and his colleagues have already had some success isolating genes that control various aspects of the tree's metabolism. In one case, they were able to start and stop the growth of a symbiotic fungus in poplar tree roots. Ultimately, Tuskan would like to have genetic switches that control many aspects of the poplar's development. Farmers could do things like grow a tree that's designed to have more lignin or less, depending on what the market demands.

More here.

A YouTube Video Is Doctor’s Secret Weapon Against Back Pain

John Henning Schumann in NPR:

Backpainfinal_wide-6219cfc3135de4e9e2c033c2f3b4ecb7ef46870f-s40-c85A woman in her late 20s came to see me recently because her back hurt. She works at a child care center in town where she picks up babies and small children all day long. She felt a twinge in her lower back when hoisting a fussy kid. The pain was bad enough that she went home from work early and was laid out on the couch until she came to see me the next day. In my office she told me she had “done some damage” to her back. She was worried. She didn't want to end up like her father, who'd left his factory job in his mid-50s on disability after suffering what she called permanent damage to his back. Back pain is common. I see someone with back pain almost every day. Nearly all of us have at least one episode in our lives, and two-thirds of us will have it repeatedly. If you've somehow lived into your 40s and never suffered low back pain, congratulations! You're what doctors like me call an outlier. In my patient's case, I was confident that her back pain wasn't serious. A minor injury was the clear cause. And nearly all back pain like hers from a simple mechanical strain gets better on its own. I wanted to reassure her. I told her to go about her daily life. Keep exercising, but try to take it just a little bit easy until she felt better. At a minimum, I said, she should be walking 30 minutes a day. Also, try some ibuprofen, which helps with inflammation and doesn't require a prescription. But she wasn't buying it. “Don't I need an MRI, or at least an X-ray?” she asked. “My father had three herniated discs and wound up with two back operations. He still never has a day without at least some pain.” I upped the ante. I told her I could refer her to physical therapy, one of the few things shown to be truly helpful for low back pain. No dice. She insisted on an MRI just to be sure. A test like that wasn't warranted, in my opinion, because it would neither change her treatment nor the course of this first-ever bout of back pain. She would just get better.

To convince her of this, I had to resort to my secret weapon: I showed her an 11-minute educational video created by Dr. Mike Evans of Toronto. You may be familiar with Evans' work, even if you've never heard of him. He's the man behind the famous “23 1/2 Hours” whiteboard video that says the single-best move for health is being active for a half-hour or so a day. The video became a viral Internet sensation, racking up millions of page views, and even a shoutout on the hit TV show Orange Is the New Black. Evans is passionate about making complex medical ideas simple. He and his team have made more than a dozen whiteboard videos on health topics including how to deal with stress, acne, quitting smoking and even flatulence.

More here.