Brian Greene: How Einstein Changed the World

Brian Greene in Scientific American:

CAFB090E-2C90-4D40-872C5D649BAD568C_articleEinstein shot to fame within the scientific community in 1905, a year christened as his annus mirabilis. While working eight hours days, six days a week at the Swiss patent office in Bern, he wrote four papers in his spare time that changed the course of physics. In March of that year he argued that light, long described as a wave, is actually composed of particles, called photons, an observation that launched quantum mechanics. Two months later, in May, Einstein's calculations provided testable predictions of the atomic hypothesis, later confirmed experimentally, cinching the case that matter is made of atoms. In June he completed the special theory of relativity, revealing that space and time behave in astonishing ways no one had ever anticipated—in short, that distances, speeds and durations are all relative depending on the observer. And to cap it off, in September 1905 Einstein derived a consequence of special relativity, an equation that would become the world's most famous: E = mc2.

Science usually progresses incrementally. Few and far between are contributions that sound the scientific alert that a radical upheaval is at hand. But here one man in one year rang the bell four times, an astonishing outpouring of creative insight. Almost immediately, the scientific establishment could sense that reverberations of Einstein's work were shifting the bedrock understanding of reality. For the wider public, however, Einstein had not yet become Einstein.

That would change on November 6, 1919.

More here.

Why Isis fights

Martin Chulov in The Guardian:

ScreenHunter_1370 Sep. 17 20.33For more than a century, Dabiq was one of northern Syria’s forsaken villages, a speck on a vast agricultural plain between the Turkish border and the deserts of Iraq, which hardly seemed likely to shape the fate of nations. A weathered sign at its entrance said 4,000 people lived there, most of whom appeared to have left by 2013, driven out over time by a lack of work – and lately by insurrection. For the first three years of Syria’s civil war, the arrival of a strange car would lure bored children to the town’s otherwise empty streets, scattering cats and chickens as they scampered after it. Little else moved.

Dabiq’s few remaining men worked on the odd building project: a half-finished mosque, a humble house for one local who had just returned after 10 years labouring in Lebanon, or a fence for the shrine that was the town’s only showpiece – the tomb of Sulayman ibn Abd al-Malik. The Ummayad caliph was buried under a mound of earth in 717, which over many centuries had somehow grown into a small hill. The war was happening elsewhere, it seemed.

That was until the jihadists of Islamic State (Isis) arrived in early 2014, an event that the Dabiq elders had feared from the moment the war began – and which the new arrivals had anticipated for much longer. To the foreigners, and the leaders of the new militant juggernaut who were beckoning them, the war had by then entered a new phase that would transform the tussle for power in Syria into something far more grand and important. For them, the conflict that was slicing the country apart was not merely, as the Syrian opposition had seen it, a modern struggle between a ruthless state and a restive underclass. The jihadis instead saw themselves at the vanguard of a war that many among them believed had been preordained in the formative days of Islam.

More here.

Moynihan, New Orleans, and the Making of the Gentrification Economy

Blackjobs400Megan French-Marcelin at nonsite:

In 1969, Daniel Patrick Moynihan sent a rushed note to then secretary of Housing and Urban Development (HUD) George Romney. Unfazed by the backlash some four years prior to The Negro Family: The Case for National Action, Moynihan, now Richard Nixon’s counselor on Urban Affairs, enumerated the obstacles city mayors were up against on the eve of the 1970s. Armed with a slew of new statistics, the academic-turned-policy wonk doubled down on his thesis that the disintegration of black family structure was at the root of urban poverty.1 However, as the impact of deindustrialization and stagnation came into focus, Moynihan now observed other discouraging elements at work in cities as well. In little more than a few years, the failure of antipoverty programs had contributed to a more serious urban crisis motivated now by two “opposing trends”: the flight of white families and “just as clearly, the social structure of the Negro poor [which] continues to deteriorate.”2 As antithetical sides of the same crisis, city administrators must now consider how dual threats—white flight and black poverty—undermined urban stability, the academician warned.

Over the course of the next decade, local administrators in New Orleans embraced these twinned crises as means to pivot focus from inequality to matters of urban demography—conveniently ignoring broader structural economic shifts. Planners argued that the consequences of this binary left local administrators with few achievable pathways to urban stability. The city, Mayor Moon Landrieu argued, had become a “city of minorities,” “poor” and “weak.”3 Consequently, the mayor said, the fiscal crisis with which American cities contended was exacerbated in New Orleans by the sheer number of residents who “require[d] services but who [we]re less capable of paying for them.”4

more here.

THE DIVINE INSPIRATION OF JIM JONES

Article_morrisAdam Morris at The Believer:

Though obscure in popular memory, the “cause” advocated by the Peoples Temple was nothing less than total revolution: the Jonestown agricultural settlement was intended as a utopian social experiment in communism. Jones adapted the notion of “revolutionary suicide” from Huey Newton’s autobiographical account of the early days of the Black Panther Party and his personal crusade against American imperialism and racism. The young Reverend Jones may have started out preaching in the McCarthy-era Midwest, but he claimed to have held socialist and communist convictions ever since he was a child. He was a notorious liar, but if this was one of Jones’s overstatements, it was only a slight one: his wife, Marceline, recalled that her husband had privately expressed his admiration for Mao Zedong just after they were married, in 1949, when Jones was only eighteen years old.

Peoples Temple was nominally a church that originated with a core of Pentecostal followers in Indianapolis. But the Temple’s classification as a religious organization overdetermined its representation in the American media in the 1970s. Indeed, the conservative media watchdog Accuracy in Media complained that the group’s radical Marxist objectives were mostly lost in the general confusion surrounding the Jonestown massacre.

more here.

figuring out László Krasznahorkai

Lazlo Kraznahorkai SELECTS-8487.fl copyPeter Marshall at The Point:

This May, after decades of steadily gaining acclaim in the English speaking world, the Hungarian author László Krasznahorkai was awarded the Man Booker International Prize. Even with this honor, and although he has packed venues in London and New York, Krasznahorkai’s reputation for difficulty will likely continue to limit his audience. Open one of his books and you will be greeted by pages of unbroken text that take readers into a labyrinth of ideas and minute details, contradictions and verbal energy that is unlike anything else in contemporary literature.

In contrast to the overriding trend in contemporary American literature, which can be crudely, though not inaccurately, generalized as being concerned with the psychological and emotional lives of individuals in specific sociopolitical settings, Krasznahorkai’s fiction is populated by ideas, and boiling through his flood of language is a very philosophic conflict with time. Teetering on the brink of madness, characters devise systems of meaning, devote themselves to art, follow charlatans and place their faith in absurd causes in the ultimately futile attempt to halt the onslaught of change.

more here.

ON WALDEN POND

Paul Richardson in More Intelligent Life:

Walden%2001%20cropIt is one of the great American sententiae, as sonorous and moving as the Gettysburg Address. “I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived.” Henry David Thoreau went to the woods in 1845, living for two years and two months in a cabin he had built on the north shore of Walden Pond. The book resulting from his experiment in simplicity was published in 1854, to lukewarm reviews. A century and a half later, however, “Walden” is a fundamental text of the ecological movement, and the pond, a crucial topos of American history, has become a place of pilgrimage. I come to the woods in a taxi from Logan Airport, leaving Boston on Route 2. My taxi driver is a young Ethiopian woman with a printed headscarf wound around her head, nervous on her first day of work. We leave the highway at the turn-off for Lincoln, and up there on the exit sign I see the name in big letters: Walden Pond. It has become a destination in itself.

…Thoreau went to Walden out of conviction, but also out of necessity. In March 1845 the poet William Ellery Channing, his companion on the week-long boating trip on the Concord and Merrimack rivers that would form the basis for his first book, wrote to him, “I see nothing for you on this earth but that field which I once christened ‘Briars’; go out upon that, build yourself a hut, and there begin the grand process of devouring yourself alive.” At 27, Thoreau’s published writings had been limited to essays in the Dial, the in-house magazine of the Transcendentalists, and the Democratic Review. He was close to penniless, with no immediate prospects. Helping out in the pencil-making workshop alongside his father was far from a dream job. He was comfortable enough but feared the effects of his too easeful life. More than anything, he realised, he needed to strike out on his own. In the autumn of 1844 he had helped to raise the family’s new house on Texas Street. Now that he knew something about foundations, rafters and roofing, perhaps he would be able to build something much smaller by and for himself.

More here.

The man who wants to beat back aging

Stephen S. Hall in Science:

AgeOn a blazingly hot morning this past June, a half-dozen scientists convened in a hotel conference room in suburban Maryland for the dress rehearsal of what they saw as a landmark event in the history of aging research. In a few hours, the group would meet with officials at the U.S. Food and Drug Administration (FDA), a few kilometers away, to pitch an unprecedented clinical trial—nothing less than the first test of a drug to specifically target the process of human aging. “We think this is a groundbreaking, perhaps paradigm-shifting trial,” said Steven Austad, chairman of biology at the University of Alabama, Birmingham, and scientific director of the American Federation for Aging Research (AFAR). After Austad’s brief introductory remarks, a scientist named Nir Barzilai tuned up his PowerPoint and launched into a practice run of the main presentation. Barzilai is a former Israeli army medical officer and head of a well-known study of centenarians based at the Albert Einstein College of Medicine in the Bronx, New York. To anyone who has seen the ebullient scientist in his natural laboratory habitat, often in a short-sleeved shirt and always cracking jokes, he looked uncharacteristically kempt in a blue blazer and dress khakis. But his practice run kept hitting a historical speed bump. He had barely begun to explain the rationale for the trial when he mentioned, in passing, “lots of unproven, untested treatments under the category of anti-aging.” His colleagues pounced. “Nir,” interrupted S. Jay Olshansky, a biodemographer of aging from the University of Illinois, Chicago. The phrase “anti-aging … has an association that is negative.” “I wouldn’t dignify them by calling them ‘treatments,’” added Michael Pollak, director of cancer prevention at McGill University in Montreal, Canada. “They’re products.” Barzilai, a 59-year-old with a boyish mop of gray hair, wore a contrite grin. “We know the FDA is concerned about this,” he conceded, and deleted the offensive phrase.

Then he proceeded to lay out the details of an ambitious clinical trial. The group—academics all—wanted to conduct a double-blind study of roughly 3000 elderly people; half would get a placebo and half would get an old (indeed, ancient) drug for type 2 diabetes called metformin, which has been shown to modify aging in some animal studies. Because there is still no accepted biomarker for aging, the drug’s success would be judged by an unusual standard—whether it could delay the development of several diseases whose incidence increases dramatically with age: cardiovascular disease, cancer, and cognitive decline, along with mortality. When it comes to these diseases, Barzilai is fond of saying, “aging is a bigger risk factor than all of the other factors combined.”

More here.

Wednesday, September 16, 2015

AGAINST LOLITA: TRULY, A MINOR WORK

Lolita-lyonsRoxana Robinson at Literary Hub:

Lolita stands as a monument to Nabokov’s resentment: it is not a novel of sexual consummation but of cultural contempt. Contempt is the true driving passion: the landscape, the characters, the narrator, the narrative are all drenched in it. Nabokov despises his characters and their bright vulgar world, with its populist architecture and cheap displays, its tawdry, ersatz culture. Lolita is a raging cry for the world Nabokov lost, which was one of refinement, perception and beauty. It’s a cry of rage at a world that represents his world’s counterpart: young, vulgar, unrefined and irredeemably seductive.

Lolita becomes the object of his perverted desire. Early in his life, in Europe, Humbert had an adolescent fling with a girl who died. Now that he has been exiled from Paradise, now that his first love is dead, America and Lolita are what Humbert must endure. Lolita herself is common, superficial and unintelligent, though Humbert ignores her mind and her sensibility. He despises the vulgar clothes and makeup, snacks and advertisements that entrance her, America’s cheap post-war offerings. He hates these philistine vulgarians for owning this huge green continent across which he drives his unwilling partner, full of meadows and mountains and sunsets, rich and open and empty, “end of the summer mountains, all hunched up, their heavy Egyptian limbs folded under folds of tawny moth-eaten plush… a last rufous mountain, with a rich rug of lucerne at its feet.” These people don’t deserve this place.

more here.

picasso the sculptor

150921_r27038-320Peter Schjeldahl at The New Yorker:

Of the scores of pieces that merit lengthy discussion, I’ll cite one: “Woman with Vase” (1933), a bronze of a plaster sculpture that, cast in cement, accompanied “Guernica” at the Spanish Pavilion of the World’s Fair in Paris, in 1937. She stands more than seven feet tall, with a bulbous head, breasts, and belly, on spindly legs. Her left arm is missing, as if ripped off. Her right arm extends far forward, clutching a tall vase. Seen from the side, the gesture suggests a tender offering. Viewed head on, it delivers a startling, knockout punch. What isn’t this work about? It conjoins Iberian antiquity and Parisian modernity, love and loss, hope and anger, celebration and mourning. Another bronze cast of it stands at Picasso’s tomb, in the Château de Vauvenargues, as a memorial and, perhaps, as a master key to the secrets of his art. Certainly, it overshadows the somewhat indulgent—and, now and then, plain silly—sculptural creations of his later years, such as the gewgaw-elaborated bronze “Little Girl Jumping Rope” (1950). Exceptions from that time include a stunning selection of his riffs on ceramic vessels, lively bent-metal maquettes for public art, and a group of six “Bathers” from 1956: flat figures, one almost nine feet tall, made of scrap wood and standing in a shared, beachlike bed of pebbles. Its éclat might well sink the hearts of contemporary installation artists.

more here.

 Leftists often describe Henry Kissinger as a unique moral monster, but his intellectual framework pervades the entire national security state

Greg Grandin in The Nation:

ScreenHunter_1369 Sep. 16 20.24When I told friends and colleagues that I was writing a book about the legacy of Henry Kissinger’s foreign policy, many made mention of Christopher Hitchens’s The Trial of Henry Kissinger. But I saw my purpose as antithetical to Hitchens’s polemic, which is a good example of what the great historian Charles Beard, in 1936, dismissed as the “devil theory of war”—placing the blame for militarism on a single, isolable cause: a “wicked man.” To really understand the sources of conflict, Beard argued, you had to look at the big picture, to consider the way “war is our own work,” emerging out of “the total military and economic situation.” In making the case that Kissinger should be tried—and convicted—for war crimes, Hitchens didn’t look at the big picture. Instead, he focused obsessively on the morality of one man, his devil: Henry Kissinger.

Aside from assembling the docket and gathering the accused’s wrongdoings in one place, The Trial of Henry Kissinger isn’t very useful and is actually counterproductive; righteous indignation doesn’t provide much room for understanding. Hitchens burrows deep into Kissinger’s dark heart: The statesman was implicated in horrors in Cambodia, Laos, Bangladesh, Vietnam, East Timor, Latin America, southern Africa, and Washington, DC (the assassination of Orlando Letelier), as well as against the Kurds. Readers are left waiting for Hitchens to come out and tell us what it all means (that is, besides the obvious: Kissinger is a criminal). But Hitchens never does. In the end, we learn more about the prosecutor than the would-be prosecuted; the book provides no insights into the “total situation” in which Kissinger operated, and makes no effort to explain the power of his ideas or how they tapped into the deeper intellectual currents of American history.

More here.

The Evolution of Everything by Matt Ridley – the rightwing libertarian gets it wrong

John Gray in The Guardian:

MATT-RIDLEY-large570Matt Ridley has made a discovery. The natural selection that Darwin described in The Origin of Species is only a particular example of a universal process. As he tells us at the start of this book, Darwinism is “the special theory of evolution”. But there is a general theory of evolution, too, and it applies to society, money, technology, language, law, culture, music, violence, history, education, politics, God, morality. The general theory says that things do not stay the same; they change gradually but inexorably; they show “path dependence”; they show descent with modification; they show selective persistence.

In the course of the book’s 16 chapters, which deal with the evolution of everything from the internet to leadership, Ridley repeats this mantra many times: Darwin’s mechanism of selective survival resulting in cumulative complexity applies to human culture in all its aspects, too. Our habits and institutions, from language to cities, are constantly changing, and the mechanism of change turns out to be surprisingly Darwinian: it is gradual, undirected, mutational, inexorable, combinatorial, selective and “in some sense vaguely progressive”.

It’s curious that Ridley thinks this a new idea. There is nothing at all novel in theories of social evolution. I have a vivid memory of listening to the late FA Hayek, some 30 years ago, lecturing on what he called “the natural selection of religions” – a supposedly Darwinian process in which the religions that survive and spread are those that promote private property and market exchange and thereby support growing numbers of believers. I recall wondering how this account squared with the actual history of religion. The polytheistic cults of Greece and Rome didn’t die out in an incremental process of evolutionary decline; they were stamped out when the emperor Constantine converted to Christianity. If Tibet’s brand of Buddhism disappears from the country, or the Baha’i faith vanishes from Iran, the reason won’t be that these faiths suffer from any evolutionary disadvantage. It will be because state power has been used to destroy them.

More here.

Genetics: Dawkins, redux

Nathaniel Comfort in Nature:

DawkinsA curious stasis underlies Dawkins's thought. His biomorphs are grounded in 1970s assumptions. Back then, with rare exceptions, each gene specified a protein and each protein was specified by a gene. The genome was a linear text — a parts list or computer program for making an organism —insulated from the environment, with the coding regions interspersed with “junk”. Today's genome is much more than a script: it is a dynamic, three-dimensional structure, highly responsive to its environment and almost fractally modular. Genes may be fragmentary, with far-flung chunks of DNA sequence mixed and matched in bewildering combinatorial arrays. A universe of regulatory and modulatory elements hides in the erstwhile junk. Genes cooperate, evolving together as units to produce traits. Many researchers continue to find selfish DNA a productive idea, but taking the longer view, the selfish gene per se is looking increasingly like a twentieth-century construct. Dawkins's synopsis shows that he has not adapted to this view. He nods at cooperation among genes, but assimilates it as a kind of selfishness. The microbiome and the 3D genome go unnoticed. Epigenetics is an “interesting, if rather rare, phenomenon” enjoying its “fifteen minutes of pop science voguery”, which it has been doing since at least 2009, when Dawkins made the same claim in The Greatest Show on Earth (Transworld). Dawkins adheres to a deterministic language of “genes for” traits. As I and other historians have shown, such hereditarianism plays into the hands of the self-styled race realists (N. Comfort Nature 513, 306–307; 2014).

His writing can still sparkle. He excels at capturing the scenes behind a scene, deftly explaining a scientific principle, capping a story with an amusing anecdote. His tale of palaeoanthropologist Richard Leakey hauling his legs (amputated after a plane crash) to Kenya in his hand luggage for burial is funny and touching. Dawkins also makes an important case for the “poetic” side of science, arguing that the imperative to justify research in terms of potential medical or financial benefits bleeds the beauty out of it. Amen. At such moments, one feels transported to a tweedy evening at Oxford, pouring the sherry as a charming senior faculty member holds court. But too often, the professor rambles. He quotes friends' and colleagues' tributes from dust-jackets and afterwords. He mentions the fish genus Dawkinsia. He repeatedly slams his late rival, Gould (“whose genius for getting things wrong matched the eloquence with which he did so”). His digressions often come off as twee and self-indulgent. Mentioning the limping family dog, Bunch, in an apt example of an acquired characteristic that cannot be inherited, he is reminded of an unfinished poem his mother wrote after Bunch died, which he prints. “If you can't be sentimental in an autobiography, when can you?” he asks.

For a time, Dawkins was a rebellious scientific rock star. Now, his critique of religion seems cranky, and his immovably genocentric universe is parochial. Brief Candle is about as edgy as Sir Mick and the Rolling Stones cranking out the 3,578th rendition of 'Brown Sugar' — a treat for fans, but reinscribing boundaries rather than crossing them.

More here.

Wednesday Poem

Possibilities

I prefer movies.
I prefer cats.
I prefer the oaks along the Warta.
I prefer Dickens to Dostoyevsky.
I prefer myself liking people
to myself loving mankind.
I prefer keeping a needle and thread on hand, just in case.
I prefer the color green.
I prefer not to maintain
that reason is to blame for everything.
I prefer exceptions.
I prefer to leave early.
I prefer talking to doctors about something else.
I prefer the old fine-lined illustrations.
I prefer the absurdity of writing poems
to the absurdity of not writing poems.
I prefer, where love's concerned, nonspecific anniversaries
that can be celebrated every day.
I prefer moralists
who promise me nothing.
I prefer cunning kindness to the over-trustful kind.
I prefer the earth in civvies.
I prefer conquered to conquering countries.
I prefer having some reservations.
I prefer the hell of chaos to the hell of order.
I prefer Grimms' fairy tales to the newspapers' front pages.
I prefer leaves without flowers to flowers without leaves.
I prefer dogs with uncropped tails.
I prefer light eyes, since mine are dark.
I prefer desk drawers.
I prefer many things that I haven't mentioned here
to many things I've also left unsaid.
I prefer zeroes on the loose
to those lined up behind a cipher.
I prefer the time of insects to the time of stars.
I prefer to knock on wood.
I prefer not to ask how much longer and when.
I prefer keeping in mind even the possibility
that existence has its own reason for being.

by Wislawa Szymborska
from Map –collected and last poems
translated by S. Baranczak & C. Cavanagh

How a Black Man From Missouri Transformed Himself Into the Indian Liberace

Korla_Jan-300x443

Liesl Bradner in The New Republic:

Before Liberace, there was Korla Pandit. He was a pianist from New Delhi, India, and dazzled national audiences in the 1950s with his unique keyboard skills and exotic compositions on the Hammond B3 organ. He appeared on Los Angeles local television in 900 episodes of his show, “Korla Pandit’s Adventures in Music”, smartly dressed in a suit and tie or silk brocade Nehru jacket and cloaked in a turban adorned with a single shimmering jewel. The mysterious, spiritual Indian man with a hypnotic gaze and sly grin was transfixing.

Offstage, Korla—known as the “Godfather of Exotica“— was living the American dream: he had a house in the Hollywood hills, a beautiful blonde wife, two kids, and a social circle that included Errol Flynn and Bob Hope. He even had his own floral-decorated organ float in the Rose Bowl parade in 1953.

Like most everything in Hollywood, it was all smoke and mirrors. His charade wasn’t his stage name—it was his race. Korla Pandit, born John Roland Redd, was a light skinned black man from St. Louis, Missouri. It was a secret he kept until the day he died.

A new documentary, Korla, explores Pandit’s extraordinary life and career. Filmmakers John Turner and Eric Christiansen grew up in the Bay Area watching Korla on TV and listening to his music. The two worked together for 35 years at KGO-TV in San Francisco, where Korla had a live show in 1964. Both fell under his spell learning the truth in a Los Angeles Magazine exposé in 2001, three years after Pandit’s death. “He was a slight man with a beatific smile who was spouting pearls of wisdom about how we could get along better and the universal language of music,” Turner told me. “Why question a person like that?”

More here.

Economists vs. Economics

Dani_Rodrik_small

Dani Rodrik in Project Syndicate:

Ever since the late nineteenth century, when economics, increasingly embracing mathematics and statistics, developed scientific pretensions, its practitioners have been accused of a variety of sins. The charges – including hubris, neglect of social goals beyond incomes, excessive attention to formal techniques, and failure to predict major economic developments such as financial crises – have usually come from outsiders, or from a heterodox fringe. But lately it seems that even the field’s leaders are unhappy.

Paul Krugman, a Nobel laureate who also writes a newspaper column, has made a habit of slamming the latest generation of models in macroeconomics for neglecting old-fashioned Keynesian truths. Paul Romer, one of the originators of new growth theory, has accused some leading names, including the Nobel laureate Robert Lucas, of what he calls “mathiness” – using math to obfuscate rather than clarify.

Richard Thaler, a distinguished behavioral economist at the University of Chicago, has taken the profession to task for ignoring real-world behavior in favor of models that assume people are rational optimizers. And finance professor Luigi Zingales, also at the University of Chicago, has charged that his fellow finance specialists have led society astray by overstating the benefits produced by the financial industry.

This kind of critical examination by the discipline’s big names is healthy and welcome – especially in a field that has often lacked much self-reflection. I, too, have taken aim at the discipline’s sacred cows – free markets and free trade – often enough.

But there is a disconcerting undertone to this new round of criticism that needs to be made explicit – and rejected. Economics is not the kind of science in which there could ever be one true model that works best in all contexts. The point is not “to reach a consensus about which model is right,” as Romer puts it, but to figure out which model applies best in a given setting.

More here.

There Is No Theory of Everything

13critchley-blog427

Simon Critchley in the NYT's The Stone:

Over the years, I have had the good fortune to teach a lot of graduate students, mostly in philosophy, and have noticed a recurring fact. Behind every new graduate student stands an undergraduate teacher. This is someone who opened the student’s eyes and ears to the possibility of the life of the mind that they had perhaps imagined but scarcely believed was within their reach. Someone who, through the force of their example, animated a desire to read more, study more and know more. Someone in whom the student heard something fascinating or funny or just downright strange. Someone who heard something significant in what the student said in a way that gave them confidence and self-belief. Such teachers are the often unknown and usually unacknowledged (and underpaid) heroes of the world of higher education.

Some lucky people have several such teachers. This was the case with me. But there is usually one teacher who sticks out and stays in one’s mind, and whose words resound down through the years. These are teachers who become repositories for all sorts of anecdotes, who are fondly recalled through multiple bon mots and jokes told by their former students. It is also very often the case that the really good teachers don’t write or don’t write that much. They are not engaged in “research,” whatever that benighted term means with respect to the humanities. They teach. They talk. Sometimes they even listen and ask questions.

In relation to philosophy, this phenomenon is hardly new. The activity of philosophy begins with Socrates, who didn’t write and about whom many stories were told. Plato and others, like Xenophon, wrote them down and we still read them. It is very often the case that the center of a vivid philosophical culture is held by figures who don’t write but who exist only through the stories that are told about them. One thinks of Sidney Morgenbesser, long-time philosophy professor at Columbia, whom I once heard described as a “mind on the loose.” The philosopher Robert Nozick said of his undergraduate education that he “majored in Sidney Morgenbesser.” On his deathbed, Morgenbesser is said to have asked: “Why is God making me suffer so much? Just because I don’t believe in him?”

These anecdotes seem incidental, but they are very important. They become a way of both revering the teacher and humanizing them, both building them up and belittling them, giving us a feeling of intimacy with them, keeping them within human reach. Often the litmus test of an interesting philosopher is how many stories circulate about them.

More here.

Tuesday, September 15, 2015

Deep Learning Machine Teaches Itself Chess in 72 Hours

Unknown

Over at MIT Technology Review:

It’s been almost 20 years since IBM’s Deep Blue supercomputer beat the reigning world chess champion, Gary Kasparov, for the first time under standard tournament rules. Since then, chess-playing computers have become significantly stronger, leaving the best humans little chance even against a modern chess engine running on a smartphone.

But while computers have become faster, the way chess engines work has not changed. Their power relies on brute force, the process of searching through all possible future moves to find the best next one.

Of course, no human can match that or come anywhere close. While Deep Blue was searching some 200 million positions per second, Kasparov was probably searching no more than five a second. And yet he played at essentially the same level. Clearly, humans have a trick up their sleeve that computers have yet to master.

This trick is in evaluating chess positions and narrowing down the most profitable avenues of search. That dramatically simplifies the computational task because it prunes the tree of all possible moves to just a few branches.

Computers have never been good at this, but today that changes thanks to the work of Matthew Lai at Imperial College London. Lai has created an artificial intelligence machine called Giraffe that has taught itself to play chess by evaluating positions much more like humans and in an entirely different way to conventional chess engines.

Straight out of the box, the new machine plays at the same level as the best conventional chess engines, many of which have been fine-tuned over many years. On a human level, it is equivalent to FIDE International Master status, placing it within the top 2.2 percent of tournament chess players.

More here.

The Most Misread Poem in America

David Orr in the Paris Review:

9781594205835On a word-for-word basis, it may be the most popular piece of literature ever written by an American.

And almost everyone gets it wrong. This is the most remarkable thing about “The Road Not Taken”—not its immense popularity (which is remarkable enough), but the fact that it is popular for what seem to be the wrong reasons. It’s worth pausing here to underscore a truth so obvious that it is often taken for granted: Most widely celebrated artistic projects are known for being essentially what they purport to be. When we play “White Christmas” in December, we correctly assume that it’s a song about memory and longing centered around the image of snow falling at Christmas. When we read Joyce’sUlysses, we correctly assume that it’s a complex story about a journey around Dublin as filtered through many voices and styles. A cultural offering may be simple or complex, cooked or raw, but its audience nearly always knows what kind of dish is being served.

Frost’s poem turns this expectation on its head. Most readers consider “The Road Not Taken” to be a paean to triumphant self-assertion (“I took the one less traveled by”), but the literal meaning of the poem’s own lines seems completely at odds with this interpretation.

More here. [For my sister Azra.]