Sunday, October 28, 2012

Atul Gawande: Excellence Is Recognizing Details, Failures

From Harvard Magazine:

AtulIn the professional world, what separates greatness from mere competence? Why is a cystic fibrosis treatment center in Minnesota miles ahead of a similar program in Cincinnati? Why are certain teachers getting first-rate results in the classroom when others are merely getting by?

Atul Gawande, a Harvard Medical School professor, surgeon at Brigham and Women’s Hospital, and New Yorker staff writer who has traveled the country researching answers to this question, says the answer has little to do with income level, education, or high intelligence. The key to being great at any given profession, he says, is the ability to recognize failure. “What I found over time, trying to follow and emulate people that were focused on achieving something more than competence, is that they weren’t smarter than anybody else, they weren’t geniuses,” Gawande told an audience at the Harvard Graduate School of Education’s Askwith Forum on Wednesday. “Instead they seemed to be people that could come to grips with their inherent fallibility—fallibility in the systems that they work in, and with what it took to overcome that fallibility.”

More here.

BRAINS PLUS BRAWN

Daniel Lieberman in Edge:

DanI've been thinking a lot about the concept of whether or not human evolution is a story of brains over brawn. I study the evolution of the human body and how and why the human body is the way it is, and I've worked a lot on both ends of the body. I'm very interested in feet and barefoot running and how our feet function, but I've also written and thought a lot about how and why our heads are the way they are. The more I study feet and heads, the more I realize that what's in the middle also matters, and that we have this very strange idea —it goes back to mythology—that human evolution is primarily a story about brains, about intelligence, about technology triumphing over brawn.

Another good example of this would be the Piltdown hoax. The Piltdown forgery was a fossil that was discovered in the early 1900s, in a pit in southern England. This fossil consisted of a modern human skull that had been stained and made to look really old, and an orangutan jaw whose teeth had been filed it down and broken up, all thrown into a pit with a bunch of fake stone tools. It was exactly what Edwardian scientists were looking for, because it was an ape-like face with a big human brain, and also it evolved in England, so it proved that humans evolved in England, which of course made sense to any Victorian or Edwardian. It also fit with the prevailing idea at the time of Elliot Smith, that brains led the way in human evolution because, if you think about what makes us so different from other creatures, people always thought it's our brains. We have these big, enormous, large, fantastic brains that enable us to invent railways and income tax and insurance companies and all those other wonderful inventions that made the Industrial Revolution work.

More here.

Remembering Sri Lanka’s Killing Fields

F9e0ff509445c7a6c436c137d24dbe24.portrait.jpg

Gareth Evans in Project Syndicate (illustration by Dean Rohrer):

Three years ago, in the bloody endgame of the Sri Lankan government’s war against the separatist Liberation Tigers of Tamil Eelam, some 300,000 civilians became trapped between the advancing army and the last LTTE fighters in what has been called “the cage” – a tiny strip of land, not much larger than New York City’s Central Park, between sea and lagoon in the northeast of the country.

With both sides showing neither restraint nor compassion, at least 10,000 civilians – possibly as many as 40,000 – died in the carnage that followed, as a result of indiscriminate army shelling, rebel gunfire, and denial of food and medical supplies.

The lack of outrage mainly reflects the Sri Lankan government’s success in embedding in the minds of policymakers and publics an alternative narrative that had extraordinary worldwide resonance in the aftermath of the terrorist attacks of September 11, 2001. What occurred in the cage, according to this narrative, was the long-overdue defeat, by wholly necessary and defensible means, of a murderous terrorist insurrection that had threatened the country’s very existence.

The other key reason behind the world’s silence is that the Sri Lankan government was relentless in banning independent observers – media, NGOs, or diplomats – from witnessing or reporting on its actions. And this problem was compounded by the timidity of in-country United Nations officials in communicating such information as they had.

President Mahinda Rajapaksa’s government claimed throughout, and still does, that it maintained a “zero civilian casualties” policy. Officials argued that no heavy artillery fire was ever directed at civilians or hospitals, that any collateral injury to civilians was minimal, and that they fully respected international law, including the proscription against execution of captured prisoners.

But that narrative is now being picked apart in a series of recent publications, notably the report last year of a UN Panel of Experts, and in two new books: UN official Gordon Weiss’s relentlessly analytical The Cage: The Fight for Sri Lanka and the Last Days of the Tamil Tigers, and BBC journalist Frances Harrison’s harrowingly anecdotal Still Counting the Dead: Survivors of Sri Lanka’s Hidden War.

What Can You Really Know? Another Round of Physicists vs. Philosophers

Dyson_2-110812_jpg_230x1041_q85

Freeman Dyson reviews Jim Holt's Why Does the World Exist?: An Existential Detective Story, in the NYRB:

The fading of philosophy came to my attention in 1979, when I was involved in the planning of a conference to celebrate the hundredth birthday of Einstein. The conference was held in Princeton, where Einstein had lived, and our largest meeting hall was too small for all the people who wanted to come. A committee was set up to decide who should be invited. When the membership of the committee was announced, there were loud protests from people who were excluded. After acrimonious discussions, we agreed to have three committees, each empowered to invite one third of the participants. One committee was for scientists, one for historians of science, and one for philosophers of science.

After the three committees had made their selections, we had three lists of names of people to be invited. I looked at the lists of names and was immediately struck by their disconnection. With a few exceptions, I knew personally all the people on the science list. On the history list, I knew the names, but I did not know the people personally. On the philosophy list, I did not even know the names.

In earlier centuries, scientists and historians and philosophers would have known one another. Newton and Locke were friends and colleagues in the English parliament of 1689, helping to establish constitutional government in England after the bloodless revolution of 1688. The bloody passions of the English Civil War were finally quieted by establishing a constitutional monarchy with limited powers. Constitutional monarchy was a system of government invented by philosophers. But in the twentieth century, science and history and philosophy had become separate cultures. We were three groups of specialists, living in separate communities and rarely speaking to each other.

When and why did philosophy lose its bite? How did it become a toothless relic of past glories? These are the ugly questions that Jim Holt’s book compels us to ask.

Literature is not Data: Against Digital Humanities

1351415953

Stephen Marche in The LA Review of Books:

BIG DATA IS COMING for your books. It’s already come for everything else. All human endeavor has by now generated its own monadic mass of data, and through these vast accumulations of ciphers the robots now endlessly scour for significance much the way cockroaches scour for nutrition in the enormous bat dung piles hiding in Bornean caves. The recent Automate This, a smart book with a stupid title, offers a fascinatingly general look at the new algorithmic culture: 60 percent of trades on the stock market today take place with virtually no human oversight. Artificial intelligence has already changed health care and pop music, baseball, electoral politics, and several aspects of the law. And now, as an afterthought to an afterthought, the algorithms have arrived at literature, like an army which, having conquered Italy, turns its attention to San Marino.

The story of how literature became data in the first place is a story of several, related intellectual failures.

In 2002, on a Friday, Larry Page began to end the book as we know it. Using the 20 percent of his time that Google then allotted to its engineers for personal projects, Page and Vice-President Marissa Mayer developed a machine for turning books into data. The original was a crude plywood affair with simple clamps, a metronome, a scanner, and a blade for cutting the books into sheets. The process took 40 minutes. The first refinement Page developed was a means of digitizing books without cutting off their spines — a gesture of tender-hearted sentimentality towards print. The great disbinding was to be metaphorical rather than literal. A team of Page-supervised engineers developed an infrared camera that took into account the curvature of pages around the spine. They resurrected a long dormant piece of Optical Character Recognition software from Hewlett-Packard and released it to the open-source community for improvements. They then crowd-sourced textual correction at a minimal cost through a brilliant program called reCAPTCHA, which employs an anti-bot service to get users to read and type in words the Optical Character Recognition software can’t recognize. (A miracle of cleverness: everyone who has entered a security identification has also, without knowing it, aided the perfection of the world’s texts.) Soon after, the world’s five largest libraries signed on as partners. And, more or less just like that, literature became data.

Remarkable Facts: Ending Science As We Know It

Sober_37.6_roulette

Elliott Sober reviews Thomas Nagel's Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False, in Boston Review:

Thomas Nagel, a distinguished philosopher at NYU, is well known for his critique of “materialistic reductionism” as an account of the mind-body relationship. In his new and far-reaching book Mind and Cosmos, Nagel extends his attack on materialistic reductionism—which he describes as the thesis that physics provides a complete explanation of everything—well beyond the mind-body problem. He argues that evolutionary biology is fundamentally flawed and that physics also needs to be rethought—that we need a new way to do science.

Nagel’s new way is teleological—scientific explanations need to invoke goals, not just mechanistic causes. The conventional story of the emergence of modern science maintains that Galileo and Newton forever banished Aristotle’s teleology. SoMind and Cosmos is an audacious book, bucking the tide. Nagel acknowledges that he has no teleological theory of his own to offer. His job, as he sees it, is to point to a need; creative scientists, he hopes, will do the heavy lifting.

Nagel’s rejection of materialistic reductionism does not stem from religious conviction. He says that he doesn’t have a religious bone in his body. The new, teleological science he wants is naturalistic, not supernaturalistic. This point needs to be remembered, given that the book begins with kind words for proponents of intelligent design. Nagel applauds them for identifying problems in evolutionary theory, but he does not endorse their solution.

Nagel’s main goal in this book is not to argue against materialistic reductionism, but to explore the consequences of its being false. He has argued against the -ism elsewhere, and those who know their Nagel will be able to fill in the details. But new readers may be puzzled, so a little backstory may help.

Sunday Poem

Smell and Envy

You nature poets think you've got it, hostaged
somewhere in Vermont or Oregon,
so it blooms and withers only for you,
so all you have to do is name it: primrose
– and now you're writing poetry, and now
you ship it off to us, to smell and envy.

But we are made of newspaper and smoke
and we dunk your roses in vats of blue.
Birds don't call, our pigeons play it close
to the vest. When the moon is full
we hear it in the sirens. The Pleiades
you could probably buy downtown. Gravity
is the receiver on the hook. Mortality
we smell on certain people as they pass.

by Douglas Goetsch
from Nobody's Hell

Hanging Loose Press, Brooklyn, NY, 1999

Saturday, October 27, 2012

Lewis Lapham’s Antidote to the Age of BuzzFeed

From Smithsonian:

Last-Renaissance-Man-Lewis-Lapham-631The counter­revolution has its embattled forward outpost on a genteel New York street called Irving Place, home to Lapham’s Quarterly. The street is named after Washington Irving, the 19th-century American author best known for creating the Headless Horseman in his short story “The Legend of Sleepy Hollow.” The cavalry charge that Lewis Lapham is now leading could be said to be one against headlessness—against the historically illiterate, heedless hordesmen of the digital revolution ignorant of our intellectual heritage; against the “Internet intellectuals” and hucksters of the purportedly utopian digital future who are decapitating our culture, trading in the ideas of some 3,000 years of civilization for…BuzzFeed.

Lapham, the legendary former editor of Harper’s, who, beginning in the 1970s, helped change the face of American nonfiction, has a new mission: taking on the Great Paradox of the digital age. Suddenly thanks to Google Books, JSTOR and the like, all the great thinkers of all the civilizations past and present are one or two clicks away. The great library of Alexandria, nexus of all the learning of the ancient world that burned to the ground, has risen from the ashes online. And yet—here is the paradox—the wisdom of the ages is in some ways more distant and difficult to find than ever, buried like lost treasure beneath a fathomless ocean of online ignorance and trivia that makes what is worthy and timeless more inaccessible than ever. There has been no great librarian of Alexandria, no accessible finder’s guide, until Lapham created his quarterly five years ago with the quixotic mission of serving as a highly selective search engine for the wisdom of the past.

More here.

Wartime Rations

From The New York Times:

Fishman-190I want to hate David Benioff. He’s annoyingly handsome. He’s already written a pair of unputdownable books, one of which was made into Spike Lee’s most heartbreaking film, “The 25th Hour” — for which Benioff was asked to write the screenplay, leading to a second career in Hollywood. (They should just get it over with and put the man in the movies already.) He takes his morning orange juice next to Amanda Peet. And he’s still in his 30s. See what I mean?

Benioff’s new novel reveals why there are so many Russians — not oligarchs or prostitutes, but soldiers and old babushkas — in this nice American boy’s fiction. “City of Thieves” follows a character named Lev Beniov, the son of a revered Soviet Jewish poet who was “disappeared” in the Stalinist purges, as Lev and an accomplice carry out an impossible assignment during the Nazi blockade of Leningrad. Before Lev begins to tell his story, however, a young Los Angeles screenwriter named David visits his grandfather in Florida, pleading for his memories of the siege. But this is no postmodern coquetry. In fact, the novel tells a refreshingly traditional tale, driven by an often ingenious plot. And after that first chapter Benioff is humble enough to get out of its way. For some writers, Russia inspires extravagant lamentations uttered into the eternity of those implacable winters. Happily, Benioff’s prose doesn’t draw that kind of attention to itself.

More here. (Note: Old review but, thanks to Abbas and Margit, I just read the book now and recommend it strongly).

Andrew Gelman on How Americans Vote

Favorite

A Five Books interview:

I notice from your blog as well that one of the stereotypes that you are keen on debunking is this idea that working-class people in America vote conservative. A number of people have gone to some lengths to try to explain this phenomenon, but you seem to think it’s a bit of a red herring.

Somehow people on the left and on the right find it difficult to understand. On the left, people think that 100% of working-class people should vote for the left, so anything less than 100% makes them feel that there is something that went wrong. They just cannot understand how this could be. On the right, you get the opposite. It’s considered a validation – they want to believe that these more virtuous people are voting for them. But even in the days of Franklin Roosevelt and Harry Truman, a lot of low-income people voted Republican. There was no magic golden age in which lower-income working-class people were uniformly Democrat. It was always various subgroups of the population.

How many of the poor did vote for the Democrats, say, in the last election?

Of the lowest third of the population about 60% voted for the Democrats.

What if you narrow it down to blue-collar workers though? Don’t the majority of them vote conservative?

Then you have to ask, what does that exactly mean? Someone could make $100,000 a year and be blue collar. Conversely, if you’re a woman cleaning bedpans and making very little money, you’re not blue collar. Cleaning bedpans is not considered blue-collar work. There is the way that, firstly, blue collar conveys some sort of moral superiority, and secondly that it just happens to exclude a lot of the female workforce, who are more likely to be Democrats. If you take only blue collar – which is mostly male – and don’t even restrict for income and then you go beyond that to only include whites, you’re chipping away at various groups that support the Democrats, without noticing what’s happening. It sounds very innocuous to talk about blue-collar whites, but you’re selecting a subgroup among this social class which is particularly conservative, and then making some claims about them.

X-phi is Here to Stay

Chriweigel

Richard Marshall interviews Chris Weigel in 3:AM Magazine:

3:AM: By 2009 you were enthusiastically supporting X-phi. You wrote a paper‘Experimental Philosophy Is here To Stay’. Why did you write that? Was there a feeling at the time that the approach needed defending?

CW: Yes, it did need defending and explaining and sometimes still does. In 2009, I bumped into someone at a conference who said, “Oh, you’re doing that? That’s too bad. I read a paper that refutes it.” And my thought was, “Which ‘it’ are we talking about? The projects are really diverse, and it seems unlikely that one argument could refute all of them at once.” Over time, that person and the field in general has become much more sympathetic. Writing the paper was a way not so much of defending but of explaining experimental philosophy systematically. After attending the phenomenal Experimental Philosophy summer workshop directed by Ron Mallon and Shaun Nichols, I wanted to try to explainexperimental philosophy to a wide audience.

3:AM: When talking about this approach to philosophy Josh Knobe, Shaun Nichols andothers give the impression that it is a more collaborative approach than the traditional, armchair variety. Have you found this to be the case in your own experience? It seems very cool and unstuffy. Josh Knobe in his interview said he feared ending up as being just an academic stuck being read by a couple of other academics. X-phi seems to be a way of escaping this fear. Is this something that you relate to?

CW: Yes, and if you look at how so many of the major papers have co-authors, you’ll see that experimental philosophers tend to work collaboratively. I’ve also had many more opportunities for collaboration since starting in experimental philosophy. And I think you’re right about that the research tends to be, as you say, cool and unstuffy. I think of it like this: When my daughter was fifteen months old, I took her to a pumpkin patch, and she was so excited, she started uttering—screaming, really—her first sentence while pointing all around: “Look at that! Look at that! Look at that!” Experimental philosophy presentations have much the same feel. They offer a pumpkin patch full of philosophically rich ideas just waiting to be explored.

From the Naturalism Workshop, Part I

Around the table-2

Massimo Pigliucci reports on the Naturalism Workshop conceived of by Sean Carroll, over at Rationally Speaking:

During the roundtable introductions, Dawkins (as well as the rest of us) was asked what he would be willing to change his mind about; he said he couldn’t conceive of a sensible alternative to naturalism. Rosenberg, interestingly, brought up the (hypothetical) example of finding God’s signature in a DNA molecule (just like Craig Venter has actually done). Dawkins admitted that that would do it, though immediately raised the more likely possibility that that would be a practical joke played by a superhuman — but not supernatural — intelligence. Coyne then commented that there is no sensible distinction between superhuman and supernatural, in a nod to Clarke’s third law.

There appeared to be some interesting differences within the group. For instance, Rosenberg clearly has no problem with a straightforward functionalist computational theory of the mind; DeDeo accepts it, but feels uncomfortable about it; and Deacon outright rejects it, without embracing any kind of mystical woo. Steven Weinberg asked the question of whether — if a strong version of artificial intelligence is possible — it follows that we should be nice to computers.

The first actual session was about the nature of reality, with an introduction by Alex Rosenberg. His position is self-professedly scientistic, reductionist and nihilist, as presented in his The Atheist’s Guide to Reality. (Rationally Speaking published a critical review of that book, penned by Michael Ruse.) Alex thinks that complex phenomena — including of course consciousness, free will, etc. — are not just compatible with, but determined by and reducible to, the fundamental level of physics. (Except, of course, that there appears not to be any such thing as the fundamental level, at least not in terms of micro-things and micro-bangings.)

A Matter of Taste?

28FOOD-articleInline

William Deresiewicz in the NYT:

Foodism has taken on the sociological characteristics of what used to be known — in the days of the rising postwar middle class, when Mortimer Adler was peddling the Great Books and Leonard Bernstein was on television — as culture. It is costly. It requires knowledge and connoisseurship, which are themselves costly to develop. It is a badge of membership in the higher classes, an ideal example of what Thorstein Veblen, the great social critic of the Gilded Age, called conspicuous consumption. It is a vehicle of status aspiration and competition, an ever-present occasion for snobbery, one-upmanship and social aggression. (My farmers’ market has bigger, better, fresher tomatoes than yours.) Nobody cares if you know about Mozart or Leonardo anymore, but you had better be able to discuss the difference between ganache and couverture.

Young men once headed to the Ivy League to acquire the patina of high culture that would allow them to move in the circles of power — or if they were to the manner born, to assert their place at the top of the social heap by flashing what they already knew. Now kids at elite schools are inducted, through campus farmlets, the local/organic/sustainable fare in dining halls and osmotic absorption via their classmates from Manhattan or the San Francisco Bay Area, into the ways of food. More and more of them also look to the expressive possibilities of careers in food: the cupcake shop, the pop-up restaurant, the high-end cookie business. Food, for young people now, is creativity, commerce, politics, health, almost religion.

It took me some effort to explain to a former student recently that no, my peers did not talk about food all the time when we were her age, unless she meant which diner we were going to for breakfast. “But food is everything!” she said.