the nagel debate continues…

Mind_and_Cosmos_cover

In Mind and Cosmos, Nagel holds that materialism can’t deliver the goods. Drawing on his bolder and more recent paper “The Psychophysical Nexus,” he now says that materialistic reductionism is false, not that we currently don’t understand how it could be true. For Nagel, perception and other psychological processes involve irreducibly subjective facts; important aspects of the mind are, therefore, forever beyond the reach of physical explanation. This position is compatible with many doctrines that are associated with materialism. For example, Nagel doesn’t gainsay the slogan “no difference without a physical difference”—if you and I have different psychological properties, then we must be physically different. Indeed, Nagel’s position is even compatible with the idea that every mental property is identical with some physical property—for example, it may be that being in pain and being in some neurophysiological state X are identical in the same way that being made of water and being made of H2O are identical properties.

more from Elliott Sober at The Boston Review here.

Mind games: Why everything you thought you knew about yourself is wrong

From The Independent:

SmartSo you remember your wedding day like it was yesterday. You can you spot when something is of high quality. You keep yourself well-informed about current affairs but would be open to debate and discussion, You love your phone because it's the best, right? Are you sure? David McRaney from Hattiesburg, Mississippi, is here to tell you that you don't know yourself as well as you think. The journalist and self-described psychology nerd's new book, You Are Not So Smart, consists of 48 short chapters on the assorted ways that we mislead ourselves every day. “The central theme is that you are the unreliable narrator in the story of your life. And this is because you're unaware of how unaware you are,” says McRaney. “It's fun to go through legitimate scientific research and pull out all of the examples that show how everyone, no matter how smart or educated or experienced, is radically self-deluded in predictable and quantifiable ways.” Based on the blog of the same name, You Are Not So Smart is not so much a self-help book as a self-hurt book. Here McRaney gives some key examples.

The Misconception: Your opinions are the result of years of rational, objective analysis.

The Truth: Your opinions are the result of years of paying attention to information that confirmed what you believed, while ignoring information that challenged your preconceived notions.

More here.

Killing the Computer to Save It

From The New York Times:

CompMany people cite Albert Einstein’s aphorism “Everything should be made as simple as possible, but no simpler.” Only a handful, however, have had the opportunity to discuss the concept with the physicist over breakfast. One of those is Peter G. Neumann, now an 80-year-old computer scientist at SRI International, a pioneering engineering research laboratory here.

As an applied-mathematics student at Harvard, Dr. Neumann had a two-hour breakfast with Einstein on Nov. 8, 1952. What the young math student took away was a deeply held philosophy of design that has remained with him for six decades and has been his governing principle of computing and computer security. For many of those years, Dr. Neumann (pronounced NOY-man) has remained a voice in the wilderness, tirelessly pointing out that the computer industry has a penchant for repeating the mistakes of the past. He has long been one of the nation’s leading specialists in computer security, and early on he predicted that the security flaws that have accompanied the pell-mell explosion of the computer and Internet industries would have disastrous consequences.

More here.

Monday, October 29, 2012

Sunday, October 28, 2012

Atul Gawande: Excellence Is Recognizing Details, Failures

From Harvard Magazine:

AtulIn the professional world, what separates greatness from mere competence? Why is a cystic fibrosis treatment center in Minnesota miles ahead of a similar program in Cincinnati? Why are certain teachers getting first-rate results in the classroom when others are merely getting by?

Atul Gawande, a Harvard Medical School professor, surgeon at Brigham and Women’s Hospital, and New Yorker staff writer who has traveled the country researching answers to this question, says the answer has little to do with income level, education, or high intelligence. The key to being great at any given profession, he says, is the ability to recognize failure. “What I found over time, trying to follow and emulate people that were focused on achieving something more than competence, is that they weren’t smarter than anybody else, they weren’t geniuses,” Gawande told an audience at the Harvard Graduate School of Education’s Askwith Forum on Wednesday. “Instead they seemed to be people that could come to grips with their inherent fallibility—fallibility in the systems that they work in, and with what it took to overcome that fallibility.”

More here.

BRAINS PLUS BRAWN

Daniel Lieberman in Edge:

DanI've been thinking a lot about the concept of whether or not human evolution is a story of brains over brawn. I study the evolution of the human body and how and why the human body is the way it is, and I've worked a lot on both ends of the body. I'm very interested in feet and barefoot running and how our feet function, but I've also written and thought a lot about how and why our heads are the way they are. The more I study feet and heads, the more I realize that what's in the middle also matters, and that we have this very strange idea —it goes back to mythology—that human evolution is primarily a story about brains, about intelligence, about technology triumphing over brawn.

Another good example of this would be the Piltdown hoax. The Piltdown forgery was a fossil that was discovered in the early 1900s, in a pit in southern England. This fossil consisted of a modern human skull that had been stained and made to look really old, and an orangutan jaw whose teeth had been filed it down and broken up, all thrown into a pit with a bunch of fake stone tools. It was exactly what Edwardian scientists were looking for, because it was an ape-like face with a big human brain, and also it evolved in England, so it proved that humans evolved in England, which of course made sense to any Victorian or Edwardian. It also fit with the prevailing idea at the time of Elliot Smith, that brains led the way in human evolution because, if you think about what makes us so different from other creatures, people always thought it's our brains. We have these big, enormous, large, fantastic brains that enable us to invent railways and income tax and insurance companies and all those other wonderful inventions that made the Industrial Revolution work.

More here.

Remembering Sri Lanka’s Killing Fields

F9e0ff509445c7a6c436c137d24dbe24.portrait.jpg

Gareth Evans in Project Syndicate (illustration by Dean Rohrer):

Three years ago, in the bloody endgame of the Sri Lankan government’s war against the separatist Liberation Tigers of Tamil Eelam, some 300,000 civilians became trapped between the advancing army and the last LTTE fighters in what has been called “the cage” – a tiny strip of land, not much larger than New York City’s Central Park, between sea and lagoon in the northeast of the country.

With both sides showing neither restraint nor compassion, at least 10,000 civilians – possibly as many as 40,000 – died in the carnage that followed, as a result of indiscriminate army shelling, rebel gunfire, and denial of food and medical supplies.

The lack of outrage mainly reflects the Sri Lankan government’s success in embedding in the minds of policymakers and publics an alternative narrative that had extraordinary worldwide resonance in the aftermath of the terrorist attacks of September 11, 2001. What occurred in the cage, according to this narrative, was the long-overdue defeat, by wholly necessary and defensible means, of a murderous terrorist insurrection that had threatened the country’s very existence.

The other key reason behind the world’s silence is that the Sri Lankan government was relentless in banning independent observers – media, NGOs, or diplomats – from witnessing or reporting on its actions. And this problem was compounded by the timidity of in-country United Nations officials in communicating such information as they had.

President Mahinda Rajapaksa’s government claimed throughout, and still does, that it maintained a “zero civilian casualties” policy. Officials argued that no heavy artillery fire was ever directed at civilians or hospitals, that any collateral injury to civilians was minimal, and that they fully respected international law, including the proscription against execution of captured prisoners.

But that narrative is now being picked apart in a series of recent publications, notably the report last year of a UN Panel of Experts, and in two new books: UN official Gordon Weiss’s relentlessly analytical The Cage: The Fight for Sri Lanka and the Last Days of the Tamil Tigers, and BBC journalist Frances Harrison’s harrowingly anecdotal Still Counting the Dead: Survivors of Sri Lanka’s Hidden War.

What Can You Really Know? Another Round of Physicists vs. Philosophers

Dyson_2-110812_jpg_230x1041_q85

Freeman Dyson reviews Jim Holt's Why Does the World Exist?: An Existential Detective Story, in the NYRB:

The fading of philosophy came to my attention in 1979, when I was involved in the planning of a conference to celebrate the hundredth birthday of Einstein. The conference was held in Princeton, where Einstein had lived, and our largest meeting hall was too small for all the people who wanted to come. A committee was set up to decide who should be invited. When the membership of the committee was announced, there were loud protests from people who were excluded. After acrimonious discussions, we agreed to have three committees, each empowered to invite one third of the participants. One committee was for scientists, one for historians of science, and one for philosophers of science.

After the three committees had made their selections, we had three lists of names of people to be invited. I looked at the lists of names and was immediately struck by their disconnection. With a few exceptions, I knew personally all the people on the science list. On the history list, I knew the names, but I did not know the people personally. On the philosophy list, I did not even know the names.

In earlier centuries, scientists and historians and philosophers would have known one another. Newton and Locke were friends and colleagues in the English parliament of 1689, helping to establish constitutional government in England after the bloodless revolution of 1688. The bloody passions of the English Civil War were finally quieted by establishing a constitutional monarchy with limited powers. Constitutional monarchy was a system of government invented by philosophers. But in the twentieth century, science and history and philosophy had become separate cultures. We were three groups of specialists, living in separate communities and rarely speaking to each other.

When and why did philosophy lose its bite? How did it become a toothless relic of past glories? These are the ugly questions that Jim Holt’s book compels us to ask.

Literature is not Data: Against Digital Humanities

1351415953

Stephen Marche in The LA Review of Books:

BIG DATA IS COMING for your books. It’s already come for everything else. All human endeavor has by now generated its own monadic mass of data, and through these vast accumulations of ciphers the robots now endlessly scour for significance much the way cockroaches scour for nutrition in the enormous bat dung piles hiding in Bornean caves. The recent Automate This, a smart book with a stupid title, offers a fascinatingly general look at the new algorithmic culture: 60 percent of trades on the stock market today take place with virtually no human oversight. Artificial intelligence has already changed health care and pop music, baseball, electoral politics, and several aspects of the law. And now, as an afterthought to an afterthought, the algorithms have arrived at literature, like an army which, having conquered Italy, turns its attention to San Marino.

The story of how literature became data in the first place is a story of several, related intellectual failures.

In 2002, on a Friday, Larry Page began to end the book as we know it. Using the 20 percent of his time that Google then allotted to its engineers for personal projects, Page and Vice-President Marissa Mayer developed a machine for turning books into data. The original was a crude plywood affair with simple clamps, a metronome, a scanner, and a blade for cutting the books into sheets. The process took 40 minutes. The first refinement Page developed was a means of digitizing books without cutting off their spines — a gesture of tender-hearted sentimentality towards print. The great disbinding was to be metaphorical rather than literal. A team of Page-supervised engineers developed an infrared camera that took into account the curvature of pages around the spine. They resurrected a long dormant piece of Optical Character Recognition software from Hewlett-Packard and released it to the open-source community for improvements. They then crowd-sourced textual correction at a minimal cost through a brilliant program called reCAPTCHA, which employs an anti-bot service to get users to read and type in words the Optical Character Recognition software can’t recognize. (A miracle of cleverness: everyone who has entered a security identification has also, without knowing it, aided the perfection of the world’s texts.) Soon after, the world’s five largest libraries signed on as partners. And, more or less just like that, literature became data.

Remarkable Facts: Ending Science As We Know It

Sober_37.6_roulette

Elliott Sober reviews Thomas Nagel's Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False, in Boston Review:

Thomas Nagel, a distinguished philosopher at NYU, is well known for his critique of “materialistic reductionism” as an account of the mind-body relationship. In his new and far-reaching book Mind and Cosmos, Nagel extends his attack on materialistic reductionism—which he describes as the thesis that physics provides a complete explanation of everything—well beyond the mind-body problem. He argues that evolutionary biology is fundamentally flawed and that physics also needs to be rethought—that we need a new way to do science.

Nagel’s new way is teleological—scientific explanations need to invoke goals, not just mechanistic causes. The conventional story of the emergence of modern science maintains that Galileo and Newton forever banished Aristotle’s teleology. SoMind and Cosmos is an audacious book, bucking the tide. Nagel acknowledges that he has no teleological theory of his own to offer. His job, as he sees it, is to point to a need; creative scientists, he hopes, will do the heavy lifting.

Nagel’s rejection of materialistic reductionism does not stem from religious conviction. He says that he doesn’t have a religious bone in his body. The new, teleological science he wants is naturalistic, not supernaturalistic. This point needs to be remembered, given that the book begins with kind words for proponents of intelligent design. Nagel applauds them for identifying problems in evolutionary theory, but he does not endorse their solution.

Nagel’s main goal in this book is not to argue against materialistic reductionism, but to explore the consequences of its being false. He has argued against the -ism elsewhere, and those who know their Nagel will be able to fill in the details. But new readers may be puzzled, so a little backstory may help.

Sunday Poem

Smell and Envy

You nature poets think you've got it, hostaged
somewhere in Vermont or Oregon,
so it blooms and withers only for you,
so all you have to do is name it: primrose
– and now you're writing poetry, and now
you ship it off to us, to smell and envy.

But we are made of newspaper and smoke
and we dunk your roses in vats of blue.
Birds don't call, our pigeons play it close
to the vest. When the moon is full
we hear it in the sirens. The Pleiades
you could probably buy downtown. Gravity
is the receiver on the hook. Mortality
we smell on certain people as they pass.

by Douglas Goetsch
from Nobody's Hell

Hanging Loose Press, Brooklyn, NY, 1999

Saturday, October 27, 2012

Lewis Lapham’s Antidote to the Age of BuzzFeed

From Smithsonian:

Last-Renaissance-Man-Lewis-Lapham-631The counter­revolution has its embattled forward outpost on a genteel New York street called Irving Place, home to Lapham’s Quarterly. The street is named after Washington Irving, the 19th-century American author best known for creating the Headless Horseman in his short story “The Legend of Sleepy Hollow.” The cavalry charge that Lewis Lapham is now leading could be said to be one against headlessness—against the historically illiterate, heedless hordesmen of the digital revolution ignorant of our intellectual heritage; against the “Internet intellectuals” and hucksters of the purportedly utopian digital future who are decapitating our culture, trading in the ideas of some 3,000 years of civilization for…BuzzFeed.

Lapham, the legendary former editor of Harper’s, who, beginning in the 1970s, helped change the face of American nonfiction, has a new mission: taking on the Great Paradox of the digital age. Suddenly thanks to Google Books, JSTOR and the like, all the great thinkers of all the civilizations past and present are one or two clicks away. The great library of Alexandria, nexus of all the learning of the ancient world that burned to the ground, has risen from the ashes online. And yet—here is the paradox—the wisdom of the ages is in some ways more distant and difficult to find than ever, buried like lost treasure beneath a fathomless ocean of online ignorance and trivia that makes what is worthy and timeless more inaccessible than ever. There has been no great librarian of Alexandria, no accessible finder’s guide, until Lapham created his quarterly five years ago with the quixotic mission of serving as a highly selective search engine for the wisdom of the past.

More here.

Wartime Rations

From The New York Times:

Fishman-190I want to hate David Benioff. He’s annoyingly handsome. He’s already written a pair of unputdownable books, one of which was made into Spike Lee’s most heartbreaking film, “The 25th Hour” — for which Benioff was asked to write the screenplay, leading to a second career in Hollywood. (They should just get it over with and put the man in the movies already.) He takes his morning orange juice next to Amanda Peet. And he’s still in his 30s. See what I mean?

Benioff’s new novel reveals why there are so many Russians — not oligarchs or prostitutes, but soldiers and old babushkas — in this nice American boy’s fiction. “City of Thieves” follows a character named Lev Beniov, the son of a revered Soviet Jewish poet who was “disappeared” in the Stalinist purges, as Lev and an accomplice carry out an impossible assignment during the Nazi blockade of Leningrad. Before Lev begins to tell his story, however, a young Los Angeles screenwriter named David visits his grandfather in Florida, pleading for his memories of the siege. But this is no postmodern coquetry. In fact, the novel tells a refreshingly traditional tale, driven by an often ingenious plot. And after that first chapter Benioff is humble enough to get out of its way. For some writers, Russia inspires extravagant lamentations uttered into the eternity of those implacable winters. Happily, Benioff’s prose doesn’t draw that kind of attention to itself.

More here. (Note: Old review but, thanks to Abbas and Margit, I just read the book now and recommend it strongly).