Brian Leiter on the Analytic/Continental Distinction

Over at Philosophy Bites:

Many philosophers self-identify as 'analytic' or 'continental' philosophers. But does this sort of label make sense? Brian Leiter, who, amongst other things, is an expert on Nietzsche, is sceptical of the value of the terms as typically used. In this episode of the Philosophy Bites podcast he explains why.

Listen to Brian Leiter on the Analytic/Contintental Distinction

Rejoice for Utopia is Nigh!

GernsbackProspero in The Economist:

ONE hundred years ago an American immigrant invented science fiction.

Okay, that’s not true. Not even close. People have been building fantastic narratives out of scientific gobbledygook since the days of the Greeks. Lucian of Samosata imagined a trip to the moon over 17 centuries before Jules Verne took a whack at it. And decades before 1911 Verne and H.G. Wells wrote the stories that established the contours of the genre: fantastic voyages in space and time, alien encounters, technology run amok, and so forth. The term “science fiction” wouldn’t even be invented until 1929.

But the genre as a coherent field of literary endeavour—as the thing that takes up a whole wall at your local Barnes & Noble or Waterstone’s—might not have come to be if it weren’t for a failed inventor-turned-publisher with aesthetic ambitions. Naive, utopian and romantic, a man named Hugo Gernsback ended up establishing a new strand of science fiction, one that helped shape (and was shaped by) the American century.

Gernsback had come to America in 1904 with the common immigrant dream of striking it rich. He planned to revolutionise battery technology, but when that didn’t pan out he turned to scientific-magazine publishing. He started out with mail-order catalogues for his imported radio-equipment business, but, as the years went on, his efforts took a more explicitly literary turn. Amazing Stories, which he founded in 1926, has a fair claim to being the first magazine dedicated solely to what he called “scientifiction”. It would go on to help define the genre, publishing the debuts of some of its greatest authors.

Imran Khan: My time has come

From The Telegraph:

RallyImran Khan has been written off before. As a cricketer, he was initially dismissed as having average ability before captaining his team to World Cup glory. For the past 15 years his political party has stumbled from one election humiliation to the next. Now though, he is convinced his time has come. Riding a tsunami of popular support ahead of elections widely expected next year, he is bracing himself for a campaign of dirty tricks. “During a match there comes a time when you know you have the opposition on the mat. It is exactly the feeling now, that I have all the opposition by their balls,” he said, in an interview last month with The Daily Telegraph as he travelled to the north-western city of Peshawar for yet another rally on his 59th birthday. “Whatever they do now will backfire.”

Further evidence of Mr Khan’s steepling ascent was on display in Karachi, Pakistan’s biggest city, today on Christmas Day, when at least 100,000 people turned out to hear his message that change was sweeping the country. The figure is all the more remarkable as the city is far from Mr Khan’s stronghold of Lahore. “I’ve never seen a gathering like this in Karachi in two decades,” said a local journalist covering the rally. Everything changed in October, when he attracted more than 100,000 supporters to a parade ground in Lahore. The world took notice of a new star in Pakistan's political firmament, dominated for decades by a handful of the richest families. A YouGov-Cambridge poll released on Fri Dec 23 found that Khan is the most popular political figure in Pakistan by far, with some 81 per cent of respondents choosing him as the person they think best suited to lead the country. Two thirds meanwhile said they would vote for his PTI party.

More here.

The Hormone Surge of Middle Childhood

Natalie Angier in The New York Times:

KidVIEWED superficially, the part of youth that the psychologist Jean Piaget called middle childhood looks tame and uneventful, a quiet patch of road on the otherwise hairpin highway to adulthood. Said to begin around 5 or 6, when toddlerhood has ended and even the most protractedly breast-fed children have been weaned, and to end when the teen years commence, middle childhood certainly lacks the physical flamboyance of the epochs fore and aft: no gotcha cuteness of babydom, no secondary sexual billboards of pubescence. Yet as new findings from neuroscience, evolutionary biology, paleontology and anthropology make clear, middle childhood is anything but a bland placeholder. To the contrary, it is a time of great cognitive creativity and ambition, when the brain has pretty much reached its adult size and can focus on threading together its private intranet service — on forging, organizing, amplifying and annotating the tens of billions of synaptic connections that allow brain cells and brain domains to communicate.

Subsidizing the deft frenzy of brain maturation is a distinctive endocrinological event called adrenarche (a-DREN-ar-kee), when the adrenal glands that sit like tricornered hats atop the kidneys begin pumping out powerful hormones known to affect the brain, most notably the androgen dihydroepiandrosterone, or DHEA. Researchers have only begun to understand adrenarche in any detail, but they see it as a signature feature of middle childhood every bit as important as the more familiar gonadal reveille that follows a few years later. Middle childhood is when the parts of the brain most closely associated with being human finally come online: our ability to control our impulses, to reason, to focus, to plan for the future.

More here.

Tuesday Poem

An Ox Looks at Man

They are more delicate even than shrubs and they run
and run from one side to the other, always forgetting
something. Surely they lack I don't know what
basic ingredient, though they present themselves
as noble or serious, at times. Oh, terribly serious,
even tragic. Poor things, one would say that they hear
neither the song of the air nor the secrets of hay;
likewise they seem not to see what is visible
and common to each of us, in space. And they are sad,
and in the wake of sadness they come to cruelty.
All their expression lives in their eyes–and loses itself
to a simple lowering of lids, to a shadow.
And since there is little of the mountain about them —
nothing in the hair or in the terribly fragile limbs
but coldness and secrecy — it is impossible for them
to settle themselves into forms that are calm, lasting
and necessary. They have, perhaps, a kind
of melancholy grace (one minute) and with this they allow
themselves to forget the problems and translucent
inner emptiness that make them so poor and so lacking
when it comes to uttering silly and painful sounds:
desire, love, jealousy
(what do we know?) — sounds that scatter and fall in the field
like troubled stones and burn the herbs and the water,
and after this it is hard to keep chewing away at our truth.

by Carlos Drummond de Andrade
from In Praise of Fertile Land
translated by Mark Strand


Václav havel: not quite who you thought….

ID_PI_GOLBE_HAVEL_AP_001

Havel once said that the true dissident is not interested in power, has no desire for office and does not gather votes. It’s an ironic statement coming from a future president. The absurdity of his own rise to power has been pointed out numerous times, first and foremost by Havel himself. And yet, whether or not he lived up to his own values as a politician, Havel always felt that the role of a leader should be no different than the role of a dissident — a leader should simply be a voice for the people. The only kind of politics that makes sense, said Havel, is one that is guided by conscience. Political institutions should be open, dynamic and small, rather than closed, inviolable and huge. “It is better to have organizations springing up ad hoc,” he wrote in The Power of the Powerless, “infused with enthusiasm for a particular purpose and disappearing when that purpose has been achieved.” In other words, institutions are best when they serve a specific purpose, and are not a replacement for community. And they are best when they place moral concerns before political ones. Without addressing the spiritual needs of people, without focusing on real human relationships and personal trust, democracy was likely to be just as absurd as communism. In this, Havel was much more radical than most of his post-democracy peers, at least intellectually. He was a politician who saw good politics as a result rather than a solution.

more from Stefany Anne Golberg at The Smart Set here.

speaking N’Ko

11languages1_span-articleLarge

When Ibrahima Traore takes his sons to a park in Montclair, N.J., he often sits on a bench and reads. He reads English, French and Arabic, but most of the time he reads N’Ko, a language few speakers of those languages would recognize. N’Ko is the standardized writing system for Mande languages, a family of closely related tongues — among them Traore’s language of Mandinka, but also Jula, Bamana, Koyaga, Marka — spoken, for the most part, in eight West African countries, by some 35 million people. N’Ko looks like a cross between Arabic and ancient Norse runes, written from right to left in a blocky script with the letters connected underneath. Traore types e-mail to his family on his laptop in N’Ko, works on his Web site in N’Ko, tweets in N’Ko on his iPhone and iPad and reads books and newspapers written in N’Ko to prepare for the N’Ko classes he teaches in the Bronx and for his appearances on an Internet radio program to discuss cultural issues around the use of N’Ko. For years, the Web’s lingua franca was English. Speakers of French, Hindi and Urdu, Arabic, Chinese and Russian chafed at the advantage the Internet gave not only American pop culture but also its language. For those who lived at the intersection of modern technology and traditional cultures, the problem was even worse. “For a long time, technology was the enemy,” says Inée Slaughter, executive director of the New Mexico-based Indigenous Language Institute, which teaches Native Americans and other indigenous peoples how to use digital technologies to keep their languages vital. Heritage languages were being killed off by increasing urbanization, the spread of formal education and the shift to cash crops, which ended the isolation of indigenous communities. Advances in technology seemed to intensify the decline. “Even in 1999 or 2000, people were saying technology killed their language,” Slaughter says. “Community elders worried about it. As television came into homes, English became pervasive 24/7. Mainstream culture infiltrated, and young kids want to be like that. It was a huge, huge problem, and it’s still there. But now we know ways technology can be helpful.”

more from Tina Rosenberg at the NYT Magazine here.

somewhat toilsome and perhaps fruitless adventure?

Beard_1-011212_jpg_230x802_q85

To put it another way, how do we make the ancient world make sense to us? How do we translate it? Young Taplow doesn’t actually rate Browning’s translation very highly, and indeed—to our tastes—it is written in awful nineteenth-century poetry-speak (“Who conquers mildly, God, from afar, benignantly regardeth,” as Browning puts the key line, is hardly going to send most of us rushing to the rest of the play.) But when, in his lessons, Taplow himself gets excited by Aeschylus’ Greek and comes out with a wonderfully spirited but slightly inaccurate version of one of the murderous bits, the Crock reprimands him—”you are supposed to be construing Greek“—that is, translating the language literally, word for word—”not collaborating with Aeschylus.” Most of us now, I suspect, are on the side of the collaborators, with their conviction that the classical tradition is something to be engaged with, and sparred against, not merely replicated and mouthed. In this context, I can’t resist reminding you of the flagrantly modern versions of Homer’s Iliad by the English poet Christopher Logue, who died on December 2—Kings, War Music, and others—”the best translation of Homer since [Alexander] Pope’s,” as Garry Wills called them in these pages.* This was, I think, both a heartfelt and a slightly ironic comment. For the joke is that Logue, our leading collaborator with Homer, knew not a word of Greek.

more from Mary Beard at the NYRB here.

Monday, December 26, 2011

Sunday, December 25, 2011

What is the Greatest Invention?

ScienceDifferent writers at More Intelligent Life offer their own answers. Samantha Weinberg argues it is the Web. Edward Carr makes the case for the blade. Roger Highfield's candidate, the modern scientific method, is probably the answer I agree with most:

All great inventions rest on understanding how things work. And the greatest of all is the über-invention that has provided the insights on which other inventions depend: the modern scientific method, the realisation that we cannot grasp the way the world works by rational thought alone.

To gain meaningful insights into the scheme of things, logic has to be accompanied by asking probing questions of nature. To advance understanding, we need to devise rational conjectures and probe them to destruction through controlled tests, precise observations and clever analysis. The upshot is an unending dialogue between theory and experiment.

Unlike a traditional invention, the scientific method did not come into being at a particular time: its history is complex and stretches back long before 1833, when the term “scientist” was coined by the English polymath William Whewell. The method is not a concrete gadget like Gutenberg’s press, the computer or the Pill. Nor is it a brainwave like the non-geocentric universe, the Indo-Arab counting system or the theory of evolution. It is a fecund way of thinking on which the modern world rests. In relatively few generations, the rigorous application of the method has bootstrapped modern society through a non-linear accumulation of both knowledge and technology. Its impact on everyday life is ubiquitous and indisputable, even though a surprising number of people, including some senior politicians, have only a feeble grasp of its significance.

Top 12 Longreads of 2011

Ed_Yong_Not_Exactly_Rocket_Ed Yong over at Not Exactly Rocket Science:

The wonderful site Longreads is collating people’s picks of the best long features of the year. Some say that the internet is triggering a renaissance for long-form writing and I very much agree. Over the past 12 days, I’ve been tweeting my picks and the full list should be up soon. Here it is:

The world of science offers great opportunities for journalists to flex their writing muscles by fusing rich storytelling and reporting with deft explanatory skill. After all, what could make for better stories than intelligent people trying to understand how the world works?

Here are my top dozen stories from the year, originally tweeted as daily treats in the run-up to Christmas. Yes, I know everyone else has picked five, but we bloggers hate word restrictions – I’ll pick my Top 67 of 2011 and you’ll like it. Each of these features left a firm impression so, taking my lead from Jodi Ettenberg, each choice comes with a note about where I was when I read it.

Here they are, in no particular order:

The Mystery of the Canadian Whiskey Fungus by Adam Rogers (Wired; read at my desk during an uneventful work day)

This is a superb whodunit featuring James Scott, the Sherlock Holmes of fungus – an old-school scientist in the modern world, trying to solve the mystery of the “angel’s share”. It’s packaged with confident wit and vivid, sensory prose (check out that lede), and Rogers finds space to take in a brief history of distillation and a look at the dying art of mycology. The best piece about fungus you’ll likely ever read.

Bad Christmas Gifts – A Neuroscientific Gifting Guide

Golden-Christmas-gifts-300x449Jordan Gaines in Brain Blogger:

Gift-giving isn’t easy — particularly during the holidays, when there are so many different people for whom to buy. It’s overwhelming and stressful, and people cope with the burden in different ways. Some, like myself, begin lists in September, all the while picking up hints from others and taking note, then making my purchases before Thanksgiving. Others rush to the mall the weekend before — or of — Christmas, hoping something will catch their eye or they’ll snag a great deal.

At one point or another, we’ve all been on the receiving end of a poor or ill-fitting gift. How did you react to it? Or, more importantly, what did it mean to you in terms of your relationship with the giver? A study in recent years has explored exactly how men and women react upon receiving good and bad gifts.

A paper published in Social Cognition by Elizabeth Dunn and colleagues at the University of British Columbia explored the theory that while “good” gifts would reaffirm similarity between couples, poor gift-giving may cause partners to question their compatibility.

What Love Looks Like

From Orion Magazine:

ChrisFROM THE MOMENT I HEARD about Bidder #70 raising his paddle inside a BLM auction to outbid oil and gas companies in the leasing of Utah’s public lands, I recognized Tim DeChristopher as a brave, creative citizen-activist. That was on December 19, 2008, in Salt Lake City. Since that moment, Tim has become a thoughtful, dynamic leader of his generation in the climate change movement. While many of us talk about the importance of democracy, Tim has put his body on the line and is now paying the consequences.

On March 2, 2011, Tim DeChristopher was found guilty on two felony charges for violation of the Federal Onshore Oil and Gas Leasing Reform Act and for making false statements. He refused to entertain any type of plea bargain. On July 26, 2011, he was sentenced to two years in a federal prison with a $10,000 fine, followed by three years of supervised probation. Minutes before receiving his sentence, Tim DeChristopher delivered an impassioned speech from the courtroom floor. At the end of the speech, he turned toward Judge Dee Benson, who presided over his trial, looked him in the eye, and said, “This is what love looks like.” Minutes later, he was placed in handcuffs and briskly taken away.

More here.

The accidental universe: Science’s crisis of faith

Alan P. Lightman in Harper's:

UniverseThe history of science can be viewed as the recasting of phenomena that were once thought to be accidents as phenomena that can be understood in terms of fundamental causes and principles. One can add to the list of the fully explained: the hue of the sky, the orbits of planets, the angle of the wake of a boat moving through a lake, the six-sided patterns of snowflakes, the weight of a flying bustard, the temperature of boiling water, the size of raindrops, the circular shape of the sun. All these phenomena and many more, once thought to have been fixed at the beginning of time or to be the result of random events thereafter, have been explained as necessary consequences of the fundamental laws of nature—laws discovered by human beings. This long and appealing trend may be coming to an end. Dramatic developments in cosmological findings and thought have led some of the world’s premier physicists to propose that our universe is only one of an enormous number of universes with wildly varying properties, and that some of the most basic features of our particular universe are indeed mere accidents—a random throw of the cosmic dice. In which case, there is no hope of ever explaining our universe’s features in terms of fundamental causes and principles.

It is perhaps impossible to say how far apart the different universes may be, or whether they exist simultaneously in time. Some may have stars and galaxies like ours. Some may not. Some may be finite in size. Some may be infinite. Physicists call the totality of universes the “multiverse.” Alan Guth, a pioneer in cosmological thought, says that “the multiple-universe idea severely limits our hopes to understand the world from fundamental principles.” And the philosophical ethos of science is torn from its roots. As put to me recently by Nobel Prize–winning physicist Steven Weinberg, a man as careful in his words as in his mathematical calculations, “We now find ourselves at a historic fork in the road we travel to understand the laws of nature. If the multiverse idea is correct, the style of fundamental physics will be radically changed.”

More here.

Saturday, December 24, 2011

Forced Merriment: The True Spirit of Christmas

OB-RD679_hitche_OR_20111223171154A previously unpublished, oddly timely, contrarian piece by the late Christopher Hitchens in the WSJ Online (I'm still celebrating by the way):

[T]he thing about the annual culture war that would probably most surprise those who want to “keep the Christ in Christmas” is this: The original Puritan Protestants regarded the whole enterprise as blasphemous. Under the rule of Oliver Cromwell in England, Christmas festivities were banned outright. The same was true in some of the early Pilgrim settlements in North America.

Last year I read a recent interview with the priest of one of the oldest Roman Catholic churches in New York, located downtown and near Wall Street. Taking a stand in favor of Imam Rauf's “Ground Zero” project, he pointed to some parish records showing hostile picketing of his church in the 18th century. The pious protestors had been voicing their suspicion that a profane and Popish ceremonial of “Christ Mass” was being conducted within.

Now, that was a time when Americans took their religion seriously. But we know enough about Puritans to suspect that what they really disliked was the idea of a holiday where people would imbibe strong drink and generally make merry. (Scottish Presbyterians did not relax their hostility to Yuletide celebrations until well into the 20th century.) And the word “Yule” must be significant here as well, since pagans of all sorts have been roistering at the winter solstice ever since records were kept, and Christians have been faced with the choice of either trying to beat them or join them.

Kenneth Arrow on Economics and Inequality

Kenneth Arrow in the Boston Review:

ScreenHunter_03 Dec. 24 16.20The specific problems of the current U.S. economy—the drastic increase in unemployment and sluggish increase in output—overlay a tendency of much longer duration, a drastic and rapid increase in the inequality of income. Every economy of complexity produces an unequal distribution of the good things in life. But the period immediately following World War II showed a considerably increased equality of income compared with either the Great Depression or the previous period of relative prosperity.

Since the middle 1980s, this tendency has been reversed. In the United States, median family income (adjusted for size) has remained virtually constant since 1995, while per capita income has risen at about 2 percent per annum. The difference in income between college graduates and those with only high school degrees increased at a rapid rate, even during the period before 1990 when per capita income grew very slowly. Further, the proportion of the college-age population enrolled in college, which had been rising rapidly, stopped increasing and has remained the same for thirty years.

Clearly, the bulk of the gains from increased productivity went to a small group of upper-income recipients. Indeed, closer study has shown that the bulk of the increase went to the top 1 percent of income recipients and much of that to those in the top .1 percent.

More here.