“The Most Human Human”: Can computers truly think?

From Salon:

Comp This week's recommendation must be delivered with a caveat: Brian Christian's “The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive,” is billed as an account of the author's participation in a Turing Test, but it's best enjoyed if you don't expect to read much about the test itself. A Turing Test — named for Alan Turing, the 20th-century mathematician who proposed it — asks a judge to converse with two unseen entities, a computer and a human being, then attempt to determine which is which. Turing estimated that by 2000 there would exist a computer sophisticated enough to pass itself off as a person in the course of a five-minute conversation. At that point, Turing contended, “one will be able to speak of machines thinking without expecting to be contradicted.”

Christian played the piquant role of “confederate” in the 2009 Loebner Prize competition, an annual Turing Test sponsored by an eccentric entrepreneur. His job was to seem more human than the computer software entered by the primary contestants, who are programmers. The prize money goes to the author of the most human-seeming chatbot, but there's also a citation for the confederate rated “most human” by the judges. “Most Human Human” was the title that Christian, a science writer and poet, coveted for himself.

More here.

Age affects us all

From PhysOrg:

Studyhumansa Early comparisons with rats, mice, and other short-lived creatures confirmed the hunch. But now, the first-ever multi-species comparison of human aging patterns with those in chimps, gorillas, and other primates suggests the pace of human aging may not be so unique after all. The findings appear in the March 11 issue of Science. You don't need to read obituaries or sell life insurance to know that death and disease become more common as we transition from middle to old age. But scientists studying creatures from mice to fruit flies long assumed the aging clock ticked more slowly for humans.

We had good reason to think human aging was unique, said co-author Anne Bronikowski of Iowa State University. For one, humans live longer than many animals. There are some exceptions – parrots, seabirds, clams and tortoises can all outlive us – but humans stand out as the longest-lived primates. “Humans live for many more years past our reproductive prime,” Bronikowski said. “If we were like other mammals, we would start dying fairly rapidly after we reach mid-life. But we don't,” she explained.

More here.

Thursday, March 10, 2011

The Scientific Method is Alive and Well

You-Are-Here-300x225 Daniel Holz over at Cosmic Variance:

I’ve been on somewhat of an unintended hiatus for the past few months, as I try to wrap up some projects, and deal with a few other things in my life. However, I just read something that has given me a kick in the pants. And I don’t mean that in a good way. In late December there was an article by Jonah Lehrer in the New Yorker titled “The truth wears off”. Much more suggestive was the subtitle, “Is there something wrong with the scientific method?”. The story discusses the “decline effect”: an article is published with startling results, and then subsequent work finds increasingly diminished evidence for the initial unexpected result. It’s as if there’s “cosmic habituation”, with the Universe conspiring to make a surprising result go away with time. The last paragraph sums things up:

The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.

I don’t particularly disagree with any of this. But it’s completely besides the point, and to untutored ears can be immensely misleading. The article is a perfect example of precisely the effect it seeks to describe (there must be a catchy word for this? Intellectual onomatopoeia?). The article gives a few examples of people finding interesting results, only to have them disappear on sustained scrutiny.

Rwanda in Six Scenes

Stephen W. Smith in the LRB:

A number of memories connected with Rwanda play in my mind like scenes from a movie, although I don’t pretend they add up to a film. In 1994 a genocide was committed against the Tutsi minority in Rwanda. All else about this small East African country, ‘the land of a thousand hills’, is open to question and, indeed, bears re-examination. ‘Freedom of opinion is a farce,’ Hannah Arendt wrote in 1966 in ‘Truth and Politics’, ‘unless factual information is guaranteed and the facts themselves are not in dispute.’ The problem with Rwanda is not only that opinions and facts have parted company but that opinion takes precedence.

The first scene: I’m walking beside Paul Kagame, the current president of Rwanda and then a rebel leader, past low picket fences and small prefabricated houses in a residential suburb of Brussels. It’s cold and our breath mingles in the air as we speak. Kagame is swaddled in a thick coat. Even so, he remains a spindly figure with a birdlike face. I can’t warm to him, but I know him well enough by now to hazard the question that has been preying on my mind for a while: ‘Why is it always you, the vice-president, whom I meet when I have dealings with the Rwandan Patriotic Front, and not Alexis Kanyarengwe?’ Kanyarengwe was the movement’s president. ‘Don’t worry,’ he chuckles. ‘You’re seeing the boss. Kanyarengwe is only our front man. You’d be wasting your time.’

This was in 1992. The RPF had been set up in 1987 in Uganda by Tutsi exiles. Kagame’s parents had fled with him to Uganda when he was four. At the time of our meeting in Brussels, Kagame was avoiding the French. A few months earlier, in 1991, he’d just returned to his hotel near the Eiffel Tower from a meeting with officials at the Elysée when the French police called him in for interrogation. They were inquiring into a murky incident that was never entirely elucidated. Police sources claimed that members of Kagame’s delegation were ‘roaming around town with bags full of cash to buy weapons’; Kagame claimed the police were trying to discredit him. Tensions were running high between the rebel movement and France.

Understanding Libya’s Michael Corleone

Saif Benjamin Pauker interviews Benjamin Barber about Saif al-Qaddafi, Libya and the Monitor Group, in Foreign Policy:

Foreign Policy: How is it that so many people got Saif al-Qaddafi so wrong?

Benjamin Barber: Who got it wrong? I don't think anyone got him wrong. Is that the idea: to go back and say in 2006, 2007, 2008, when the U.S. recognized the government of Muammar al-Qaddafi, when the sovereign oil fund that Libya set up and that people like Prince Andrew and Peter Mandelson, or organizations like the Carlyle Group and Blackstone, were doing business with, and the heavy investments oil companies were making while others were running around and making all sorts of money — that those of us who went in trying to do some work for democratic reform, that we somehow got Saif wrong?

Until Sunday night a week ago [Feb. 27], Saif was a credible, risk-taking reformer. He several times had to leave Libya because he was at odds with his father. The [Gaddafi] Foundation's last meeting in December wasn't held in Tripoli because he was nervous about being there; it was held in London. And the people who worked for it and the foundation's work itself have been recognized by Amnesty International and Human Rights Watch as genuine, authentic, and having made real accomplishments in terms of releasing people from prison, saving lives. The Carnegie Endowment for International Peace wrote in a report in January that: “For much of the last decade, Qadhafi's son Saif was the public face of human rights reform in Libya and the Qadhafi Foundation was the country's only address for complaints about torture, arbitrary detention, and disappearances. The Foundation issued its first human rights report in 2009, cataloging abuses and calling for reforms, and a second report released in December 2010 regretted 'a dangerous regression' in civil society and called for the authorities to lift their 'stranglehold' on the media. In the interim, Saif assisted Human Rights Watch in conducting a groundbreaking press conference which launched a report in Tripoli in December 2009.”

Aside from the foundation, one of the things that I was involved with in my interaction with Muammar as well as Saif Qaddafi was the release of the hostages: the four Bulgarian nurses and the Palestinian doctor. I had said to the colonel in our first meeting that the release of the hostages was a condition for any more such interactions and, indeed, for the continuation with the rapprochement with the West, and he had said he understood. That modest pressure added one more incentive to the decision to release the hostages. I was called the day before the public announcement of the release by Qaddafi's secretary and told: “You see; the leader has acted on his word.”

Well today of course, it's all radically changed. But second-guessing the past, I mean, it's just 20/20 hindsight.

amongst the chimps

Chance

During my stay at Fauna, some of my relationships with the chimps were much more demanding than others. Take my friendship with Chance, who along with Rachel is one of the most disturbed chimpanzees at Fauna. At first, I thought Chance hated my guts. This is because she was always trying to scare me. Whenever I approached her room, or simply walked past it, Chance made this loud puckering sound with her lips and juked her shoulder at me as if she was about to charge. Even though there was thick caging between us, her aggressive feints and angry noises always made me jump. But as I learned more about her personality and her awful childhood, I came to understand that when Chance tried to frighten me, she was actually doing all that she could to reach out to me. Chance was, in fact, desperate for her existence to be acknowledged, and she had been this way since the day she was born.

more from Andrew Westoll at The Walrus here.

Industrious Amazement: A Notebook

Anna-kamienska-135

Poetry as a cemetery. A cemetery of faces, hands, gestures. A cemetery of clouds, colors of the sky, a graveyard of winds, branches, jasmine (the jasmine from Swidnik), the statue of a saint from Marseilles, a single poplar over the Black Sea, a graveyard of moments and hours, burnt offerings of words. Eternal rest be yours in words, eternal rest, eternal light of recollection. Cemeteries of sunsets, running with arms spread, a child’s short dress, winter, snowstorms, footsteps on the stairs, tears, a letter with a serious confession, silver faces, the shoemaker’s stall, parting, pain, sorrow. Everything preserved, buried in amber tombs of words. The sea, grief trickling from someone’s eyes, parting; faith in God, arrivals and departures, loneliness heavier than death, sweet as death. Anxiety and peace. The streets of cities. A monk’s belly bumps up against a tourist in catacombs. First communion. First love. First storm at sea. First night. A dog’s eyes, eyes of the beloved, unclosed eyes of a dead man, glazed with one tear. Barrows of memory. Mummies, the amputated hands and feet of statues. A deer emerges from the grove, stops and stares. A footbridge across the river in the flutter of geese and bare feet, flowering fields. Grandfather’s death, his moustache in the coffin. A dog’s howl.

more from Anna Kamienska at Poetry here.

the last lingua franca

TLS_Hoge_734897a

The dominance of English today as the language of business, science and popular entertainment appears unassailed and perhaps unassailable. For the linguist David Crystal, it is entirely plausible that “English, in some shape or form, will find itself in the service of the world community for ever”. In his new, engaging and learned book, The Last Lingua Franca, Nicholas Ostler challenges this widespread confidence in the continued future of English as the dominant global language, and, more radically, questions whether there will be any need at all for a single language of international communication “in a world where digital technology is cheap and ubiquitous”. Acknowledging the unparalleled geographical expanse of English, Ostler draws a firm distinction between English as a mother tongue and English as a lingua franca or communicative tool for non-native speakers. This distinction motivates two separate questions concerning the future of English. First, will English, given its spread as a native language, split into a range of separate languages akin to the development of Latin into the Romance languages? Second, will it continue to be a widely used lingua franca, possibly even increase its influence? Ostler’s answer to both of these questions is a resolute no. In a world linked by digital communication networks, native speaker communities of English, no matter their location, are able to remain in constant contact and familiar with each other’s varieties. Regular communicative interaction promotes parallel linguistic development; only when a group of speakers becomes communicatively isolated (or isolates itself) can its dialect diverge from related dialects to the point of becoming a distinct language.

more from Kerstin Hoge at the TLS here.

Thursday Poem

Bullet

I have a bullet made of icy silver to give you.

I prepared it last night with dirty, sweet, infallible blood. I prayed with it for hours. I attended it with candles and the most secret invocations.

First off, I blinded it, because a bullet must never see the ominous air or the body it will encounter. After, I deafened it, so that it wouldn’t hear the cries or threats or music of the flesh and bones while shattering.

I only left it lips so it could whistle.

Understand what I say:

whistles are bullets’ words: they are their ruthless final kisses piercing the smoothness of the night; their wonder and their plea, their breath.

by Carlos López Degregori
© 2010
translation: Robin Myers
publisher: PIW, 2010

Read more »

Quotable Harvard

From Harvard Magazine:

Quote2 “Some may find the list below revealing of my biases,” he writes. “In particular, there is a tendency toward liberalism. In my own defense I note only that it is not easy to find quotations of a conservative nature by Harvard people. There are some—such as ‘Any one may so arrange his affairs that his taxes shall be as low as possible…there is not even a patriotic duty to increase one’s taxes,’ from Learned Hand, LL.B. 1896, or ‘States like those [Iraq, Iran, and North Korea] and their terrorist allies, constitute an axis of evil, aiming to threaten the peace of the world,’ by David Frum, J.D. ’87, in a speech written for George W. Bush, M.B.A. ’75—but they are few and far between. I leave it to others to explain what historical and sociological factors may underlie a Crimson slant to the left, or whether there is some inherent correlation between political and quotational innovation in general.” ~The Editors

We may have democracy, or we may have wealth concentrated in the hands of a few, but we cannot have both.—Louis D. Brandeis, LL.B. 1877,
quoted in Labor, October 14, 1941

There may be said to be two classes of people in the world: those who constantly divide the people of the world into two classes, and those who do not.—Robert Benchley, A.B. 1912,
Of All Things (1921)

Nobody dies from lack of sex. It’s lack of love we die from.—Margaret Atwood, A.M. ’62, Litt.D. ’04,
The Handmaid’s Tale (1986)

I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce, and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry, and Porcelaine.—John Adams, A.B. 1755, LL.D. 1781,
Letter to Abigail Adams, May 12, 1780

More here.

From Middlemarch to Money, the best stories of societies under stress

From The Guardian:

Money-by-Martin-Amis-007 Justin Cartwright was born in South Africa and educated in America and England. His novels have won numerous awards. In Every Face I Meet was shortlisted for the Booker Prize, Leading the Cheers won the Whitbread novel award and The Promise of Happiness won the Hawthornden Prize for Literature in 2005. He has won other awards including a Commonwealth Writers' prize and the South African Sunday Times award. He lives in north London with his wife and, occasionally, with his two sons. From Middlemarch to Money, the novelist picks the best stories of societies under stress

1. Rabbit at Rest by John Updike

The Rabbit series is stunningly observant of changing America over five novels and four decades. Rabbit at Rest stands out. It is wonderfully assured, as though after three decades Updike know had come to know Rabbit Angstrom to the depths of his being.

2. Midnight's Children by Salman Rushdie

India and its bewildering diversity, deployed in extravagant and beautiful prose.

3. Little Dorrit by Charles Dickens

Dickens lived with the dark personal knowledge that you could go up or down in society and his novels often have a dark shadow of the workhouse hanging over them. I could add at least three others, but Mr Merdle in Little Dorrit seems to come straight from one Dickens's own nightmares.

4. Disgrace by JM Coetzee

Devastating and prescient on the state of South Africa, post-apartheid. Although his take on the new South Africa was dark, his intimations both about the tolerance of violence and the disregard for high culture have proved horribly prophetic.

5. Middlemarch by George Eliot

The father of all English state of the nation novels and strangely contemporary in its multiple layers and themes, which include marriage, hypocrisy, politics and the status of women.

More here.

Zakaria’s World: Are America’s best days really behind us?

Joseph S. Nye Jr. in Foreign Policy:

ScreenHunter_03 Mar. 10 12.06 Fareed Zakaria is one of our most perceptive analysts of America's role in the world, and I generally agree with him. But in the case of his new special essay for Time, “Are America's Best Days Behind Us?,” I think he paints too gloomy a picture of American decline.

Americans are prone to cycles of belief in decline, and the term itself confuses various dimensions of changing power relations. Some see the American problem as imperial overstretch (though as a percentage of GDP, the United States spends half as much on defense as it did during the Cold War); some see the problem as relative decline caused by the rise of others (though that process could still leave the United States more powerful than any other country); and still others see it as a process of absolute decline or decay such as occurred in the fall of ancient Rome (though Rome was an agrarian society with stagnant economic growth and internecine strife).

Such projections are not new. As Zakaria notes, America's Founding Fathers worried about comparisons to the decline of the Roman Republic. A strand of cultural pessimism is simply very American, extending back to the country's Puritan roots. English novelist Charles Dickens observed a century and a half ago: “[I]f its individual citizens, to a man, are to be believed, [America] always is depressed, and always is stagnated, and always is at an alarming crisis, and never was otherwise.”

More here.

People don’t know when they’re lying to themselves

Ed Yong in Not Exactly Rocket Science:

ScreenHunter_02 Mar. 10 11.53 Whether it’s a drug-addled actor or an almost-toppled dictator, some people seem to have an endless capacity for rationalising what they did, no matter how questionable. We might imagine that these people really know that they’re deceiving themselves, and that their words are mere bravado. But Zoe Chance from Harvard Business School thinks otherwise.

Using experiments where people could cheat on a test, Chance has found that cheaters not only deceive themselves, but are largely oblivious to their own lies. Their ruse is so potent that they’ll continue to overestimate their abilities in the future, even if they suffer for it. Cheaters continue to prosper in their own heads, even if they fail in reality.

Chance asked 76 students to take a maths test, half of whom could see an answer key at the bottom of their sheets. Afterwards, they had to predict their scores on a second longer test. Even though they knew that they wouldn’t be able to see the answers this time round, they imagined higher scores for themselves (81%) if they had the answers on the first test than if they hadn’t (72%). They might have deliberately cheated, or they might have told themselves that they were only looking to “check” the answers they knew all along. Either way, they had fooled themselves into thinking that their strong performance reflected their own intellect, rather than the presence of the answers.

And they were wrong – when Chance asked her recruits to actually take the hypothetical second test, neither group outperformed the other. Those who had used the answers the first-time round were labouring under an inflated view of their abilities.

More here.

How the penis lost its spikes

Zoë Corbyn in Nature:

ScreenHunter_01 Mar. 10 11.48 Sex would be a very different proposition for humans if — like some animals including chimpanzees, macaques and mice — men had penises studded with small, hard spines.

Now researchers at Stanford University in California have found a molecular mechanism for how the human penis could have evolved to be so distinctly spine-free. They have pinpointed it as the loss of a particular chunk of non-coding DNA that influences the expression of the androgen receptor gene involved in hormone signalling.

“It is a small but fascinating part of a bigger picture about the evolution of human-specific traits,” said Gill Bejerano, a developmental biologist at Stanford who led the work along with colleague David Kingsley. “We add a molecular perspective to a discussion that has been going on for several decades at least.”

Published in Nature today1, the research also suggests a molecular mechanism for how we evolved bigger brains than chimpanzees and lost the small sensory whiskers that the apes — who are amongst our closest relatives and with whom it has been estimated we share 96% of our DNA — have on their face.

More here.

Wednesday, March 9, 2011

Why do conservatives hate trains so much?

David Weigel in Slate:

110307_POL_trainTN In the movie version of Atlas Shrugged, there is a scene in which Ayn Rand's libertarian heroes defy all odds, deploy some untold amount of private funding, and launch the fastest high-speed train in history over rails of experimental metal. “The run of the John Galt Line is thrilling,” wrote the libertarian federal judge Alex Kozinski. “When it crossed the bridge made of Rearden Metal, I wanted to stand up and cheer.”

That's in the fantasy world. In the real world, libertarians aren't cheering for high speed rail but rather trying to stop it from being built. They are succeeding. In Ohio, Gov. John Kasich campaigned against a high-speed rail line funded by the stimulus, got elected, and turned down the funding. In Wisconsin, Gov. Scott Walker did the same thing, only more so—his anti-train campaign even had its own Web site. In Florida, the state Supreme Court has just approved Gov. Rick Scott's decision to reject $2.4 billion of federal funds to build a Tampa-Orlando rail line; the state was being asked to contribute only $280 million to finish it off. The funding was originally agreed to by Charlie Crist, one of the Tea Party's archenemies, so Scott's victory could hardly be any sweeter.

But it could hardly make less sense to liberals. What, exactly, do Republicans, conservatives, and libertarians have against trains? Seriously, what?

More here.

An Unsung Hero of the Nuclear Age

Ron Rosenbaum in Slate:

110228_SPEC_minutemenmissle_TN It was a risk. Dedicating a book to someone I'd had had a five-minute phone conversation with three decades ago. Someone who, last I'd heard, had become a long-haul trucker and whom I'd given up trying to track down.

But I went ahead and dedicated my new book, How the End Begins: The Road to a Nuclear World War III, to Maj. Harold Hering because Maj. Hering sacrificed his military career to ask a Forbidden Question about launching nuclear missiles. A question that exposed the comforting illusions of the so called fail-safe system designed to prevent “unauthorized” nuclear missile launches.

It was a question that changed his life, and changed mine, and may have changed—even saved—all of ours by calling attention to flaws in our nuclear command and control system at the height of the Cold War. It was a question that makes Maj. Hering an unsung hero of the nuclear age. A question that came from inside the system, a question that has no good answer: How can any missile crewman know that an order to twist his launch key in its slot and send a thermonuclear missile rocketing out of its silo—a nuke capable of killing millions of civilians—is lawful, legitimate, and comes from a sane president?

More here. [Thanks to Moin Rahman.]

What Scientists Believe

20110215_TNA29Lopatinhomepage2

The notion that science and religion are at war is one of the great dogmas of the present age. For journalists, it is a prism through which to understand everything from the perennial kerfuffles over teaching evolution to the ethics of destroying human embryos for research. To many scientists, religious belief seems little more than a congeries of long-discredited pre-modern superstitions. For many religious believers, modern science threatens a deeply held faith that man is more than a mere organism and that our status as free beings bound by natural law implies the existence of a transcendent deity. But this is not the whole story. Every year, countless new books try to reconcile the claims of truths revealed by divine inspiration and those that are the product of earthly reason. Foundational developments and arcane speculations from theoretical physics — from the latest findings of quantum mechanics to the search for a “Theory of Everything” — take on a metaphysical import in the popular mind. One of the best known examples involves the cosmologist Stephen Hawking, who famously concluded his 1988 bestseller A Brief History of Time with the suggestion that our search for scientific meaning may someday allow us to “know the mind of God.” More recently, Hawking has backed away from this statement. His new book, The Grand Design, which posits that the universe may have created itself out of quantum fluctuations, is but the latest in a long line of volumes by prominent physicists and cosmologists translating scientific theory for a popular audience. Along with volumes by biologists with a flair for explaining complex concepts, these books have become a locus of debate about the place of God and man in our understanding of the universe.

more from Peter Lopatin at The New Atlantis here.

gutters awash in blood

Civilwar

In the early months of 1861—as the Confederate flag unfurled above Fort Sumter, as bands played and newly formed regiments paraded in towns and cities throughout the North and the South—two civilians sat disconsolately at the sidelines of the Civil War. One had recently taken a desk job running a horse-drawn trolley line. He spent most of his days pushing papers, trying his hardest to concentrate on the minutiae of fare revenues and fodder costs, in an office permeated with pungent aromas from the company’s adjacent stables. The other man was a down-at-the-heels, small-town shop clerk who had come to the city in search of an officer’s commission. He camped out at his in-laws’ house, trudging around the city each day, fruitlessly trying to attract the attention of the local military authorities. The trolley-car executive was named William Tecumseh Sherman. The luckless clerk was Ulysses S. Grant. Both—as unknown to one other, probably, as each was to the nation—had found themselves in St. Louis.

more from Adam Goodheart at The American Scholar here.

an unembarrassed steamroller of multiculturalism

Article00

The writer William Dalrymple lives in a farmhouse on the outskirts of New Delhi with his wife, their three children, four incestuous goats, a cockatoo, and the usual entourage of servants that attends any successful man in India’s capital city. The previous resident of the house, a British journalist, was driven from the country by death threats after he published an article in Time magazine outing the previous Indian prime minister’s bladder problems and habit of nodding off during meetings. Dalrymple is also British—Scottish, to be exact—but his controversial statements are more likely to concern the country’s Mughal or British past. He is today India’s most famous narrative historian. A number of modern British writers—including Geoff Dyer, Patrick French, and the late Bruce Chatwin—have been fascinated by the land that their ancestors once ruled, but Dalrymple is unique, in the past twenty years, for how rigorously he has pursued that fascination, writing one brilliant travel book (City of Djinns: A Year in Delhi), two vivid histories (White Mughals and The Last Mughal), and one anthology of acute journalism (The Age of Kali) about South Asia. He came to India before it had achieved its status as a frontier boomland for computer programmers and writers alike, and he has lived there, on and off, since 1989. As a result, at the age of forty-five, he has become something of a godfather to a generation of writers who are producing nonfiction about the country. The fact that Dalrymple looks like a sunnier version of the actor James Gandolfini and loves to party no doubt helps with this reputation.

more from Karan Mahajan at Bookforum here.