Giving thanks for error bars

Sean Carroll in Cosmic Variance:

FileStandard_deviation_diagramError bars are a simple and convenient way to characterize the expected uncertainty in a measurement, or for that matter the expected accuracy of a prediction. In a wide variety of circumstances (though certainly not always), we can characterize uncertainties by a normal distribution — the bell curve made famous by Gauss. Sometimes the measurements are a little bigger than the true value, sometimes they’re a little smaller. The nice thing about a normal distribution is that it is fully specified by just two numbers — the central value, which tells you where it peaks, and the standard deviation, which tells you how wide it is. The simplest way of thinking about an error bar is as our best guess at the standard deviation of what the underlying distribution of our measurement would be if everything were going right. Things might go wrong, of course, and your neutrinos might arrive early; but that’s not the error bar’s fault.

Now, there’s much more going on beneath the hood, as any scientist (or statistician!) worth their salt would be happy to explain. Sometimes the underlying distribution is not expected to be normal. Sometimes there are systematic errors. Are you sure you want the standard deviation, or perhaps the standard error? What are the error bars on your error bars?

More here.

the suffering of the vegetables

15318_420789977847_538727847_5357287_3545709_n

Sir Jagadish Chandra Bose, the aforementioned carrot vivisector, was a serious man of science. Born in what is today Bangladesh in 1858, Bose was a quintessential polymath: physicist, biologist, botanist, archaeologist. He was the first person from the Indian subcontinent to receive a U.S. patent, and is considered one of the fathers of radio science, alongside such notables as Tesla, Marconi, and Popov. He was elected Fellow of the Royal Society in 1920, becoming the first Indian to be honored by the Royal Society in the field of science. It’s clear that Sir Jagadish Chandra Bose was a scientist of some weight. And, like many scientists of weight, he has become popularly known for his more controversial pursuits — in Bose’s case, his experiments in plant physiology. Perhaps it was his work in radio waves and electricity that inspired Bose’s investigations into what we might call the invisible world. Bose strongly felt that physics could go far beyond what was apparent to the naked eye. Around 1900, Bose began his investigations into the secret world of plants. He found that all plants, and all parts of plants, have a sensitive nervous system not unlike that of animals, and that their responses to external stimuli could be measured and recorded. Some plant reactions can be seen easily in sensitive plants like the Mimosa, which, when irritated, will react with the sudden shedding or shrinking of its leaves. But when Bose attached his magnifying device to plants from which it was more difficult to witness a response, such as vegetables, he was astounded to discover that they, too, became excited when vexed. All around us, Bose realized, the plants are communicating. We just don’t notice it.

more from Stefany Anne Golberg at The Smart Set here.

on the American way of death

Tumblr_lvaf00DUTj1qhwx0o

There are, in other words, two aspects to the phenomenon of death. On the one hand, there is death itself — immutable, the single certainty all of us face, unchanging as it has always been. On the other hand, though, is how we living face the death of others, which is constantly changing, composed of ritual, emotion, and something that each culture and each generation must define — and redefine — for itself. Our current culture seems generally uncomfortable with facing the prospect of mourning, and even more uncomfortable with the dead body itself. Only nine days after the attacks of September 11, 2001, George W. Bush forcefully declared that it was time to turn grief into action, attempting to foreclose any extended period of public mourning period. And personal losses aren’t much different; half a century ago, Jessica Mitford’s The American Way of Death laid bare the amount of chemicals, makeup, and money we waste in order to give death a pleasant, less death-like appearance. Death is a thing to be acknowledged but not dwelled on, not faced head-on.

more from Colin Dickey at the LA Review of Books here.

ken russell (1927-2011)

10-Ken-Russell-1-REX

It is the commonplace fate of British cinema’s more visionary talents to end their careers marginalised and even mocked. This was certainly what happened to Ken Russell, who has died aged 84. In his latter years, with his shock of white hair and his red face, the director cut a cantankerous and slightly buffoonish figure. He asked for money for interviews. His greatest work wasn’t much in circulation. Those who knew him from such lesser efforts as The Fall Of The Louse Of Usher (2002), his eccentric and low-budget Edgar Allan Poe adaptation, or for his Cliff Richard and Sara Brightman videos, were probably baffled that he had such a glowing reputation. The director’s son Alex Verney-Elliott said his father had died in hospital after a series of strokes. Russell’s widow, Elize, said she was “devastated” by her husband’s death, which had been “completely unexpected”. Even in his pomp, he had always been a figure of considerable controversy. He was so often called the “enfant terrible” of British film that no one paid as much attention to his craftsmanship as they should have done.

more from Geoffrey Macnab at The Independent here.

Tuesday Poem

Vocation

Each cold October morning he went out
into the Gate Field and walked up and down,
like the horse-drawn seed-drill quartering every inch
to make sure the harvest was kept constant,
reading his Office, every sentence
of the forty pages for the day. In the evening,
as the colder darkness fell with the crows’
harsh calling, he sat alone in the back
benches of the unheated chapel, hour
after hour, staring for inspiration
at the golden, unresponsive tabernacle.

by Bernard O'Donoghue
Publisher: PIW, © 2011

Aging stem cells may explain higher prevalence of leukemia, infections among elderly

From PhysOrg:

Stem cellsHuman stem cells aren't immune to the aging process, according to scientists at the Stanford University School of Medicine. The researchers studied hematopoietic stem cells, which create the cells that comprise the blood and immune system. Understanding when and how these stem cells begin to falter as the years pass may explain why some diseases, such as acute myeloid leukemia, increase in prevalence with age, and also why elderly people tend to be more vulnerable to infections such as colds and the flu.

“We know that immune system function seems to decline with increasing age,” said Wendy Pang, MD. “This is the first study comparing the function and gene expression profiles of young and old purified, human hematopoietic stem cells, and it tells us that these clinical changes can be traced back to stem cell function.” Specifically, the researchers found that hematopoietic stem cells from healthy people over age 65 make fewer lymphocytes — cells responsible for mounting an immune response to viruses and bacteria — than stem cells from healthy people between ages 20 and 35. (The cells were isolated from bone marrow samples.) Instead, elderly hematopoietic stem cells, or HSCs, have a tendency to be biased in their production of another type of white blood cell called a myeloid cell. This bias may explain why older people are more likely than younger people to develop myeloid malignancies.

More here.

Human Nature’s Pathologist

Carl Zimmer in The New York Times:

PinkerDr. Pinker has focused much of his research on language on a seemingly innocuous fluke: irregular verbs. While we can generate most verb tenses according to a few rules, we also hold onto a few arbitrary ones. Instead of simply turning “speak” into “speaked,” for example, we say “spoke.” As a young professor at the Massachusetts Institute of Technology, he pored over transcripts of children’s speech, looking for telling patterns in the mistakes they made as they mastered verbs. Out of this research, he proposed that our brains contain two separate systems that contribute to language. One combines elements of language to build up meaning; the other is like a mental dictionary we keep in our memory.

This research helped to convince Dr. Pinker that language has deep biological roots. Some linguists argued that language simply emerged as a byproduct of an increasingly sophisticated brain, but he rejected that idea. “Language is so woven into what makes humans human,” he said, “that it struck me as inconceivable that it was just an accident.” Instead, he concluded that language was an adaptation produced by natural selection. Language evolved like the eye or the hand, thanks to the way it improved reproductive success. In 1990 he published a paper called “Natural Language and Natural Selection,” with his student Paul Bloom, now at Yale. The paper was hugely influential. It also became the seed of his breakthrough book, “The Language Instinct,” which quickly became a best seller and later won a place on a list in the journal American Scientist of the top 100 science books of the 20th century.

More here.

Monday, November 28, 2011

Sunday, November 27, 2011

Inverting the Turing Test

Stuart Shieber in American Scientist:

Most-human-what-talking-with-computers-teaches-brian-christian-hardcover-cover-artIn his book The Most Human Human, Brian Christian extrapolates from his experiences at the 2009 Loebner Prize competition, a competition among chatbots (computer programs that engage in conversation with people) to see which is “most human.” In doing so, he demonstrates once again that the human being may be the only animal that overinterprets.

You may not have heard of the Loebner competition, and for good reason. The annual event was inspired by the Turing test, proposed by Alan Turing in his seminal 1950 paper “Computing Machinery and Intelligence” as a method for determining in principle whether a computer possesses thought. Turing meant his test as a thought experiment to address a particular philosophical question, namely, how to define a sufficient condition for properly attributing intelligence, the capacity of thinking, to a computer. He proposed that a blind controlled test of verbal indistinguishability could serve that purpose. If a computer program were indistinguishable from people in a kind of open-ended typewritten back-and-forth, the program would have passed the test and, in Turing’s view, would merit attribution of thinking.

The Loebner competition picks up on this idea; it charges a set of judges to engage in conversation with the chatbot entrants and several human confederates, and to determine which are the humans and which the computers. At the end, a prize is awarded to the “most human” chatbot—that is, the chatbot that is most highly ranked as human in paired tests against the human confederates. “Each year, the artificial intelligence (AI) community convenes for the field’s most anticipated and controversial annual event,” Christian says. Well, not so much. The AI community pretty much ignores this sideshow. It’s the chatbot community that has taken up the Loebner competition.

More here.

Offense, irony, comedy, and who knows what else

Jed Perl in The New Republic:

MaurizioCattelanHimThey are selling postcards of Hitler in the gift shop at the Guggenheim Museum. To be precise, they are selling photographic reproductions of a work entitled Him, a polyester portrayal of the Führer that is one of the works by Maurizio Cattelan in his retrospective at the museum. I can imagine being outraged or at least troubled by the postcards in the gift shop, except that by the time I saw them I had already been bombarded by this exhibition in which nearly all of Cattelan’s oversized neo-Dadaist baubles have been hung from the ceiling of Frank Lloyd Wright’s rotunda. Cattelan’s Hitler doll—like his Picasso doll, his bicycle, his dinosaur, and the rest of the 128 items in this stupefyingly sophomoric show—is engineered for offense, irony, comedy, or who knows what else. Those who are bothered by the Hitler postcards in the gift shop are naturally going to be dismissed as insufficiently hip. The same goes for those who are disturbed by the sight of one of the world’s greatest public spaces once again turned over to an art world charlatan as his personal playpen. My own feeling is that the postcards, however misbegotten, are speech we accept, although not necessarily embrace, in a society we prize for its openness. What is really disquieting is the event that has occasioned these postcards. “Maurizio Cattelan: All”—that’s the title of the show—amounts to hate speech directed at the sponsoring institution.

More here.

Sunday Poem

Mapping the Interior

Imagine that you had a dishcloth
Bigger than the one mothers put on the bread
To slow its cooling, that you could spread
Over the whole kitchen floor to bring up its face
As clearly as the features on the cake.

You’d have a print you could lift up
To the light and examine for individual traces
Of people who came to swap yarns, and sit on
Sugan chairs that bit into the bare floor, leaving
Unique signatures on concrete that creased
Over time into a map you could look at and

Imagine what those amateur cartographers
Were thinking when their eyes fell, in the silence
Between the stories, that was broken only by
The sound of the fire and whatever it was that
Was calling in the night outside.

by Eugene O'Connell
from One Clear Call
Bradshaw Books, Cork, © 2003

Storms

Mitch Dobrowner in Lensculture:

Dobrowner_1Landscape photographers count ourselves lucky to be in the right place at the right time if a storm system is moving through — but I wanted to actively pursue these events. Since storms are a process (not a thing) I needed a guide. I soon connected with Roger Hill (regarded as the most experienced storm-chaser in the world); he introduced me to Tornado Alley and the Great Plains of the United States. In July 2009 Roger and I tracked a severe weather system for nine hours — from its formation outside of Sturgis, South Dakota, through Badlands National Park and into Valentine, Nebraska. Eventually we stopped in a field outside of Valentine, and there we stood in awe of the towering supercell (a thunderstorm with a deep rotating updraft) which was building with intake wind gusts of 60mph. It was like standing next to a 65,000-foot-high vacuum cleaner. It was unlike anything I had seen before in my life; the formation of the supercell had an ominous presence and power that I had never witnessed or experienced before. I remember turning to Roger, who was standing next to me, and saying, 'what the ****… you have to be kidding me'. It was only the second day of my “experiment” in shooting storms, but I knew without a doubt that this experiment would become an important project to me.

Words are inadequate to describe the experience of photographing this immense power and beauty. And the most exciting part is with each trip I really don’t know what to expect. But now I see these storms as living, breathing things.

More here. [Editor's note: You'll discover a lot more detail when you look at these images in our high-resolution slide show.]

The Era of Small and Many

Bill McKibben in Orion Magazine:

SmallEarlier this year, my state’s governor asked if I’d give an after-lunch speech to some of his cabinet and other top officials who were in the middle of a retreat. It’s a useful discipline for writers and theorists to have to summarize books in half an hour, and to compete with excellent local ice cream. No use telling these guys how the world should be at some distant future moment when they’ll no longer be in office—instead, can you isolate themes broad enough to be of use to people working on subjects from food to energy to health care to banking to culture, and yet specific enough to help them choose among the options that politics daily throws up? Can you figure out a principle that might undergird a hundred different policies? Or another way to say it: can you figure out which way history wants to head (since no politician can really fight the current) and suggest how we might surf that wave?

Here’s my answer: we’re moving, if we’re lucky, from the world of few and big to the world of small and many. We’ll either head there purposefully or we’ll be dragged kicking, but we’ve reached one of those moments when tides reverse. Take agriculture. For 150 years the number of farms in America has inexorably declined. In my state—the most rural in the nation—the number of dairies fell from 11,000 at the end of World War II to 998 this summer. And of course the farms that remained grew ever larger—factory farms, we called them, growing commodity food. Here in Vermont most of the remaining dairies are big, but not big enough to compete with the behemoths in California or Arizona; they operate so close to the margin that they can’t afford to hire local workers and instead import illegal migrants from Mexico. But last year the USDA reported that the number of farms in America had actually increased for the first time in a century and a half. The most defining American demographic trend—the shift that had taken us from a nation of 50 percent farmers to less than 1 percent—had bottomed out and reversed. Farms are on the increase—small farms, mostly growing food for their neighbors.

More here.

Saturday, November 26, 2011

Lynn Margulis 1938-2011

John Brockman in Edge:

LynnBiologist Lynn Margulis died on November 22nd. She stood out from her colleagues in that she would have extended evolutionary studies nearly four billion years back in time. Her major work was in cell evolution, in which the great event was the appearance of the eukaryotic, or nucleated, cell — the cell upon which all larger life-forms are based. Nearly forty-five years ago, she argued for its symbiotic origin: that it arose by associations of different kinds of bacteria. Her ideas were generally either ignored or ridiculed when she first proposed them; symbiosis in cell evolution is now considered one of the great scientific breakthroughs.

Margulis was also a champion of the Gaia hypothesis, an idea developed in the 1970s by the free lance British atmospheric chemist James E. Lovelock. The Gaia hypothesis states that the atmosphere and surface sediments of the planet Earth form a self- regulating physiological system — Earth's surface is alive. The strong version of the hypothesis, which has been widely criticized by the biological establishment, holds that the earth itself is a self-regulating organism; Margulis subscribed to a weaker version, seeing the planet as an integrated self- regulating ecosystem. She was criticized for succumbing to what George Williams called the “God-is good” syndrome, as evidenced by her adoption of metaphors of symbiosis in nature. She was, in turn, an outspoken critic of mainstream evolutionary biologists for what she saw as a failure to adequately consider the importance of chemistry and microbiology in evolution. I first met her in the late 80's and in 1994 interviewed her for my book The Third Culture: Beyond the Scientific Revolution (1995). Below, in remembrance, please see her chapter, “Gaia is a Tough Bitch”. One of the compelling features of The Third Culture was that I invited each of the participants to comment about the others. In this regard, the end of the following chapter has comments on Margulis and her work by Daniel C. Dennett, the late George C. Williams, W. Daniel Hillis, Lee Smolin, Marvin Minsky, Richard Dawkins, and the late Francisco Varela. Interesting stuff.

More here.

Jeffrey Eugenides talks about ‘The Marriage Plot’ and pokes fun at literary theorists

From The Christian Science Monitor:

JeffJeffrey Eugenides published his first novel at 33 after he was fired from his position as executive secretary at the Academy of American Poets. The reason he lost his job? He was spending too much time at work honing the manuscript of his debut novel “The Virgin Suicides” (1993). “Middlesex,” his second novel, earned Eugenides a Pulitzer Prize for fiction in 2002. “The Marriage Plot,” just released in October, is Eugenides' third novel which took him nine years to complete. The story centers around three college students – Madeleine, Leonard, and Mitchell – all of whom graduate from Brown College in 1982. The book is a postmodernist take on the original marriage plot within the Victorian novel. A lot of the time, it is also a novel about other novels, in which the characters spend their time discussing Derrida, Tolstoy, Austen, and Hemingway. The second half of the book moves away from literary theory, and in some colorful scenes set in Paris, Calcutta, and New York, Eugenides explores the difficulties of dealing with mental illness, failed romance, and one man’s battle with his faith in religion. And of course Eugenides also returns to his central source of inspiration: the coming-of-age story. Eugenides recently spoke to the Monitor about the extent of free will, why semiotics is needlessly convoluted, and how reading James Joyce nearly made him choose a career in religion over a career in writing.

Your new novel moves more towards realism than your previous work. Why the change in style?

I’ve always considered myself a realist at heart. I’ve never written a book that violated physicalprinciples. My books often have an atmosphere of the fantastic or
the surreal, but actually nothing happens in them that couldn’t happen in reality, so I don’t know if this book is that much of a departure in terms of realism.

More here.

Saturday Poem

Arriving Shortly

When amma came
to New York City,
she wore unfashionably cut
salwar kurtas,
mostly in beige,
so as to blend in,
her body
a puzzle that was missing a piece –
the many sarees
she had left behind:
that peacock blue
Kanjeevaram,
that nondescript nylon in which she had raised
and survived me,
the stiff chikan saree
that had once held her up at work.

When amma came to
New York City,
an Indian friend
who swore by black
and leather,
remarked in a stage whisper,

“This is New York, you know –
not Madras.
Does she realise?”

Ten years later,
transiting through L.A. airport
I find amma
all over again
in the uncles and aunties
who shuffle past the Air India counter
in their uneasily worn, unisex Bata sneakers,
suddenly brown in a white space,
louder than ever in their linguistic unease
as they look for quarters and payphones.
I catch the edge of amma’s saree
sticking out
like a malnourished fox’s tail
from underneath
some other woman’s sweater
meant really for Madras’ gentle Decembers.

by K. Srilata
from Arriving Shortly
Publisher: Writers Workshop, Kolkata, © 2011