Carl Zimmer in his excellent blog, The Loom:
Scientists have learned a lot more about parasitoid wasps since Darwin wrote about them in 1860, and their elegant viciousness is now even more staggering to behold. Not only do they devour their hosts alive from the inside out, but they also manipulate the behavior of their hosts to serve their own needs (see my post on zombie cockroaches for one particularly startling example).
To be fair, though, parasitoid wasps are not just vicious to their hosts. They can be just as nasty to other parasitoid wasps. Some wasp larvae can only mature inside other parasitoids, turning their host into a grotesque Russian doll. And, as I write in tomorrow’s New York Times, some wasps turn their caterpillar host into a battlefield, waging all-out war with other wasps. They kill other species of wasps, and will even kill their own siblings by the thousands. (Be sure to see the diagram of the sci-fi life cycle of the wasp Copidosoma floridanum. By the end of it, the caterpillar is a mummified mass of pupae.)
These creatures are certainly bizarre, but bizarre in an scientifically interesting way. Scientists have found that the evolutionary forces that shape other animals can also explain these wasps. As I explain in the article, the warfare among the wasps probably arises thanks to the peculiar way they develop. A single egg (like the one being laid inside a host egg in the picture) gives rise to thousands of genetically identical siblings. Up to a quarter of them become vicious soldiers, while the rest become passive feeders. The soldiers are sterile, lacking any sex cells. In a way, they’re not even really individuals. In a genetic sense, they’re like disembodied organs. Imagine you could send your liver off to kill your enemies.
His last personal secretary returns to Havana and discovers that the novelist’s mythic presence looms larger than ever.
Valerie Hemingway in Smithsonian Magazine:
Nine miles outside the city I arrived at what I had come to see: Finca Vigía, or Lookout Farm, where Ernest Hemingway had made his home from 1939 to 1960, and where he had written seven books, including The Old Man and the Sea, A Moveable Feast and Islands in the Stream.
The Finca Vigía had been my home too. I lived there for six months in 1960 as Hemingway’s secretary, having met him on a sojourn to Spain the previous year, and I returned to the finca for five weeks in 1961 as a companion to his widow, Mary. (Later, I married Ernest’s youngest son, Gregory; we had three children before we divorced in 1987; he died in 2001.) I well remember the night in 1960 when Philip Bonsall, the U.S. ambassador to Cuba and a frequent visitor, dropped by to say that Washington was planning to cut off relations with Fidel Castro’s fledgling government, and that American officials thought it would be best if Hemingway demonstrated his patriotism by giving up his beloved tropical home. He resisted the suggestion, fiercely.
Tom Standage in the Christian Science Monitor:
In many cases, bottled water is actually derived from tap water and filtered – which is why PepsiCo has just agreed to add the words “public water source” to the label of its Aquafina water. But water from glacial springs is not inherently superior. Worse, shipping it around causes unnecessary environmental damage. Refrigeration wastes even more energy. Then there are the millions of plastic bottles, many of which end up in landfills.
Surely bottled water is purer and safer? Actually, no. The regulations governing the quality of public water supplies are far stricter than those governing bottled-water plants. True, there are sometimes contamination problems with tap water, but the same is true of bottled water.
The industry responds that it is not selling water; it is selling “portable hydration.” But filling a bottle from the tap works just as well. The industry also likes to point out that bottled water is a healthy, calorie-free alternative to sugary soda drinks. The same goes for tap water.
Alun Anderson at Edge.org:
Knowing that Arctic climate models are imperfect, it would be reassuring for me, if not for the scientists, to be able to write that scientists keep making grim predictions that just that don’t come true. If that were so, we could follow Dyson’s line that the models aren’t so good and “the fuss is exaggerated”. Scarily, the truth is the other way around. The ice is melting faster than the grimmest of the scientist’s predictions, and the predictions keep getting grimmer. Now we are talking about an Arctic free of ice in summer by 2040. That’s a lot of melting given that, in the long, dark winter the ice covers an area greater than that of the entire United States.
Margaret Moorman in Columbia Magazine:
While Collins is an unabashed advocate for rigorous classical training, he seems too relaxed and sophisticated to proselytize, and he eschews negativity. “I don’t really want to advertise myself as disaffected,” he says. “It’s not that I didn’t like modernism, but I loved extraordinary draftsmanship. I looked at Hans Holbein and Raphael and Michelangelo, and that’s what I wanted to do so much. If it doesn’t have that classical, underlying, structured draftsmanship, it’s just not what I’m interested in.” He respects some 20th-century painters, especially the abstract expressionists, whose sense of the transformative power of art is close to his own, but “that doesn’t mean that I care about them very much. What happened after that — the irony of postmodernism — is just ridiculous. I don’t care about it at all.
Martin Haake in the Los Angeles Times:
The subway cars in New York are plastered with ads featuring cartoonish character faces with absolutely no hint about the advertisement’s purpose except for a come-on with the word Windorphins.
So what are Windorphins? A video game? A kiddie show? A sugary snack? A new drug to make you feel like your endorphins are kicking in?
None of the above. Windorphins is a new marketing gimmick for EBay, with ads so inscrutable as to be ridiculous. If the ads aroused enough curiosity for people to check out windorphins.com, some might have been disappointed to find that the mysterious windorphins (whatever those are) were simply a big tease.
But curiosity — in the form of a riddle, a mystery, a puzzle, even a clever bit of deception — is a powerful thing. Advertising has learned that teasing the public without giving too much away can be an effective marketing tool that can create a tremendous amount of excitement. Why? Because of our tremendous need to know.
Patricia Cohen in the NY Times:
PROVIDENCE, R.I. — For children, play is easy. You can do it anytime, anywhere, with anyone, and it’s fun. For adults, play is hard. They want to know if it’s safe for their kids, if it’s educational, if it promotes motor coordination, if it’s environmentally friendly, if it will look good on a preschool application.
The tension between how children spend their free time and how adults want them to spend it runs through Howard P. Chudacoff’s new book, “Children at Play: An American History” (New York University Press), like a yellow line smack down the middle of a highway.
“Kids should have their own world, and parents are nuisances,” said Mr. Chudacoff, a professor of history at Brown University.
His critique is increasingly echoed today by parents, educators and children’s advocates who warn that organized activities, overscheduling and excessive amounts of homework are crowding out free time and constricting children’s imaginations and social skills.
“It seems like a really timely book,” said Cindy Dell Clark, a historian at Penn State Delaware County and a consultant to the Please Touch Museum in Philadelphia. “We’ve taken a lot of privacy and autonomy out of a child’s day.”
The topic may seem an odd choice for Mr. Chudacoff, 64, given that he has no children of his own, but then again, Mr. Chudacoff is also the author of a book about bachelors (“The Age of the Bachelor,” Princeton University Press, 1999) even though he has been married for nearly 40 years.
Chris Cheesman in Amateur Photographer:
Famed photo agency Magnum will celebrate 60 years of fashion photography with an exhibition split between two London venues from 14 September.
One show, at the Atlas Gallery, will feature vintage images from legends such as Robert Capa and Eve Arnold.
A separate show at The Magnum Print Room will focus on images from Fashion Magazine, featuring work from photographers such as Martin Parr, Bruce Gilden and Alec Soth.
The exhibition is entitled Documenting Style: 60 Years of Fashion Photography from Magnum Photos.
The author’s book tour—for God Is Not Great—takes a few miraculous turns, including the P.R. boost from Jerry Falwell’s demise, a chance encounter with the Archbishop of Canterbury, and surprising support for an attack on religion.
Christopher Hitchens in Vanity Fair:
You hear all the time that America is an intensely religious nation, but what you don’t hear is that there are almost as many religions as there are believers. Moreover, many ostensible believers are quite unsure of what they actually believe. And, to put it mildly, the different faiths don’t think that highly of one another. The emerging picture is not at all monolithic.
People seem to be lying to the opinion polls, as well. They claim to go to church in much larger numbers than they actually do (there aren’t enough churches in the country to hold the hordes who boast of attending), and they sometimes seem to believe more in Satan and in the Virgin Birth than in the theory of evolution. But every single time that the teaching of “intelligent design” has actually been proposed in conservative districts, it has been defeated overwhelmingly by both courts and school boards. A fascinating new book, 40 Days and 40 Nights, describes this happening in detail in the small town of Dover, Pennsylvania.
Michael J. Disney in American Scientist:
It is true that the modern study of cosmology has taken a turn for the better, if only because astronomers can now build relevant instruments rather than waiting for serendipitous evidence to turn up. On the other hand, to explain some surprising observations, theoreticians have had to create heroic and yet insubstantial notions such as “dark matter” and “dark energy,” which supposedly overwhelm, by a hundred to one, the stuff of the universe we can directly detect. Outsiders are bound to ask whether they should be more impressed by the new observations or more dismayed by the theoretical jinnis that have been conjured up to account for them.
My limited aim here is to discuss this dilemma by looking at the development of cosmology over the past century and to compare the growing number of independent relevant observations with the number of (also growing) separate hypotheses or “free parameters” that have had to be introduced to explain them. Without having to understand the complex astrophysics, one can still ask, at an epistemological level, whether the number of relevant independent measurements has overtaken and comfortably surpassed the number of free parameters needed to fit them—as one would expect of a maturing science. This approach should be appealing to nonspecialists, who otherwise would have little option but to believe experts who may be far too committed to supply objective advice. What one finds, in my view, is that modern cosmology has at best very flimsy observational support.
From The Harvard Gazette:
The Sombrero Galaxy – 28 million light years from Earth – was voted best picture taken by the Hubble telescope. The dimensions of the galaxy, officially called M104, are as spectacular as its appearance. It has 800 billion suns and is 50,000 light years across.
From The Washington Post:
AWAY by Amy Bloom.
Amy Bloom knows the urgency of love. As a practicing psychotherapist, she must have heard that urgency in her patients’ stories, and in 1993 when she broke onto the literary scene with Come To Me, we heard it in hers. She has never strayed from that theme. Four years later, she published Love Invents Us and followed that with another collection in 2000, A Blind Man Can See How Much I Love You. A finalist for the National Book Award and the National Book Critics Circle Award, Bloom writes with extraordinary care about people caught in emotional and physical crosswinds: desires they can’t satisfy, illnesses they can’t survive, and — always — love that exceeds the boundaries of this world.
It’s the kind of humid, overwrought territory where you’d expect to find pathos and melodrama growing like mold, but none of that can survive the blazing light of her wisdom and humor. Now, with her aptly named second novel, Away, Bloom has stepped confidently into America’s past to work in that old and ever-expanding genre of immigrant lit. It seems, at first, a familiar tune, but she plays it with lots of brio and erotic charge.
In Ars technica, (for Maya Nair):
Orange you glad I didn’t say banana?
This is the kind of humor we’re used to hearing from kids, colleagues who think they are funnier than they really are, and the likes of Steve Wozniak. Well, Woz may have a newly-interested audience for his hilarious joke-telling appearances, because researchers at the University of Cincinnati have developed a robot that is capable of recognizing simple humor made up of wordplay and bad puns.
“The ability to appreciate humor is an enormous increment in subtlety,” said researcher Tom Mantei from the University of Cincinnati’s College of Engineering in a statement. “You need to know a lot to ‘get’ humor—a computer does not find it easy.”
That’s what UC doctoral student Julia Taylor and professor Larry Mazlack have discovered in their project on data mining. They reported on their progress with the project at the American Association for Artificial Intelligence conference in Vancouver this week, and while they feel they have made great progress so far, they also feel that they have a long way to go.
“The ‘robot’ is just a software program that still needs a lot of work,” says Taylor. “The idea is to be able to recognize jokes that are based on phonological similarity of words.”
The software recognizes humor by processing the words used in the joke and comparing it with a vocabulary database, which must be created by a world-wise human being.
[H/t: Anandaroop Roy.]
James Galbriath, Olivier Giovannoni and Ann J. Russo suggest that the Fed has a bias against full employment, not inflation per se.
Using a VAR model of the American economy from 1984 to 2003 we find that, contrary to official claims, the Federal Reserve does not target inflation or react to “inflation signals.” Rather, the Fed reacts to the very “real” signal sent by unemployment; in a way that suggests that a baseless fear of full employment is a principal force behind monetary policy. Tests of variations in the workings of a Taylor Rule, using dummy variable regressions, on data going back to 1969 suggest that after 1983 the Federal Reserve largely ceased reacting to inflation or high unemployment, but continued to react when unemployment fell “too low.” We further find that monetary policy (measured by the yield curve) has significant causal impact on pay inequality–a domain where the Federal Reserve refuses responsibility. Finally, we test whether Federal Reserve policy has exhibited a pattern of partisan bias in presidential election years, with results that suggest the presence of such bias, after controlling for the effects of inflation and unemployment.