When I first learned about the fungus Cordyceps, I refused to believe.
I was working on a book about the glories of parasites, so I was already in the parasitic tank, you could say. But when I read about how Cordyceps infects its insect hosts, I thought, this simply cannot be. The spores penetrate an insect’s exoskeleton and then work their way into its body, where fungus then starts to grow. Meanwhile, the insect wanders up a plant and clamps down, whereupon Cordyceps grows a long stalk that sprouts of the dead host’s body. It can then shower down spores on unfortunate insects below.
I mean, really.
Yet this video from David Attenborough faithfully depicts the actual biology of this flesh-and-blood fungus. I also discovered that Cordyceps is not the only species that drives insect hosts upward. You don’t even have to visit a remote jungle to see one. Here in the United States, houseflies sometimes end up stuck to screen doors thanks to a fungus called Entomophthora muscae. And the lancet fluke Dicrocoelium dendriticum uses the same strategy to get into cows.
Call me naive, but I assumed that creatures as freakish and wonderful as Cordyceps and company would attract enormous amounts of scientific attention. Yet I was frustrated to discover that hardly any research has been carried out on their powers of manipulation. That’s a shame, because you cannot assume that these parasites are indeed manipulating their hosts. It’s possible, but it’s just a hypothesis that requires testing.
Two teams of Chinese researchers working separately have reprogrammed mature skin cells of mice to an embryonic-like state and used the resulting cells to create live mouse offspring. The reprogramming may bring scientists one step closer to creating medically useful stem-cell lines for treating human disease without having to resort to controversial laboratory techniques. However, the advance poses fresh ethical challenges because the results could make it easier to create human clones and babies with specific genetic traits. The latest findings are a bit of a surprise, given that Chinese scientists' contribution to lab-based stem-cell research has been modest over the years. However, Chinese scientists have been publishing more basic-research findings than in the past. The country is more known for its growing trade in unproven stem-cell therapies that have attracted patients from around the world. Reports suggest that China's health authorities have moved to regulate such activities.
The plots of soap operas are not only melodramatic; unlike any other kind of serial, they are written with no end in sight. This engages the viewer in an experience in which the pace startlingly mimics that of reality and plot itself is incidental. Like life, once a soap starts you’re along for the ride, never knowing how or when it will end. You focus on the characters’ daily affairs and less on the overall story. Soap opera characters act in real time — day by day by day, just as you and I do — but theirs are infinite, fantastic lives. To quote Guiding Light’s “Gus Aitoro,” “Everything’s easy for me. Although next year might be a problem because I was legally dead, partially, briefly.” It’s no wonder soap operas have been so loved by women who stay home all day. The incremental timing of the narrative mimics daily life, even if the events don’t. There’s an immediacy to all the melodrama. (This might be the reason why there were so many protests when networks tried to replace the ugly rawness of standard video with the gloss of high-def). And while the content of the narrative sounds outrageous when summarized, it doesn’t feel as strange when you’re watching it unfold over time. Maybe you haven’t yet been divorced six times, but try to write the story of your life in three paragraphs and I promise you will be shocked at the theater of it all. In structure, soaps are far different from a show like C.S.I. The latter is self-contained, complete. The plots are generally simple and focused. It doesn’t matter much whether you watch the episodes in sequence, and the characters’ development tends to be static. Serials, however, are different. Each episode concludes with loose ends, teases that lead you along. As the plots unravel, the characters become more complicated. You watch what the characters on Law & Order do, but you don’t grow with them. With serials like soaps, you learn characters’ dark secrets, watch them slowly fall in love. And out of love. And into love again. The way serials involve you completely in their logic is not just engaging — it’s magical.
more from Stefany Anne Golberg at The Smart Set here.
The personal honor of the private eye is the genre’s most hallowed convention. He owes nothing to anyone. He is in it only for himself; therefore, he is selfless. In Chandler’s description: “He is a relatively poor man, or he would not be a detective at all. He is a common man, or he could not go among common people. He has a sense of character, or he would not know his job. He will take no man’s money dishonestly and no man’s insolence without a due and dispassionate revenge. He is a lonely man and his pride is that you will treat him as a proud man or be very sorry you ever saw him. . . . The story is his adventure in search of a hidden truth, and it would be no adventure if it did not happen to a man fit for adventure.” The detective in Chandler’s books is Philip Marlowe, a character probably created on the model of Dashiell Hammett’s Sam Spade. (Hammett was a mystery writer Chandler did admire. “Hammett gave murder back to the kind of people that commit it for reasons, not just to provide a corpse,” he said.) Lew Archer is Ross Macdonald’s private eye; Mike Hammer is Mickey Spillane’s. Thomas Pynchon’s is named Larry (Doc) Sportello. Sportello is the best thing in Pynchon’s self-consciously laid-back and funky new novel, “Inherent Vice” (Penguin; $27.95). The title is a term in maritime law (a specialty of one of the minor characters). It refers to the quality of things that makes them difficult to insure: if you have eggs in your cargo, a normal policy will not cover their breaking. Getting broken is in the nature of being an egg. The novel gives the concept some low-key metaphysical play—original sin is an obvious analogy—but, apart from this and a death-and-resurrection motif involving a saxophonist in a surf-rock band, “Inherent Vice” does not appear to be a Pynchonian palimpsest of semi-obscure allusions. (I could be missing something, of course. I could be missing everything.)
“Pen and Parchment: Drawing in the Middle Ages” is the most original museum show in this country since 2002’s “Tapestry in the Renaissance: Art and Magnificence.” These audacious exhibitions turn scholarly probity into artistic revelation; it speaks volumes about the curatorial esprit at the Metropolitan Museum of Art that this great institution has been responsible for both events. “Tapestry in the Renaissance,” which made a definitive case for the centrality of woven images in fifteenth- and sixteenth-century European art, was the defining moment in the career of Thomas Campbell, a relatively untested curator who is now the director of the Metropolitan. It is anyone’s guess where the curator Melanie Holcomb will be in seven years, but there is no doubt that with this new, gorgeously focused show, she has reframed the place of drawing in the history of European art. I cannot imagine someone going through this epochal exhibition without being convinced that drawing was recognized as a deeply personal avowal as early as the ninth century. We may know next to nothing about the artists who did most of this work, but we can see that they were expressing their own sense of life through the energy that they brought to marks made with pen and ink on parchment.
In a famous series of stories in the 1940s, physicist George Gamow related the adventures of one Mr. C.G.H. Tompkins, a humble bank clerk who had vivid dreams of worlds where strange physical phenomena intruded into everyday life. In one of these worlds, for instance, the speed of light was 15 kilometers per hour, putting the weird effects of Einstein's theory of special relativity on display if you so much as rode a bicycle.
Not long ago I figuratively encountered one of Mr. Tompkins's great grandsons, Mr. E. M. Everard, a philosopher and engineer who is carrying on his ancestor's tradition. He told me of an amazing experience he had involving some recently discovered aspects of Einstein's theory of general relativity, which I will share with you. His remarkable story is replete with curved spacetime, cats twisting in midair, an imperiled astronaut dog paddling through a vacuum to safety—and Isaac Newton perhaps spinning in his grave.
In his 1985 essay “Freaks and the American Ideal of Manhood,” Baldwin wrote of Michael Jackson:
The Michael Jackson cacophony is fascinating in that it is not about Jackson at all. I hope he has the good sense to know it and the good fortune to snatch his life out of the jaws of a carnivorous success. He will not swiftly be forgiven for having turned so many tables, for he damn sure grabbed the brass ring, and the man who broke the bank at Monte Carlo has nothing on Michael.
Baldwin goes on to claim that “freaks are called freaks and are treated as they are treated—in the main, abominably—because they are human beings who cause to echo, deep within us, our most profound terrors and desires.” But Jackson was not quite that articulate or vocal about his difference, if he even saw it as such after a while. Certainly his early interest in subtext —expressed primarily by wordplay and choice of metaphor—receded after he released his synthesizer-heavy 1991 album, Dangerous. That album gave us “In the Closet,” where an uncredited Princess Stéphanie of Monaco pleads, at the beginning of the song, for the singer not to ignore their love, “woman to man.” (It's another link in the chain of influence; she sounds like Jackson doing Diana Ross.) In a later part of the song, Michael pleads: “Just promise me/Whatever we say/Or whatever we do/To each other/For now we'll make a vow/To just keep it in the closet.”
But this would be his last engagement of this kind. Unlike Prince, his only rival in the black pop sweepstakes, Jackson couldn't keep mining himself for material for fear of what it would require of him—a turning inward, which, though arguably not the job of a pop musician, is the job of the artist.
Merce Cunningham, who has died aged 90, was one of the greatest choreographers of the 20th century, and the greatest American-born one. As a choreographer, he never abandoned the voyage of discovery that he embarked on at the beginning of his career. Like his life partner and frequent collaborator, the composer John Cage, he remained intransigent to the last. He continued to lead his dance company, founded in 1953, until his death, and presented a new work, Nearly Ninety, last April, at the Brooklyn Academy of Music, New York, to mark his 90th birthday. In spite of what was often seen as his iconoclasm, his work was essentially classical in its formal qualities, its rigour, and its purity. Both Cunningham and Cage used chance processes, though in very different ways: Cage carried them through to the actual performance of his music, while Cunningham used them only in the creation of the choreography itself. As with any other compositional tool, what really matters is the quality of the imagination at work. Apart from Cunningham’s sheer fecundity of invention, his choreography was notable for its strength of structure, even though that structure was organic rather than preconceived.
The unresolved debate over how to monitor older drivers points to not only the difficulty of regulating an important social activity, but of the underappreciated complexity of driving itself. Getting behind the wheel of a car may be an everyday activity, but it’s also the most dangerous and cognitively assaultive thing most of us do, and the only realm in which most people are regularly confronted with split-second, life-or-death decisions. That also makes it a valuable laboratory for the study of human attention, perception, and concentration – an arena where brain science is turning seeming abstractions into hard knowledge about important life skills. “[Studying driving] turns out to be an excellent way to look at the limits of our attentional abilities, especially as we get older and we start to show significant declines,” says David Strayer, a psychology professor at the University of Utah. “It’s one of the most direct ways to be able to look at how attention works, how multi-tasking works.”
In Psychology Today, Gad Saad responds to Sharon Begley’s article on evo psyc in Newsweek. One of Saad’s points is that many evo psyc models incorporate contingent behavioral strategies, the “it depends” mode of explanation. I wonder though. If the claims of evolutionary psychology are given credence by identifying them in cross-cutural, transhistoric universal patterns of behavior, how can we know that the variations in behavior are the result of an “it depends” hardwiring or socio-cultural development?:
Sharon Begley has just written an article in Newsweek wherein she castigates the field of evolutionary psychology (EP) using the same antiquated and perfectly erroneous set of criticisms that have been addressed by evolutionary psychologists on endless occasions. If cats have nine lives then critics of evolutionary psychology à la Ms. Begley have infinite lives. The anti-EP dragon is slain repeatedly and yet it always resurfaces, emboldened by its blind and prideful ignorance of the facts. Unfortunately, it would take several posts for me to provide a point-by-point retort to the endless number of falsehoods that appear in her article. Instead, I will focus on a few key ones that were central to her critique.
(1) Ms. Begley’s article title, Can We Blame Our Bad Behavior on Stone-Age Genes, seems to levy yet again the specter that evolutionary psychology is tantamount to genetic determinism. Evolutionary psychologists posit that the human mind does indeed consist of evolved computational systems that can be instantiated in one of several ways as a function of specific triggering inputs. Put simply, evolutionary psychologists are perfectly aware that humans are an inextricable mélange of their genes and idiosyncratic life experiences. This is known as the interactionist perspective. Epigenetic rules by definition recognize the importance of the environment in shaping the manner by which biological blueprints will be instantiated. Hence, EP does not imply that we are endowed with a perfectly rigid and inflexible human nature. Rather, we do possess an evolutionary-based human nature that subsequently interacts with environmental cues. That said this does not imply that human nature is infinitely malleable. I challenge Ms. Begley to find a culture in the annals of recorded history where parents were overwhelmingly more concerned about their son’s chastity as compared to their daughter’s.
Who is Britain's favourite American dramatist? One year it seems to be Arthur Miller, the next it's David Mamet. Right now, Tennessee Williams is having a moment. Rachel Weisz opens in A Streetcar Named Desire tonight, at the Donmar in London. In December, a Broadway African-American Cat On a Hot Tin Roof, starring James Earl Jones and Adrian Lester, comes to the West End. And, in between, there is the European premiere of a forgotten 1937 play, Spring Storm, at the Royal & Derngate in Northampton. But, for all our enthusiasm for Williams, I think we still get him subtly wrong. He is most often dubbed a “psychological” dramatist, but this ignores his social and political radicalism – as well as his rich talent for comedy.
Of course, perceptions of Williams have evolved over the years. When Streetcar was first seen in London in 1949, in a production directed by Laurence Olivier and starring Vivien Leigh, Williams was viewed as a kind of filthy American sleaze-merchant. The confrontation of Blanche Dubois and Stanley Kowalski sent the British press into a tizzy: Logan Gourlay in the Sunday Express spoke for many when he condemned the play as “the progress of a prostitute, the flight of a nymphomaniac, the ravings of a sexual neurotic”. The play was attacked in Parliament as “low and repugnant”, and by the Public Morality Council as “salacious and pornographic”. When Cat On a Hot Tin Roof had its British premiere in 1958, it had to be presented under the polite fiction of a “club performance” – lest the broader public be corrupted by the discreet suggestion that its hero, Brick, is gay.
Creativity is commonly thought of as a personality trait that resides within the individual. We count on creative people to produce the songs, movies, and books we love; to invent the new gadgets that can change our lives; and to discover the new scientific theories and philosophies that can change the way we view the world. Over the past several years, however, social psychologists have discovered that creativity is not only a characteristic of the individual, but may also change depending on the situation and context. The question, of course, is what those situations are: what makes us more creative at times and less creative at others?
One answer is psychological distance. According to the construal level theory (CLT) of psychological distance, anything that we do not experience as occurring now, here, and to ourselves falls into the “psychologically distant” category. It’s also possible to induce a state of “psychological distance” simply by changing the way we think about a particular problem, such as attempting to take another person's perspective, or by thinking of the question as if it were unreal and unlikely. In this new paper, by Lile Jia and colleagues at Indiana University at Bloomington, scientists have demonstrated that increasing psychological distance so that a problem feels farther away can actually increase creativity.
History to the defeated May say Alas but cannot help or pardon.
Auden’s anthem to the doomed Spanish Republic, his somber warning, has rarely been more relevant.
Last September Spain’s homegrown “super-judge” Baltasar Garzón—best-known for his dramatic 1998 effort to arrest the late Chilean dictator Augusto Pinochet in London— announced that he was investigating not only the whereabouts of the remains of the “disappeared” of the Spanish Civil War (1936-1939), but also the huge numbers of defeated Republicans executed by General Francisco Franco in the grim postwar years. His goal was to try to amass enough evidence to charge Franco’s regime posthumously with crimes against humanity. Could it be that, after so long, “help” and “pardon” were finally coming to the descendants of those who died defending the Spanish Republic?
According to the great Hispanist Hugh Thomas, the three-year Civil War claimed the lives of 365,000 Spaniards, a toll that includes both those loyal to the fascist rebel Franco and those who opposed him. Some historians put the figure higher. Both sides carried out brutal executions, the bodies of victims often ending up in unmarked mass graves.
When the Civil War ended in 1939, the victorious Franco regime executed an additional one hundred thousand-plus Republican prisoners, many of whose corpses were flung into yet more mass-burial pits. These unmarked mounds, visited stealthily by the families of the “defeated” during the dictatorship, are scattered the length and breadth of Spain.
Because thousands of a certain generation's cinematic lives have been changed by this film, its territory is best approached with caution. Mine, however, happens to be among those thousands, 1998 marking as it did the opening of my prime window of cultural absporpton. Cinephilic teenagers of the 1960s had The 400 Blows, Breathless, Dr. Strangelove; cinephilic teenagers of the 1970s had Harold and Maude, Chinatown, Taxi Driver; cinephilic teenagers of the 1980s had Repo Man, Blue Velvet, Stranger than Paradise; cinephilic teenagers of the 1990s had Rushmore.
The impact of Wes Anderson's second film didn't propel me immediately from the screening room to a new, theretofore unseen world illuminated by pure light cast forth by the angels of cinema. Its effects were those of a gradually-dissolving ingested substance, working only in the fullness of time. I knew I'd seen something epiphanic, but damned if I could put my finger on what or why. While it has sparked and continues to spark in young viewers as much of a fanatic enthusiasm for film, both its appreciation and its craft, as the most radical, stylistically transgressive piece of deliberate provocation, it does so within a shell of relative normality. But though translucently thin, this shell appears to have confused almost as many filmgoers as it's blindsided with slow-acting inspiration.
“You can't tell if it's a comedy, or if it's a drama, or what it is!” complained some with whom I excitedly sought to discuss the movie. While my adolescent mind couldn't counter this grievance, I now realize that coming up with a genre to fit Rushmore into is an exercise not only doomed to futility but ignorant of the very seat of the film's strength: you can't tell if it's a comedy or a drama or what because it isn't. It is, strictly speaking, a film without genre, which is to say, a film without any of the bundles of clichés that constitute the genres' membership qualifications. This must have rendered marketing a futile ordeal, which would account for the movie's unimpressive domestic box office performance. (But since genre is a labor-saving marketer's device in the first place, perhaps this is a simple case of reaping what's been sown.)
When it comes to “memetics,” which some say is the new science of studying “memes,” consider me a skeptic. Doesn't a science need to have a clearly defined subject and verifiable findings? At this point the “meme” concept seems more or less to be where the “artificial intelligence” idea was twenty years ago: That is, it's not so much a hypothesis as it is an analogy – a somewhat vague and fluid analogy – one that lets people think in some new and smart ways but leaves them subject to flights of excessive rhetoric.
Which means it's useful … but not exactly real.
The uninitiated among you may be wondering what, exactly, is meant by the word “meme.” You're not alone. Meme advocates are still arguing about that. The word was first used by Richard Dawkins in his book The Selfish Gene, as a contract of “mimeme” (meaning imitated behavior.) Dawkins was suggesting that cultural behaviors, reproduced as one person mimics the actions of another, could be considered analogous to genes.
What are some examples of memes? Opinions vary. But the word has caught on in the blogging and Internet world, where its definition seems to be indistinguishable from “fads” or “catchphrases.” Lolcats is described as a “meme” on the Web, for example, and so is “rickrolling.” Expressions like “Jump the shark” and “FAIL” are memes in the online universe, too. A more rigorous and universally agreed-upon definition appears to be lacking.