How Leonard Nimoy (RIP) grew to love Spock as much as we did

Andrew Collins in The Guardian:

ScreenHunter_1036 Feb. 28 19.36As with many others of my generation, Mr Spock was my babysitter. What we now refer to as “the original series” of Star Trek – it having since been superseded by four others, not to mention a dozen motion pictures – was famously cancelled by NBC in America in 1969 after three seasons, but it started airing here that July and boldly went into eternal syndication like no show had done before. I’m guessing I started watching it a couple of years later, unaware that its vision of a pioneering American future was already history. Spock was my favourite character on that famous bridge. Wasn’t he logically everybody’s?

As played by Leonard Nimoy, a Boston-raised polymath of Ukrainian parentage who eventually learned to embrace the pixie-eared half-Vulcan who made him an international icon, the starship Enterprise’s science officer was our appointment to view in those glory years when those of us too young to see science fiction at the cinema snaffled it up on TV. It was Mr – never Doctor – Spock who kept his head while all around were losing theirs, whether to a sexy female alien like fallible farmboy Captain Kirk, or amid some engine-room catastrophe like Scotty. (I seem to remember my mum having at least one baby book by the famous American paediatrician and Olympic rowing medallist Doctor Spock, who empowered mothers with his 1946 book Baby and Child Care. He was not Spock.)

Although the thespian and the half-Vulcan were two different people, to us they were one and the same. We assumed Nimoy to be as calmly logical and emotionally repressed as Spock. Nimoy’s relationship with his alter-ego was encapsulated by the titles of his two autobiographies, I Am Not Spock, published in 1975, and I Am Spock, 20 years later. But Nimoy was Spock; he even invented the famous Vulcan “neck-pinch” as a fighting technique suitable for a vegetarian, which Spock was. And the Vulcan salute (do it now), which he adapted from a blessing sign used bykohanim priests. The actor admitted that Spock’s personality had influenced his own in real life. Nimoy is in Spock’s green blood and Spock is in Nimoy’s red equivalent.

More here.

From computational complexity to quantum mechanics

Michael Segal in Nautilus:

ScreenHunter_1035 Feb. 28 19.09Scott Aaronson, theoretical computer scientist and professor at the Massachusetts Institute of Technology (MIT), runs a popular blog called “Shtetl-Optimized.” Which is a curious title, given its focus on computational complexity. When I asked Aaronson about the connection, he replied that he saw himself as someone designed for a different era—like, for instance, the 19th-century Jewish village, or shtetl, from which he descended, and where studying was, for many, the central activity of life.

Completing his undergraduate studies at age 18, and earning tenure at MIT at age 31, Aaronson has certainly made study a central part of his own life. But it’s not just computer science that draws his interest. His book, Quantum Computing Since Democritus, touches on consciousness, free will, and time travel. A recent discussion on his blog about gender roles in science has drawn 609 comments as of this writing. And he does not shy away from public debate, having become one of the most persistent critics of claims made by startup D-Wave Systems that they are selling operational quantum computers. Why not just turn a blind eye and let those claims slide? “This is just not who I am,” says Aaronson.

More here.

People are animals, too

The human brain is special. Just not that special. To understand animal minds, and our own place in the living world, we should remove ourselves from centre stage.

Peter Aldhous in Mosaic:

ScreenHunter_1034 Feb. 28 19.06Tommy the chimpanzee got his day in court on 8 October 2014. He was unable to attend the hearing in ‘person’ – spending the day, like any other, in a cage at a used trailer sales lot in Gloversville, New York. But an hour’s drive away, in a courtroom in the state capital of Albany, Steven Wise of the Nonhuman Rights Project argued that Tommy should indeed be considered a person under New York state law. If so, Patrick and Diane Lavery of Circle L Trailer Sales could be summoned to determine whether they are imprisoning him illegally.

Central to Wise’s arguments in Tommy’s case, and similar suits his organisation has filed on behalf of other captive chimpanzees, is the assertion that apes are highly intelligent and self-aware beings with complex emotional lives. “The uncontroverted facts demonstrate that chimpanzees possess the autonomy and self-determination that are supreme common law values,” Wise told the five judges hearing the case.

It is a bold legal move – and so far unsuccessful. The court in Albany, like a lower court before it, rejected the idea that Tommy has legal rights of personhood. But Wise intends to fight on, taking Tommy’s case to the state’s ultimate arbiter, the New York Court of Appeals.

Events elsewhere in New York State stand in stark contrast to its courts’ willingness to consider the legal implications of the science of animal cognition. In March 2014, the Rip Van Winkle Rod and Gun Club in Palenville, a hamlet of some 1,000 people on the Hudson River, held the fourth instalment of an annual festival that makes a competitive sport out of shooting down creatures that – judged by objective measures of their mental abilities – are arguably just as deserving of personhood as Tommy.

More here.

Right Brained, Wrong Brained: How Caltech Neuroscience Became a Buzzfeed Quiz

Jason G. Goldman in Los Angeles Magazine:

ScreenHunter_1033 Feb. 28 19.00Somewhere between art class and algebra, most of us learn—probably after struggling in one area and excelling in the other—which “side” of our brain is dominant. You are either left brained or right brained. (And if you are in doubt, you can turn to any number of online tests to peg your hemispheric tendencies once and for all.) Left brainers are supposed to be analytical, orderly, mathematical, and good with language. Right brainers tend to be more disorganized, creative, artistic, and visual. A test on BuzzFeed informs me that I’m right brained, though as a science writer, my background would suggest that I draw more from the left. This cognitive shorthand for establishing left- or right-brain dominance doesn’t just aid us in discerning the nature of our talents and shortcomings, it has fueled TED talks, magazine articles, and best-selling books on how to make the most of your alpha side and shore up the weaker one.

Angelenos are always looking for new target areas for self-improvement, but there’s an inherent flaw in treating the two sides of the brain as if they’re biceps. Cognitive development doesn’t work that way. As the neuroscientist whose research helped define split-brain theory in humans will tell you, you’re limiting your potential by trying to fit something as complicated as the brain into two tidy categories.

“There’s a folk psychology that says people have two skills—they’re more verbal or more visual, more artistic or more analytical,” says Michael S. Gazzaniga, director of the SAGE Center for the Study of the Mind at the University of California, Santa Barbara. “But the simple dynamics of ‘the left brain does this, the right brain does that’ are way overdone.”

More here.

What I Mean by Mexico

Oaxaca-city_jpg_600x498_q85James Fenton at the NY Review of Books:


Birds everywhere. Fish everywhere. And when we came near the ocean we began to encounter fishermen of clearly African descent. Their village, with its two aspects, the tourist side facing the ocean and a great expanse of beach, the fishing side facing the lagoon, looked like a little paradise. One had to remember that, if its isolation had been an advantage for its founder members, peons of the lowest class of the society of the day, it must also—as was the whole coast—have been plagued with disease and other forms of daily risk. The fishing (crocodiles apart) might have been the easiest thing about it. As it is, today the village is painted with propaganda about public health, explaining the risks and symptoms of tuberculosis, influenza and so forth.


The houses are often painted in bright colors, combinations that can probably be traced to the availability of individual pigments at the time of painting. But sometimes a color plan is evident: an imposing bright green facade with a yellow trim, or a yellow facade with magenta stripes. One small inn showed boldly in bright green at the base and different hues of blue on an upper story, but set off by contrasting panels of chocolate and hazelnut. Gardens, where they existed, pushed in the direction of a riot of reds. A Catholic church, with no priest or weekly service, used mainly for weddings and baptisms, was painted in the colour known to Benjamin Moore as Bermuda teal. And in this pleasant environment, women with hair combed out in Afrostyle, but men—obeying the current international fashion—tending towards the faux-hawk.

more here.


Mem_cover-beauty_1414929090_crop_345x539Will Rees at The Quarterly Conversation:

An English philosopher called Simon Critchley moves from Essex to New York after becoming disillusioned with the state of British academia. On returning to Essex to clear out his office he finds five boxes, each marked with a zodiac sign from Capricorn to Gemini (the Taurus box is missing), containing the unpublished notebooks and manuscripts of the great, recently deceased, Michel Haar.

Among the boxes is a text called “Le théâtre de mémoir selon G.W.F Hegel,” “an entirely original interpretation of Hegel’s monumental 1807 Phenomenology of Spirit.” Haar’s interpretation of Hegel is informed by a reading of British art historian Francis Yates’s The Art of Memory, a book tracing mnemonic systems from antiquity to the early modern period; in particular by her account of the Renaissance philosopher Giulio Camillo’s theatre of memory, a theoretical structure that “hold[s] the sum of knowledge in a way that would permit total recall.”

Having lost a great deal of his memory in an industrial accident, Critchley finds the idea of the memory theatre at once quaint and irresistible. Later he opens the fifth box—marked Pisces; his astrological sign—and finds a series of astrological projections. “They weren’t so much birth charts as death charts, necronautical rather than genethlialogical. Their purpose was to plot the major events in a philosopher’s life and then to use those events to explain their demise.” One of them, produced 18 years before Haar’s death, predicts that death to the minute. “Knowing his fate, he had simply lost the will to live. He arrived dead just on time.”

more here.

‘The Buried Giant’, by Kazuo Ishiguro

9d1e0f4e-919b-49f3-bb6c-265b9232ee4cJason Cowley at the Financial Times:

Kazuo Ishiguro’s first novel for a decade is, on one level, a complete surprise. It’s set in England in the Dark Ages no less, perhaps in the fifth or early sixth century, a period about which little certain is known. The Romans have left Britain and the Saxons have arrived, built settlements, and fought wars of conquest and survival. The people Ishiguro calls “Britons” have been forced into an uneasy accommodation with the settlers, and ogres and pixies roam a bleak, damp landscape.

Ishiguro has set novels in a parallel dystopian England in which child clones are being reared for organ donation in ignorance of their ultimate fate (Never Let Me Go, 2005), and in an imaginary central European city in which a concert pianist finds himself lost in a kind of surrealist nightmare of coincidence, farce and mistaken identity (The Unconsoled, 1995). He is no realist. But I never expected to encounter a she-dragon in his fiction or, for that matter, the wizard Merlin, from Arthurian legend.

Yet for all its flights of fantasy and supernatural happenings — a mist has settled over the land forcing people into a condition of forgetfulness, or so they believe — The Buried Giant is absolutely characteristic, moving and unsettling, in the way of all Ishiguro’s fiction.

more here.

Sojourner Truth Speech of 1851, “Ain’t I a Woman”

On April 28, six NWHM Board Members and Advisors attended the dedication of the Sojourner Truth bust in Emancipation Hall of the new Capitol Visitor Center. She becomes only the 10th statue of a woman to stand in the U. S. Capitol building, out of 211 statues. The Hall was filled with excitement and pride. Secretary of State Hillary Clinton noted that the event was long past due, Speaker Nancy Pelosi acknowledged standing on the shoulders of the women who came before and Michelle Obama said that as a descendant of slaves, Sojourner Truth must be pleased to see her standing there as first lady. House Minority Leader John Boehner, Senate Minority Leader Mitch McConnell and Senate Majority Leader Harry Reid spoke in honor of the dedication – good speeches but not near so impassioned as the women’s. The female speakers understood the indebtedness that they owed Truth. They had a vested interest and it showed.
One of the highlights was Cicely Tyson’s reenactment of Truth’s famous “Ain’t I a Woman?” speech. It brought the house down. She made the words – and Truth – come alive.
More here. (Note: This is our last post in honor of Black History Month. Please take 4 minutes to listen to this speech)

A Gaza Artist Creates 100 Square Feet of Beauty, and She’s Not Budging

Jodi Rudoren in The New Yor Times:


GAZA2-articleLargeNIDAA BADWAN’S room is less than 100 square feet, lit by a single window and a bare bulb. She has slathered one wall with aquamarine paint and covered another in a patchwork of colored egg cartons. There is a medium-size mirror, an antique sewing machine and iron, two easels, a large yellow ladder and a gas canister on which she boils water to make a sweet cappuccino drink from a packet. Alienated by Gaza’s restrictive religiosity and constant conflict with Israel, Ms. Badwan, 27, has hardly left the room for more than a year. Within its walls she has created her own world, and a striking set of self-portraits that are at once classical and cutting-edge. “I wait for the light,” said Ms. Badwan, who sometimes takes a week or even a month to construct photographs that look like paintings. “Everything is beautiful, but only in my room, not in Gaza. I’m ready to die in this room unless I find a better place.

The project is called “100 Days of Solitude” in homage to Gabriel García Márquez’s landmark novel, though Ms. Badwan’s isolation has been much longer. Its 14 self-portraits, all about 40 by 22 inches, are on display at an East Jerusalem art gallery, whose director, Alia Rayyan, said they evoked the Dutch masters of the 16th and 17th centuries with modern splashes, made more meaningful against the unseen backdrop of chaos outside. That may be a stretch, but the images are undeniably compelling. Here is Ms. Badwan lying on her tummy in jean overalls and a wool hat, bare ankles crossed, staring at a laptop. There she is on a tire swing hung from that yellow ladder, buttoning a man’s shirt with a look of defiant pride. Wiping tears as she peels an onion. Threading a needle to sew a quilt. Gulping from a tin cup. Tapping an old typewriter, meditating, applying lipstick.

More here.

Jean-Luc Godard: still brooding on the end of cinema

Hoberman_brotheranotherplanet_ba_imgJ. Hoberman at The Nation:

What to make of the Godardian mind? You might say that, as prolific as he is, Godard suffers from the attention-deficit disorder of genius, a condition Bob Dylan evoked repeatedly in his mid-’60s work, as when he wailed, “I need a dump truck, baby, to unload my head.” There are more ideas about more things in any five minutes of Godard’s latest opus, Goodbye to Language, than in the year’s five next most intelligent movies combined.

Largely devoted to startling stereoscopic effects, alternating sections labeled “Nature” and “Metaphor,” ultimately devolving to the quandary of a youngish couple, apparently played by two sets of actors, about whether to have a baby or get a dog, Goodbye to Language is neither a narrative film, nor a film essay, nor even a documentary, but an almost indescribable mélange of gorgeous images, slapstick interactions, unanswerable questions and strident assertions, including the playful observation, opening the movie, that “Those lacking imagination take refuge in reality.” Godard is surely referring to himself. No filmmaker has ever been more interested in the fiction of the real—or crankier.

Peppered with questions during his Concordia talks, Godard routinely shoots from the hip. The most opinionated of cinephiles, as well as a former critic, he has no difficulty articulating his preferences.

more here.

virgil thomson: writing about music

TompkinsWeb1Jeff Tompkins at The Brooklyn Rail:

The first pages of the Library of America’s new collection make it clear that when Virgil Thomson was named head music critic of the New York Herald Tribune in the fall of 1940, he came in spoiling for a fight. At that time, New York’s staid musical establishment was still in thrall to the 19th century and the Austro-German tradition, whereas Thomson was not only an ardent Francophile—he lived in Paris from 1925 to 1940, fleeing one step ahead of the Nazis in June of that year—but a composer of avant-garde tendencies whose opera Four Saints in Three Actsboasted a libretto by Gertrude Stein.

His opening salvos aimed at fat targets: Brahms’s music, Thomson informed his readers, is “timid and over-respectful of the past,” devotion to it “the mark of a quite definite musical conservatism”; Sibelius was “vulgar, self-indulgent, and provincial beyond all description”; the violinist Jascha Heifetz produced “silk-underwear music”; and the playing of the New York Philharmonic (in Thomson’s inaugural review, no less) was “dull and brutal.”

The Library of America volume,Music Chronicles 1940–1954, restores to print four collections of Thomson’s criticism from the years indicated, along with a miscellany of previously uncollected pieces.

more here.

Why the Dismal Science Needs a Richer Moral Anthropology

51N54NCKS9LChristina McRorie at Hedgehog Review:

It’s tempting to conclude that Piketty understands and delivers what the public wants from economics: a return to classical political economy. More than a few reviews of his work make this exact claim; he’s been hailed as the Smith, Mill, or Marx the twenty-first century has been waiting for.

If only it were true. Or, to be fair, completely true. Piketty’s work is certainly a welcome step in the right direction, but it doesn’t make him the new Adam Smith. Empirical attention to the sweep of history was not the only thing that enabled the grand theorists of classical political economy to make sense of market life for their readers. The other half of what made political economy—the forerunner of economics—so compelling was its ability to take on “big questions” by connecting economic matters with moral and philosophical concerns, often by way of what might be called moral anthropology.

For those longing for a revival of political economy in the tradition of Smith, even Piketty’s remarkable turn to richer data doesn’t fully satisfy their hunger for a fully worked out exploration of the connection between economics and larger philosophical and moral questions. These questions are about more than mere markets and politics; they pertain directly to the moral dimensions of economic life.

Because early economists such as Smith (1723–90) were also moral philosophers, they took up such questions naturally. They assumed that theories of justice and normative reflections on society were inseparable from their theories about pricing, the distribution of resources, and national wealth. They assumed that their field could advance only if all such concerns were part of a seamless whole.

more here.

Ending the Creditor’s Paradise


Mark Blyth in Jacobin:

As I sat in my office at Brown University on December 16, 2014, an email popped into my inbox with the title “Herzlichen Glückwunsch – Sie sind der 1. Preisträger des Hans-Matthöfer-Preises für wirtschaftspublizistik.” This was the award given by the Friedrich Ebert Stiftung (FES), the research foundation closest to the German Social Democratic Party (SPD), and the Hans-Matthöfer Stiftung for the best economics publication in German in 2014. I was, to say the least, surprised.

My 2013 Oxford University Press book, Austerity: The History of a Dangerous Idea, had recently been translated into German by the publishing arm of the FES. Indeed, I had been there a month earlier, in Berlin, to do a book launch, which was very well attended. Since then the book has been reviewed, positively, in the German press, with Suddeutsche Zeitung giving it a rather glowing review. Something odd was going on.

Clearly, despite the impression we get in the US, there was movement away from the “austerity is the only way” approach to thinking about the eurozone crisis in Berlin, at least among the social democrats — but how much movement?

Consider that during the negotiations to form the current coalition with German Chancellor Angela Merkel’s Christian Democratic Union, the SPD could have made an issue out of how the policies designed to heal Europe were causing great harm, a fact acknowledged even by the International Monetary Fund by 2012. But they chose not to do so.

True, as German (and French) politicians know only too well, there are no votes in talking about Europe, only costs, so not speaking up is locally rational. But not speaking up when such inappropriate policies are being applied to Germany’s European partners is collectively disastrous. Indeed, what is so tragic in this crisis is how the center-left throughout Europe have not just accepted, but in many cases actively supported, policies that have done nothing but hurt their supposed core constituencies.

So I was awarded the prize at a ceremony in Berlin for “thinking differently” about economics. Martin Schultz, the head of the European Parliament, gave the introduction. Peter Bofinger, the voice of macroeconomic reason on the German equivalent to the US Council of Economic Advisors, gave a speech praising the book. I had ten minutes to say something useful at the end of the event. But what should I say that would be of use to the six hundred social democrats gathered in the room?

More here.

Friday Poem

Unfinished Book

The Achaeans have been pinned against their hollow ships
all winter. Not because this is how the epic unfolds,
but because this is where I left off, tired of the clashing of swords,
the clamor of armor, the spilling of black blood,
tired of the myriad of forgettable Greek names and
their fathers’ places of birth.

Maybe later this month, after I’ve read The Elements of Style
and my daily horoscope for the second time, the Achaeans will once again
raise themselves from the dirt, released from their paused state,
and drive back onto the plain of Scamander to rescue a reluctant wife.

Or maybe this summer, when beach reading gives way
to more heroic struggles, Achilles will quit
his stubborn stance and rejoin the reckless battle;
but nothing will happen until I pick up that book and silently
recite the words, turn the pages, and continue the struggle.
I, Eric, son of Brent of Salt Lake.

by Eric Parker

Displaced Personhood: A Pakistani-British writer chronicles his odyssey through the war on terror

Jake Lamar in Bookforum:

ImagesOne could say, with no snark intended, that back in the year 2000, twenty-nine-year-old Mohsin Hamid was the ultimate bourgeois bohemian. He had just published a well-received first novel. He lived on lovely Cornelia Street, in a corner of the West Village once inhabited by artists and writers but, by the dawn of the twenty-first century, affordable mainly to investment bankers and management consultants. As it happened, this debut novelist was also a management consultant. And in a deal of sugar-shock sweetness, his employer, McKinsey & Company—famous for overworking its bright young climbers—allowed this graduate of Princeton and Harvard Law three months off per year to write fiction. This guy, as Frank Sinatra might have crooned, had the world on a string, the string around his finger. But even back then, before the twin towers came tumbling down, Hamid felt the sting of Islamophobia in New York City. In “International Relations,” one of the many superb pieces in his first collection of essays, Discontent and Its Civilizations, Hamid describes how he was made to squirm every time he went to the Italian consulate in Manhattan to receive official clearance to visit his then-girlfriend in her European homeland. Hamid’s passport “runs suspiciously backward, the right-hand cover its front, and above the curved swords of its Urdu lettering . . . reads, ‘Islamic Republic of Pakistan.’ Words to make a visa officer tremble.”

For Hamid, life in the Big Apple would turn sour fast. As he writes in another essay: “The 9/11 attacks placed great strain on the hyphen bridging that identity called Muslim-American. As a man not known for frequenting mosques, and not possessing a US passport, I should not have felt it. But I did, deeply. It seemed two halves of myself were suddenly at war.” He arranged to have McKinsey transfer him, indefinitely, to London. All was well there, at least for a while: “Like many Bush-era self-exiles from the United States, I found that London combined much of what first attracted me to New York with a freedom America seemed to have lost in the paranoid years after 9/11.” In London, Hamid met the love of his life: “She and I had been born on the same street in Lahore.” He quit McKinsey. He published his mesmerizing second novel, The Reluctant Fundamentalist, which became an internationally acclaimed best seller. Marching with a million other people in Hyde Park to protest the 2003 invasion of Iraq, Hamid thought: “I am one of them. I am a Londoner.”

More here.

Why companies are rewarding shareholders instead of investing in the real economy


Lydia DePillis in The Washington Post's Wonkblog (via Doug Henwood) (image Jewel Samad/AFP/Getty Images):

In the past several years, profits have been increasingly paid back out to shareholders, rather than invested in hiring more people and paying them better. And lately, companies have even been borrowing money to make those shareholder payouts, because with interest rates so low, it’s a relatively cheap way to push stock prices higher.

That’s according to a new paper from the Roosevelt Institute, a left-leaning think tank that's launching a project exploring how the financialization of the economy has unlinked corporates from the well-being of regular people.

“The health of the financial system might matter less for the real economy than it once did,” writes J.W. Mason, an assistant professor of economics at John Jay College who wrote the paper, “because finance is no longer an instrument for getting money into productive businesses, but for getting money out of them.”

If it holds up, that has some pretty serious implications for how the Federal Reserve should go about tending the “real economy” in the future.

Here’s the data at the center of the report: In the 1960s, 40 percent of earnings and borrowing used to go into investment. In the 1980s, that figure fell to less than 10 percent, and hasn’t risen since. Instead of investment, borrowing is now closely correlated with shareholder payouts, which have nearly doubled as a share of corporate assets since the 1980s.

So what happened in the 1980s? The “shareholder revolution,” starting with a wave of hostile takeovers, propelled a shift in American corporate governance. Investors began demanding more control over the firm’s cash flow. Rather than plowing profits back into expansion and employee welfare, managers would pay them out in the form of dividends.

The years since the recession have given firms even more of an incentive to dispense cash rather than invest in growth: The Fed’s policy of keeping interest rates low has made credit cheap, and with weak consumer demand, high-yield investment opportunities have been scarce. So instead, companies have been borrowing in order to buy back stock, which boosts their share price and keeps investors happy — but doesn’t give anything back to the world of job listings and salary freezes, where most of us still exist.

More here.

French hip-hop

Jesse McCarthy in The Point:

ScreenHunter_1031 Feb. 26 19.47In 1988 no one in France took the hip-hop movement seriously. It was the rec-room era. JoeyStarr and Kool Shen were just two kids from Seine-Saint-Denis, the 93rd ward, a neglected tract of housing projects on the northern outskirts of Paris. One black, the other white, they shared a love and talent for breakdancing and got together practicing moves in bleak lots and house parties. They started crews and listened to Doug E. Fresh, Masta Ace, Grandmaster Flash and Marley Marl. DJs played the breaks looped over jazzy horn riffs, cats sported Kangol hats and Cosby sweaters, and they tagged the walls of the city with their calling card: NTM, an acronym for Nique Ta Mère (Fuck Your Mother). There were no labels, no official concerts or shows, and the only airplay was after midnight on Radio Nova, a station dedicated to underground and avant-garde music, created and directed by French countercultural hero Jean-François Bizot.

I was at a house party in a spacious bourgeois apartment somewhere in the 16th arrondissement when I first heard DJ Cut Killer’s track “La Haine,” better known by its infamous refrain “Nique la police” (Fuck the police). I hadn’t yet seen the film La Haine (1995), which made the song famous, and which remains arguably the most important French film of the 1990s. I was at a boum, slang for a teenage house party and a tradition of Parisian coming of age that involves a great deal of slow dancing and emotional espionage. Sophie Marceau immortalized it as a mesmerizing ingénue in the greatest French teen romance ever produced, La Boum (1980). But I wasn’t dancing with Sophie Marceau.

More here.

Parasitic Wasps Infected with Mind-Controlling Viruses

Carl Zimmer in his excellent blog, The Loom:

Parasite-NG-cover-550-206x300In November, National Geographic put a ladybug and a wasp on its cover. They made for a sinister pair. The wasp, a species called Dinocampus coccinellae, lays an egg inside the ladybug Coleomegilla maculata. After the egg hatches, the wasp larva develops inside the ladybug, feeding on its internal juices. When the wasp ready to develop into an adult, it crawls out of its still-living host and weaves a cocoon around itself.

As I wrote in the article that accompanied that photograph, the ladybug then does something remarkable: it becomes a bodyguard. It hunches over the wasp and defends it against predators and other species of parasitic wasps that would try to lay their eggs inside the cocoon. Only after the adult wasp emerges from its cocoon does the bodyguard ladybug move again. It either recovers, or dies from the damage of growing another creature inside of it.

How parasites turn their hosts into zombie slaves is a tough question for scientists to answer. In some cases, researchers have found evidence suggesting that the parasites release brain-controlling chemicals. But the wasp uses another strategy: there’s a parasite within this parasite.

More here.